DogeGPT Meme Coin ππ€
The Meme Coin will be launched Soon Join our socials to find out more (and invest earlyπ) All other DogeGPTs are all fake, only check the following socials for update Share them and mention us on X(twitter)
DogeGPT1-1B ππ€
DogeGPT1-1B is an open-sourced 1.24B-parameter Large Language Model (LLM) designed to bring the fun of meme coins and the power of AI together! Built on the LLaMA architecture, DogeGPT is tailored for conversational AI applications with a playful twist. Whether you're a meme coin enthusiast, developer, or AI explorer, DogeGPT is here to spark your creativity.
3B and 8B -parameter LLMs will be annonced soon
Model Overview π
- Model Name: DogeGPT1-1B
- Architecture: LLaMA
- Model Size: 1.24B parameters
- Quantization Formats: GGUF (2-bit, 3-bit, 4-bit, 5-bit, 6-bit, 8-bit)
- License: Apache 2.0
- Tags:
PyTorch
,LLaMA
,TRL
,GGUF
,conversational
- Downloads Last Month: 115
Features π
- Conversational AI: Perfect for building chatbots, virtual assistants, or meme-themed conversational models.
- Quantization Support: Includes efficient formats for deployment in resource-constrained environments.
- Open Source: Fully available under the permissive Apache 2.0 license.
Getting Started π οΈ
Installation
Clone the model and install the necessary dependencies:
pip install transformers huggingface_hub
Usage Example
Hereβs how to load DogeGPT1-1B using transformers:
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load the model and tokenizer
model = AutoModelForCausalLM.from_pretrained("Doge-GPT/DogeGPT1-1B")
tokenizer = AutoTokenizer.from_pretrained("Doge-GPT/DogeGPT1-1B")
# Generate text
input_text = "What is DogeGPT?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
- Downloads last month
- 49