license: mit
language:
- en
library_name: transformers
pipeline_tag: text-generation
datasets:
- Open-Orca/OpenOrca
tags:
- freeai
- conversational
- meowgpt
- gpt
- free
- opensource
- splittic
- ai
widget:
- text: <s> [|User|] Hello World </s>[|Assistant|]
MeowGPT Readme
Overview
MeowGPT, developed by CutyCat2000, is a language model based on Llama with the checkpoint version 2.5. Trained on the OpenOrca dataset, this model is designed to generate text in a conversational manner and can be used for various natural language processing tasks.
Usage
Loading the Model
To use MeowGPT, you can load it via the transformers
library in Python using the following code:
from transformers import LlamaTokenizer, AutoModelForCausalLM, AutoTokenizer
tokenizer = LlamaTokenizer.from_pretrained("cutycat2000x/MeowGPT-2.5")
model = AutoModelForCausalLM.from_pretrained("cutycat2000x/MeowGPT-2.5")
Example Prompt
An example of how to prompt the model for generating text:
prompt = "<s> [|User|] Hello World </s>[|Assistant|]"
The <s> and </s> are start and end tokens.
About the Model
- Base Model: Llama
- Checkpoint Version: 2.5
- Dataset Used: OpenOrca
Citation
If you use MeowGPT in your research or projects, please consider citing CutyCat2000 and the relevant resources associated with the OpenOrca dataset.
Disclaimer
Please note that while MeowGPT is trained to assist in generating text based on given prompts, it may not always provide accurate or contextually appropriate responses. It's recommended to review and validate the generated content before usage in critical applications.
For more information or support, refer to the transformers
library documentation or CutyCat2000's resources.