Text Generation
Transformers
Safetensors
English
llama
freeai
conversational
meowgpt
gpt
free
opensource
splittic
ai
llama3
Inference Endpoints
text-generation-inference
cutycat2000x commited on
Commit
6fd13e3
1 Parent(s): 8cde156

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +61 -0
README.md CHANGED
@@ -1,3 +1,64 @@
1
  ---
2
  license: mit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ language:
4
+ - en
5
+ library_name: transformers
6
+ pipeline_tag: text-generation
7
+ tags:
8
+ - freeai
9
+ - conversational
10
+ - meowgpt
11
+ - gpt
12
+ - free
13
+ - opensource
14
+ - splittic
15
+ - ai
16
+ - llama
17
+ - llama3
18
+ widget:
19
+ - text: <s> [|User|] Hello World </s>[|Assistant|]
20
+ datasets:
21
+ - Open-Orca/SlimOrca-Dedup
22
+ - jondurbin/airoboros-3.2
23
+ - microsoft/orca-math-word-problems-200k
24
+ - m-a-p/Code-Feedback
25
+ - MaziyarPanahi/WizardLM_evol_instruct_V2_196k
26
+ - mlabonne/orpo-dpo-mix-40k
27
  ---
28
+ # MeowGPT Readme
29
+
30
+ ## Overview
31
+ MeowGPT, developed by CutyCat2000x, is a language model based on Llama with the checkpoint version ll3. This model is designed to generate text in a conversational manner and can be used for various natural language processing tasks.
32
+
33
+ ## Usage
34
+ ### Loading the Model
35
+ To use MeowGPT, you can load it via the `transformers` library in Python using the following code:
36
+
37
+ ```python
38
+ from transformers import LlamaTokenizer, AutoModelForCausalLM, AutoTokenizer
39
+
40
+ tokenizer = LlamaTokenizer.from_pretrained("cutycat2000x/MeowGPT-ll3")
41
+ model = AutoModelForCausalLM.from_pretrained("cutycat2000x/MeowGPT-ll3")
42
+ ```
43
+
44
+ ### Example Prompt
45
+ An example of how to prompt the model for generating text:
46
+
47
+ ```python
48
+ {{ bos_token }}{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% else %}{% set loop_messages = messages %}{% set system_message = false %}{% endif %}{% for message in loop_messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% if loop.index0 == 0 and system_message != false %}{% set content = '<<SYS>>\\n' + system_message + '\\n<</SYS>>\\n\\n' + message['content'] %}{% else %}{% set content = message['content'] %}{% endif %}{% if message['role'] == 'user' %}{{ '[INST] ' + content.strip() + ' [/INST]' }}{% elif message['role'] == 'assistant' %}{{ ' ' + content.strip() + eos_token }}{% endif %}{% endfor %}
49
+ ```
50
+
51
+ The &lt;s&gt; and &lt;/s&gt; are start and end tokens.
52
+
53
+ ## About the Model
54
+ - **Base Model**: Llama3
55
+ - **Checkpoint Version**: ll3
56
+ - **Datasets Used**: Open-Orca/SlimOrca-Dedup, jondurbin/airoboros-3.2, microsoft/orca-math-word-problems-200k, m-a-p/Code-Feedback, MaziyarPanahi/WizardLM_evol_instruct_V2_196k, mlabonne/orpo-dpo-mix-40k
57
+
58
+ ## Citation
59
+ If you use MeowGPT in your research or projects, please consider citing CutyCat2000x.
60
+
61
+ ## Disclaimer
62
+ Please note that while MeowGPT is trained to assist in generating text based on given prompts, it may not always provide accurate or contextually appropriate responses. It's recommended to review and validate the generated content before usage in critical applications.
63
+
64
+ For more information or support, refer to the `transformers` library documentation or CutyCat2000x's resources.