saucam commited on
Commit
cb4971e
β€’
1 Parent(s): 96b4ac9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +84 -1
README.md CHANGED
@@ -5,7 +5,7 @@ tags:
5
  - cognitivecomputations/dolphin-2.9-llama3-8b
6
  - NousResearch/Hermes-2-Pro-Llama-3-8B
7
  - abacusai/Llama-3-Smaug-8B
8
- base_model:
9
  - cognitivecomputations/dolphin-2.9-llama3-8b
10
  - NousResearch/Hermes-2-Pro-Llama-3-8B
11
  - abacusai/Llama-3-Smaug-8B
@@ -49,6 +49,8 @@ slices:
49
 
50
  ## πŸ’» Usage
51
 
 
 
52
  ```python
53
  !pip install -qU transformers accelerate
54
 
@@ -70,4 +72,85 @@ pipeline = transformers.pipeline(
70
 
71
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
72
  print(outputs[0]["generated_text"])
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
73
  ```
 
5
  - cognitivecomputations/dolphin-2.9-llama3-8b
6
  - NousResearch/Hermes-2-Pro-Llama-3-8B
7
  - abacusai/Llama-3-Smaug-8B
8
+ models:
9
  - cognitivecomputations/dolphin-2.9-llama3-8b
10
  - NousResearch/Hermes-2-Pro-Llama-3-8B
11
  - abacusai/Llama-3-Smaug-8B
 
49
 
50
  ## πŸ’» Usage
51
 
52
+ Using pipelines
53
+
54
  ```python
55
  !pip install -qU transformers accelerate
56
 
 
72
 
73
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
74
  print(outputs[0]["generated_text"])
75
+ ```
76
+
77
+ ```
78
+ Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:03<00:00, 1.62s/it]
79
+ Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
80
+ <|begin_of_text|><|im_start|>user
81
+ What is a large language model?<|im_end|>
82
+ <|im_start|>assistant
83
+ A large language model is a type of artificial intelligence (AI) model trained on a massive dataset of text, which enables it to understand and generate human language at a level of sophistication that is comparable to or even surpassing human ability. These models are typically based on deep learning architectures, such as transformer models, and are trained on a large corpus of text data, often in the billions of parameters.
84
+
85
+ Large language models are designed to understand the context, nuances, and complexities of human language, allowing them to perform a variety of tasks such as text generation, question answering, language translation, and more. They can generate coherent and contextually relevant text based on prompts or input data, making them useful for applications like chatbots, virtual assistants, language translation tools, and content generation.
86
+
87
+ Some examples of large language models include:
88
+
89
+ 1. GPT-3 (Generative Pre-trained Transformer 3) - Developed by OpenAI, this model has 175 billion parameters and is capable of generating human-like text and performing a wide range of tasks.
90
+ 2. BERT (Bidirectional Encoder Representations from Transformers) - Developed by Google, this model is widely used for natural language processing tasks like question answering, sentiment analysis, and language translation.
91
+ 3. T5 (Text-to-Text
92
+ ```
93
+
94
+ Using model generation
95
+
96
+ ```
97
+ from transformers import AutoModelForCausalLM, AutoTokenizer
98
+ import transformers
99
+ import torch
100
+
101
+ model_name = "saucam/aqua-smaug-hermes-8B"
102
+ messages = [{"role": "user", "content": "What is a large language model?"}]
103
+
104
+ model = AutoModelForCausalLM.from_pretrained(model_name)
105
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
106
+ messages = [
107
+ {"role": "system", "content": "You are a sentient, superintelligent artificial general intelligence, here to teach and assist me."},
108
+ {"role": "user", "content": "Write a short story about Goku discovering kirby has teamed up with Majin Buu to destroy the world."}
109
+ ]
110
+
111
+ device = "cuda"
112
+
113
+ gen_input = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
114
+ model_inputs = gen_input.to(device)
115
+ model.to(device)
116
+
117
+ # Generate response
118
+ out = model.generate(model_inputs, max_new_tokens=750, temperature=0.8, repetition_penalty=1.1, do_sample=True, eos_token_id=tokenizer.eos_token_id)
119
+ response = tokenizer.decode(out[0][model_inputs.shape[-1]:], skip_special_tokens=True, clean_up_tokenization_space=True)
120
+ print(f"Response: {response}")
121
+ ```
122
+
123
+ ```
124
+ Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:17<00:00, 8.56s/it]
125
+ /usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
126
+ warnings.warn(
127
+ Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
128
+ The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results.
129
+ Setting `pad_token_id` to `eos_token_id`:128003 for open-end generation.
130
+
131
+ Response: In a world where superheroes and villains coexisted, Goku, the legendary warrior from Earth, had always fought for peace and justice alongside his comrades. One day, he received a shocking message that shook him to his core.
132
+
133
+ "Goku! You won't believe who I've teamed up with," a familiar yet startling voice echoed through the universe's communication channels. It was Kirby, the pink puffball known for his copy abilities and heroic feats. However, something in his tone wasn't right this time.
134
+
135
+ Goku's initial reaction was disbelief, but as he connected the dots, his heart sank. If it were true, then it meant one of the most loathed characters in the galaxy, Majin Buu, had somehow formed an alliance with the usually benevolent Kirby.
136
+
137
+ Summoning all the power within him, Goku immediately rushed towards the scene, ready to confront whatever danger lay ahead. As he arrived, he found Kirby and Majin Buu working in tandem, their destructive energies intertwining like a twisted dance.
138
+
139
+ "Kirby, what have you done?!" Goku demanded, his anger blazing brighter than his signature Kamehameha wave.
140
+
141
+ But before Kirby could respond, Majin Buu gloated, "Ah, Goku! Your ignorance is your downfall. Together, we will bring chaos and destruction upon this realm, proving the absurdity of your so-called 'peace.'"
142
+
143
+ As they began their assault, Goku knew he couldn't take on both foes alone. He quickly sent out a distress signal to his allies across the universe, rallying them to help defend against this unexpected threat.
144
+
145
+ The battle raged on, with Goku and his team pushing back against the unholy alliance. Though Kirby's copying abilities made him a formidable opponent, Goku's sheer strength and determination kept him grounded. Meanwhile, Majin Buu's monstrous form made him nearly unstoppable.
146
+
147
+ It took a combined effort from Goku, his friends, and even some of Kirby's previous allies for the tide to turn. The final blow came when Vegeta, using the power of the Dragon Balls, created a massive explosion that separated Kirby and Majin Buu, each consumed by the blast.
148
+
149
+ When the dust settled, Goku approached Kirby, who groggily regained consciousness amidst the wreckage. The once cheerful hero looked remorseful, realizing the depths to which he'd fallen.
150
+
151
+ "It...it didn't feel like me," Kirby whispered. "Majin Buu somehow manipulated my copy abilities..."
152
+
153
+ Understanding dawned on Goku. "No matter how powerful or influential a force may be, never forget who you truly are," Goku said softly, helping Kirby stand upright. "Together, we'll ensure such a betrayal never happens again."
154
+
155
+ And so, Goku and Kirby joined forces anew, now more vigilant than ever, protecting the universe from threats both inside and outside their ranks. Their bond stronger than before, they remained steadfast guardians, a testament to the resilience that defined them as heroes. The alliance between Kirby and Majin Buu would forever serve as a cautionary tale, reminding them of the importance of staying true to their principles. Despite the darkness, there was still light, and together, they would keep shining. πŸŒŸπŸ›ΈπŸ’₯πŸ‘ŠοΈ #DragonBall #Kirby #HeroesUnite #GuardiansOfTheCosmos #PeaceAndJustice #LightVsDarkness #AllianceTurnedBetrayal #RemorsefulRegret #StrongerThanEver #TrueHeroesEndure πŸ‘πŸ’ͺ✨🌈🌠 #NeverGiveUp #DefeatChaosAndDestruction #TogetherWeStand #UnitedAgainstEvil βš‘οΈπŸ’«β­οΈπŸ”₯πŸŒŠπŸ’¨οΏ½
156
  ```