File size: 949 Bytes
648d222
 
 
 
88c16e6
dee374c
 
 
 
205a201
c448422
 
dee374c
d34d707
dee374c
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
language:
- ru
library_name: transformers
license: mit
---

# llama-600M-rus

Simple and customized amateur experimental model pretrained on the text fiction books from the scratch (updating the model regularly).<br>
It could generate amateur, but more or less adequate output as well (in respect of training tokens).<br>
The work can be used as a checkpoint for the further training or for experiments.<br>

Simple usage example:

```python
from transformers import LlamaTokenizerFast, LlamaForCausalLM
model = LlamaForCausalLM.from_pretrained('demetera/llama-600M-rus')
tokenizer = LlamaTokenizerFast.from_pretrained('demetera/llama-600M-rus')

prompt = "Я вышел и улицу и"
inputs = tokenizer(prompt, return_tensors='pt')
outputs = model.generate(inputs.input_ids, attention_mask = inputs.attention_mask, max_new_tokens=250, do_sample=True, top_k=50, top_p=0.95)

print (tokenizer.decode(outputs[0], skip_special_tokens=True))
```