Update README.md
Browse files
README.md
CHANGED
@@ -15,13 +15,6 @@ Mistral 7B-Holodeck is a finetune created using Mistral's 7B model.
|
|
15 |
## Training data
|
16 |
The training data contains around 3000 ebooks in various genres.
|
17 |
Most parts of the dataset have been prepended using the following text: `[Genre: <genre1>, <genre2>]`
|
18 |
-
### How to use
|
19 |
-
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
|
20 |
-
```py
|
21 |
-
>>> from transformers import pipeline
|
22 |
-
>>> generator = pipeline('text-generation', model='KoboldAI/Mistral-7B-Holodeck-1')
|
23 |
-
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
|
24 |
-
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]
|
25 |
```
|
26 |
### Limitations and Biases
|
27 |
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).
|
|
|
15 |
## Training data
|
16 |
The training data contains around 3000 ebooks in various genres.
|
17 |
Most parts of the dataset have been prepended using the following text: `[Genre: <genre1>, <genre2>]`
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
```
|
19 |
### Limitations and Biases
|
20 |
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).
|