|
--- |
|
language: |
|
- en |
|
license: apache-2.0 |
|
tags: |
|
- pretrained |
|
- mlx |
|
pipeline_tag: text-generation |
|
inference: |
|
parameters: |
|
temperature: 0.7 |
|
--- |
|
|
|
# mistral-7b-v0.1-GreeceRome-v0.1 |
|
This classics model is a fine-tune of Mistral 7b, on 1,640 Q/A pairs on Greek & Roman history, over 3 epochs. |
|
The dataset was generated via Mixtral-8x7b Instruct v01, run over 512 token-length chunks of vol's 2&3 of Will Durants' 13 vol **Story of Civilization** (*Life of Greece* and *Caesar & Christ*). |
|
|
|
Training data was formatted with [INST] and [/INST] delimiting instructions: |
|
```bash |
|
{"text": "Q: \"Why did many Greeks come to resent Rome's 'liberation' and 'peacekeeping' efforts, such as forbidding class war and interfering in disputes, despite Rome having given Greece freedom from previous conflicts?\"\nA: Many Greeks came to resent Rome's \"liberation\" and \"peacekeeping\" efforts due to several reasons. First, after the Romans had given Greece freedom...(blah blah blah)...interfering in their domestic affairs, and ultimately"} |
|
``` |
|
The model was converted to MLX format from [`mistralai/Mistral-7B-v0.1`](). |
|
Mistral-7B-v0.1 Large Language Model (LLM) is a pretrained generative text model with 7 billion parameters. |
|
Refer to the [original model card](https://huggingface.co/mistralai/Mistral-7B-v0.1) for more details on the model. |
|
## Use with mlx |
|
```bash |
|
pip install mlx |
|
git clone https://github.com/ml-explore/mlx-examples.git |
|
cd mlx-examples/llms/hf_llm |
|
python generate.py --model mlx-community/mistral-7b-v0.1-GreeceRome-v0.1 --prompt "How does Aristotle define the soul?" |
|
``` |
|
|