Update README.md
Browse files
README.md
CHANGED
@@ -209,6 +209,8 @@ You can use GGUF models from Python using the [llama-cpp-python](https://github.
|
|
209 |
|
210 |
### How to load this model in Python code, using ctransformers
|
211 |
|
|
|
|
|
212 |
#### First install the package
|
213 |
|
214 |
Run one of the following commands, according to your system:
|
|
|
209 |
|
210 |
### How to load this model in Python code, using ctransformers
|
211 |
|
212 |
+
I have not tested ctransformers with Mistral models. It may work, but will require that you set the `model_type` to `llama` for now, until ctransformers updates with specific support.
|
213 |
+
|
214 |
#### First install the package
|
215 |
|
216 |
Run one of the following commands, according to your system:
|