Spaces:
Sleeping
Sleeping
title: Huggingface | |
To use Open Interpreter with Huggingface models, set the `model` flag: | |
<CodeGroup> | |
```bash Terminal | |
interpreter --model huggingface/<huggingface-model> | |
``` | |
```python Python | |
from interpreter import interpreter | |
interpreter.llm.model = "huggingface/<huggingface-model>" | |
interpreter.chat() | |
``` | |
</CodeGroup> | |
You may also need to specify your Huggingface api base url: | |
<CodeGroup> | |
```bash Terminal | |
interpreter --api_base <https://my-endpoint.huggingface.cloud> | |
``` | |
```python Python | |
from interpreter import interpreter | |
interpreter.llm.api_base = "https://my-endpoint.huggingface.cloud" | |
interpreter.chat() | |
``` | |
</CodeGroup> | |
# Supported Models | |
Open Interpreter should work with almost any text based hugging face model. | |
# Required Environment Variables | |
Set the following environment variables [(click here to learn how)](https://chat.openai.com/share/1062cdd8-62a1-4aa8-8ec9-eca45645971a) to use these models. | |
| Environment Variable | Description | Where to Find | | |
| ---------------------- | --------------------------- | ---------------------------------------------------------------------------------- | | |
| `HUGGINGFACE_API_KEY'` | Huggingface account API key | [Huggingface -> Settings -> Access Tokens](https://huggingface.co/settings/tokens) | | |