Update README.md
Browse files
README.md
CHANGED
@@ -50,8 +50,9 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
50 |
torch_dtype=torch.bfloat16
|
51 |
)
|
52 |
```
|
53 |
-
|
54 |
-
|
|
|
55 |
|
56 |
```python
|
57 |
from goat_storytelling_agent.story_processor.prompt_manager import generate_story
|
|
|
50 |
torch_dtype=torch.bfloat16
|
51 |
)
|
52 |
```
|
53 |
+
Currently, we support LLM endpoint generation, where you need to send a post request to generation endpoint (we recommend using Text Generation Inference by HuggingFace)
|
54 |
+
First, modify config.py and add your generation endpoint.
|
55 |
+
Then you can use it inside via GOAT-STORYTELLING-AGENT framework:
|
56 |
|
57 |
```python
|
58 |
from goat_storytelling_agent.story_processor.prompt_manager import generate_story
|