Spaces:
Sleeping
Sleeping
File size: 3,704 Bytes
d8d14f1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 |
# OpenAI Assistant
The OpenAI Assistant class provides a wrapper around OpenAI's Assistants API, integrating it with the swarms framework.
## Overview
The `OpenAIAssistant` class allows you to create and interact with OpenAI Assistants, providing a simple interface for:
- Creating assistants with specific roles and capabilities
- Adding custom functions that the assistant can call
- Managing conversation threads
- Handling tool calls and function execution
- Getting responses from the assistant
## Insstallation
```bash
pip install swarms
```
## Basic Usage
```python
from swarms import OpenAIAssistant
#Create an assistant
assistant = OpenAIAssistant(
name="Math Tutor",
instructions="You are a helpful math tutor.",
model="gpt-4o",
tools=[{"type": "code_interpreter"}]
)
#Run a Task
response = assistant.run("Solve the equation: 3x + 11 = 14")
print(response)
# Continue the conversation in the same thread
follow_up = assistant.run("Now explain how you solved it")
print(follow_up)
```
## Function Calling
The assistant supports custom function integration:
```python
def get_weather(location: str, unit: str = "celsius") -> str:
# Mock weather function
return f"The weather in {location} is 22 degrees {unit}"
# Add function to assistant
assistant.add_function(
description="Get the current weather in a location",
parameters={
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City name"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"default": "celsius"
}
},
"required": ["location"]
}
)
```
## API Reference
### Constructor
```python
OpenAIAssistant(
name: str,
instructions: Optional[str] = None,
model: str = "gpt-4o",
tools: Optional[List[Dict[str, Any]]] = None,
file_ids: Optional[List[str]] = None,
metadata: Optional[Dict[str, Any]] = None,
functions: Optional[List[Dict[str, Any]]] = None,
)
```
### Methods
#### run(task: str) -> str
Sends a task to the assistant and returns its response. The conversation thread is maintained between calls.
#### add_function(func: Callable, description: str, parameters: Dict[str, Any]) -> None
Adds a callable function that the assistant can use during conversations.
#### add_message(content: str, file_ids: Optional[List[str]] = None) -> None
Adds a message to the current conversation thread.
## Error Handling
The assistant implements robust error handling:
- Retries on rate limits
- Graceful handling of API errors
- Clear error messages for debugging
- Status monitoring for runs and completions
## Best Practices
1. Thread Management
- Use the same assistant instance for related conversations
- Create new instances for unrelated tasks
- Monitor thread status during long-running operations
2. Function Integration
- Keep functions simple and focused
- Provide clear descriptions and parameter schemas
- Handle errors gracefully in custom functions
- Test functions independently before integration
3. Performance
- Reuse assistant instances when possible
- Monitor and handle rate limits appropriately
- Use appropriate polling intervals for status checks
- Consider implementing timeouts for long-running operations
## References
- [OpenAI Assistants API Documentation](https://platform.openai.com/docs/assistants/overview)
- [OpenAI Function Calling Guide](https://platform.openai.com/docs/guides/function-calling)
- [OpenAI Rate Limits](https://platform.openai.com/docs/guides/rate-limits)
|