zerrin commited on
Commit
11086ea
β€’
1 Parent(s): 4ab2d90

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -92
README.md CHANGED
@@ -1,98 +1,21 @@
1
- # DeepInfra-Wrapper
 
 
 
 
 
 
 
 
2
 
3
- DeepInfra-Wrapper is a Python Flask project designed to provide a convenient and free interface for utilizing the DeepInfra API through reverse-engineering. It serves as a local and global server host, allowing users to interact with the DeepInfra chat completion models using Python requests.
4
 
5
- ## Features
6
 
7
- - **Local and Global Server**: Choose between a local server or utilize a global server with Cloudflare integration for enhanced performance.
8
 
9
- - **Chat Completion**: Easily generate chat completions by sending messages to the DeepInfra API.
10
 
11
- - **Model Selection**: Access a variety of models for different use cases.
 
 
12
 
13
- - **Streaming Support**: Enable real-time streaming for dynamic chat interactions.
14
-
15
- ## Getting Started
16
-
17
- ### Prerequisites
18
-
19
- - Python 3.6 or higher
20
- - Flask
21
- - Flask-CORS
22
- - Flask-Cloudflared
23
- - Requests
24
- - Fake User Agent
25
-
26
- ### Installation
27
-
28
- 1. Clone the repository:
29
-
30
- ```bash
31
- git clone https://github.com/Recentaly/DeepInfra-Wrapper.git
32
- ```
33
-
34
- 2. Install dependencies:
35
-
36
- ```bash
37
- pip install -r requirements.txt
38
- ```
39
-
40
- 3. Run the Flask application:
41
-
42
- ```bash
43
- python app.py
44
- ```
45
-
46
- ## Configuration
47
-
48
- Adjust the configuration settings in the `assets/config.json` file to customize your DeepInfra-Wrapper experience.
49
-
50
- ```json
51
- {
52
- "use_global": true
53
- }
54
- ```
55
-
56
- ## Usage
57
-
58
- ### Chat Completion
59
-
60
- Send a POST request to `/chat/completions` with the following JSON payload (messages must be in OpenAI format):
61
-
62
- ```json
63
- {
64
- "messages": [{"role": "user", "content": "Hello, World!"}],
65
- "model": "meta-llama/Llama-2-70b-chat-hf",
66
- "max_tokens": 150,
67
- "top_p": 1,
68
- "stream": true
69
- }
70
- ```
71
-
72
- ### Get Models
73
-
74
- Retrieve the available models by sending a GET request to `/models`.
75
-
76
- ### Check API Status
77
-
78
- Verify the API status by accessing the root route `/`.
79
-
80
- ## Error Handling
81
-
82
- The API gracefully handles errors, such as forbidden requests, providing meaningful error messages.
83
-
84
- ## Google Colab
85
-
86
- The server is also usable on [This Google Colab Link](https://colab.research.google.com/drive/15sQ6sjZJYUincL3otypxfH96aCfLQ7HZ?usp=sharing)
87
-
88
- ## License
89
-
90
- This project is licensed under the [MIT License](LICENSE).
91
-
92
- ## Acknowledgments
93
-
94
- - Special thanks to the DeepInfra team for providing the chat completion models.
95
-
96
- ## Contact
97
-
98
- For issues and inquiries, please open an [issue](https://github.com/Recentaly/DeepInfra-Wrapper/issues).
 
1
+ ---
2
+ title: Test Space
3
+ emoji: 🌍
4
+ colorFrom: yellow
5
+ colorTo: indigo
6
+ sdk: docker
7
+ pinned: false
8
+ license: mit
9
+ ---
10
 
11
+ This is a templated Space for [Shiny for Python](https://shiny.rstudio.com/py/).
12
 
 
13
 
 
14
 
15
+ To get started with a new app do the following:
16
 
17
+ 1) Install Shiny with `pip install shiny`
18
+ 2) Create a new app with `shiny create .`
19
+ 3) Then run the app with `shiny run --reload`
20
 
21
+ To learn more about this framework please see the [Documentation](https://shiny.rstudio.com/py/docs/overview.html).