zerrin commited on
Commit
315d075
1 Parent(s): 1d483bb

Upload 5 files

Browse files
Files changed (5) hide show
  1. LEGAL +26 -0
  2. LICENSE +21 -0
  3. README.md +93 -15
  4. app.py +197 -151
  5. requirements.txt +4 -4
LEGAL ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Legal Notice
2
+ Reverse Engineering Disclaimer
3
+
4
+ DeepInfra-Wrapper is a project developed for educational purposes and is not affiliated with or endorsed by DeepInfra or any associated entities. The use of this project is subject to the following legal notice.
5
+ Educational Purposes Only
6
+
7
+ DeepInfra-Wrapper was created solely for educational and research purposes to explore and understand the functionality of the DeepInfra API. The reverse-engineering efforts undertaken in this project are intended to enhance knowledge and skills related to software development and API interaction.
8
+ Not Authorized by DeepInfra
9
+
10
+ This project, including its codebase and associated documentation, is not authorized by DeepInfra or any affiliated parties. It is an independent initiative that aims to provide a learning resource for individuals interested in working with APIs and Flask applications.
11
+ Ethical Usage
12
+
13
+ Users are expected to adhere to ethical standards when using DeepInfra-Wrapper. The project should be utilized responsibly and in compliance with applicable laws and regulations. Any use of DeepInfra-Wrapper for malicious or unauthorized activities is strictly prohibited.
14
+ No Warranty
15
+
16
+ DeepInfra-Wrapper is distributed "as is" and without any warranty. The authors and contributors of this project make no representations or warranties regarding the accuracy, functionality, or reliability of the code. Users are solely responsible for their use of the project.
17
+ Legal Implications
18
+
19
+ Users of DeepInfra-Wrapper are responsible for ensuring that their use complies with all relevant laws and regulations. Any unauthorized or improper use of the project may result in legal consequences. The authors and contributors disclaim any liability arising from the use of DeepInfra-Wrapper.
20
+ Open Source License
21
+
22
+ DeepInfra-Wrapper is released under the MIT License. Users are encouraged to review and comply with the terms of this license.
23
+
24
+ By using DeepInfra-Wrapper, you acknowledge that you have read and understood this legal notice and agree to use the project responsibly and in accordance with applicable laws.
25
+
26
+ For any inquiries or concerns related to this legal notice, please contact the project maintainers.
LICENSE ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MIT License
2
+
3
+ Copyright (c) 2023 4sh
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
README.md CHANGED
@@ -1,20 +1,98 @@
1
- ---
2
- title: Test Space
3
- emoji: 🌍
4
- colorFrom: yellow
5
- colorTo: indigo
6
- sdk: docker
7
- pinned: false
8
- license: mit
9
- ---
10
 
11
- This is a templated Space for [Shiny for Python](https://shiny.rstudio.com/py/).
12
 
 
13
 
14
- To get started with a new app do the following:
15
 
16
- 1) Install Shiny with `pip install shiny`
17
- 2) Create a new app with `shiny create .`
18
- 3) Then run the app with `shiny run --reload`
19
 
20
- To learn more about this framework please see the [Documentation](https://shiny.rstudio.com/py/docs/overview.html).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # DeepInfra-Wrapper
 
 
 
 
 
 
 
 
2
 
3
+ DeepInfra-Wrapper is a Python Flask project designed to provide a convenient and free interface for utilizing the DeepInfra API through reverse-engineering. It serves as a local and global server host, allowing users to interact with the DeepInfra chat completion models using Python requests.
4
 
5
+ ## Features
6
 
7
+ - **Local and Global Server**: Choose between a local server or utilize a global server with Cloudflare integration for enhanced performance.
8
 
9
+ - **Chat Completion**: Easily generate chat completions by sending messages to the DeepInfra API.
 
 
10
 
11
+ - **Model Selection**: Access a variety of models for different use cases.
12
+
13
+ - **Streaming Support**: Enable real-time streaming for dynamic chat interactions.
14
+
15
+ ## Getting Started
16
+
17
+ ### Prerequisites
18
+
19
+ - Python 3.6 or higher
20
+ - Flask
21
+ - Flask-CORS
22
+ - Flask-Cloudflared
23
+ - Requests
24
+ - Fake User Agent
25
+
26
+ ### Installation
27
+
28
+ 1. Clone the repository:
29
+
30
+ ```bash
31
+ git clone https://github.com/Recentaly/DeepInfra-Wrapper.git
32
+ ```
33
+
34
+ 2. Install dependencies:
35
+
36
+ ```bash
37
+ pip install -r requirements.txt
38
+ ```
39
+
40
+ 3. Run the Flask application:
41
+
42
+ ```bash
43
+ python app.py
44
+ ```
45
+
46
+ ## Configuration
47
+
48
+ Adjust the configuration settings in the `assets/config.json` file to customize your DeepInfra-Wrapper experience.
49
+
50
+ ```json
51
+ {
52
+ "use_global": true
53
+ }
54
+ ```
55
+
56
+ ## Usage
57
+
58
+ ### Chat Completion
59
+
60
+ Send a POST request to `/chat/completions` with the following JSON payload (messages must be in OpenAI format):
61
+
62
+ ```json
63
+ {
64
+ "messages": [{"role": "user", "content": "Hello, World!"}],
65
+ "model": "meta-llama/Llama-2-70b-chat-hf",
66
+ "max_tokens": 150,
67
+ "top_p": 1,
68
+ "stream": true
69
+ }
70
+ ```
71
+
72
+ ### Get Models
73
+
74
+ Retrieve the available models by sending a GET request to `/models`.
75
+
76
+ ### Check API Status
77
+
78
+ Verify the API status by accessing the root route `/`.
79
+
80
+ ## Error Handling
81
+
82
+ The API gracefully handles errors, such as forbidden requests, providing meaningful error messages.
83
+
84
+ ## Google Colab
85
+
86
+ The server is also usable on [This Google Colab Link](https://colab.research.google.com/drive/15sQ6sjZJYUincL3otypxfH96aCfLQ7HZ?usp=sharing)
87
+
88
+ ## License
89
+
90
+ This project is licensed under the [MIT License](LICENSE).
91
+
92
+ ## Acknowledgments
93
+
94
+ - Special thanks to the DeepInfra team for providing the chat completion models.
95
+
96
+ ## Contact
97
+
98
+ For issues and inquiries, please open an [issue](https://github.com/Recentaly/DeepInfra-Wrapper/issues).
app.py CHANGED
@@ -1,151 +1,197 @@
1
- from pathlib import Path
2
- from typing import List, Dict, Tuple
3
- import matplotlib.colors as mpl_colors
4
-
5
- import pandas as pd
6
- import seaborn as sns
7
- import shinyswatch
8
-
9
- from shiny import App, Inputs, Outputs, Session, reactive, render, req, ui
10
-
11
- sns.set_theme()
12
-
13
- www_dir = Path(__file__).parent.resolve() / "www"
14
-
15
- df = pd.read_csv(Path(__file__).parent / "penguins.csv", na_values="NA")
16
- numeric_cols: List[str] = df.select_dtypes(include=["float64"]).columns.tolist()
17
- species: List[str] = df["Species"].unique().tolist()
18
- species.sort()
19
-
20
- app_ui = ui.page_fillable(
21
- shinyswatch.theme.minty(),
22
- ui.layout_sidebar(
23
- ui.sidebar(
24
- # Artwork by @allison_horst
25
- ui.input_selectize(
26
- "xvar",
27
- "X variable",
28
- numeric_cols,
29
- selected="Bill Length (mm)",
30
- ),
31
- ui.input_selectize(
32
- "yvar",
33
- "Y variable",
34
- numeric_cols,
35
- selected="Bill Depth (mm)",
36
- ),
37
- ui.input_checkbox_group(
38
- "species", "Filter by species", species, selected=species
39
- ),
40
- ui.hr(),
41
- ui.input_switch("by_species", "Show species", value=True),
42
- ui.input_switch("show_margins", "Show marginal plots", value=True),
43
- ),
44
- ui.output_ui("value_boxes"),
45
- ui.output_plot("scatter", fill=True),
46
- ui.help_text(
47
- "Artwork by ",
48
- ui.a("@allison_horst", href="https://twitter.com/allison_horst"),
49
- class_="text-end",
50
- ),
51
- ),
52
- )
53
-
54
-
55
- def server(input: Inputs, output: Outputs, session: Session):
56
- @reactive.Calc
57
- def filtered_df() -> pd.DataFrame:
58
- """Returns a Pandas data frame that includes only the desired rows"""
59
-
60
- # This calculation "req"uires that at least one species is selected
61
- req(len(input.species()) > 0)
62
-
63
- # Filter the rows so we only include the desired species
64
- return df[df["Species"].isin(input.species())]
65
-
66
- @output
67
- @render.plot
68
- def scatter():
69
- """Generates a plot for Shiny to display to the user"""
70
-
71
- # The plotting function to use depends on whether margins are desired
72
- plotfunc = sns.jointplot if input.show_margins() else sns.scatterplot
73
-
74
- plotfunc(
75
- data=filtered_df(),
76
- x=input.xvar(),
77
- y=input.yvar(),
78
- palette=palette,
79
- hue="Species" if input.by_species() else None,
80
- hue_order=species,
81
- legend=False,
82
- )
83
-
84
- @output
85
- @render.ui
86
- def value_boxes():
87
- df = filtered_df()
88
-
89
- def penguin_value_box(title: str, count: int, bgcol: str, showcase_img: str):
90
- return ui.value_box(
91
- title,
92
- count,
93
- {"class_": "pt-1 pb-0"},
94
- showcase=ui.fill.as_fill_item(
95
- ui.tags.img(
96
- {"style": "object-fit:contain;"},
97
- src=showcase_img,
98
- )
99
- ),
100
- theme_color=None,
101
- style=f"background-color: {bgcol};",
102
- )
103
-
104
- if not input.by_species():
105
- return penguin_value_box(
106
- "Penguins",
107
- len(df.index),
108
- bg_palette["default"],
109
- # Artwork by @allison_horst
110
- showcase_img="penguins.png",
111
- )
112
-
113
- value_boxes = [
114
- penguin_value_box(
115
- name,
116
- len(df[df["Species"] == name]),
117
- bg_palette[name],
118
- # Artwork by @allison_horst
119
- showcase_img=f"{name}.png",
120
- )
121
- for name in species
122
- # Only include boxes for _selected_ species
123
- if name in input.species()
124
- ]
125
-
126
- return ui.layout_column_wrap(*value_boxes, width = 1 / len(value_boxes))
127
-
128
-
129
- # "darkorange", "purple", "cyan4"
130
- colors = [[255, 140, 0], [160, 32, 240], [0, 139, 139]]
131
- colors = [(r / 255.0, g / 255.0, b / 255.0) for r, g, b in colors]
132
-
133
- palette: Dict[str, Tuple[float, float, float]] = {
134
- "Adelie": colors[0],
135
- "Chinstrap": colors[1],
136
- "Gentoo": colors[2],
137
- "default": sns.color_palette()[0], # type: ignore
138
- }
139
-
140
- bg_palette = {}
141
- # Use `sns.set_style("whitegrid")` to help find approx alpha value
142
- for name, col in palette.items():
143
- # Adjusted n_colors until `axe` accessibility did not complain about color contrast
144
- bg_palette[name] = mpl_colors.to_hex(sns.light_palette(col, n_colors=7)[1]) # type: ignore
145
-
146
-
147
- app = App(
148
- app_ui,
149
- server,
150
- static_assets=str(www_dir),
151
- )
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ---------------------------------------- IMPORTS ---------------------------------------- #
2
+
3
+ # import flask and flask_cors to host the api
4
+ from flask import Flask, request, jsonify, render_template
5
+ from flask_cors import CORS
6
+
7
+ # import the api class
8
+ from assets.source import api, non_streamed_format
9
+
10
+ # import addon
11
+ from assets.source.addons import * # here we only use 'create_cloudflare_tunnel' and 'translate' from the addons
12
+
13
+ # logging module for debugging
14
+ import logging
15
+
16
+ # json module to parse json
17
+ from json import loads
18
+
19
+ # ---------------------------------------- CONFIGURE LOCAL SERVER ---------------------------------------- #
20
+
21
+ # create flask app
22
+ app = Flask(__name__)
23
+ app.template_folder = "assets/templates"
24
+
25
+ # enable cors
26
+ CORS(app)
27
+
28
+ # ---------------------------------------- READ FROM CONFIG FILE ---------------------------------------- #
29
+ with (open("assets/config.json", "r")) as f:
30
+
31
+ config_file = loads(f.read())
32
+
33
+ # copy constants over
34
+ DEBUG: bool = config_file.get("DEBUG", False)
35
+ PORT: int = config_file.get("PORT", 5000)
36
+ HOST: str = config_file.get("HOST", "0.0.0.0")
37
+
38
+ # check if user wants to use a global server too
39
+ if config_file["use_global"]:
40
+
41
+ # create a cloudflare tunnel
42
+ create_cloudflare_tunnel(PORT)
43
+
44
+ # ---------------------------------------- LOGGING CONFIG ---------------------------------------- #
45
+
46
+ # set logging level
47
+ logging.basicConfig(level=logging.DEBUG, format='%(asctime)s %(message)s')
48
+
49
+ # ---------------------------------------- ROUTES ---------------------------------------- #
50
+
51
+ # chat generaiton route
52
+ @app.route("/chat/completions", methods=["POST"])
53
+ def chat():
54
+
55
+ # get request data
56
+ data = request.get_json()
57
+
58
+ # get messages
59
+ messages = message_translation(data["messages"]) if config_file["use_addons"] else data["messages"]
60
+
61
+ # get model
62
+ model = translate(data["model"]) if config_file["use_addons"] else data["model"]
63
+
64
+ # get max tokens
65
+ max_tokens = data.get("max_tokens", 150)
66
+
67
+ # top p and top k
68
+ top_p = data.get("top_p", 0.99)
69
+ top_k = data.get("top_k", 50)
70
+
71
+ # temperature, frequency penalty and presence penalty
72
+ temperature = data.get("temperature", 0.6)
73
+
74
+ # frequency penalty
75
+ frequency_penalty = data.get("frequency_penalty", 1)
76
+
77
+ # presence penalty
78
+ presence_penalty = data.get("presence_penalty", 1)
79
+
80
+ # streaming function. uses text/event-stream instead of application/json
81
+ def stream():
82
+
83
+ # generate chat
84
+ for chunk in api.chat(messages,
85
+ model,
86
+ stream=True,
87
+ max_tokens=max_tokens,
88
+ top_p=top_p,
89
+ temperature=temperature,
90
+ frequency_penalty=frequency_penalty,
91
+ presence_penalty=presence_penalty,
92
+ top_k=top_k
93
+ ):
94
+
95
+ # yield chat
96
+ #print(chunk)
97
+ yield chunk + b'\n\n'
98
+
99
+ # in the end, return done
100
+ yield b'data: [DONE]'
101
+
102
+ # check if user wants to stream
103
+ if data.get("stream"):
104
+
105
+ # log
106
+ logging.info(f"Streaming requested for model {model}\n")
107
+
108
+ # return stream
109
+ return app.response_class(stream(), mimetype='text/event-stream')
110
+
111
+ # even if not, stream but collect all data to a full string
112
+ else:
113
+
114
+ # log
115
+ logging.info(f"Non-streaming requested for model {model}\n")
116
+
117
+ # pre-init
118
+ full: str = ""
119
+
120
+ # generate chat
121
+ for chunk in api.chat(messages,
122
+ model,
123
+ stream=True,
124
+ max_tokens=max_tokens,
125
+ top_p=top_p,
126
+ temperature=temperature,
127
+ frequency_penalty=frequency_penalty,
128
+ presence_penalty=presence_penalty,
129
+ top_k=top_k
130
+ ):
131
+
132
+ try:
133
+
134
+ # append chunk
135
+ full += loads(chunk.decode("utf-8").removeprefix('data: '))["choices"][0]["delta"]["content"]
136
+
137
+ except: pass
138
+
139
+ # return full
140
+ return jsonify(non_streamed_format(model, full))
141
+
142
+
143
+ # route to get all models
144
+ @app.route("/models", methods=["GET"])
145
+ def get_models():
146
+
147
+ # return models
148
+ return jsonify(api.get_models())
149
+
150
+ # root route to check if api is online
151
+ @app.route("/", methods=["GET"])
152
+ def root():
153
+
154
+ # return root
155
+ return render_template("index.html")
156
+
157
+ # ---------------------------------------- ERROR HANDLING ---------------------------------------- #
158
+ @app.errorhandler(403)
159
+ def forbidden(error):
160
+
161
+ # return 403
162
+ return jsonify(
163
+
164
+ {"status": False},
165
+ {'error': [
166
+
167
+ {'message': 'Something went wrong, the API was blocked from sending a request to the DeepInfra API. Please try again later.'},
168
+ {'tpye': 'forbidden'},
169
+ {'error': f'{error}'}
170
+ ]},
171
+ {'hint': 'please report issues on the github page'}
172
+ ), 403
173
+
174
+ @app.errorhandler(500)
175
+ def internal_server_error(error):
176
+
177
+ # return 500
178
+ return jsonify(
179
+
180
+ {"status": False},
181
+ {'error': [
182
+
183
+ {'message': 'Something went wrong, the API was unable to complete your request. Please try again later.'},
184
+ {'tpye': 'internal server error'},
185
+ {'error': f'{error}'}
186
+ ]},
187
+ {'hint': 'please report issues on the github page'}
188
+ ), 500
189
+
190
+ # ---------------------------------------- START API ---------------------------------------- #
191
+
192
+ # start the api
193
+ if __name__ == "__main__":
194
+
195
+ app.run(debug=DEBUG, port=PORT, host=HOST)
196
+
197
+ # Path: app.py
requirements.txt CHANGED
@@ -1,4 +1,4 @@
1
- shiny==0.10.2
2
- shinyswatch==0.6.1
3
- seaborn==0.12.2
4
- matplotlib==3.7.1
 
1
+ flask
2
+ flask_cors
3
+ fake_useragent
4
+ requests