.
diff --git a/g4f/.v1/README.md b/g4f/.v1/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..ce3ee7bea3622690fdc0437919f8c0c98f2db77d
--- /dev/null
+++ b/g4f/.v1/README.md
@@ -0,0 +1,255 @@
+**A major update is to come this week (statement written 14 Jun)**
+**You may check these out in the meanwhile**:
+
+- v2 prototype of gpt4free someone made: https://gitler.moe/g4f/gpt4free
+- Discord bot with gpt-4 using poe.com: https://github.com/xtekky/gpt4free-discord
+
+______
+What can I do to contribute ?
+you reverse a site from this list: [sites-to-reverse](https://github.com/xtekky/gpt4free/issues/40), and add it to [`./testing`](https://github.com/xtekky/gpt4free/tree/main/testing) or refractor it and add it to [`./gpt4free`](https://github.com/xtekky/gpt4free/tree/main/gpt4free)
+
+You may join our discord: discord.gg/gpt4free for further updates.
+
+
+
+
+## Legal Notice
+
+This repository is _not_ associated with or endorsed by providers of the APIs contained in this GitHub repository. This project is intended **for educational purposes only**. This is just a little personal project. Sites may contact me to improve their security or request the removal of their site from this repository.
+
+Please note the following:
+
+1. **Disclaimer**: The APIs, services, and trademarks mentioned in this repository belong to their respective owners. This project is _not_ claiming any right over them nor is it affiliated with or endorsed by any of the providers mentioned.
+
+2. **Responsibility**: The author of this repository is _not_ responsible for any consequences, damages, or losses arising from the use or misuse of this repository or the content provided by the third-party APIs. Users are solely responsible for their actions and any repercussions that may follow. We strongly recommend the users to follow the TOS of the each Website.
+
+3. **Educational Purposes Only**: This repository and its content are provided strictly for educational purposes. By using the information and code provided, users acknowledge that they are using the APIs and models at their own risk and agree to comply with any applicable laws and regulations.
+
+4. **Indemnification**: Users agree to indemnify, defend, and hold harmless the author of this repository from and against any and all claims, liabilities, damages, losses, or expenses, including legal fees and costs, arising out of or in any way connected with their use or misuse of this repository, its content, or related third-party APIs.
+
+5. **Updates and Changes**: The author reserves the right to modify, update, or remove any content, information, or features in this repository at any time without prior notice. Users are responsible for regularly reviewing the content and any changes made to this repository.
+
+By using this repository or any code related to it, you agree to these terms. The author is not responsible for any copies, forks, or reuploads made by other users. This is the author's only account and repository. To prevent impersonation or irresponsible actions, you may comply with the GNU GPL license this Repository uses.
+
+
+
+
+Just API's from some language model sites.
+
+
+# Related gpt4free projects
+
+
+
+
+## Table of Contents
+| Section | Description | Link | Status |
+| ------- | ----------- | ---- | ------ |
+| **To do list** | List of tasks to be done | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#todo) | - |
+| **Current Sites** | Current websites or platforms that can be used as APIs | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#current-sites) | - |
+| **Best Sites for gpt4** | Recommended websites or platforms for gpt4 | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#best-sites) | - |
+| **Streamlit GPT4Free GUI** | Web-based graphical user interface for interacting with gpt4free | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#streamlit-gpt4free-gui) | - |
+| **Docker** | Instructions on how to run gpt4free in a Docker container | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#docker-instructions) | - |
+| **ChatGPT clone** | A ChatGPT clone with new features and scalability | [![Link to Website](https://img.shields.io/badge/Link-Visit%20Site-blue)](https://chat.chatbot.sex/chat) | - |
+| **How to install** | Instructions on how to install gpt4free | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#install) | - |
+| **Usage Examples** | | | |
+| `theb` | Example usage for theb (gpt-3.5) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/theb/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
+| `forefront` | Example usage for forefront (gpt-4) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/forefront/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) | ||
+| `quora (poe)` | Example usage for quora | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/quora/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
+| `you` | Example usage for you | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/you/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
+| `deepai` | Example usage for DeepAI (gpt-3.5, with chat) | [![Link to File](https://img.shields.io/badge/Link-Go%20to%20File-blue)](gpt4free/deepai/README.md) | ![Active](https://img.shields.io/badge/Active-brightgreen) |
+| **Try it Out** | | | |
+| Google Colab Jupyter Notebook | Example usage for gpt4free | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/DanielShemesh/gpt4free-colab/blob/main/gpt4free.ipynb) | - |
+| replit Example (feel free to fork this repl) | Example usage for gpt4free | [![](https://img.shields.io/badge/Open%20in-Replit-1A1E27?logo=replit)](https://replit.com/@gpt4free/gpt4free-webui) | - |
+| **Legal Notice** | Legal notice or disclaimer | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#legal-notice) | - |
+| **Copyright** | Copyright information | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#copyright) | - |
+| **Star History** | Star History | [![Link to Section](https://img.shields.io/badge/Link-Go%20to%20Section-blue)](#star-history) | - |
+
+
+## To do list
+
+- [x] Add a GUI for the repo
+- [ ] Make a general package named `gpt4free`, instead of different folders
+- [ ] Live api status to know which are down and which can be used
+- [ ] Integrate more API's in `./unfinished` as well as other ones in the lists
+- [ ] Make an API to use as proxy for other projects
+- [ ] Make a pypi package
+
+## Current Sites
+
+| Website s | Model(s) |
+| ------------------------------------------------ | -------------------------------- |
+| [forefront.ai](https://chat.forefront.ai) | GPT-4/3.5 |
+| [poe.com](https://poe.com) | GPT-4/3.5 |
+| [writesonic.com](https://writesonic.com) | GPT-3.5 / Internet |
+| [t3nsor.com](https://t3nsor.com) | GPT-3.5 |
+| [you.com](https://you.com) | GPT-3.5 / Internet / good search |
+| [sqlchat.ai](https://sqlchat.ai) | GPT-3.5 |
+| [bard.google.com](https://bard.google.com) | custom / search |
+| [bing.com/chat](https://bing.com/chat) | GPT-4/3.5 |
+| [italygpt.it](https://italygpt.it) | GPT-3.5 |
+| [deepai.org](https://deepai.org/chat) | GPT-3.5 / chat support |
+
+
+## Best sites
+
+#### gpt-4
+
+- [`/forefront`](gpt4free/forefront/README.md)
+
+#### gpt-3.5
+
+- [`/you`](gpt4free/you/README.md)
+
+## Install
+
+Download or clone this GitHub repo
+install requirements with:
+
+```sh
+python3 -m venv venv
+. venv/bin/activate
+pip3 install -r requirements.txt
+```
+
+## Install ffmpeg
+```sh
+sudo apt-get install ffmpeg
+```
+
+## Connect VPN if needed and get proxy (Optional)
+```sh
+echo "$http_proxy" # http://127.0.0.1:8889/
+```
+
+## Set proxy in gpt4free/you/__init__.py (Optional)
+```
+diff --git a/gpt4free/you/__init__.py b/gpt4free/you/__init__.py
+index 11847fb..59d1162 100644
+--- a/gpt4free/you/__init__.py
++++ b/gpt4free/you/__init__.py
+@@ -38,6 +38,7 @@ class Completion:
+ if chat is None:
+ chat = []
+
++ proxy = '127.0.0.1:8889'
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else {}
+
+ client = Session(client_identifier='chrome_108')
+```
+
+
+## To start gpt4free GUI
+
+##### Note: streamlit app collects heavy analytics even when running locally. This includes events for every page load, form submission including metadata on queries (like length), browser and client information including host ips. These are all transmitted to a 3rd party analytics group, Segment.com.
+
+Move `streamlit_app.py` from `./gui` to the base folder then run:
+`streamlit run streamlit_app.py` or `python3 -m streamlit run streamlit_app.py`
+
+```sh
+cp gui/streamlit_app.py .
+streamlit run streamlit_app.py
+```
+
+
+## Docker
+
+Build
+
+```
+docker build -t gpt4free:latest .
+```
+
+Run
+
+```
+docker run -p 8501:8501 gpt4free:latest
+```
+
+## Deploy using docker-compose
+
+Run the following:
+
+```
+docker-compose up --build -d
+```
+
+## ChatGPT clone
+
+> Currently implementing new features and trying to scale it, please be patient it may be unstable
+> https://chat.g4f.ai/chat
+> This site was developed by me and includes **gpt-4/3.5**, **internet access** and **gpt-jailbreak's** like DAN
+> Run locally here: https://github.com/xtekky/chatgpt-clone
+
+## Copyright:
+
+This program is licensed under the [GNU GPL v3](https://www.gnu.org/licenses/gpl-3.0.txt)
+
+Most code, with the exception of `quora/api.py` and `deepai/__init__.py` (by [ading2210](https://github.com/ading2210)), has been written by me, [xtekky](https://github.com/xtekky).
+
+### Copyright Notice:
+
+```
+xtekky/gpt4free: multiple reverse engineered language-model api's to decentralise the ai industry.
+Copyright (C) 2023 xtekky
+
+This program is free software: you can redistribute it and/or modify
+it under the terms of the GNU General Public License as published by
+the Free Software Foundation, either version 3 of the License, or
+(at your option) any later version.
+
+This program is distributed in the hope that it will be useful,
+but WITHOUT ANY WARRANTY; without even the implied warranty of
+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+GNU General Public License for more details.
+
+You should have received a copy of the GNU General Public License
+along with this program. If not, see .
+```
+
+
+## Star History
+
+
+
+
diff --git a/g4f/.v1/SECURITY.md b/g4f/.v1/SECURITY.md
new file mode 100644
index 0000000000000000000000000000000000000000..cbc69677a0ec6b0192f1bd61f3eccb7723f8827b
--- /dev/null
+++ b/g4f/.v1/SECURITY.md
@@ -0,0 +1,4 @@
+## Reporting a Vulnerability
+
+Reporting a Vulnerability
+Please report (suspected) security vulnerabilities to https://t.me/xtekky. You will receive a response within 48 hours. If the issue is confirmed, we will release a patch as soon as possible depending on complexity but historically within a few days.
diff --git a/g4f/.v1/Singularity/gpt4free.sif b/g4f/.v1/Singularity/gpt4free.sif
new file mode 100644
index 0000000000000000000000000000000000000000..67bc12410080017fbf1679c3bada765cf2fd0c7d
--- /dev/null
+++ b/g4f/.v1/Singularity/gpt4free.sif
@@ -0,0 +1,15 @@
+Bootstrap: docker
+From: python:3.10-slim
+
+%post
+ apt-get update && apt-get install -y git
+ git clone https://github.com/xtekky/gpt4free.git
+ cd gpt4free
+ pip install --no-cache-dir -r requirements.txt
+ cp gui/streamlit_app.py .
+
+%expose
+ 8501
+
+%startscript
+ exec streamlit run streamlit_app.py
diff --git a/g4f/.v1/docker-compose.yaml b/g4f/.v1/docker-compose.yaml
new file mode 100644
index 0000000000000000000000000000000000000000..8098f359cc8545671d755b5a5464876801cbe630
--- /dev/null
+++ b/g4f/.v1/docker-compose.yaml
@@ -0,0 +1,15 @@
+version: "3.9"
+
+services:
+ gpt4free:
+ build:
+ context: ./
+ dockerfile: Dockerfile
+ container_name: dc_gpt4free
+ # environment:
+ # - http_proxy=http://127.0.0.1:1080 # modify this for your proxy
+ # - https_proxy=http://127.0.0.1:1080 # modify this for your proxy
+ image: img_gpt4free
+ ports:
+ - 8501:8501
+ restart: always
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/README.md b/g4f/.v1/gpt4free/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..73e7fa09f1502c9a79f5324cabb51128cad13fbc
--- /dev/null
+++ b/g4f/.v1/gpt4free/README.md
@@ -0,0 +1,110 @@
+# gpt4free package
+
+### What is it?
+
+gpt4free is a python package that provides some language model api's
+
+### Main Features
+
+- It's free to use
+- Easy access
+
+### Installation:
+
+```bash
+pip install gpt4free
+```
+
+#### Usage:
+
+```python
+import gpt4free
+from gpt4free import Provider, quora, forefront
+
+# usage You
+response = gpt4free.Completion.create(Provider.You, prompt='Write a poem on Lionel Messi')
+print(response)
+
+# usage Poe
+token = quora.Account.create(logging=False)
+response = gpt4free.Completion.create(Provider.Poe, prompt='Write a poem on Lionel Messi', token=token, model='ChatGPT')
+print(response)
+
+# usage forefront
+token = forefront.Account.create(logging=False)
+response = gpt4free.Completion.create(
+ Provider.ForeFront, prompt='Write a poem on Lionel Messi', model='gpt-4', token=token
+)
+print(response)
+print(f'END')
+
+# usage theb
+response = gpt4free.Completion.create(Provider.Theb, prompt='Write a poem on Lionel Messi')
+print(response)
+
+
+```
+
+### Invocation Arguments
+
+`gpt4free.Completion.create()` method has two required arguments
+
+1. Provider: This is an enum representing different provider
+2. prompt: This is the user input
+
+#### Keyword Arguments
+
+Some of the keyword arguments are optional, while others are required.
+
+- You:
+ - `safe_search`: boolean - default value is `False`
+ - `include_links`: boolean - default value is `False`
+ - `detailed`: boolean - default value is `False`
+- Quora:
+ - `token`: str - this needs to be provided by the user
+ - `model`: str - default value is `gpt-4`.
+
+ (Available models: `['Sage', 'GPT-4', 'Claude+', 'Claude-instant', 'ChatGPT', 'Dragonfly', 'NeevaAI']`)
+- ForeFront:
+ - `token`: str - this need to be provided by the user
+
+- Theb:
+ (no keyword arguments required)
+
+#### Token generation of quora
+```python
+from gpt4free import quora
+
+token = quora.Account.create(logging=False)
+```
+
+### Token generation of ForeFront
+```python
+from gpt4free import forefront
+
+token = forefront.Account.create(logging=False)
+```
+
+## Copyright:
+
+This program is licensed under the [GNU GPL v3](https://www.gnu.org/licenses/gpl-3.0.txt)
+
+### Copyright Notice:
+
+```
+xtekky/gpt4free: multiple reverse engineered language-model api's to decentralise the ai industry.
+Copyright (C) 2023 xtekky
+
+This program is free software: you can redistribute it and/or modify
+it under the terms of the GNU General Public License as published by
+the Free Software Foundation, either version 3 of the License, or
+(at your option) any later version.
+
+This program is distributed in the hope that it will be useful,
+but WITHOUT ANY WARRANTY; without even the implied warranty of
+MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+GNU General Public License for more details.
+
+You should have received a copy of the GNU General Public License
+along with this program. If not, see .
+```
diff --git a/g4f/.v1/gpt4free/__init__.py b/g4f/.v1/gpt4free/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..bcc03a3b93386abe1089181bb979de6b350dd554
--- /dev/null
+++ b/g4f/.v1/gpt4free/__init__.py
@@ -0,0 +1,103 @@
+from enum import Enum
+
+from gpt4free import forefront
+from gpt4free import quora
+from gpt4free import theb
+from gpt4free import usesless
+from gpt4free import you
+from gpt4free import aicolors
+from gpt4free import deepai
+
+
+class Provider(Enum):
+ """An enum representing different providers."""
+
+ You = "you"
+ Poe = "poe"
+ ForeFront = "fore_front"
+ Theb = "theb"
+ UseLess = "useless"
+ AiColors = "ai_colors"
+ DeepAI = "deepai"
+
+
+class Completion:
+ """This class will be used for invoking the given provider"""
+
+ @staticmethod
+ def create(provider: Provider, prompt: str, **kwargs) -> str:
+ """
+ Invokes the given provider with given prompt and addition arguments and returns the string response
+
+ :param provider: an enum representing the provider to use while invoking
+ :param prompt: input provided by the user
+ :param kwargs: Additional keyword arguments to pass to the provider while invoking
+ :return: A string representing the response from the provider
+ """
+ if provider == Provider.Poe:
+ return Completion.__poe_service(prompt, **kwargs)
+ elif provider == Provider.You:
+ return Completion.__you_service(prompt, **kwargs)
+ elif provider == Provider.ForeFront:
+ return Completion.__fore_front_service(prompt, **kwargs)
+ elif provider == Provider.Theb:
+ return Completion.__theb_service(prompt, **kwargs)
+ elif provider == Provider.UseLess:
+ return Completion.__useless_service(prompt, **kwargs)
+ elif provider == Provider.AiColors:
+ return Completion.__ai_colors_service(prompt, **kwargs)
+ elif provider == Provider.DeepAI:
+ return Completion.__deepai_service(prompt, **kwargs)
+ else:
+ raise Exception("Provider not exist, Please try again")
+
+ @staticmethod
+ def __ai_colors_service(prompt: str):
+ return aicolors.Completion.create(prompt=prompt)
+
+ @staticmethod
+ def __useless_service(prompt: str, **kwargs) -> str:
+ return usesless.Completion.create(prompt=prompt, **kwargs)
+
+ @staticmethod
+ def __you_service(prompt: str, **kwargs) -> str:
+ return you.Completion.create(prompt, **kwargs).text
+
+ @staticmethod
+ def __poe_service(prompt: str, **kwargs) -> str:
+ return quora.Completion.create(prompt=prompt, **kwargs).text
+
+ @staticmethod
+ def __fore_front_service(prompt: str, **kwargs) -> str:
+ return forefront.Completion.create(prompt=prompt, **kwargs).text
+
+ @staticmethod
+ def __theb_service(prompt: str, **kwargs):
+ return "".join(theb.Completion.create(prompt=prompt))
+
+ @staticmethod
+ def __deepai_service(prompt: str, **kwargs):
+ return "".join(deepai.Completion.create(prompt=prompt))
+
+
+class ChatCompletion:
+ """This class is used to execute a chat completion for a specified provider"""
+
+ @staticmethod
+ def create(provider: Provider, messages: list, **kwargs) -> str:
+ """
+ Invokes the given provider with given chat messages and addition arguments and returns the string response
+
+ :param provider: an enum representing the provider to use while invoking
+ :param messages: a list of chat messages, see the OpenAI docs for how to format this (https://platform.openai.com/docs/guides/chat/introduction)
+ :param kwargs: Additional keyword arguments to pass to the provider while invoking
+ :return: A string representing the response from the provider
+ """
+ if provider == Provider.DeepAI:
+ return ChatCompletion.__deepai_service(messages, **kwargs)
+ else:
+ raise Exception("Provider not exist, Please try again")
+
+ @staticmethod
+ def __deepai_service(messages: list, **kwargs):
+ return "".join(deepai.ChatCompletion.create(messages=messages))
diff --git a/g4f/.v1/gpt4free/aiassist/README.md b/g4f/.v1/gpt4free/aiassist/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..b61017841d3c52b8cd079e638b1fa35264aa15af
--- /dev/null
+++ b/g4f/.v1/gpt4free/aiassist/README.md
@@ -0,0 +1,19 @@
+aiassist.site
+
+### Example: `aiassist`
+
+```python
+import aiassist
+
+question1 = "Who won the world series in 2020?"
+req = aiassist.Completion.create(prompt=question1)
+answer = req["text"]
+message_id = req["parentMessageId"]
+
+question2 = "Where was it played?"
+req2 = aiassist.Completion.create(prompt=question2, parentMessageId=message_id)
+answer2 = req2["text"]
+
+print(answer)
+print(answer2)
+```
diff --git a/g4f/.v1/gpt4free/aiassist/__init__.py b/g4f/.v1/gpt4free/aiassist/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..95a9f08b259bbcabf7512dda0fe633d96686fd4d
--- /dev/null
+++ b/g4f/.v1/gpt4free/aiassist/__init__.py
@@ -0,0 +1,36 @@
+import urllib.request
+import json
+
+
+class Completion:
+ @staticmethod
+ def create(
+ systemMessage: str = "You are a helpful assistant",
+ prompt: str = "",
+ parentMessageId: str = "",
+ temperature: float = 0.8,
+ top_p: float = 1,
+ ):
+ json_data = {
+ "prompt": prompt,
+ "options": {"parentMessageId": parentMessageId},
+ "systemMessage": systemMessage,
+ "temperature": temperature,
+ "top_p": top_p,
+ }
+
+ url = "http://43.153.7.56:8080/api/chat-process"
+ headers = {"Content-type": "application/json"}
+
+ data = json.dumps(json_data).encode("utf-8")
+ req = urllib.request.Request(url, data=data, headers=headers)
+ response = urllib.request.urlopen(req)
+ content = response.read().decode()
+
+ return Completion.__load_json(content)
+
+ @classmethod
+ def __load_json(cls, content) -> dict:
+ split = content.rsplit("\n", 1)[1]
+ to_json = json.loads(split)
+ return to_json
diff --git a/g4f/.v1/gpt4free/aicolors/__init__.py b/g4f/.v1/gpt4free/aicolors/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..a69276b81076c8a25c30ed9c8ab45e09fb20aabf
--- /dev/null
+++ b/g4f/.v1/gpt4free/aicolors/__init__.py
@@ -0,0 +1,30 @@
+import fake_useragent
+import requests
+import json
+from .typings import AiColorsResponse
+
+
+class Completion:
+ @staticmethod
+ def create(
+ query: str = "",
+ ) -> AiColorsResponse:
+ headers = {
+ "authority": "jsuifmbqefnxytqwmaoy.functions.supabase.co",
+ "accept": "*/*",
+ "accept-language": "en-US,en;q=0.5",
+ "cache-control": "no-cache",
+ "sec-fetch-dest": "empty",
+ "sec-fetch-mode": "cors",
+ "sec-fetch-site": "same-origin",
+ "user-agent": fake_useragent.UserAgent().random,
+ }
+
+ json_data = {"query": query}
+
+ url = "https://jsuifmbqefnxytqwmaoy.functions.supabase.co/chatgpt"
+ request = requests.post(url, headers=headers, json=json_data, timeout=30)
+ data = request.json().get("text").get("content")
+ json_data = json.loads(data.replace("\n ", ""))
+
+ return AiColorsResponse(**json_data)
diff --git a/g4f/.v1/gpt4free/aicolors/typings/__init__.py b/g4f/.v1/gpt4free/aicolors/typings/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..8c4f29d147b0e09cbde3a369a78256e78a97c22e
--- /dev/null
+++ b/g4f/.v1/gpt4free/aicolors/typings/__init__.py
@@ -0,0 +1,9 @@
+from dataclasses import dataclass
+
+
+@dataclass
+class AiColorsResponse:
+ background: str
+ primary: str
+ accent: str
+ text: str
diff --git a/g4f/.v1/gpt4free/deepai/README.md b/g4f/.v1/gpt4free/deepai/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a287cdb7a4a3a52a17adb6d0a01e064fa21b3b54
--- /dev/null
+++ b/g4f/.v1/gpt4free/deepai/README.md
@@ -0,0 +1,26 @@
+# DeepAI Wrapper
+Written by [ading2210](https://github.com/ading2210/).
+
+## Examples:
+These functions are generators which yield strings containing the newly generated text.
+
+### Completion:
+```python
+for chunk in deepai.Completion.create("Who are you?"):
+ print(chunk, end="", flush=True)
+print()
+```
+
+### Chat Completion:
+Use the same format for the messages as you would for the [official OpenAI API](https://platform.openai.com/docs/guides/chat/introduction).
+```python
+messages = [
+ {"role": "system", "content": "You are a helpful assistant."},
+ {"role": "user", "content": "Who won the world series in 2020?"},
+ {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
+ {"role": "user", "content": "Where was it played?"}
+]
+for chunk in deepai.ChatCompletion.create(messages):
+ print(chunk, end="", flush=True)
+print()
+```
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/deepai/__init__.py b/g4f/.v1/gpt4free/deepai/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..a2fc6f5af4a85304b0e23ceb07bfe844fc907f23
--- /dev/null
+++ b/g4f/.v1/gpt4free/deepai/__init__.py
@@ -0,0 +1,46 @@
+import requests
+import json
+import hashlib
+import random
+import string
+from fake_useragent import UserAgent
+
+class ChatCompletion:
+ @classmethod
+ def md5(self, text):
+ return hashlib.md5(text.encode()).hexdigest()[::-1]
+
+ @classmethod
+ def get_api_key(self, user_agent):
+ part1 = str(random.randint(0, 10**11))
+ part2 = self.md5(user_agent+self.md5(user_agent+self.md5(user_agent+part1+"x")))
+ return f"tryit-{part1}-{part2}"
+
+ @classmethod
+ def create(self, messages):
+ user_agent = UserAgent().random
+ api_key = self.get_api_key(user_agent)
+ headers = {
+ "api-key": api_key,
+ "user-agent": user_agent
+ }
+ files = {
+ "chat_style": (None, "chat"),
+ "chatHistory": (None, json.dumps(messages))
+ }
+
+ r = requests.post("https://api.deepai.org/chat_response", headers=headers, files=files, stream=True)
+
+ for chunk in r.iter_content(chunk_size=None):
+ r.raise_for_status()
+ yield chunk.decode()
+
+class Completion:
+ @classmethod
+ def create(self, prompt):
+ return ChatCompletion.create([
+ {
+ "role": "user",
+ "content": prompt
+ }
+ ])
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/forefront/README.md b/g4f/.v1/gpt4free/forefront/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7a59fe8e44f90c44854278d5fd673e726684bfce
--- /dev/null
+++ b/g4f/.v1/gpt4free/forefront/README.md
@@ -0,0 +1,19 @@
+### Example: `forefront` (use like openai pypi package)
+
+```python
+from gpt4free import forefront
+
+
+# create an account
+account_data = forefront.Account.create(logging=False)
+
+# get a response
+for response in forefront.StreamingCompletion.create(
+ account_data=account_data,
+ prompt='hello world',
+ model='gpt-4'
+):
+ print(response.choices[0].text, end='')
+print("")
+
+```
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/forefront/__init__.py b/g4f/.v1/gpt4free/forefront/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..240ee0a46c05ca39133cfe71f4d5f55013a18961
--- /dev/null
+++ b/g4f/.v1/gpt4free/forefront/__init__.py
@@ -0,0 +1,214 @@
+import hashlib
+from base64 import b64encode
+from json import loads
+from re import findall
+from time import time, sleep
+from typing import Generator, Optional
+from uuid import uuid4
+
+from Crypto.Cipher import AES
+from Crypto.Random import get_random_bytes
+from fake_useragent import UserAgent
+from mailgw_temporary_email import Email
+from requests import post
+from tls_client import Session
+
+from .typing import ForeFrontResponse, AccountData
+
+
+class Account:
+ @staticmethod
+ def create(proxy: Optional[str] = None, logging: bool = False) -> AccountData:
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else False
+
+ start = time()
+
+ mail_client = Email()
+ mail_client.register()
+ mail_address = mail_client.address
+
+ client = Session(client_identifier='chrome110')
+ client.proxies = proxies
+ client.headers = {
+ 'origin': 'https://accounts.forefront.ai',
+ 'user-agent': UserAgent().random,
+ }
+
+ response = client.post(
+ 'https://clerk.forefront.ai/v1/client/sign_ups?_clerk_js_version=4.38.4',
+ data={'email_address': mail_address},
+ )
+
+ try:
+ trace_token = response.json()['response']['id']
+ if logging:
+ print(trace_token)
+ except KeyError:
+ raise RuntimeError('Failed to create account!')
+
+ response = client.post(
+ f'https://clerk.forefront.ai/v1/client/sign_ups/{trace_token}/prepare_verification?_clerk_js_version=4.38.4',
+ data={
+ 'strategy': 'email_link',
+ 'redirect_url': 'https://accounts.forefront.ai/sign-up/verify'
+ },
+ )
+
+ if logging:
+ print(response.text)
+
+ if 'sign_up_attempt' not in response.text:
+ raise RuntimeError('Failed to create account!')
+
+ while True:
+ sleep(5)
+ message_id = mail_client.message_list()[0]['id']
+ message = mail_client.message(message_id)
+ verification_url = findall(r'https:\/\/clerk\.forefront\.ai\/v1\/verify\?token=\w.+', message["text"])[0]
+ if verification_url:
+ break
+
+ if logging:
+ print(verification_url)
+ client.get(verification_url)
+
+ response = client.get('https://clerk.forefront.ai/v1/client?_clerk_js_version=4.38.4').json()
+ session_data = response['response']['sessions'][0]
+
+ user_id = session_data['user']['id']
+ session_id = session_data['id']
+ token = session_data['last_active_token']['jwt']
+
+ with open('accounts.txt', 'a') as f:
+ f.write(f'{mail_address}:{token}\n')
+
+ if logging:
+ print(time() - start)
+
+ return AccountData(token=token, user_id=user_id, session_id=session_id)
+
+
+class StreamingCompletion:
+ @staticmethod
+ def create(
+ prompt: str,
+ account_data: AccountData,
+ chat_id=None,
+ action_type='new',
+ default_persona='607e41fe-95be-497e-8e97-010a59b2e2c0', # default
+ model='gpt-4',
+ proxy=None
+ ) -> Generator[ForeFrontResponse, None, None]:
+ token = account_data.token
+ if not chat_id:
+ chat_id = str(uuid4())
+
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else None
+ base64_data = b64encode((account_data.user_id + default_persona + chat_id).encode()).decode()
+ encrypted_signature = StreamingCompletion.__encrypt(base64_data, account_data.session_id)
+
+ headers = {
+ 'authority': 'chat-server.tenant-forefront-default.knative.chi.coreweave.com',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'origin': 'https://chat.forefront.ai',
+ 'pragma': 'no-cache',
+ 'referer': 'https://chat.forefront.ai/',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'cross-site',
+ 'authorization': f"Bearer {token}",
+ 'X-Signature': encrypted_signature,
+ 'user-agent': UserAgent().random,
+ }
+
+ json_data = {
+ 'text': prompt,
+ 'action': action_type,
+ 'parentId': chat_id,
+ 'workspaceId': chat_id,
+ 'messagePersona': default_persona,
+ 'model': model,
+ }
+
+ for chunk in post(
+ 'https://streaming.tenant-forefront-default.knative.chi.coreweave.com/chat',
+ headers=headers,
+ proxies=proxies,
+ json=json_data,
+ stream=True,
+ ).iter_lines():
+ if b'finish_reason":null' in chunk:
+ data = loads(chunk.decode('utf-8').split('data: ')[1])
+ token = data['choices'][0]['delta'].get('content')
+
+ if token is not None:
+ yield ForeFrontResponse(
+ **{
+ 'id': chat_id,
+ 'object': 'text_completion',
+ 'created': int(time()),
+ 'text': token,
+ 'model': model,
+ 'choices': [{'text': token, 'index': 0, 'logprobs': None, 'finish_reason': 'stop'}],
+ 'usage': {
+ 'prompt_tokens': len(prompt),
+ 'completion_tokens': len(token),
+ 'total_tokens': len(prompt) + len(token),
+ },
+ }
+ )
+
+ @staticmethod
+ def __encrypt(data: str, key: str) -> str:
+ hash_key = hashlib.sha256(key.encode()).digest()
+ iv = get_random_bytes(16)
+ cipher = AES.new(hash_key, AES.MODE_CBC, iv)
+ encrypted_data = cipher.encrypt(StreamingCompletion.__pad_data(data.encode()))
+ return iv.hex() + encrypted_data.hex()
+
+ @staticmethod
+ def __pad_data(data: bytes) -> bytes:
+ block_size = AES.block_size
+ padding_size = block_size - len(data) % block_size
+ padding = bytes([padding_size] * padding_size)
+ return data + padding
+
+
+class Completion:
+ @staticmethod
+ def create(
+ prompt: str,
+ account_data: AccountData,
+ chat_id=None,
+ action_type='new',
+ default_persona='607e41fe-95be-497e-8e97-010a59b2e2c0', # default
+ model='gpt-4',
+ proxy=None
+ ) -> ForeFrontResponse:
+ text = ''
+ final_response = None
+ for response in StreamingCompletion.create(
+ account_data=account_data,
+ chat_id=chat_id,
+ prompt=prompt,
+ action_type=action_type,
+ default_persona=default_persona,
+ model=model,
+ proxy=proxy
+ ):
+ if response:
+ final_response = response
+ text += response.text
+
+ if final_response:
+ final_response.text = text
+ else:
+ raise RuntimeError('Unable to get the response, Please try again')
+
+ return final_response
diff --git a/g4f/.v1/gpt4free/forefront/typing.py b/g4f/.v1/gpt4free/forefront/typing.py
new file mode 100644
index 0000000000000000000000000000000000000000..b572e2c252db380effc5863015ed78d9479a5bb4
--- /dev/null
+++ b/g4f/.v1/gpt4free/forefront/typing.py
@@ -0,0 +1,32 @@
+from typing import Any, List
+
+from pydantic import BaseModel
+
+
+class Choice(BaseModel):
+ text: str
+ index: int
+ logprobs: Any
+ finish_reason: str
+
+
+class Usage(BaseModel):
+ prompt_tokens: int
+ completion_tokens: int
+ total_tokens: int
+
+
+class ForeFrontResponse(BaseModel):
+ id: str
+ object: str
+ created: int
+ model: str
+ choices: List[Choice]
+ usage: Usage
+ text: str
+
+
+class AccountData(BaseModel):
+ token: str
+ user_id: str
+ session_id: str
diff --git a/g4f/.v1/gpt4free/gptworldAi/README.md b/g4f/.v1/gpt4free/gptworldAi/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a6b07f86e5752a5420f7b163694f49ade95cb743
--- /dev/null
+++ b/g4f/.v1/gpt4free/gptworldAi/README.md
@@ -0,0 +1,25 @@
+# gptworldAi
+Written by [hp_mzx](https://github.com/hpsj).
+
+## Examples:
+### Completion:
+```python
+for chunk in gptworldAi.Completion.create("你是谁", "127.0.0.1:7890"):
+ print(chunk, end="", flush=True)
+ print()
+```
+
+### Chat Completion:
+Support context
+```python
+message = []
+while True:
+ prompt = input("请输入问题:")
+ message.append({"role": "user","content": prompt})
+ text = ""
+ for chunk in gptworldAi.ChatCompletion.create(message,'127.0.0.1:7890'):
+ text = text+chunk
+ print(chunk, end="", flush=True)
+ print()
+ message.append({"role": "assistant", "content": text})
+```
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/gptworldAi/__init__.py b/g4f/.v1/gpt4free/gptworldAi/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e7f76c61209fabf224698949764155ac53cc7a6b
--- /dev/null
+++ b/g4f/.v1/gpt4free/gptworldAi/__init__.py
@@ -0,0 +1,105 @@
+# -*- coding: utf-8 -*-
+"""
+@Time : 2023/5/23 13:37
+@Auth : Hp_mzx
+@File :__init__.py.py
+@IDE :PyCharm
+"""
+import json
+import uuid
+import random
+import binascii
+import requests
+import Crypto.Cipher.AES as AES
+from fake_useragent import UserAgent
+
+class ChatCompletion:
+ @staticmethod
+ def create(messages:[],proxy: str = None):
+ url = "https://chat.getgpt.world/api/chat/stream"
+ headers = {
+ "Content-Type": "application/json",
+ "Referer": "https://chat.getgpt.world/",
+ 'user-agent': UserAgent().random,
+ }
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else None
+ data = json.dumps({
+ "messages": messages,
+ "frequency_penalty": 0,
+ "max_tokens": 4000,
+ "model": "gpt-3.5-turbo",
+ "presence_penalty": 0,
+ "temperature": 1,
+ "top_p": 1,
+ "stream": True,
+ "uuid": str(uuid.uuid4())
+ })
+ signature = ChatCompletion.encrypt(data)
+ res = requests.post(url, headers=headers, data=json.dumps({"signature": signature}), proxies=proxies,stream=True)
+ for chunk in res.iter_content(chunk_size=None):
+ res.raise_for_status()
+ datas = chunk.decode('utf-8').split('data: ')
+ for data in datas:
+ if not data or "[DONE]" in data:
+ continue
+ data_json = json.loads(data)
+ content = data_json['choices'][0]['delta'].get('content')
+ if content:
+ yield content
+
+
+ @staticmethod
+ def random_token(e):
+ token = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"
+ n = len(token)
+ return "".join([token[random.randint(0, n - 1)] for i in range(e)])
+
+ @staticmethod
+ def encrypt(e):
+ t = ChatCompletion.random_token(16).encode('utf-8')
+ n = ChatCompletion.random_token(16).encode('utf-8')
+ r = e.encode('utf-8')
+ cipher = AES.new(t, AES.MODE_CBC, n)
+ ciphertext = cipher.encrypt(ChatCompletion.__pad_data(r))
+ return binascii.hexlify(ciphertext).decode('utf-8') + t.decode('utf-8') + n.decode('utf-8')
+
+ @staticmethod
+ def __pad_data(data: bytes) -> bytes:
+ block_size = AES.block_size
+ padding_size = block_size - len(data) % block_size
+ padding = bytes([padding_size] * padding_size)
+ return data + padding
+
+
+class Completion:
+ @staticmethod
+ def create(prompt:str,proxy:str=None):
+ return ChatCompletion.create([
+ {
+ "content": "You are ChatGPT, a large language model trained by OpenAI.\nCarefully heed the user's instructions. \nRespond using Markdown.",
+ "role": "system"
+ },
+ {"role": "user", "content": prompt}
+ ], proxy)
+
+
+if __name__ == '__main__':
+ # single completion
+ text = ""
+ for chunk in Completion.create("你是谁", "127.0.0.1:7890"):
+ text = text + chunk
+ print(chunk, end="", flush=True)
+ print()
+
+
+ #chat completion
+ message = []
+ while True:
+ prompt = input("请输入问题:")
+ message.append({"role": "user","content": prompt})
+ text = ""
+ for chunk in ChatCompletion.create(message,'127.0.0.1:7890'):
+ text = text+chunk
+ print(chunk, end="", flush=True)
+ print()
+ message.append({"role": "assistant", "content": text})
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/hpgptai/README.md b/g4f/.v1/gpt4free/hpgptai/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..2735902ffdf18106f8620dae7a30dd1d25bcf304
--- /dev/null
+++ b/g4f/.v1/gpt4free/hpgptai/README.md
@@ -0,0 +1,39 @@
+# HpgptAI
+Written by [hp_mzx](https://github.com/hpsj).
+
+## Examples:
+### Completion:
+```python
+res = hpgptai.Completion.create("你是谁","127.0.0.1:7890")
+print(res["reply"])
+```
+
+### Chat Completion:
+Support context
+```python
+messages = [
+ {
+ "content": "你是谁",
+ "html": "你是谁",
+ "id": hpgptai.ChatCompletion.randomStr(),
+ "role": "user",
+ "who": "User: ",
+ },
+ {
+ "content": "我是一位AI助手,专门为您提供各种服务和支持。我可以回答您的问题,帮助您解决问题,提供相关信息,并执行一些任务。请随时告诉我您需要什么帮助。",
+ "html": "我是一位AI助手,专门为您提供各种服务和支持。我可以回答您的问题,帮助您解决问题,提供相关信息,并执行一些任务。请随时告诉我您需要什么帮助。",
+ "id": hpgptai.ChatCompletion.randomStr(),
+ "role": "assistant",
+ "who": "AI: ",
+ },
+ {
+ "content": "我上一句问的是什么?",
+ "html": "我上一句问的是什么?",
+ "id": hpgptai.ChatCompletion.randomStr(),
+ "role": "user",
+ "who": "User: ",
+ },
+]
+res = hpgptai.ChatCompletion.create(messages,proxy="127.0.0.1:7890")
+print(res["reply"])
+```
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/hpgptai/__init__.py b/g4f/.v1/gpt4free/hpgptai/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..f5d1f0edc379b4a4bd0c18dc8665531c1e22fe91
--- /dev/null
+++ b/g4f/.v1/gpt4free/hpgptai/__init__.py
@@ -0,0 +1,103 @@
+# -*- coding: utf-8 -*-
+"""
+@Time : 2023/5/22 14:04
+@Auth : Hp_mzx
+@File :__init__.py.py
+@IDE :PyCharm
+"""
+import re
+import json
+import base64
+import random
+import string
+import requests
+from fake_useragent import UserAgent
+
+
+class ChatCompletion:
+ @staticmethod
+ def create(
+ messages: list,
+ context: str = "Converse as if you were an AI assistant. Be friendly, creative.",
+ restNonce: str = None,
+ proxy: str = None
+ ):
+ url = "https://chatgptlogin.ac/wp-json/ai-chatbot/v1/chat"
+ if not restNonce:
+ restNonce = ChatCompletion.get_restNonce(proxy)
+ headers = {
+ "Content-Type": "application/json",
+ "X-Wp-Nonce": restNonce
+ }
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else None
+ data = {
+ "env": "chatbot",
+ "session": "N/A",
+ "prompt": ChatCompletion.__build_prompt(context, messages),
+ "context": context,
+ "messages": messages,
+ "newMessage": messages[-1]["content"],
+ "userName": "User:
",
+ "aiName": "AI:
",
+ "model": "gpt-3.5-turbo",
+ "temperature": 0.8,
+ "maxTokens": 1024,
+ "maxResults": 1,
+ "apiKey": "",
+ "service": "openai",
+ "embeddingsIndex": "",
+ "stop": "",
+ "clientId": ChatCompletion.randomStr(),
+ }
+ res = requests.post(url=url, data=json.dumps(data), headers=headers, proxies=proxies)
+ if res.status_code == 200:
+ return res.json()
+ return res.text
+
+ @staticmethod
+ def randomStr():
+ return ''.join(random.choices(string.ascii_lowercase + string.digits, k=34))[:11]
+
+ @classmethod
+ def __build_prompt(cls, context: str, message: list, isCasuallyFineTuned=False, last=15):
+ prompt = context + '\n\n' if context else ''
+ message = message[-last:]
+ if isCasuallyFineTuned:
+ lastLine = message[-1]
+ prompt = lastLine.content + ""
+ return prompt
+ conversation = [x["who"] + x["content"] for x in message]
+ prompt += '\n'.join(conversation)
+ prompt += '\n' + "AI: "
+ return prompt
+
+ @classmethod
+ def get_restNonce(cls, proxy: str = None):
+ url = "https://chatgptlogin.ac/"
+ headers = {
+ "Referer": "https://chatgptlogin.ac/",
+ "User-Agent": UserAgent().random
+ }
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else None
+ res = requests.get(url, headers=headers, proxies=proxies)
+ src = re.search(
+ 'class="mwai-chat mwai-chatgpt">.*Send '
+ script_text = search(script_regex, html).group(1)
+ key_regex = r'var .="([0-9a-f]+)",'
+ key_text = search(key_regex, script_text).group(1)
+ cipher_regex = r'.\[(\d+)\]=.\[(\d+)\]'
+ cipher_pairs = findall(cipher_regex, script_text)
+
+ formkey_list = [''] * len(cipher_pairs)
+ for pair in cipher_pairs:
+ formkey_index, key_index = map(int, pair)
+ formkey_list[formkey_index] = key_text[key_index]
+ formkey = ''.join(formkey_list)
+
+ return formkey
+
+
+class Choice(BaseModel):
+ text: str
+ index: int
+ logprobs: Any
+ finish_reason: str
+
+
+class Usage(BaseModel):
+ prompt_tokens: int
+ completion_tokens: int
+ total_tokens: int
+
+
+class PoeResponse(BaseModel):
+ id: int
+ object: str
+ created: int
+ model: str
+ choices: List[Choice]
+ usage: Usage
+ text: str
+
+
+class ModelResponse:
+ def __init__(self, json_response: dict) -> None:
+ self.id = json_response['data']['poeBotCreate']['bot']['id']
+ self.name = json_response['data']['poeBotCreate']['bot']['displayName']
+ self.limit = json_response['data']['poeBotCreate']['bot']['messageLimit']['dailyLimit']
+ self.deleted = json_response['data']['poeBotCreate']['bot']['deletionState']
+
+
+class Model:
+ @staticmethod
+ def create(
+ token: str,
+ model: str = 'gpt-3.5-turbo', # claude-instant
+ system_prompt: str = 'You are ChatGPT a large language model. Answer as consisely as possible',
+ description: str = 'gpt-3.5 language model',
+ handle: str = None,
+ ) -> ModelResponse:
+ if not handle:
+ handle = f'gptx{randint(1111111, 9999999)}'
+
+ client = Session()
+ client.cookies['p-b'] = token
+
+ formkey = extract_formkey(client.get('https://poe.com').text)
+ settings = client.get('https://poe.com/api/settings').json()
+
+ client.headers = {
+ 'host': 'poe.com',
+ 'origin': 'https://poe.com',
+ 'referer': 'https://poe.com/',
+ 'poe-formkey': formkey,
+ 'poe-tchannel': settings['tchannelData']['channel'],
+ 'user-agent': UserAgent().random,
+ 'connection': 'keep-alive',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'content-type': 'application/json',
+ 'sec-fetch-site': 'same-origin',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-dest': 'empty',
+ 'accept': '*/*',
+ 'accept-encoding': 'gzip, deflate, br',
+ 'accept-language': 'en-GB,en-US;q=0.9,en;q=0.8',
+ }
+
+ payload = dumps(
+ separators=(',', ':'),
+ obj={
+ 'queryName': 'CreateBotMain_poeBotCreate_Mutation',
+ 'variables': {
+ 'model': MODELS[model],
+ 'handle': handle,
+ 'prompt': system_prompt,
+ 'isPromptPublic': True,
+ 'introduction': '',
+ 'description': description,
+ 'profilePictureUrl': 'https://qph.fs.quoracdn.net/main-qimg-24e0b480dcd946e1cc6728802c5128b6',
+ 'apiUrl': None,
+ 'apiKey': ''.join(choices(ascii_letters + digits, k=32)),
+ 'isApiBot': False,
+ 'hasLinkification': False,
+ 'hasMarkdownRendering': False,
+ 'hasSuggestedReplies': False,
+ 'isPrivateBot': False,
+ },
+ 'query': 'mutation CreateBotMain_poeBotCreate_Mutation(\n $model: String!\n $handle: String!\n $prompt: String!\n $isPromptPublic: Boolean!\n $introduction: String!\n $description: String!\n $profilePictureUrl: String\n $apiUrl: String\n $apiKey: String\n $isApiBot: Boolean\n $hasLinkification: Boolean\n $hasMarkdownRendering: Boolean\n $hasSuggestedReplies: Boolean\n $isPrivateBot: Boolean\n) {\n poeBotCreate(model: $model, handle: $handle, promptPlaintext: $prompt, isPromptPublic: $isPromptPublic, introduction: $introduction, description: $description, profilePicture: $profilePictureUrl, apiUrl: $apiUrl, apiKey: $apiKey, isApiBot: $isApiBot, hasLinkification: $hasLinkification, hasMarkdownRendering: $hasMarkdownRendering, hasSuggestedReplies: $hasSuggestedReplies, isPrivateBot: $isPrivateBot) {\n status\n bot {\n id\n ...BotHeader_bot\n }\n }\n}\n\nfragment BotHeader_bot on Bot {\n displayName\n messageLimit {\n dailyLimit\n }\n ...BotImage_bot\n ...BotLink_bot\n ...IdAnnotation_node\n ...botHelpers_useViewerCanAccessPrivateBot\n ...botHelpers_useDeletion_bot\n}\n\nfragment BotImage_bot on Bot {\n displayName\n ...botHelpers_useDeletion_bot\n ...BotImage_useProfileImage_bot\n}\n\nfragment BotImage_useProfileImage_bot on Bot {\n image {\n __typename\n ... on LocalBotImage {\n localName\n }\n ... on UrlBotImage {\n url\n }\n }\n ...botHelpers_useDeletion_bot\n}\n\nfragment BotLink_bot on Bot {\n displayName\n}\n\nfragment IdAnnotation_node on Node {\n __isNode: __typename\n id\n}\n\nfragment botHelpers_useDeletion_bot on Bot {\n deletionState\n}\n\nfragment botHelpers_useViewerCanAccessPrivateBot on Bot {\n isPrivateBot\n viewerIsCreator\n}\n',
+ },
+ )
+
+ base_string = payload + client.headers['poe-formkey'] + 'WpuLMiXEKKE98j56k'
+ client.headers['poe-tag-id'] = md5(base_string.encode()).hexdigest()
+
+ response = client.post('https://poe.com/api/gql_POST', data=payload)
+
+ if 'success' not in response.text:
+ raise Exception(
+ '''
+ Bot creation Failed
+ !! Important !!
+ Bot creation was not enabled on this account
+ please use: quora.Account.create with enable_bot_creation set to True
+ '''
+ )
+
+ return ModelResponse(response.json())
+
+
+class Account:
+ @staticmethod
+ def create(
+ proxy: Optional[str] = None,
+ logging: bool = False,
+ enable_bot_creation: bool = False,
+ ):
+ client = TLS(client_identifier='chrome110')
+ client.proxies = {'http': f'http://{proxy}', 'https': f'http://{proxy}'} if proxy else {}
+
+ mail_client = Emailnator()
+ mail_address = mail_client.get_mail()
+
+ if logging:
+ print('email', mail_address)
+
+ client.headers = {
+ 'authority': 'poe.com',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'content-type': 'application/json',
+ 'origin': 'https://poe.com',
+ 'poe-tag-id': 'null',
+ 'referer': 'https://poe.com/login',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
+ 'poe-formkey': extract_formkey(client.get('https://poe.com/login').text),
+ 'poe-tchannel': client.get('https://poe.com/api/settings').json()['tchannelData']['channel'],
+ }
+
+ token = reCaptchaV3(
+ 'https://www.recaptcha.net/recaptcha/enterprise/anchor?ar=1&k=6LflhEElAAAAAI_ewVwRWI9hsyV4mbZnYAslSvlG&co=aHR0cHM6Ly9wb2UuY29tOjQ0Mw..&hl=en&v=4PnKmGB9wRHh1i04o7YUICeI&size=invisible&cb=bi6ivxoskyal'
+ )
+ # token = solver.recaptcha(sitekey='6LflhEElAAAAAI_ewVwRWI9hsyV4mbZnYAslSvlG',
+ # url = 'https://poe.com/login?redirect_url=%2F',
+ # version = 'v3',
+ # enterprise = 1,
+ # invisible = 1,
+ # action = 'login',)['code']
+
+ payload = dumps(
+ separators=(',', ':'),
+ obj={
+ 'queryName': 'MainSignupLoginSection_sendVerificationCodeMutation_Mutation',
+ 'variables': {
+ 'emailAddress': mail_address,
+ 'phoneNumber': None,
+ 'recaptchaToken': token,
+ },
+ 'query': 'mutation MainSignupLoginSection_sendVerificationCodeMutation_Mutation(\n $emailAddress: String\n $phoneNumber: String\n $recaptchaToken: String\n) {\n sendVerificationCode(verificationReason: login, emailAddress: $emailAddress, phoneNumber: $phoneNumber, recaptchaToken: $recaptchaToken) {\n status\n errorMessage\n }\n}\n',
+ },
+ )
+
+ base_string = payload + client.headers['poe-formkey'] + 'WpuLMiXEKKE98j56k'
+ client.headers['poe-tag-id'] = md5(base_string.encode()).hexdigest()
+
+ print(dumps(client.headers, indent=4))
+
+ response = client.post('https://poe.com/api/gql_POST', data=payload)
+
+ if 'automated_request_detected' in response.text:
+ print('please try using a proxy / wait for fix')
+
+ if 'Bad Request' in response.text:
+ if logging:
+ print('bad request, retrying...', response.json())
+ quit()
+
+ if logging:
+ print('send_code', response.json())
+
+ mail_content = mail_client.get_message()
+ mail_token = findall(r';">(\d{6,7})', mail_content)[0]
+
+ if logging:
+ print('code', mail_token)
+
+ payload = dumps(
+ separators=(',', ':'),
+ obj={
+ 'queryName': 'SignupOrLoginWithCodeSection_signupWithVerificationCodeMutation_Mutation',
+ 'variables': {
+ 'verificationCode': str(mail_token),
+ 'emailAddress': mail_address,
+ 'phoneNumber': None,
+ },
+ 'query': 'mutation SignupOrLoginWithCodeSection_signupWithVerificationCodeMutation_Mutation(\n $verificationCode: String!\n $emailAddress: String\n $phoneNumber: String\n) {\n signupWithVerificationCode(verificationCode: $verificationCode, emailAddress: $emailAddress, phoneNumber: $phoneNumber) {\n status\n errorMessage\n }\n}\n',
+ },
+ )
+
+ base_string = payload + client.headers['poe-formkey'] + 'WpuLMiXEKKE98j56k'
+ client.headers['poe-tag-id'] = md5(base_string.encode()).hexdigest()
+
+ response = client.post('https://poe.com/api/gql_POST', data=payload)
+ if logging:
+ print('verify_code', response.json())
+
+ def get(self):
+ cookies = open(Path(__file__).resolve().parent / 'cookies.txt', 'r').read().splitlines()
+ return choice(cookies)
+
+ @staticmethod
+ def delete(token: str, proxy: Optional[str] = None):
+ client = PoeClient(token, proxy=proxy)
+ client.delete_account()
+
+
+class StreamingCompletion:
+ @staticmethod
+ def create(
+ model: str = 'gpt-4',
+ custom_model: bool = None,
+ prompt: str = 'hello world',
+ token: str = '',
+ proxy: Optional[str] = None,
+ ) -> Generator[PoeResponse, None, None]:
+ _model = MODELS[model] if not custom_model else custom_model
+
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else False
+ client = PoeClient(token)
+ client.proxy = proxies
+
+ for chunk in client.send_message(_model, prompt):
+ yield PoeResponse(
+ **{
+ 'id': chunk['messageId'],
+ 'object': 'text_completion',
+ 'created': chunk['creationTime'],
+ 'model': _model,
+ 'text': chunk['text_new'],
+ 'choices': [
+ {
+ 'text': chunk['text_new'],
+ 'index': 0,
+ 'logprobs': None,
+ 'finish_reason': 'stop',
+ }
+ ],
+ 'usage': {
+ 'prompt_tokens': len(prompt),
+ 'completion_tokens': len(chunk['text_new']),
+ 'total_tokens': len(prompt) + len(chunk['text_new']),
+ },
+ }
+ )
+
+
+class Completion:
+ @staticmethod
+ def create(
+ model: str = 'gpt-4',
+ custom_model: str = None,
+ prompt: str = 'hello world',
+ token: str = '',
+ proxy: Optional[str] = None,
+ ) -> PoeResponse:
+ _model = MODELS[model] if not custom_model else custom_model
+
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else False
+ client = PoeClient(token)
+ client.proxy = proxies
+
+ chunk = None
+ for response in client.send_message(_model, prompt):
+ chunk = response
+
+ return PoeResponse(
+ **{
+ 'id': chunk['messageId'],
+ 'object': 'text_completion',
+ 'created': chunk['creationTime'],
+ 'model': _model,
+ 'text': chunk['text'],
+ 'choices': [
+ {
+ 'text': chunk['text'],
+ 'index': 0,
+ 'logprobs': None,
+ 'finish_reason': 'stop',
+ }
+ ],
+ 'usage': {
+ 'prompt_tokens': len(prompt),
+ 'completion_tokens': len(chunk['text']),
+ 'total_tokens': len(prompt) + len(chunk['text']),
+ },
+ }
+ )
+
+
+class Poe:
+ def __init__(
+ self,
+ model: str = 'ChatGPT',
+ driver: str = 'firefox',
+ download_driver: bool = False,
+ driver_path: Optional[str] = None,
+ cookie_path: str = './quora/cookie.json',
+ ):
+ # validating the model
+ if model and model not in MODELS:
+ raise RuntimeError('Sorry, the model you provided does not exist. Please check and try again.')
+ self.model = MODELS[model]
+ self.cookie_path = cookie_path
+ self.cookie = self.__load_cookie(driver, driver_path=driver_path)
+ self.client = PoeClient(self.cookie)
+
+ def __load_cookie(self, driver: str, driver_path: Optional[str] = None) -> str:
+ if (cookie_file := Path(self.cookie_path)).exists():
+ with cookie_file.open() as fp:
+ cookie = json.load(fp)
+ if datetime.fromtimestamp(cookie['expiry']) < datetime.now():
+ cookie = self.__register_and_get_cookie(driver, driver_path=driver_path)
+ else:
+ print('Loading the cookie from file')
+ else:
+ cookie = self.__register_and_get_cookie(driver, driver_path=driver_path)
+
+ return unquote(cookie['value'])
+
+ def __register_and_get_cookie(self, driver: str, driver_path: Optional[str] = None) -> dict:
+ mail_client = Emailnator()
+ mail_address = mail_client.get_mail()
+
+ driver = self.__resolve_driver(driver, driver_path=driver_path)
+ driver.get("https://www.poe.com")
+
+ # clicking use email button
+ driver.find_element(By.XPATH, '//button[contains(text(), "Use email")]').click()
+
+ email = WebDriverWait(driver, 30).until(EC.presence_of_element_located((By.XPATH, '//input[@type="email"]')))
+ email.send_keys(mail_address)
+ driver.find_element(By.XPATH, '//button[text()="Go"]').click()
+
+ code = findall(r';">(\d{6,7})', mail_client.get_message())[0]
+ print(code)
+
+ verification_code = WebDriverWait(driver, 30).until(
+ EC.presence_of_element_located((By.XPATH, '//input[@placeholder="Code"]'))
+ )
+ verification_code.send_keys(code)
+ verify_button = EC.presence_of_element_located((By.XPATH, '//button[text()="Verify"]'))
+ login_button = EC.presence_of_element_located((By.XPATH, '//button[text()="Log In"]'))
+
+ WebDriverWait(driver, 30).until(EC.any_of(verify_button, login_button)).click()
+
+ cookie = driver.get_cookie('p-b')
+
+ with open(self.cookie_path, 'w') as fw:
+ json.dump(cookie, fw)
+
+ driver.close()
+ return cookie
+
+ @staticmethod
+ def __resolve_driver(driver: str, driver_path: Optional[str] = None) -> Union[Firefox, Chrome]:
+ options = FirefoxOptions() if driver == 'firefox' else ChromeOptions()
+ options.add_argument('-headless')
+
+ if driver_path:
+ options.binary_location = driver_path
+ try:
+ return Firefox(options=options) if driver == 'firefox' else Chrome(options=options)
+ except Exception:
+ raise Exception(SELENIUM_WEB_DRIVER_ERROR_MSG)
+
+ def chat(self, message: str, model: Optional[str] = None) -> str:
+ if model and model not in MODELS:
+ raise RuntimeError('Sorry, the model you provided does not exist. Please check and try again.')
+ model = MODELS[model] if model else self.model
+ response = None
+ for chunk in self.client.send_message(model, message):
+ response = chunk['text']
+ return response
+
+ def create_bot(self, name: str, /, prompt: str = '', base_model: str = 'ChatGPT', description: str = '') -> None:
+ if base_model not in MODELS:
+ raise RuntimeError('Sorry, the base_model you provided does not exist. Please check and try again.')
+
+ response = self.client.create_bot(
+ handle=name,
+ prompt=prompt,
+ base_model=MODELS[base_model],
+ description=description,
+ )
+ print(f'Successfully created bot with name: {response["bot"]["displayName"]}')
+
+ def list_bots(self) -> list:
+ return list(self.client.bot_names.values())
+
+ def delete_account(self) -> None:
+ self.client.delete_account()
diff --git a/g4f/.v1/gpt4free/quora/api.py b/g4f/.v1/gpt4free/quora/api.py
new file mode 100644
index 0000000000000000000000000000000000000000..6402148940d9486c3a95365fee681ad08ae9134f
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/api.py
@@ -0,0 +1,558 @@
+# This file was taken from the repository poe-api https://github.com/ading2210/poe-api and is unmodified
+# This file is licensed under the GNU GPL v3 and written by @ading2210
+
+# license:
+# ading2210/poe-api: a reverse engineered Python API wrapepr for Quora's Poe
+# Copyright (C) 2023 ading2210
+
+# This program is free software: you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation, either version 3 of the License, or
+# (at your option) any later version.
+
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+# GNU General Public License for more details.
+
+# You should have received a copy of the GNU General Public License
+# along with this program. If not, see .
+
+import hashlib
+import json
+import logging
+import queue
+import random
+import re
+import threading
+import time
+import traceback
+from pathlib import Path
+from urllib.parse import urlparse
+
+import requests
+import requests.adapters
+import websocket
+
+parent_path = Path(__file__).resolve().parent
+queries_path = parent_path / "graphql"
+queries = {}
+
+logging.basicConfig()
+logger = logging.getLogger()
+
+user_agent = "Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Firefox/102.0"
+
+
+def load_queries():
+ for path in queries_path.iterdir():
+ if path.suffix != ".graphql":
+ continue
+ with open(path) as f:
+ queries[path.stem] = f.read()
+
+
+def generate_payload(query_name, variables):
+ return {"query": queries[query_name], "variables": variables}
+
+
+def retry_request(method, *args, **kwargs):
+ """Retry a request with 10 attempts by default, delay increases exponentially"""
+ max_attempts: int = kwargs.pop("max_attempts", 10)
+ delay = kwargs.pop("delay", 1)
+ url = args[0]
+
+ for attempt in range(1, max_attempts + 1):
+ try:
+ response = method(*args, **kwargs)
+ response.raise_for_status()
+ return response
+ except Exception as error:
+ logger.warning(
+ f"Attempt {attempt}/{max_attempts} failed with error: {error}. "
+ f"Retrying in {delay} seconds..."
+ )
+ time.sleep(delay)
+ delay *= 2
+ raise RuntimeError(f"Failed to download {url} after {max_attempts} attempts.")
+
+
+class Client:
+ gql_url = "https://poe.com/api/gql_POST"
+ gql_recv_url = "https://poe.com/api/receive_POST"
+ home_url = "https://poe.com"
+ settings_url = "https://poe.com/api/settings"
+
+ def __init__(self, token, proxy=None):
+ self.proxy = proxy
+ self.session = requests.Session()
+ self.adapter = requests.adapters.HTTPAdapter(pool_connections=100, pool_maxsize=100)
+ self.session.mount("http://", self.adapter)
+ self.session.mount("https://", self.adapter)
+
+ if proxy:
+ self.session.proxies = {"http": self.proxy, "https": self.proxy}
+ logger.info(f"Proxy enabled: {self.proxy}")
+
+ self.active_messages = {}
+ self.message_queues = {}
+
+ self.session.cookies.set("p-b", token, domain="poe.com")
+ self.headers = {
+ "User-Agent": user_agent,
+ "Referrer": "https://poe.com/",
+ "Origin": "https://poe.com",
+ }
+ self.session.headers.update(self.headers)
+
+ self.setup_connection()
+ self.connect_ws()
+
+ def setup_connection(self):
+ self.ws_domain = f"tch{random.randint(1, 1e6)}"
+ self.next_data = self.get_next_data(overwrite_vars=True)
+ self.channel = self.get_channel_data()
+ self.bots = self.get_bots(download_next_data=False)
+ self.bot_names = self.get_bot_names()
+
+ self.gql_headers = {
+ "poe-formkey": self.formkey,
+ "poe-tchannel": self.channel["channel"],
+ }
+ self.gql_headers = {**self.gql_headers, **self.headers}
+ self.subscribe()
+
+ def extract_formkey(self, html):
+ script_regex = r""
+ script_text = re.search(script_regex, html).group(1)
+ key_regex = r'var .="([0-9a-f]+)",'
+ key_text = re.search(key_regex, script_text).group(1)
+ cipher_regex = r".\[(\d+)\]=.\[(\d+)\]"
+ cipher_pairs = re.findall(cipher_regex, script_text)
+
+ formkey_list = [""] * len(cipher_pairs)
+ for pair in cipher_pairs:
+ formkey_index, key_index = map(int, pair)
+ formkey_list[formkey_index] = key_text[key_index]
+ formkey = "".join(formkey_list)
+
+ return formkey
+
+ def get_next_data(self, overwrite_vars=False):
+ logger.info("Downloading next_data...")
+
+ r = retry_request(self.session.get, self.home_url)
+ json_regex = r''
+ json_text = re.search(json_regex, r.text).group(1)
+ next_data = json.loads(json_text)
+
+ if overwrite_vars:
+ self.formkey = self.extract_formkey(r.text)
+ self.viewer = next_data["props"]["pageProps"]["payload"]["viewer"]
+ self.next_data = next_data
+
+ return next_data
+
+ def get_bot(self, display_name):
+ url = f'https://poe.com/_next/data/{self.next_data["buildId"]}/{display_name}.json'
+
+ r = retry_request(self.session.get, url)
+
+ chat_data = r.json()["pageProps"]["payload"]["chatOfBotDisplayName"]
+ return chat_data
+
+ def get_bots(self, download_next_data=True):
+ logger.info("Downloading all bots...")
+ if download_next_data:
+ next_data = self.get_next_data(overwrite_vars=True)
+ else:
+ next_data = self.next_data
+
+ if not "viewerBotList" in self.viewer:
+ raise RuntimeError("Invalid token or no bots are available.")
+ bot_list = self.viewer["viewerBotList"]
+
+ threads = []
+ bots = {}
+
+ def get_bot_thread(bot):
+ chat_data = self.get_bot(bot["displayName"])
+ bots[chat_data["defaultBotObject"]["nickname"]] = chat_data
+
+ for bot in bot_list:
+ thread = threading.Thread(target=get_bot_thread, args=(bot,), daemon=True)
+ threads.append(thread)
+
+ for thread in threads:
+ thread.start()
+ for thread in threads:
+ thread.join()
+
+ self.bots = bots
+ self.bot_names = self.get_bot_names()
+ return bots
+
+ def get_bot_names(self):
+ bot_names = {}
+ for bot_nickname in self.bots:
+ bot_obj = self.bots[bot_nickname]["defaultBotObject"]
+ bot_names[bot_nickname] = bot_obj["displayName"]
+ return bot_names
+
+ def get_remaining_messages(self, chatbot):
+ chat_data = self.get_bot(self.bot_names[chatbot])
+ return chat_data["defaultBotObject"]["messageLimit"]["numMessagesRemaining"]
+
+ def get_channel_data(self, channel=None):
+ logger.info("Downloading channel data...")
+ r = retry_request(self.session.get, self.settings_url)
+ data = r.json()
+
+ return data["tchannelData"]
+
+ def get_websocket_url(self, channel=None):
+ if channel is None:
+ channel = self.channel
+ query = f'?min_seq={channel["minSeq"]}&channel={channel["channel"]}&hash={channel["channelHash"]}'
+ return f'wss://{self.ws_domain}.tch.{channel["baseHost"]}/up/{channel["boxName"]}/updates' + query
+
+ def send_query(self, query_name, variables):
+ for i in range(20):
+ json_data = generate_payload(query_name, variables)
+ payload = json.dumps(json_data, separators=(",", ":"))
+
+ base_string = payload + self.gql_headers["poe-formkey"] + "WpuLMiXEKKE98j56k"
+
+ headers = {
+ "content-type": "application/json",
+ "poe-tag-id": hashlib.md5(base_string.encode()).hexdigest(),
+ }
+ headers = {**self.gql_headers, **headers}
+
+ r = retry_request(self.session.post, self.gql_url, data=payload, headers=headers)
+
+ data = r.json()
+ if data["data"] is None:
+ logger.warn(f'{query_name} returned an error: {data["errors"][0]["message"]} | Retrying ({i + 1}/20)')
+ time.sleep(2)
+ continue
+
+ return r.json()
+
+ raise RuntimeError(f"{query_name} failed too many times.")
+
+ def subscribe(self):
+ logger.info("Subscribing to mutations")
+ result = self.send_query(
+ "SubscriptionsMutation",
+ {
+ "subscriptions": [
+ {
+ "subscriptionName": "messageAdded",
+ "query": queries["MessageAddedSubscription"],
+ },
+ {
+ "subscriptionName": "viewerStateUpdated",
+ "query": queries["ViewerStateUpdatedSubscription"],
+ },
+ ]
+ },
+ )
+
+ def ws_run_thread(self):
+ kwargs = {}
+ if self.proxy:
+ proxy_parsed = urlparse(self.proxy)
+ kwargs = {
+ "proxy_type": proxy_parsed.scheme,
+ "http_proxy_host": proxy_parsed.hostname,
+ "http_proxy_port": proxy_parsed.port,
+ }
+
+ self.ws.run_forever(**kwargs)
+
+ def connect_ws(self):
+ self.ws_connected = False
+ self.ws = websocket.WebSocketApp(
+ self.get_websocket_url(),
+ header={"User-Agent": user_agent},
+ on_message=self.on_message,
+ on_open=self.on_ws_connect,
+ on_error=self.on_ws_error,
+ on_close=self.on_ws_close,
+ )
+ t = threading.Thread(target=self.ws_run_thread, daemon=True)
+ t.start()
+ while not self.ws_connected:
+ time.sleep(0.01)
+
+ def disconnect_ws(self):
+ if self.ws:
+ self.ws.close()
+ self.ws_connected = False
+
+ def on_ws_connect(self, ws):
+ self.ws_connected = True
+
+ def on_ws_close(self, ws, close_status_code, close_message):
+ self.ws_connected = False
+ logger.warn(f"Websocket closed with status {close_status_code}: {close_message}")
+
+ def on_ws_error(self, ws, error):
+ self.disconnect_ws()
+ self.connect_ws()
+
+ def on_message(self, ws, msg):
+ try:
+ data = json.loads(msg)
+
+ if not "messages" in data:
+ return
+
+ for message_str in data["messages"]:
+ message_data = json.loads(message_str)
+ if message_data["message_type"] != "subscriptionUpdate":
+ continue
+ message = message_data["payload"]["data"]["messageAdded"]
+
+ copied_dict = self.active_messages.copy()
+ for key, value in copied_dict.items():
+ # add the message to the appropriate queue
+ if value == message["messageId"] and key in self.message_queues:
+ self.message_queues[key].put(message)
+ return
+
+ # indicate that the response id is tied to the human message id
+ elif key != "pending" and value is None and message["state"] != "complete":
+ self.active_messages[key] = message["messageId"]
+ self.message_queues[key].put(message)
+ return
+
+ except Exception:
+ logger.error(traceback.format_exc())
+ self.disconnect_ws()
+ self.connect_ws()
+
+ def send_message(self, chatbot, message, with_chat_break=False, timeout=20):
+ # if there is another active message, wait until it has finished sending
+ while None in self.active_messages.values():
+ time.sleep(0.01)
+
+ # None indicates that a message is still in progress
+ self.active_messages["pending"] = None
+
+ logger.info(f"Sending message to {chatbot}: {message}")
+
+ # reconnect websocket
+ if not self.ws_connected:
+ self.disconnect_ws()
+ self.setup_connection()
+ self.connect_ws()
+
+ message_data = self.send_query(
+ "SendMessageMutation",
+ {
+ "bot": chatbot,
+ "query": message,
+ "chatId": self.bots[chatbot]["chatId"],
+ "source": None,
+ "withChatBreak": with_chat_break,
+ },
+ )
+ del self.active_messages["pending"]
+
+ if not message_data["data"]["messageEdgeCreate"]["message"]:
+ raise RuntimeError(f"Daily limit reached for {chatbot}.")
+ try:
+ human_message = message_data["data"]["messageEdgeCreate"]["message"]
+ human_message_id = human_message["node"]["messageId"]
+ except TypeError:
+ raise RuntimeError(f"An unknown error occurred. Raw response data: {message_data}")
+
+ # indicate that the current message is waiting for a response
+ self.active_messages[human_message_id] = None
+ self.message_queues[human_message_id] = queue.Queue()
+
+ last_text = ""
+ message_id = None
+ while True:
+ try:
+ message = self.message_queues[human_message_id].get(timeout=timeout)
+ except queue.Empty:
+ del self.active_messages[human_message_id]
+ del self.message_queues[human_message_id]
+ raise RuntimeError("Response timed out.")
+
+ # only break when the message is marked as complete
+ if message["state"] == "complete":
+ if last_text and message["messageId"] == message_id:
+ break
+ else:
+ continue
+
+ # update info about response
+ message["text_new"] = message["text"][len(last_text) :]
+ last_text = message["text"]
+ message_id = message["messageId"]
+
+ yield message
+
+ del self.active_messages[human_message_id]
+ del self.message_queues[human_message_id]
+
+ def send_chat_break(self, chatbot):
+ logger.info(f"Sending chat break to {chatbot}")
+ result = self.send_query("AddMessageBreakMutation", {"chatId": self.bots[chatbot]["chatId"]})
+ return result["data"]["messageBreakCreate"]["message"]
+
+ def get_message_history(self, chatbot, count=25, cursor=None):
+ logger.info(f"Downloading {count} messages from {chatbot}")
+
+ messages = []
+ if cursor is None:
+ chat_data = self.get_bot(self.bot_names[chatbot])
+ if not chat_data["messagesConnection"]["edges"]:
+ return []
+ messages = chat_data["messagesConnection"]["edges"][:count]
+ cursor = chat_data["messagesConnection"]["pageInfo"]["startCursor"]
+ count -= len(messages)
+
+ cursor = str(cursor)
+ if count > 50:
+ messages = self.get_message_history(chatbot, count=50, cursor=cursor) + messages
+ while count > 0:
+ count -= 50
+ new_cursor = messages[0]["cursor"]
+ new_messages = self.get_message_history(chatbot, min(50, count), cursor=new_cursor)
+ messages = new_messages + messages
+ return messages
+ elif count <= 0:
+ return messages
+
+ result = self.send_query(
+ "ChatListPaginationQuery",
+ {"count": count, "cursor": cursor, "id": self.bots[chatbot]["id"]},
+ )
+ query_messages = result["data"]["node"]["messagesConnection"]["edges"]
+ messages = query_messages + messages
+ return messages
+
+ def delete_message(self, message_ids):
+ logger.info(f"Deleting messages: {message_ids}")
+ if not type(message_ids) is list:
+ message_ids = [int(message_ids)]
+
+ result = self.send_query("DeleteMessageMutation", {"messageIds": message_ids})
+
+ def purge_conversation(self, chatbot, count=-1):
+ logger.info(f"Purging messages from {chatbot}")
+ last_messages = self.get_message_history(chatbot, count=50)[::-1]
+ while last_messages:
+ message_ids = []
+ for message in last_messages:
+ if count == 0:
+ break
+ count -= 1
+ message_ids.append(message["node"]["messageId"])
+
+ self.delete_message(message_ids)
+
+ if count == 0:
+ return
+ last_messages = self.get_message_history(chatbot, count=50)[::-1]
+ logger.info(f"No more messages left to delete.")
+
+ def create_bot(
+ self,
+ handle,
+ prompt="",
+ base_model="chinchilla",
+ description="",
+ intro_message="",
+ api_key=None,
+ api_bot=False,
+ api_url=None,
+ prompt_public=True,
+ pfp_url=None,
+ linkification=False,
+ markdown_rendering=True,
+ suggested_replies=False,
+ private=False,
+ ):
+ result = self.send_query(
+ "PoeBotCreateMutation",
+ {
+ "model": base_model,
+ "handle": handle,
+ "prompt": prompt,
+ "isPromptPublic": prompt_public,
+ "introduction": intro_message,
+ "description": description,
+ "profilePictureUrl": pfp_url,
+ "apiUrl": api_url,
+ "apiKey": api_key,
+ "isApiBot": api_bot,
+ "hasLinkification": linkification,
+ "hasMarkdownRendering": markdown_rendering,
+ "hasSuggestedReplies": suggested_replies,
+ "isPrivateBot": private,
+ },
+ )
+
+ data = result["data"]["poeBotCreate"]
+ if data["status"] != "success":
+ raise RuntimeError(f"Poe returned an error while trying to create a bot: {data['status']}")
+ self.get_bots()
+ return data
+
+ def edit_bot(
+ self,
+ bot_id,
+ handle,
+ prompt="",
+ base_model="chinchilla",
+ description="",
+ intro_message="",
+ api_key=None,
+ api_url=None,
+ private=False,
+ prompt_public=True,
+ pfp_url=None,
+ linkification=False,
+ markdown_rendering=True,
+ suggested_replies=False,
+ ):
+ result = self.send_query(
+ "PoeBotEditMutation",
+ {
+ "baseBot": base_model,
+ "botId": bot_id,
+ "handle": handle,
+ "prompt": prompt,
+ "isPromptPublic": prompt_public,
+ "introduction": intro_message,
+ "description": description,
+ "profilePictureUrl": pfp_url,
+ "apiUrl": api_url,
+ "apiKey": api_key,
+ "hasLinkification": linkification,
+ "hasMarkdownRendering": markdown_rendering,
+ "hasSuggestedReplies": suggested_replies,
+ "isPrivateBot": private,
+ },
+ )
+
+ data = result["data"]["poeBotEdit"]
+ if data["status"] != "success":
+ raise RuntimeError(f"Poe returned an error while trying to edit a bot: {data['status']}")
+ self.get_bots()
+ return data
+
+ def delete_account(self) -> None:
+ response = self.send_query('SettingsDeleteAccountButton_deleteAccountMutation_Mutation', {})
+ data = response['data']['deleteAccount']
+ if 'viewer' not in data:
+ raise RuntimeError(f'Error occurred while deleting the account, Please try again!')
+
+
+load_queries()
diff --git a/g4f/.v1/gpt4free/quora/backup-mail.py b/g4f/.v1/gpt4free/quora/backup-mail.py
new file mode 100644
index 0000000000000000000000000000000000000000..749149fd091f30fdae77d20c57cf6197d83874c9
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/backup-mail.py
@@ -0,0 +1,45 @@
+from json import loads
+from re import findall
+from time import sleep
+
+from requests import Session
+
+
+class Mail:
+ def __init__(self) -> None:
+ self.client = Session()
+ self.client.post("https://etempmail.com/")
+ self.cookies = {'acceptcookie': 'true'}
+ self.cookies["ci_session"] = self.client.cookies.get_dict()["ci_session"]
+ self.email = None
+
+ def get_mail(self):
+ respone = self.client.post("https://etempmail.com/getEmailAddress")
+ # cookies
+ self.cookies["lisansimo"] = eval(respone.text)["recover_key"]
+ self.email = eval(respone.text)["address"]
+ return self.email
+
+ def get_message(self):
+ print("Waiting for message...")
+ while True:
+ sleep(5)
+ respone = self.client.post("https://etempmail.com/getInbox")
+ mail_token = loads(respone.text)
+ print(self.client.cookies.get_dict())
+ if len(mail_token) == 1:
+ break
+
+ params = {
+ 'id': '1',
+ }
+ self.mail_context = self.client.post("https://etempmail.com/getInbox", params=params)
+ self.mail_context = eval(self.mail_context.text)[0]["body"]
+ return self.mail_context
+
+ # ,cookies=self.cookies
+ def get_verification_code(self):
+ message = self.mail_context
+ code = findall(r';">(\d{6,7})', message)[0]
+ print(f"Verification code: {code}")
+ return code
diff --git a/g4f/.v1/gpt4free/quora/cookies.txt b/g4f/.v1/gpt4free/quora/cookies.txt
new file mode 100644
index 0000000000000000000000000000000000000000..9cccf6ba295af6e067f192e1d608121497701790
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/cookies.txt
@@ -0,0 +1,30 @@
+SmPiNXZI9hBTuf3viz74PA==
+zw7RoKQfeEehiaelYMRWeA==
+NEttgJ_rRQdO05Tppx6hFw==
+3OnmC0r9njYdNWhWszdQJg==
+8hZKR7MxwUTEHvO45TEViw==
+Eea6BqK0AmosTKzoI3AAow==
+pUEbtxobN_QUSpLIR8RGww==
+9_dUWxKkHHhpQRSvCvBk2Q==
+UV45rvGwUwi2qV9QdIbMcw==
+cVIN0pK1Wx-F7zCdUxlYqA==
+UP2wQVds17VFHh6IfCQFrA==
+18eKr0ME2Tzifdfqat38Aw==
+FNgKEpc2r-XqWe0rHBfYpg==
+juCAh6kB0sUpXHvKik2woA==
+nBvuNYRLaE4xE4HuzBPiIQ==
+oyae3iClomSrk6RJywZ4iw==
+1Z27Ul8BTdNOhncT5H6wdg==
+wfUfJIlwQwUss8l-3kDt3w==
+f6Jw_Nr0PietpNCtOCXJTw==
+6Jc3yCs7XhDRNHa4ZML09g==
+3vy44sIy-ZlTMofFiFDttw==
+p9FbMGGiK1rShKgL3YWkDg==
+pw6LI5Op84lf4HOY7fn91A==
+QemKm6aothMvqcEgeKFDlQ==
+cceZzucA-CEHR0Gt6VLYLQ==
+JRRObMp2RHVn5u4730DPvQ==
+XNt0wLTjX7Z-EsRR3TJMIQ==
+csjjirAUKtT5HT1KZUq1kg==
+8qZdCatCPQZyS7jsO4hkdQ==
+esnUxcBhvH1DmCJTeld0qw==
diff --git a/g4f/.v1/gpt4free/quora/graphql/AddHumanMessageMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/AddHumanMessageMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..01e6bc8cd4446e824ed732746178c7be926c8a2e
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/AddHumanMessageMutation.graphql
@@ -0,0 +1,52 @@
+mutation AddHumanMessageMutation(
+ $chatId: BigInt!
+ $bot: String!
+ $query: String!
+ $source: MessageSource
+ $withChatBreak: Boolean! = false
+) {
+ messageCreateWithStatus(
+ chatId: $chatId
+ bot: $bot
+ query: $query
+ source: $source
+ withChatBreak: $withChatBreak
+ ) {
+ message {
+ id
+ __typename
+ messageId
+ text
+ linkifiedText
+ authorNickname
+ state
+ vote
+ voteReason
+ creationTime
+ suggestedReplies
+ chat {
+ id
+ shouldShowDisclaimer
+ }
+ }
+ messageLimit{
+ canSend
+ numMessagesRemaining
+ resetTime
+ shouldShowReminder
+ }
+ chatBreak {
+ id
+ __typename
+ messageId
+ text
+ linkifiedText
+ authorNickname
+ state
+ vote
+ voteReason
+ creationTime
+ suggestedReplies
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/AddMessageBreakMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/AddMessageBreakMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..b28d9903cbe70ec157f9d8c63c1da3f5f41cecec
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/AddMessageBreakMutation.graphql
@@ -0,0 +1,17 @@
+mutation AddMessageBreakMutation($chatId: BigInt!) {
+ messageBreakCreate(chatId: $chatId) {
+ message {
+ id
+ __typename
+ messageId
+ text
+ linkifiedText
+ authorNickname
+ state
+ vote
+ voteReason
+ creationTime
+ suggestedReplies
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/AutoSubscriptionMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/AutoSubscriptionMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..6cf7bf7429861145e355c8b14f81c6fbb1911e20
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/AutoSubscriptionMutation.graphql
@@ -0,0 +1,7 @@
+mutation AutoSubscriptionMutation($subscriptions: [AutoSubscriptionQuery!]!) {
+ autoSubscribe(subscriptions: $subscriptions) {
+ viewer {
+ id
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/BioFragment.graphql b/g4f/.v1/gpt4free/quora/graphql/BioFragment.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..c42180309effae889b4ea40b82140202af4fd1e9
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/BioFragment.graphql
@@ -0,0 +1,8 @@
+fragment BioFragment on Viewer {
+ id
+ poeUser {
+ id
+ uid
+ bio
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/ChatAddedSubscription.graphql b/g4f/.v1/gpt4free/quora/graphql/ChatAddedSubscription.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..664b107fa6831ed444919159664d52370777033e
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ChatAddedSubscription.graphql
@@ -0,0 +1,5 @@
+subscription ChatAddedSubscription {
+ chatAdded {
+ ...ChatFragment
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/ChatFragment.graphql b/g4f/.v1/gpt4free/quora/graphql/ChatFragment.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..605645ff3d4002049ac4654cddd171e3aed04d5c
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ChatFragment.graphql
@@ -0,0 +1,6 @@
+fragment ChatFragment on Chat {
+ id
+ chatId
+ defaultBotNickname
+ shouldShowDisclaimer
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/ChatListPaginationQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/ChatListPaginationQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..6d9ae8840405ed0b17a8a7fa79036c974103ced2
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ChatListPaginationQuery.graphql
@@ -0,0 +1,378 @@
+query ChatListPaginationQuery(
+ $count: Int = 5
+ $cursor: String
+ $id: ID!
+) {
+ node(id: $id) {
+ __typename
+ ...ChatPageMain_chat_1G22uz
+ id
+ }
+}
+
+fragment BotImage_bot on Bot {
+ displayName
+ ...botHelpers_useDeletion_bot
+ ...BotImage_useProfileImage_bot
+}
+
+fragment BotImage_useProfileImage_bot on Bot {
+ image {
+ __typename
+ ... on LocalBotImage {
+ localName
+ }
+ ... on UrlBotImage {
+ url
+ }
+ }
+ ...botHelpers_useDeletion_bot
+}
+
+fragment ChatMessageDownvotedButton_message on Message {
+ ...MessageFeedbackReasonModal_message
+ ...MessageFeedbackOtherModal_message
+}
+
+fragment ChatMessageDropdownMenu_message on Message {
+ id
+ messageId
+ vote
+ text
+ author
+ ...chatHelpers_isBotMessage
+}
+
+fragment ChatMessageFeedbackButtons_message on Message {
+ id
+ messageId
+ vote
+ voteReason
+ ...ChatMessageDownvotedButton_message
+}
+
+fragment ChatMessageInputView_chat on Chat {
+ id
+ chatId
+ defaultBotObject {
+ nickname
+ messageLimit {
+ dailyBalance
+ shouldShowRemainingMessageCount
+ }
+ hasClearContext
+ isDown
+ ...botHelpers_useDeletion_bot
+ id
+ }
+ shouldShowDisclaimer
+ ...chatHelpers_useSendMessage_chat
+ ...chatHelpers_useSendChatBreak_chat
+}
+
+fragment ChatMessageInputView_edges on MessageEdge {
+ node {
+ ...chatHelpers_isChatBreak
+ ...chatHelpers_isHumanMessage
+ state
+ text
+ id
+ }
+}
+
+fragment ChatMessageOverflowButton_message on Message {
+ text
+ ...ChatMessageDropdownMenu_message
+ ...chatHelpers_isBotMessage
+}
+
+fragment ChatMessageSuggestedReplies_SuggestedReplyButton_chat on Chat {
+ ...chatHelpers_useSendMessage_chat
+}
+
+fragment ChatMessageSuggestedReplies_SuggestedReplyButton_message on Message {
+ messageId
+}
+
+fragment ChatMessageSuggestedReplies_chat on Chat {
+ ...ChatWelcomeView_chat
+ ...ChatMessageSuggestedReplies_SuggestedReplyButton_chat
+ defaultBotObject {
+ hasWelcomeTopics
+ id
+ }
+}
+
+fragment ChatMessageSuggestedReplies_message on Message {
+ suggestedReplies
+ ...ChatMessageSuggestedReplies_SuggestedReplyButton_message
+}
+
+fragment ChatMessage_chat on Chat {
+ defaultBotObject {
+ hasWelcomeTopics
+ hasSuggestedReplies
+ disclaimerText
+ messageLimit {
+ ...ChatPageRateLimitedBanner_messageLimit
+ }
+ ...ChatPageDisclaimer_bot
+ id
+ }
+ ...ChatMessageSuggestedReplies_chat
+ ...ChatWelcomeView_chat
+}
+
+fragment ChatMessage_message on Message {
+ id
+ messageId
+ text
+ author
+ linkifiedText
+ state
+ contentType
+ ...ChatMessageSuggestedReplies_message
+ ...ChatMessageFeedbackButtons_message
+ ...ChatMessageOverflowButton_message
+ ...chatHelpers_isHumanMessage
+ ...chatHelpers_isBotMessage
+ ...chatHelpers_isChatBreak
+ ...chatHelpers_useTimeoutLevel
+ ...MarkdownLinkInner_message
+ ...IdAnnotation_node
+}
+
+fragment ChatMessagesView_chat on Chat {
+ ...ChatMessage_chat
+ ...ChatWelcomeView_chat
+ ...IdAnnotation_node
+ defaultBotObject {
+ hasWelcomeTopics
+ messageLimit {
+ ...ChatPageRateLimitedBanner_messageLimit
+ }
+ id
+ }
+}
+
+fragment ChatMessagesView_edges on MessageEdge {
+ node {
+ id
+ messageId
+ creationTime
+ ...ChatMessage_message
+ ...chatHelpers_isBotMessage
+ ...chatHelpers_isHumanMessage
+ ...chatHelpers_isChatBreak
+ }
+}
+
+fragment ChatPageDeleteFooter_chat on Chat {
+ ...MessageDeleteConfirmationModal_chat
+}
+
+fragment ChatPageDisclaimer_bot on Bot {
+ disclaimerText
+}
+
+fragment ChatPageMainFooter_chat on Chat {
+ defaultBotObject {
+ ...ChatPageMainFooter_useAccessMessage_bot
+ id
+ }
+ ...ChatMessageInputView_chat
+ ...ChatPageShareFooter_chat
+ ...ChatPageDeleteFooter_chat
+}
+
+fragment ChatPageMainFooter_edges on MessageEdge {
+ ...ChatMessageInputView_edges
+}
+
+fragment ChatPageMainFooter_useAccessMessage_bot on Bot {
+ ...botHelpers_useDeletion_bot
+ ...botHelpers_useViewerCanAccessPrivateBot
+}
+
+fragment ChatPageMain_chat_1G22uz on Chat {
+ id
+ chatId
+ ...ChatPageShareFooter_chat
+ ...ChatPageDeleteFooter_chat
+ ...ChatMessagesView_chat
+ ...MarkdownLinkInner_chat
+ ...chatHelpers_useUpdateStaleChat_chat
+ ...ChatSubscriptionPaywallContextWrapper_chat
+ ...ChatPageMainFooter_chat
+ messagesConnection(last: $count, before: $cursor) {
+ edges {
+ ...ChatMessagesView_edges
+ ...ChatPageMainFooter_edges
+ ...MarkdownLinkInner_edges
+ node {
+ ...chatHelpers_useUpdateStaleChat_message
+ id
+ __typename
+ }
+ cursor
+ id
+ }
+ pageInfo {
+ hasPreviousPage
+ startCursor
+ }
+ id
+ }
+}
+
+fragment ChatPageRateLimitedBanner_messageLimit on MessageLimit {
+ numMessagesRemaining
+}
+
+fragment ChatPageShareFooter_chat on Chat {
+ chatId
+}
+
+fragment ChatSubscriptionPaywallContextWrapper_chat on Chat {
+ defaultBotObject {
+ messageLimit {
+ numMessagesRemaining
+ shouldShowRemainingMessageCount
+ }
+ ...SubscriptionPaywallModal_bot
+ id
+ }
+}
+
+fragment ChatWelcomeView_ChatWelcomeButton_chat on Chat {
+ ...chatHelpers_useSendMessage_chat
+}
+
+fragment ChatWelcomeView_chat on Chat {
+ ...ChatWelcomeView_ChatWelcomeButton_chat
+ defaultBotObject {
+ displayName
+ id
+ }
+}
+
+fragment IdAnnotation_node on Node {
+ __isNode: __typename
+ id
+}
+
+fragment MarkdownLinkInner_chat on Chat {
+ id
+ chatId
+ defaultBotObject {
+ nickname
+ id
+ }
+ ...chatHelpers_useSendMessage_chat
+}
+
+fragment MarkdownLinkInner_edges on MessageEdge {
+ node {
+ state
+ id
+ }
+}
+
+fragment MarkdownLinkInner_message on Message {
+ messageId
+}
+
+fragment MessageDeleteConfirmationModal_chat on Chat {
+ id
+}
+
+fragment MessageFeedbackOtherModal_message on Message {
+ id
+ messageId
+}
+
+fragment MessageFeedbackReasonModal_message on Message {
+ id
+ messageId
+}
+
+fragment SubscriptionPaywallModal_bot on Bot {
+ displayName
+ messageLimit {
+ dailyLimit
+ numMessagesRemaining
+ shouldShowRemainingMessageCount
+ resetTime
+ }
+ ...BotImage_bot
+}
+
+fragment botHelpers_useDeletion_bot on Bot {
+ deletionState
+}
+
+fragment botHelpers_useViewerCanAccessPrivateBot on Bot {
+ isPrivateBot
+ viewerIsCreator
+}
+
+fragment chatHelpers_isBotMessage on Message {
+ ...chatHelpers_isHumanMessage
+ ...chatHelpers_isChatBreak
+}
+
+fragment chatHelpers_isChatBreak on Message {
+ author
+}
+
+fragment chatHelpers_isHumanMessage on Message {
+ author
+}
+
+fragment chatHelpers_useSendChatBreak_chat on Chat {
+ id
+ chatId
+ defaultBotObject {
+ nickname
+ introduction
+ model
+ id
+ }
+ shouldShowDisclaimer
+}
+
+fragment chatHelpers_useSendMessage_chat on Chat {
+ id
+ chatId
+ defaultBotObject {
+ id
+ nickname
+ }
+ shouldShowDisclaimer
+}
+
+fragment chatHelpers_useTimeoutLevel on Message {
+ id
+ state
+ text
+ messageId
+ chat {
+ chatId
+ defaultBotNickname
+ id
+ }
+}
+
+fragment chatHelpers_useUpdateStaleChat_chat on Chat {
+ chatId
+ defaultBotObject {
+ contextClearWindowSecs
+ id
+ }
+ ...chatHelpers_useSendChatBreak_chat
+}
+
+fragment chatHelpers_useUpdateStaleChat_message on Message {
+ creationTime
+ ...chatHelpers_isChatBreak
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/ChatPaginationQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/ChatPaginationQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..f2452cd6c2a8fb3b13e0a9914a2642705275e45d
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ChatPaginationQuery.graphql
@@ -0,0 +1,26 @@
+query ChatPaginationQuery($bot: String!, $before: String, $last: Int! = 10) {
+ chatOfBot(bot: $bot) {
+ id
+ __typename
+ messagesConnection(before: $before, last: $last) {
+ pageInfo {
+ hasPreviousPage
+ }
+ edges {
+ node {
+ id
+ __typename
+ messageId
+ text
+ linkifiedText
+ authorNickname
+ state
+ vote
+ voteReason
+ creationTime
+ suggestedReplies
+ }
+ }
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/ChatViewQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/ChatViewQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..c330107df0267a3235e2ac93f7fde58a93540bf1
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ChatViewQuery.graphql
@@ -0,0 +1,8 @@
+query ChatViewQuery($bot: String!) {
+ chatOfBot(bot: $bot) {
+ id
+ chatId
+ defaultBotNickname
+ shouldShowDisclaimer
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/DeleteHumanMessagesMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/DeleteHumanMessagesMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..42692c6e3bf35d6bc16e63b254aa01d7541994c8
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/DeleteHumanMessagesMutation.graphql
@@ -0,0 +1,7 @@
+mutation DeleteHumanMessagesMutation($messageIds: [BigInt!]!) {
+ messagesDelete(messageIds: $messageIds) {
+ viewer {
+ id
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/DeleteMessageMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/DeleteMessageMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..7b9e36d4de63a0a25cfbb27e73bf6742050c7dfd
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/DeleteMessageMutation.graphql
@@ -0,0 +1,7 @@
+mutation deleteMessageMutation(
+ $messageIds: [BigInt!]!
+) {
+ messagesDelete(messageIds: $messageIds) {
+ edgeIds
+ }
+}
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/quora/graphql/HandleFragment.graphql b/g4f/.v1/gpt4free/quora/graphql/HandleFragment.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..f53c484be3df99d43f940269059552fc6e8d914d
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/HandleFragment.graphql
@@ -0,0 +1,8 @@
+fragment HandleFragment on Viewer {
+ id
+ poeUser {
+ id
+ uid
+ handle
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/LoginWithVerificationCodeMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/LoginWithVerificationCodeMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..723b1f44ba4cb51196f35b6d0a4deddff9bf187f
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/LoginWithVerificationCodeMutation.graphql
@@ -0,0 +1,13 @@
+mutation LoginWithVerificationCodeMutation(
+ $verificationCode: String!
+ $emailAddress: String
+ $phoneNumber: String
+) {
+ loginWithVerificationCode(
+ verificationCode: $verificationCode
+ emailAddress: $emailAddress
+ phoneNumber: $phoneNumber
+ ) {
+ status
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/MessageAddedSubscription.graphql b/g4f/.v1/gpt4free/quora/graphql/MessageAddedSubscription.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..8dc9499c5ab5da8ff330bea0e1608afade29a414
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/MessageAddedSubscription.graphql
@@ -0,0 +1,100 @@
+subscription messageAdded (
+ $chatId: BigInt!
+) {
+ messageAdded(chatId: $chatId) {
+ id
+ messageId
+ creationTime
+ state
+ ...ChatMessage_message
+ ...chatHelpers_isBotMessage
+ }
+}
+
+fragment ChatMessageDownvotedButton_message on Message {
+ ...MessageFeedbackReasonModal_message
+ ...MessageFeedbackOtherModal_message
+}
+
+fragment ChatMessageDropdownMenu_message on Message {
+ id
+ messageId
+ vote
+ text
+ linkifiedText
+ ...chatHelpers_isBotMessage
+}
+
+fragment ChatMessageFeedbackButtons_message on Message {
+ id
+ messageId
+ vote
+ voteReason
+ ...ChatMessageDownvotedButton_message
+}
+
+fragment ChatMessageOverflowButton_message on Message {
+ text
+ ...ChatMessageDropdownMenu_message
+ ...chatHelpers_isBotMessage
+}
+
+fragment ChatMessageSuggestedReplies_SuggestedReplyButton_message on Message {
+ messageId
+}
+
+fragment ChatMessageSuggestedReplies_message on Message {
+ suggestedReplies
+ ...ChatMessageSuggestedReplies_SuggestedReplyButton_message
+}
+
+fragment ChatMessage_message on Message {
+ id
+ messageId
+ text
+ author
+ linkifiedText
+ state
+ ...ChatMessageSuggestedReplies_message
+ ...ChatMessageFeedbackButtons_message
+ ...ChatMessageOverflowButton_message
+ ...chatHelpers_isHumanMessage
+ ...chatHelpers_isBotMessage
+ ...chatHelpers_isChatBreak
+ ...chatHelpers_useTimeoutLevel
+ ...MarkdownLinkInner_message
+}
+
+fragment MarkdownLinkInner_message on Message {
+ messageId
+}
+
+fragment MessageFeedbackOtherModal_message on Message {
+ id
+ messageId
+}
+
+fragment MessageFeedbackReasonModal_message on Message {
+ id
+ messageId
+}
+
+fragment chatHelpers_isBotMessage on Message {
+ ...chatHelpers_isHumanMessage
+ ...chatHelpers_isChatBreak
+}
+
+fragment chatHelpers_isChatBreak on Message {
+ author
+}
+
+fragment chatHelpers_isHumanMessage on Message {
+ author
+}
+
+fragment chatHelpers_useTimeoutLevel on Message {
+ id
+ state
+ text
+ messageId
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/MessageDeletedSubscription.graphql b/g4f/.v1/gpt4free/quora/graphql/MessageDeletedSubscription.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..54c1c164ea8695d3389e93025b182e17a51f4a2d
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/MessageDeletedSubscription.graphql
@@ -0,0 +1,6 @@
+subscription MessageDeletedSubscription($chatId: BigInt!) {
+ messageDeleted(chatId: $chatId) {
+ id
+ messageId
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/MessageFragment.graphql b/g4f/.v1/gpt4free/quora/graphql/MessageFragment.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..cc8608117e506262eaf01ff442ab658baeb598c6
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/MessageFragment.graphql
@@ -0,0 +1,13 @@
+fragment MessageFragment on Message {
+ id
+ __typename
+ messageId
+ text
+ linkifiedText
+ authorNickname
+ state
+ vote
+ voteReason
+ creationTime
+ suggestedReplies
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/MessageRemoveVoteMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/MessageRemoveVoteMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..d5e6e610730b5a7542a714b4f5533fdda7df2872
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/MessageRemoveVoteMutation.graphql
@@ -0,0 +1,7 @@
+mutation MessageRemoveVoteMutation($messageId: BigInt!) {
+ messageRemoveVote(messageId: $messageId) {
+ message {
+ ...MessageFragment
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/MessageSetVoteMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/MessageSetVoteMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..76000df0eba23313fb10883d3ef1763c9e37674b
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/MessageSetVoteMutation.graphql
@@ -0,0 +1,7 @@
+mutation MessageSetVoteMutation($messageId: BigInt!, $voteType: VoteType!, $reason: String) {
+ messageSetVote(messageId: $messageId, voteType: $voteType, reason: $reason) {
+ message {
+ ...MessageFragment
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/PoeBotCreateMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/PoeBotCreateMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..971b4248a6365725a6fb6d7791f33f6a881ff2cb
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/PoeBotCreateMutation.graphql
@@ -0,0 +1,73 @@
+mutation CreateBotMain_poeBotCreate_Mutation(
+ $model: String!
+ $handle: String!
+ $prompt: String!
+ $isPromptPublic: Boolean!
+ $introduction: String!
+ $description: String!
+ $profilePictureUrl: String
+ $apiUrl: String
+ $apiKey: String
+ $isApiBot: Boolean
+ $hasLinkification: Boolean
+ $hasMarkdownRendering: Boolean
+ $hasSuggestedReplies: Boolean
+ $isPrivateBot: Boolean
+) {
+ poeBotCreate(model: $model, handle: $handle, promptPlaintext: $prompt, isPromptPublic: $isPromptPublic, introduction: $introduction, description: $description, profilePicture: $profilePictureUrl, apiUrl: $apiUrl, apiKey: $apiKey, isApiBot: $isApiBot, hasLinkification: $hasLinkification, hasMarkdownRendering: $hasMarkdownRendering, hasSuggestedReplies: $hasSuggestedReplies, isPrivateBot: $isPrivateBot) {
+ status
+ bot {
+ id
+ ...BotHeader_bot
+ }
+ }
+}
+
+fragment BotHeader_bot on Bot {
+ displayName
+ messageLimit {
+ dailyLimit
+ }
+ ...BotImage_bot
+ ...BotLink_bot
+ ...IdAnnotation_node
+ ...botHelpers_useViewerCanAccessPrivateBot
+ ...botHelpers_useDeletion_bot
+}
+
+fragment BotImage_bot on Bot {
+ displayName
+ ...botHelpers_useDeletion_bot
+ ...BotImage_useProfileImage_bot
+}
+
+fragment BotImage_useProfileImage_bot on Bot {
+ image {
+ __typename
+ ... on LocalBotImage {
+ localName
+ }
+ ... on UrlBotImage {
+ url
+ }
+ }
+ ...botHelpers_useDeletion_bot
+}
+
+fragment BotLink_bot on Bot {
+ displayName
+}
+
+fragment IdAnnotation_node on Node {
+ __isNode: __typename
+ id
+}
+
+fragment botHelpers_useDeletion_bot on Bot {
+ deletionState
+}
+
+fragment botHelpers_useViewerCanAccessPrivateBot on Bot {
+ isPrivateBot
+ viewerIsCreator
+}
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/quora/graphql/PoeBotEditMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/PoeBotEditMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..fdd309efd58fb0174ad6c64fe00565f58c8dcd0f
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/PoeBotEditMutation.graphql
@@ -0,0 +1,24 @@
+mutation EditBotMain_poeBotEdit_Mutation(
+ $botId: BigInt!
+ $handle: String!
+ $description: String!
+ $introduction: String!
+ $isPromptPublic: Boolean!
+ $baseBot: String!
+ $profilePictureUrl: String
+ $prompt: String!
+ $apiUrl: String
+ $apiKey: String
+ $hasLinkification: Boolean
+ $hasMarkdownRendering: Boolean
+ $hasSuggestedReplies: Boolean
+ $isPrivateBot: Boolean
+) {
+ poeBotEdit(botId: $botId, handle: $handle, description: $description, introduction: $introduction, isPromptPublic: $isPromptPublic, model: $baseBot, promptPlaintext: $prompt, profilePicture: $profilePictureUrl, apiUrl: $apiUrl, apiKey: $apiKey, hasLinkification: $hasLinkification, hasMarkdownRendering: $hasMarkdownRendering, hasSuggestedReplies: $hasSuggestedReplies, isPrivateBot: $isPrivateBot) {
+ status
+ bot {
+ handle
+ id
+ }
+ }
+}
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/quora/graphql/SendMessageMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/SendMessageMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..4b0a4383e7801120aaf09d2b7962aac55e99c71c
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SendMessageMutation.graphql
@@ -0,0 +1,40 @@
+mutation chatHelpers_sendMessageMutation_Mutation(
+ $chatId: BigInt!
+ $bot: String!
+ $query: String!
+ $source: MessageSource
+ $withChatBreak: Boolean!
+) {
+ messageEdgeCreate(chatId: $chatId, bot: $bot, query: $query, source: $source, withChatBreak: $withChatBreak) {
+ chatBreak {
+ cursor
+ node {
+ id
+ messageId
+ text
+ author
+ suggestedReplies
+ creationTime
+ state
+ }
+ id
+ }
+ message {
+ cursor
+ node {
+ id
+ messageId
+ text
+ author
+ suggestedReplies
+ creationTime
+ state
+ chat {
+ shouldShowDisclaimer
+ id
+ }
+ }
+ id
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/SendVerificationCodeForLoginMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/SendVerificationCodeForLoginMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..45af47991df889b057528da84f011ba9d7be0bb5
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SendVerificationCodeForLoginMutation.graphql
@@ -0,0 +1,12 @@
+mutation SendVerificationCodeForLoginMutation(
+ $emailAddress: String
+ $phoneNumber: String
+) {
+ sendVerificationCode(
+ verificationReason: login
+ emailAddress: $emailAddress
+ phoneNumber: $phoneNumber
+ ) {
+ status
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/SettingsDeleteAccountButton_deleteAccountMutation_Mutation.graphql b/g4f/.v1/gpt4free/quora/graphql/SettingsDeleteAccountButton_deleteAccountMutation_Mutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..0af5095012df7b0925f3248ff3cf2f6ea0970371
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SettingsDeleteAccountButton_deleteAccountMutation_Mutation.graphql
@@ -0,0 +1 @@
+mutation SettingsDeleteAccountButton_deleteAccountMutation_Mutation{ deleteAccount { viewer { uid id } }}
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/quora/graphql/ShareMessagesMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/ShareMessagesMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..92e80db576c84688b41ee32e1ac68569c7c7a574
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ShareMessagesMutation.graphql
@@ -0,0 +1,9 @@
+mutation ShareMessagesMutation(
+ $chatId: BigInt!
+ $messageIds: [BigInt!]!
+ $comment: String
+) {
+ messagesShare(chatId: $chatId, messageIds: $messageIds, comment: $comment) {
+ shareCode
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/SignupWithVerificationCodeMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/SignupWithVerificationCodeMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..06b2826fa40e3166f5fd7f88eb80d59c43b19bf6
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SignupWithVerificationCodeMutation.graphql
@@ -0,0 +1,13 @@
+mutation SignupWithVerificationCodeMutation(
+ $verificationCode: String!
+ $emailAddress: String
+ $phoneNumber: String
+) {
+ signupWithVerificationCode(
+ verificationCode: $verificationCode
+ emailAddress: $emailAddress
+ phoneNumber: $phoneNumber
+ ) {
+ status
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/StaleChatUpdateMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/StaleChatUpdateMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..de203d47c2cebdc7d7a40cb70e17d403fc2f5b34
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/StaleChatUpdateMutation.graphql
@@ -0,0 +1,7 @@
+mutation StaleChatUpdateMutation($chatId: BigInt!) {
+ staleChatUpdate(chatId: $chatId) {
+ message {
+ ...MessageFragment
+ }
+ }
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/SubscriptionsMutation.graphql b/g4f/.v1/gpt4free/quora/graphql/SubscriptionsMutation.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..b864bd60746ccaeb7a2ffa4547b6ad939af6ac15
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SubscriptionsMutation.graphql
@@ -0,0 +1,9 @@
+mutation subscriptionsMutation(
+ $subscriptions: [AutoSubscriptionQuery!]!
+) {
+ autoSubscribe(subscriptions: $subscriptions) {
+ viewer {
+ id
+ }
+ }
+}
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/quora/graphql/SummarizePlainPostQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/SummarizePlainPostQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..afa2a84c0ea5c87f2b7ba31ae372c8a13e7e9442
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SummarizePlainPostQuery.graphql
@@ -0,0 +1,3 @@
+query SummarizePlainPostQuery($comment: String!) {
+ summarizePlainPost(comment: $comment)
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/SummarizeQuotePostQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/SummarizeQuotePostQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..5147c3c5b13465f9eaca928260b3ef4c088de895
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SummarizeQuotePostQuery.graphql
@@ -0,0 +1,3 @@
+query SummarizeQuotePostQuery($comment: String, $quotedPostId: BigInt!) {
+ summarizeQuotePost(comment: $comment, quotedPostId: $quotedPostId)
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/SummarizeSharePostQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/SummarizeSharePostQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..cb4a623c4266757c8262c297a9e2f9ad210d6ff6
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/SummarizeSharePostQuery.graphql
@@ -0,0 +1,3 @@
+query SummarizeSharePostQuery($comment: String!, $chatId: BigInt!, $messageIds: [BigInt!]!) {
+ summarizeSharePost(comment: $comment, chatId: $chatId, messageIds: $messageIds)
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/UserSnippetFragment.graphql b/g4f/.v1/gpt4free/quora/graphql/UserSnippetFragment.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..17fc84264a50bbbea5b8fc6a3d6ad9e70153dc64
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/UserSnippetFragment.graphql
@@ -0,0 +1,14 @@
+fragment UserSnippetFragment on PoeUser {
+ id
+ uid
+ bio
+ handle
+ fullName
+ viewerIsFollowing
+ isPoeOnlyUser
+ profilePhotoURLTiny: profilePhotoUrl(size: tiny)
+ profilePhotoURLSmall: profilePhotoUrl(size: small)
+ profilePhotoURLMedium: profilePhotoUrl(size: medium)
+ profilePhotoURLLarge: profilePhotoUrl(size: large)
+ isFollowable
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/ViewerInfoQuery.graphql b/g4f/.v1/gpt4free/quora/graphql/ViewerInfoQuery.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..1ecaf9e8316e0272deae6eda78df4b86f37283fd
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ViewerInfoQuery.graphql
@@ -0,0 +1,21 @@
+query ViewerInfoQuery {
+ viewer {
+ id
+ uid
+ ...ViewerStateFragment
+ ...BioFragment
+ ...HandleFragment
+ hasCompletedMultiplayerNux
+ poeUser {
+ id
+ ...UserSnippetFragment
+ }
+ messageLimit{
+ canSend
+ numMessagesRemaining
+ resetTime
+ shouldShowReminder
+ }
+ }
+}
+
diff --git a/g4f/.v1/gpt4free/quora/graphql/ViewerStateFragment.graphql b/g4f/.v1/gpt4free/quora/graphql/ViewerStateFragment.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..3cd83e9c96fe4d3b49e41a566848f1b3233bf547
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ViewerStateFragment.graphql
@@ -0,0 +1,30 @@
+fragment ViewerStateFragment on Viewer {
+ id
+ __typename
+ iosMinSupportedVersion: integerGate(gateName: "poe_ios_min_supported_version")
+ iosMinEncouragedVersion: integerGate(
+ gateName: "poe_ios_min_encouraged_version"
+ )
+ macosMinSupportedVersion: integerGate(
+ gateName: "poe_macos_min_supported_version"
+ )
+ macosMinEncouragedVersion: integerGate(
+ gateName: "poe_macos_min_encouraged_version"
+ )
+ showPoeDebugPanel: booleanGate(gateName: "poe_show_debug_panel")
+ enableCommunityFeed: booleanGate(gateName: "enable_poe_shares_feed")
+ linkifyText: booleanGate(gateName: "poe_linkify_response")
+ enableSuggestedReplies: booleanGate(gateName: "poe_suggested_replies")
+ removeInviteLimit: booleanGate(gateName: "poe_remove_invite_limit")
+ enableInAppPurchases: booleanGate(gateName: "poe_enable_in_app_purchases")
+ availableBots {
+ nickname
+ displayName
+ profilePicture
+ isDown
+ disclaimer
+ subtitle
+ poweredBy
+ }
+}
+
diff --git a/g4f/.v1/gpt4free/quora/graphql/ViewerStateUpdatedSubscription.graphql b/g4f/.v1/gpt4free/quora/graphql/ViewerStateUpdatedSubscription.graphql
new file mode 100644
index 0000000000000000000000000000000000000000..03fc73d151594275aa8a87198c85311b475b5077
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/graphql/ViewerStateUpdatedSubscription.graphql
@@ -0,0 +1,43 @@
+subscription viewerStateUpdated {
+ viewerStateUpdated {
+ id
+ ...ChatPageBotSwitcher_viewer
+ }
+}
+
+fragment BotHeader_bot on Bot {
+ displayName
+ messageLimit {
+ dailyLimit
+ }
+ ...BotImage_bot
+}
+
+fragment BotImage_bot on Bot {
+ image {
+ __typename
+ ... on LocalBotImage {
+ localName
+ }
+ ... on UrlBotImage {
+ url
+ }
+ }
+ displayName
+}
+
+fragment BotLink_bot on Bot {
+ displayName
+}
+
+fragment ChatPageBotSwitcher_viewer on Viewer {
+ availableBots {
+ id
+ messageLimit {
+ dailyLimit
+ }
+ ...BotLink_bot
+ ...BotHeader_bot
+ }
+ allowUserCreatedBots: booleanGate(gateName: "enable_user_created_bots")
+}
diff --git a/g4f/.v1/gpt4free/quora/graphql/__init__.py b/g4f/.v1/gpt4free/quora/graphql/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/g4f/.v1/gpt4free/quora/mail.py b/g4f/.v1/gpt4free/quora/mail.py
new file mode 100644
index 0000000000000000000000000000000000000000..864d9568a541129d0358623ccf879f0dccc5caec
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/mail.py
@@ -0,0 +1,80 @@
+from json import loads
+from re import findall
+from time import sleep
+
+from fake_useragent import UserAgent
+from requests import Session
+
+
+class Emailnator:
+ def __init__(self) -> None:
+ self.client = Session()
+ self.client.get("https://www.emailnator.com/", timeout=6)
+ self.cookies = self.client.cookies.get_dict()
+
+ self.client.headers = {
+ "authority": "www.emailnator.com",
+ "origin": "https://www.emailnator.com",
+ "referer": "https://www.emailnator.com/",
+ "user-agent": UserAgent().random,
+ "x-xsrf-token": self.client.cookies.get("XSRF-TOKEN")[:-3] + "=",
+ }
+
+ self.email = None
+
+ def get_mail(self):
+ response = self.client.post(
+ "https://www.emailnator.com/generate-email",
+ json={
+ "email": [
+ "domain",
+ "plusGmail",
+ "dotGmail",
+ ]
+ },
+ )
+
+ self.email = loads(response.text)["email"][0]
+ return self.email
+
+ def get_message(self):
+ print("Waiting for message...")
+
+ while True:
+ sleep(2)
+ mail_token = self.client.post("https://www.emailnator.com/message-list", json={"email": self.email})
+
+ mail_token = loads(mail_token.text)["messageData"]
+
+ if len(mail_token) == 2:
+ print("Message received!")
+ print(mail_token[1]["messageID"])
+ break
+
+ mail_context = self.client.post(
+ "https://www.emailnator.com/message-list",
+ json={
+ "email": self.email,
+ "messageID": mail_token[1]["messageID"],
+ },
+ )
+
+ return mail_context.text
+
+ def get_verification_code(self):
+ message = self.get_message()
+ code = findall(r';">(\d{6,7})', message)[0]
+ print(f"Verification code: {code}")
+ return code
+
+ def clear_inbox(self):
+ print("Clearing inbox...")
+ self.client.post(
+ "https://www.emailnator.com/delete-all",
+ json={"email": self.email},
+ )
+ print("Inbox cleared!")
+
+ def __del__(self):
+ if self.email:
+ self.clear_inbox()
diff --git a/g4f/.v1/gpt4free/quora/tests/__init__.py b/g4f/.v1/gpt4free/quora/tests/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/g4f/.v1/gpt4free/quora/tests/test_api.py b/g4f/.v1/gpt4free/quora/tests/test_api.py
new file mode 100644
index 0000000000000000000000000000000000000000..2a4bb41b016429d13debe94c67a76cc6112f154c
--- /dev/null
+++ b/g4f/.v1/gpt4free/quora/tests/test_api.py
@@ -0,0 +1,38 @@
+import unittest
+import requests
+from unittest.mock import MagicMock
+from gpt4free.quora.api import retry_request
+
+
+class TestRetryRequest(unittest.TestCase):
+ def test_successful_request(self):
+ # Mock a successful request with a 200 status code
+ mock_response = MagicMock()
+ mock_response.status_code = 200
+ requests.get = MagicMock(return_value=mock_response)
+
+ # Call the function and assert that it returns the response
+ response = retry_request(requests.get, "http://example.com", max_attempts=3)
+ self.assertEqual(response.status_code, 200)
+
+ def test_exponential_backoff(self):
+ # Mock a failed request that succeeds after two retries
+ mock_response = MagicMock()
+ mock_response.status_code = 200
+ requests.get = MagicMock(side_effect=[requests.exceptions.RequestException] * 2 + [mock_response])
+
+ # Call the function and assert that it retries with exponential backoff
+ with self.assertLogs() as logs:
+ response = retry_request(requests.get, "http://example.com", max_attempts=3, delay=1)
+ self.assertEqual(response.status_code, 200)
+ self.assertGreaterEqual(len(logs.output), 2)
+ self.assertIn("Retrying in 1 seconds...", logs.output[0])
+ self.assertIn("Retrying in 2 seconds...", logs.output[1])
+
+ def test_too_many_attempts(self):
+ # Mock a failed request that never succeeds
+ requests.get = MagicMock(side_effect=requests.exceptions.RequestException)
+
+ # Call the function and assert that it raises an exception after the maximum number of attempts
+ with self.assertRaises(RuntimeError):
+ retry_request(requests.get, "http://example.com", max_attempts=3)
diff --git a/g4f/.v1/gpt4free/test.py b/g4f/.v1/gpt4free/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..b2516748041b8bbc12afa910c0eab98e944c45ce
--- /dev/null
+++ b/g4f/.v1/gpt4free/test.py
@@ -0,0 +1,4 @@
+import forefront
+token = forefront.Account.create()
+response = forefront.Completion.create(token=token, prompt='Hello!')
+print(response)
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/theb/README.md b/g4f/.v1/gpt4free/theb/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a7af9dd802753ec410c18f7dcfe2fe3f52e044ba
--- /dev/null
+++ b/g4f/.v1/gpt4free/theb/README.md
@@ -0,0 +1,14 @@
+### Example: `theb` (use like openai pypi package)
+
+```python
+# import library
+from gpt4free import theb
+
+# simple streaming completion
+
+while True:
+ x = input()
+ for token in theb.Completion.create(x):
+ print(token, end='', flush=True)
+ print("")
+```
diff --git a/g4f/.v1/gpt4free/theb/__init__.py b/g4f/.v1/gpt4free/theb/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..0177194efbaf0e79c8ff62f4191ef8c3a5578a05
--- /dev/null
+++ b/g4f/.v1/gpt4free/theb/__init__.py
@@ -0,0 +1,76 @@
+from json import loads
+from queue import Queue, Empty
+from re import findall
+from threading import Thread
+from typing import Generator, Optional
+
+from curl_cffi import requests
+from fake_useragent import UserAgent
+
+
+class Completion:
+ # experimental
+ part1 = '{"role":"assistant","id":"chatcmpl'
+ part2 = '"},"index":0,"finish_reason":null}]}}'
+ regex = rf'{part1}(.*){part2}'
+
+ timer = None
+ message_queue = Queue()
+ stream_completed = False
+ last_msg_id = None
+
+ @staticmethod
+ def request(prompt: str, proxy: Optional[str] = None):
+ headers = {
+ 'authority': 'chatbot.theb.ai',
+ 'content-type': 'application/json',
+ 'origin': 'https://chatbot.theb.ai',
+ 'user-agent': UserAgent().random,
+ }
+
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else None
+
+ options = {}
+ if Completion.last_msg_id:
+ options['parentMessageId'] = Completion.last_msg_id
+
+ requests.post(
+ 'https://chatbot.theb.ai/api/chat-process',
+ headers=headers,
+ proxies=proxies,
+ content_callback=Completion.handle_stream_response,
+ json={'prompt': prompt, 'options': options},
+ timeout=100000
+ )
+
+ Completion.stream_completed = True
+
+ @staticmethod
+ def create(prompt: str, proxy: Optional[str] = None) -> Generator[str, None, None]:
+ Completion.stream_completed = False
+
+ Thread(target=Completion.request, args=[prompt, proxy]).start()
+
+ while not Completion.stream_completed or not Completion.message_queue.empty():
+ try:
+ message = Completion.message_queue.get(timeout=0.01)
+ for message in findall(Completion.regex, message):
+ message_json = loads(Completion.part1 + message + Completion.part2)
+ Completion.last_msg_id = message_json['id']
+ yield message_json['delta']
+
+ except Empty:
+ pass
+
+ @staticmethod
+ def handle_stream_response(response):
+ Completion.message_queue.put(response.decode())
+
+ @staticmethod
+ def get_response(prompt: str, proxy: Optional[str] = None) -> str:
+ response_list = []
+ for message in Completion.create(prompt, proxy):
+ response_list.append(message)
+ return ''.join(response_list)
+
+ Completion.message_queue.put(response.decode(errors='replace'))
diff --git a/g4f/.v1/gpt4free/theb/theb_test.py b/g4f/.v1/gpt4free/theb/theb_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..c57d5c62f65ef9c7255b2de2dfefaa0b3fa3b18f
--- /dev/null
+++ b/g4f/.v1/gpt4free/theb/theb_test.py
@@ -0,0 +1,4 @@
+import theb
+
+for token in theb.Completion.create('hello world'):
+ print(token, end='', flush=True)
diff --git a/g4f/.v1/gpt4free/usesless/README.md b/g4f/.v1/gpt4free/usesless/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..7b2ea16953ffd12b048fa38c0d3f60907aacca30
--- /dev/null
+++ b/g4f/.v1/gpt4free/usesless/README.md
@@ -0,0 +1,33 @@
+ai.usesless.com
+
+### Example: `usesless`
+
+### Token generation
+This will create account.json that contains email and token in json
+
+```python
+from gpt4free import usesless
+
+
+token = usesless.Account.create(logging=True)
+print(token)
+```
+
+### Completion
+Insert token from account.json
+
+```python
+import usesless
+
+message_id = ""
+token = # usesless.Account.create(logging=True)
+while True:
+ prompt = input("Question: ")
+ if prompt == "!stop":
+ break
+
+ req = usesless.Completion.create(prompt=prompt, parentMessageId=message_id, token=token)
+
+ print(f"Answer: {req['text']}")
+ message_id = req["id"]
+```
diff --git a/g4f/.v1/gpt4free/usesless/__init__.py b/g4f/.v1/gpt4free/usesless/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..00f7f75d0e33d106a514a98fdb20234fbf80d6c2
--- /dev/null
+++ b/g4f/.v1/gpt4free/usesless/__init__.py
@@ -0,0 +1,158 @@
+import string
+import time
+import re
+import json
+import requests
+import fake_useragent
+import random
+from password_generator import PasswordGenerator
+
+from .utils import create_email, check_email
+
+
+class Account:
+ @staticmethod
+ def create(logging: bool = False):
+ is_custom_domain = input(
+ "Do you want to use your custom domain name for temporary email? [Y/n]: "
+ ).upper()
+
+ if is_custom_domain == "Y":
+ mail_address = create_email(custom_domain=True, logging=logging)
+ elif is_custom_domain == "N":
+ mail_address = create_email(custom_domain=False, logging=logging)
+ else:
+ print("Please, enter either Y or N")
+ return
+
+ name = string.ascii_lowercase + string.digits
+ username = "".join(random.choice(name) for i in range(20))
+
+ pwo = PasswordGenerator()
+ pwo.minlen = 8
+ password = pwo.generate()
+
+ session = requests.Session()
+
+ register_url = "https://ai.usesless.com/api/cms/auth/local/register"
+ register_json = {
+ "username": username,
+ "password": password,
+ "email": mail_address,
+ }
+ headers = {
+ "authority": "ai.usesless.com",
+ "accept": "application/json, text/plain, */*",
+ "accept-language": "en-US,en;q=0.5",
+ "cache-control": "no-cache",
+ "sec-fetch-dest": "empty",
+ "sec-fetch-mode": "cors",
+ "sec-fetch-site": "same-origin",
+ "user-agent": fake_useragent.UserAgent().random,
+ }
+ register = session.post(register_url, json=register_json, headers=headers)
+ if logging:
+ if register.status_code == 200:
+ print("Registered successfully")
+ else:
+ print(register.status_code)
+ print(register.json())
+ print("There was a problem with account registration, try again")
+
+ if register.status_code != 200:
+ quit()
+
+ while True:
+ time.sleep(5)
+ messages = check_email(mail=mail_address, logging=logging)
+
+ # Check if method `message_list()` didn't return None or empty list.
+ if not messages or len(messages) == 0:
+ # If it returned None or empty list sleep for 5 seconds to wait for new message.
+ continue
+
+ message_text = messages[0]["content"]
+ verification_url = re.findall(
+ r"http:\/\/ai\.usesless\.com\/api\/cms\/auth\/email-confirmation\?confirmation=\w.+\w\w",
+ message_text,
+ )[0]
+ if verification_url:
+ break
+
+ session.get(verification_url)
+ login_json = {"identifier": mail_address, "password": password}
+ login_request = session.post(
+ url="https://ai.usesless.com/api/cms/auth/local", json=login_json
+ )
+
+ token = login_request.json()["jwt"]
+ if logging and token:
+ print(f"Token: {token}")
+
+ with open("account.json", "w") as file:
+ json.dump({"email": mail_address, "token": token}, file)
+ if logging:
+ print(
+ "\nNew account credentials has been successfully saved in 'account.json' file"
+ )
+
+ return token
+
+
+class Completion:
+ @staticmethod
+ def create(
+ token: str,
+ systemMessage: str = "You are a helpful assistant",
+ prompt: str = "",
+ parentMessageId: str = "",
+ presence_penalty: float = 1,
+ temperature: float = 1,
+ model: str = "gpt-3.5-turbo",
+ ):
+ headers = {
+ "authority": "ai.usesless.com",
+ "accept": "application/json, text/plain, */*",
+ "accept-language": "en-US,en;q=0.5",
+ "cache-control": "no-cache",
+ "sec-fetch-dest": "empty",
+ "sec-fetch-mode": "cors",
+ "sec-fetch-site": "same-origin",
+ "user-agent": fake_useragent.UserAgent().random,
+ "Authorization": f"Bearer {token}",
+ }
+
+ json_data = {
+ "openaiKey": "",
+ "prompt": prompt,
+ "options": {
+ "parentMessageId": parentMessageId,
+ "systemMessage": systemMessage,
+ "completionParams": {
+ "presence_penalty": presence_penalty,
+ "temperature": temperature,
+ "model": model,
+ },
+ },
+ }
+
+ url = "https://ai.usesless.com/api/chat-process"
+ request = requests.post(url, headers=headers, json=json_data)
+ request.encoding = request.apparent_encoding
+ content = request.content
+
+ response = Completion.__response_to_json(content)
+ return response
+
+
+ @classmethod
+ def __response_to_json(cls, text) -> str:
+ text = str(text.decode("utf-8"))
+
+ split_text = text.rsplit("\n", 1)
+ if len(split_text) > 1:
+ to_json = json.loads(split_text[1])
+ return to_json
+ else:
+ return None
+
diff --git a/g4f/.v1/gpt4free/usesless/account.json b/g4f/.v1/gpt4free/usesless/account.json
new file mode 100644
index 0000000000000000000000000000000000000000..53a210ac00b53aa1f81f36465d1f274a540f6b90
--- /dev/null
+++ b/g4f/.v1/gpt4free/usesless/account.json
@@ -0,0 +1 @@
+{"email": "enganese-test-email@1secmail.net", "token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6Mzg1MDEsImlhdCI6MTY4NTExMDgzOSwiZXhwIjoxNzE2NjY4NDM5fQ.jfEQOFWYQP2Xx4U-toorPg3nh31mxl3L0D2hRROmjZA"}
\ No newline at end of file
diff --git a/g4f/.v1/gpt4free/usesless/account_creation.py b/g4f/.v1/gpt4free/usesless/account_creation.py
new file mode 100644
index 0000000000000000000000000000000000000000..0581945372f5c08918e6cf5df2a528f52c93cc00
--- /dev/null
+++ b/g4f/.v1/gpt4free/usesless/account_creation.py
@@ -0,0 +1,3 @@
+import usesless
+
+usesless.Account.create(logging=True)
diff --git a/g4f/.v1/gpt4free/usesless/test.py b/g4f/.v1/gpt4free/usesless/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..ade1e0c52cd7d2443051baa8bf8a02baa1a2cc94
--- /dev/null
+++ b/g4f/.v1/gpt4free/usesless/test.py
@@ -0,0 +1,10 @@
+# Fix by @enganese
+# Import Account class from __init__.py file
+from gpt4free import usesless
+
+# Create Account and enable logging to see all the log messages (it's very interesting, try it!)
+# New account credentials will be automatically saved in account.json file in such template: {"email": "username@1secmail.com", "token": "token here"}
+token = usesless.Account.create(logging=True)
+
+# Print the new token
+print(token)
diff --git a/g4f/.v1/gpt4free/usesless/utils/__init__.py b/g4f/.v1/gpt4free/usesless/utils/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..818c605d2d82680f2014ba8f1a3bb66d0ae741b1
--- /dev/null
+++ b/g4f/.v1/gpt4free/usesless/utils/__init__.py
@@ -0,0 +1,139 @@
+import requests
+import random
+import string
+import time
+import sys
+import re
+import os
+
+
+def check_email(mail, logging: bool = False):
+ username = mail.split("@")[0]
+ domain = mail.split("@")[1]
+ reqLink = f"https://www.1secmail.com/api/v1/?action=getMessages&login={username}&domain={domain}"
+ req = requests.get(reqLink)
+ req.encoding = req.apparent_encoding
+ req = req.json()
+
+ length = len(req)
+
+ if logging:
+ os.system("cls" if os.name == "nt" else "clear")
+ time.sleep(1)
+ print("Your temporary mail:", mail)
+
+ if logging and length == 0:
+ print(
+ "Mailbox is empty. Hold tight. Mailbox is refreshed automatically every 5 seconds.",
+ )
+ else:
+ messages = []
+ id_list = []
+
+ for i in req:
+ for k, v in i.items():
+ if k == "id":
+ id_list.append(v)
+
+ x = "mails" if length > 1 else "mail"
+
+ if logging:
+ print(
+ f"Mailbox has {length} {x}. (Mailbox is refreshed automatically every 5 seconds.)"
+ )
+
+ for i in id_list:
+ msgRead = f"https://www.1secmail.com/api/v1/?action=readMessage&login={username}&domain={domain}&id={i}"
+ req = requests.get(msgRead)
+ req.encoding = req.apparent_encoding
+ req = req.json()
+
+ for k, v in req.items():
+ if k == "from":
+ sender = v
+ if k == "subject":
+ subject = v
+ if k == "date":
+ date = v
+ if k == "textBody":
+ content = v
+
+ if logging:
+ print(
+ "Sender:",
+ sender,
+ "\nTo:",
+ mail,
+ "\nSubject:",
+ subject,
+ "\nDate:",
+ date,
+ "\nContent:",
+ content,
+ "\n",
+ )
+ messages.append(
+ {
+ "sender": sender,
+ "to": mail,
+ "subject": subject,
+ "date": date,
+ "content": content,
+ }
+ )
+
+ if logging:
+ os.system("cls" if os.name == "nt" else "clear")
+ return messages
+
+
+def create_email(custom_domain: bool = False, logging: bool = False):
+ domainList = ["1secmail.com", "1secmail.net", "1secmail.org"]
+ domain = random.choice(domainList)
+ try:
+ if custom_domain:
+ custom_domain = input(
+ "\nIf you enter 'my-test-email' as your domain name, mail address will look like this: 'my-test-email@1secmail.com'"
+ "\nEnter the name that you wish to use as your domain name: "
+ )
+
+ newMail = f"https://www.1secmail.com/api/v1/?login={custom_domain}&domain={domain}"
+ reqMail = requests.get(newMail)
+ reqMail.encoding = reqMail.apparent_encoding
+
+ username = re.search(r"login=(.*)&", newMail).group(1)
+ domain = re.search(r"domain=(.*)", newMail).group(1)
+ mail = f"{username}@{domain}"
+
+ if logging:
+ print("\nYour temporary email was created successfully:", mail)
+ return mail
+
+ else:
+ name = string.ascii_lowercase + string.digits
+ random_username = "".join(random.choice(name) for i in range(10))
+ newMail = f"https://www.1secmail.com/api/v1/?login={random_username}&domain={domain}"
+
+ reqMail = requests.get(newMail)
+ reqMail.encoding = reqMail.apparent_encoding
+
+ username = re.search(r"login=(.*)&", newMail).group(1)
+ domain = re.search(r"domain=(.*)", newMail).group(1)
+ mail = f"{username}@{domain}"
+
+ if logging:
+ print("\nYour temporary email was created successfully:", mail)
+ return mail
+
+ except KeyboardInterrupt:
+ requests.post(
+ "https://www.1secmail.com/mailbox",
+ data={
+ "action": "deleteMailbox",
+ "login": f"{username}",
+ "domain": f"{domain}",
+ },
+ )
+ if logging:
+ print("\nKeyboard Interrupt Detected! \nTemporary mail was disposed!")
+ os.system("cls" if os.name == "nt" else "clear")
diff --git a/g4f/.v1/gpt4free/you/README.md b/g4f/.v1/gpt4free/you/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..e1917c6dc153a0aff2ab1e0ec5093cc55b5b77e1
--- /dev/null
+++ b/g4f/.v1/gpt4free/you/README.md
@@ -0,0 +1,38 @@
+### Example: `you` (use like openai pypi package)
+
+```python
+
+from gpt4free import you
+
+# simple request with links and details
+response = you.Completion.create(
+ prompt="hello world",
+ detailed=True,
+ include_links=True, )
+
+print(response.dict())
+
+# {
+# "response": "...",
+# "links": [...],
+# "extra": {...},
+# "slots": {...}
+# }
+# }
+
+# chatbot
+
+chat = []
+
+while True:
+ prompt = input("You: ")
+ if prompt == 'q':
+ break
+ response = you.Completion.create(
+ prompt=prompt,
+ chat=chat)
+
+ print("Bot:", response.text)
+
+ chat.append({"question": prompt, "answer": response.text})
+```
diff --git a/g4f/.v1/gpt4free/you/__init__.py b/g4f/.v1/gpt4free/you/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..11847fb5e5bf54caaf181e8bbe9e88b01f971a7c
--- /dev/null
+++ b/g4f/.v1/gpt4free/you/__init__.py
@@ -0,0 +1,127 @@
+import json
+import re
+from typing import Optional, List, Dict, Any
+from uuid import uuid4
+
+from fake_useragent import UserAgent
+from pydantic import BaseModel
+from requests import RequestException
+from retrying import retry
+from tls_client import Session
+from tls_client.response import Response
+
+
+class YouResponse(BaseModel):
+ text: Optional[str] = None
+ links: List[str] = []
+ extra: Dict[str, Any] = {}
+
+
+class Completion:
+ @staticmethod
+ def create(
+ prompt: str,
+ page: int = 1,
+ count: int = 10,
+ safe_search: str = 'Moderate',
+ on_shopping_page: bool = False,
+ mkt: str = '',
+ response_filter: str = 'WebPages,Translations,TimeZone,Computation,RelatedSearches',
+ domain: str = 'youchat',
+ query_trace_id: str = None,
+ chat: list = None,
+ include_links: bool = False,
+ detailed: bool = False,
+ debug: bool = False,
+ proxy: Optional[str] = None,
+ ) -> YouResponse:
+ if chat is None:
+ chat = []
+
+ proxies = {'http': 'http://' + proxy, 'https': 'http://' + proxy} if proxy else {}
+
+ client = Session(client_identifier='chrome_108')
+ client.headers = Completion.__get_headers()
+ client.proxies = proxies
+
+ params = {
+ 'q': prompt,
+ 'page': page,
+ 'count': count,
+ 'safeSearch': safe_search,
+ 'onShoppingPage': on_shopping_page,
+ 'mkt': mkt,
+ 'responseFilter': response_filter,
+ 'domain': domain,
+ 'queryTraceId': str(uuid4()) if query_trace_id is None else query_trace_id,
+ 'chat': str(chat), # {'question':'','answer':' ''}
+ }
+
+ try:
+ response = Completion.__make_request(client, params)
+ except Exception:
+ return Completion.__get_failure_response()
+
+ if debug:
+ print('\n\n------------------\n\n')
+ print(response.text)
+ print('\n\n------------------\n\n')
+
+ you_chat_serp_results = re.search(
+ r'(?<=event: youChatSerpResults\ndata:)(.*\n)*?(?=event: )', response.text
+ ).group()
+ third_party_search_results = re.search(
+ r'(?<=event: thirdPartySearchResults\ndata:)(.*\n)*?(?=event: )', response.text
+ ).group()
+ # slots = findall(r"slots\ndata: (.*)\n\nevent", response.text)[0]
+
+ text = ''.join(re.findall(r'{\"youChatToken\": \"(.*?)\"}', response.text))
+
+ extra = {
+ 'youChatSerpResults': json.loads(you_chat_serp_results),
+ # 'slots' : loads(slots)
+ }
+
+ response = YouResponse(text=text.replace('\\n', '\n').replace('\\\\', '\\').replace('\\"', '"'))
+ if include_links:
+ response.links = json.loads(third_party_search_results)['search']['third_party_search_results']
+
+ if detailed:
+ response.extra = extra
+
+ return response
+
+ @staticmethod
+ def __get_headers() -> dict:
+ return {
+ 'authority': 'you.com',
+ 'accept': 'text/event-stream',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'cache-control': 'no-cache',
+ 'referer': 'https://you.com/search?q=who+are+you&tbm=youchat',
+ 'sec-ch-ua': '"Not_A Brand";v="99", "Google Chrome";v="109", "Chromium";v="109"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'cookie': f'safesearch_guest=Moderate; uuid_guest={str(uuid4())}',
+ 'user-agent': UserAgent().random,
+ }
+
+ @staticmethod
+ def __get_failure_response() -> YouResponse:
+ return YouResponse(text='Unable to fetch the response, Please try again.')
+
+ @staticmethod
+ @retry(
+ wait_fixed=5000,
+ stop_max_attempt_number=5,
+ retry_on_exception=lambda e: isinstance(e, RequestException),
+ )
+ def __make_request(client: Session, params: dict) -> Response:
+ response = client.get(f'https://you.com/api/streamingSearch', params=params)
+ if 'youChatToken' not in response.text:
+ print('retry')
+ raise RequestException('Unable to get the response from server')
+ return response
diff --git a/g4f/.v1/gui/README.md b/g4f/.v1/gui/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..c0406216bd922ab43eb2241496a816e7b747d0de
--- /dev/null
+++ b/g4f/.v1/gui/README.md
@@ -0,0 +1,78 @@
+# gpt4free gui
+
+This code provides a Graphical User Interface (GUI) for gpt4free. Users can ask questions and get answers from GPT-4 API's, utilizing multiple API implementations. The project contains two different Streamlit applications: `streamlit_app.py` and `streamlit_chat_app.py`.
+
+In addition, a new GUI script specifically implemented using PyWebIO has been added and can be found in the pywebio-gui folder. If there are errors with the Streamlit version, you can try using the PyWebIO version instead
+
+Installation
+------------
+
+1. Clone the repository.
+2. Install the required dependencies with: `pip install -r requirements.txt`.
+3. To use `streamlit_chat_app.py`, note that it depends on a pull request (PR #24) from the https://github.com/AI-Yash/st-chat/ repository, which may change in the future. The current dependency library can be found at https://github.com/AI-Yash/st-chat/archive/refs/pull/24/head.zip.
+
+Analytics Disclaimer
+-----
+The streamlit browser app collects heavy analytics even when running locally. This includes events for every page load, form submission including metadata on queries (like length), browser and client information including host ips. These are all transmitted to a 3rd party analytics group, Segment.com.
+
+Usage
+-----
+
+Choose one of the Streamlit applications to run:
+
+### streamlit\_app.py
+
+This application provides a simple interface for asking GPT-4 questions and receiving answers.
+
+To run the application:
+
+run:
+```arduino
+streamlit run gui/streamlit_app.py
+```
+
+
+
+
+
+
+
+preview:
+
+
+
+
+### streamlit\_chat\_app.py
+
+This application provides a chat-like interface for asking GPT-4 questions and receiving answers. It supports multiple query methods, and users can select the desired API for their queries. The application also maintains a conversation history.
+
+To run the application:
+
+```arduino
+streamlit run streamlit_chat_app.py
+```
+
+
+
+
+
+
+
+
+preview:
+
+
+
+Contributing
+------------
+
+Feel free to submit pull requests, report bugs, or request new features by opening issues on the GitHub repository.
+
+Bug
+----
+There is a bug in `streamlit_chat_app.py` right now that I haven't pinpointed yet, probably is really simple but havent had the time to look for it. Whenever you open a new conversation or access an old conversation it will only start prompt-answering after the second time you input to the text input, other than that, everything else seems to work accordingly.
+
+License
+-------
+
+This project is licensed under the MIT License.
diff --git a/g4f/.v1/gui/__init__.py b/g4f/.v1/gui/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391
diff --git a/g4f/.v1/gui/image1.png b/g4f/.v1/gui/image1.png
new file mode 100644
index 0000000000000000000000000000000000000000..fb84ae7fef5d4a65759fca381e3bf220aab44eaa
Binary files /dev/null and b/g4f/.v1/gui/image1.png differ
diff --git a/g4f/.v1/gui/image2.png b/g4f/.v1/gui/image2.png
new file mode 100644
index 0000000000000000000000000000000000000000..c5fc91c9ac32baf53529d355bc11a7fd37215905
Binary files /dev/null and b/g4f/.v1/gui/image2.png differ
diff --git a/g4f/.v1/gui/pywebio-gui/README.md b/g4f/.v1/gui/pywebio-gui/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..2b99c075d507dbf128a170d2975b1b22b393a70e
--- /dev/null
+++ b/g4f/.v1/gui/pywebio-gui/README.md
@@ -0,0 +1,24 @@
+# GUI with PyWebIO
+Simple, fast, and with fewer errors
+Only requires
+```bash
+pip install gpt4free
+pip install pywebio
+```
+clicking on 'pywebio-usesless.py' will run it
+
+PS: Currently, only 'usesless' is implemented, and the GUI is expected to be updated infrequently, with a focus on stability.
+
+↓ Here is the introduction in zh-Hans-CN below.
+
+# 使用pywebio实现的极简GUI
+简单,快捷,报错少
+只需要
+```bash
+pip install gpt4free
+pip install pywebio
+```
+
+双击pywebio-usesless.py即可运行
+
+ps:目前仅实现usesless,这个gui更新频率应该会比较少,目的是追求稳定
diff --git a/g4f/.v1/gui/pywebio-gui/pywebio-usesless.py b/g4f/.v1/gui/pywebio-gui/pywebio-usesless.py
new file mode 100644
index 0000000000000000000000000000000000000000..177fa7c86ebfd4b4f3b38033ee31863fec3df794
--- /dev/null
+++ b/g4f/.v1/gui/pywebio-gui/pywebio-usesless.py
@@ -0,0 +1,59 @@
+from gpt4free import usesless
+import time
+from pywebio import start_server,config
+from pywebio.input import *
+from pywebio.output import *
+from pywebio.session import local
+message_id = ""
+def status():
+ try:
+ req = usesless.Completion.create(prompt="hello", parentMessageId=message_id)
+ print(f"Answer: {req['text']}")
+ put_success(f"Answer: {req['text']}",scope="body")
+ except:
+ put_error("Program Error",scope="body")
+
+def ask(prompt):
+ req = usesless.Completion.create(prompt=prompt, parentMessageId=local.message_id)
+ rp=req['text']
+ local.message_id=req["id"]
+ print("AI:\n"+rp)
+ local.conversation.extend([
+ {"role": "user", "content": prompt},
+ {"role": "assistant", "content": rp}
+ ])
+ print(local.conversation)
+ return rp
+
+def msg():
+ while True:
+ text= input_group("You:",[textarea('You:',name='text',rows=3, placeholder='请输入问题')])
+ if not(bool(text)):
+ break
+ if not(bool(text["text"])):
+ continue
+ time.sleep(0.5)
+ put_code("You:"+text["text"],scope="body")
+ print("Question:"+text["text"])
+ with use_scope('foot'):
+ put_loading(color="info")
+ rp= ask(text["text"])
+ clear(scope="foot")
+ time.sleep(0.5)
+ put_markdown("Bot:\n"+rp,scope="body")
+ time.sleep(0.7)
+
+@config(title="AIchat",theme="dark")
+def main():
+ put_scope("heads")
+ with use_scope('heads'):
+ put_html("AI Chat
")
+ put_scope("body")
+ put_scope("foot")
+ status()
+ local.conversation=[]
+ local.message_id=""
+ msg()
+
+print("Click link to chat page")
+start_server(main, port=8099,allowed_origins="*",auto_open_webbrowser=True,debug=True)
diff --git a/g4f/.v1/gui/query_methods.py b/g4f/.v1/gui/query_methods.py
new file mode 100644
index 0000000000000000000000000000000000000000..2d6adacd3b394183c65ab596cc148c45de6b63c4
--- /dev/null
+++ b/g4f/.v1/gui/query_methods.py
@@ -0,0 +1,100 @@
+import os
+import sys
+from typing import Optional
+
+sys.path.append(os.path.join(os.path.dirname(__file__), os.path.pardir))
+
+from gpt4free import quora, forefront, theb, you
+import random
+
+
+def query_forefront(question: str, proxy: Optional[str] = None) -> str:
+ # create an account
+ token = forefront.Account.create(logging=False, proxy=proxy)
+
+ response = ""
+ # get a response
+ try:
+ return forefront.Completion.create(token=token, prompt='hello world', model='gpt-4', proxy=proxy).text
+ except Exception as e:
+ # Return error message if an exception occurs
+ return (
+ f'An error occurred: {e}. Please make sure you are using a valid cloudflare clearance token and user agent.'
+ )
+
+
+def query_quora(question: str, proxy: Optional[str] = None) -> str:
+ token = quora.Account.create(logging=False, enable_bot_creation=True, proxy=proxy)
+ return quora.Completion.create(model='gpt-4', prompt=question, token=token, proxy=proxy).text
+
+
+def query_theb(question: str, proxy: Optional[str] = None) -> str:
+ # Set cloudflare clearance cookie and get answer from GPT-4 model
+ response = ""
+ try:
+ return ''.join(theb.Completion.create(prompt=question, proxy=proxy))
+
+ except Exception as e:
+ # Return error message if an exception occurs
+ return (
+ f'An error occurred: {e}. Please make sure you are using a valid cloudflare clearance token and user agent.'
+ )
+
+
+def query_you(question: str, proxy: Optional[str] = None) -> str:
+ # Set cloudflare clearance cookie and get answer from GPT-4 model
+ try:
+ result = you.Completion.create(prompt=question, proxy=proxy)
+ return result.text
+
+ except Exception as e:
+ # Return error message if an exception occurs
+ return (
+ f'An error occurred: {e}. Please make sure you are using a valid cloudflare clearance token and user agent.'
+ )
+
+
+# Define a dictionary containing all query methods
+avail_query_methods = {
+ "Forefront": query_forefront,
+ "Poe": query_quora,
+ "Theb": query_theb,
+ "You": query_you,
+ # "Writesonic": query_writesonic,
+ # "T3nsor": query_t3nsor,
+ # "Phind": query_phind,
+ # "Ora": query_ora,
+}
+
+
+def query(user_input: str, selected_method: str = "Random", proxy: Optional[str] = None) -> str:
+ # If a specific query method is selected (not "Random") and the method is in the dictionary, try to call it
+ if selected_method != "Random" and selected_method in avail_query_methods:
+ try:
+ return avail_query_methods[selected_method](user_input, proxy=proxy)
+ except Exception as e:
+ print(f"Error with {selected_method}: {e}")
+ return "😵 Sorry, some error occurred please try again."
+
+ # Initialize variables for determining success and storing the result
+ success = False
+ result = "😵 Sorry, some error occurred please try again."
+ # Create a list of available query methods
+ query_methods_list = list(avail_query_methods.values())
+
+ # Continue trying different methods until a successful result is obtained or all methods have been tried
+ while not success and query_methods_list:
+ # Choose a random method from the list
+ chosen_query = random.choice(query_methods_list)
+ # Find the name of the chosen method
+ chosen_query_name = [k for k, v in avail_query_methods.items() if v == chosen_query][0]
+ try:
+ # Try to call the chosen method with the user input
+ result = chosen_query(user_input, proxy=proxy)
+ success = True
+ except Exception as e:
+ print(f"Error with {chosen_query_name}: {e}")
+ # Remove the failed method from the list of available methods
+ query_methods_list.remove(chosen_query)
+
+ return result
diff --git a/g4f/.v1/gui/streamlit_app.py b/g4f/.v1/gui/streamlit_app.py
new file mode 100644
index 0000000000000000000000000000000000000000..2dba0a7b672470f20aa163005d42894dd17df7c0
--- /dev/null
+++ b/g4f/.v1/gui/streamlit_app.py
@@ -0,0 +1,52 @@
+import os
+import sys
+
+sys.path.append(os.path.join(os.path.dirname(__file__), os.path.pardir))
+
+import streamlit as st
+from gpt4free import you
+
+
+def get_answer(question: str) -> str:
+ # Set cloudflare clearance cookie and get answer from GPT-4 model
+ try:
+ result = you.Completion.create(prompt=question)
+
+ return result.text
+
+ except Exception as e:
+ # Return error message if an exception occurs
+ return (
+ f'An error occurred: {e}. Please make sure you are using a valid cloudflare clearance token and user agent.'
+ )
+
+
+# Set page configuration and add header
+st.set_page_config(
+ page_title="gpt4freeGUI",
+ initial_sidebar_state="expanded",
+ page_icon="🧠",
+ menu_items={
+ 'Get Help': 'https://github.com/xtekky/gpt4free/blob/main/README.md',
+ 'Report a bug': "https://github.com/xtekky/gpt4free/issues",
+ 'About': "### gptfree GUI",
+ },
+)
+st.header('GPT4free GUI')
+
+# Add text area for user input and button to get answer
+question_text_area = st.text_area('🤖 Ask Any Question :', placeholder='Explain quantum computing in 50 words')
+if st.button('🧠 Think'):
+ answer = get_answer(question_text_area)
+ escaped = answer.encode('utf-8').decode('unicode-escape')
+ # Display answer
+ st.caption("Answer :")
+ st.markdown(escaped)
+
+# Hide Streamlit footer
+hide_streamlit_style = """
+
+ """
+st.markdown(hide_streamlit_style, unsafe_allow_html=True)
diff --git a/g4f/.v1/gui/streamlit_chat_app.py b/g4f/.v1/gui/streamlit_chat_app.py
new file mode 100644
index 0000000000000000000000000000000000000000..af3969e6fa682b0b237b0a7b8c228de83e7209ae
--- /dev/null
+++ b/g4f/.v1/gui/streamlit_chat_app.py
@@ -0,0 +1,156 @@
+import atexit
+import Levenshtein
+import os
+import sys
+
+sys.path.append(os.path.join(os.path.dirname(__file__), os.path.pardir))
+
+import streamlit as st
+from streamlit_chat import message
+from query_methods import query, avail_query_methods
+import pickle
+
+conversations_file = "conversations.pkl"
+
+def load_conversations():
+ try:
+ with open(conversations_file, "rb") as f:
+ return pickle.load(f)
+ except FileNotFoundError:
+ return []
+ except EOFError:
+ return []
+
+
+def save_conversations(conversations, current_conversation):
+ updated = False
+ for idx, conversation in enumerate(conversations):
+ if conversation == current_conversation:
+ conversations[idx] = current_conversation
+ updated = True
+ break
+ if not updated:
+ conversations.append(current_conversation)
+
+ temp_conversations_file = "temp_" + conversations_file
+ with open(temp_conversations_file, "wb") as f:
+ pickle.dump(conversations, f)
+
+ os.replace(temp_conversations_file, conversations_file)
+
+def delete_conversation(conversations, current_conversation):
+ for idx, conversation in enumerate(conversations):
+ conversations[idx] = current_conversation
+ break
+ conversations.remove(current_conversation)
+
+ temp_conversations_file = "temp_" + conversations_file
+ with open(temp_conversations_file, "wb") as f:
+ pickle.dump(conversations, f)
+
+ os.replace(temp_conversations_file, conversations_file)
+
+def exit_handler():
+ print("Exiting, saving data...")
+ # Perform cleanup operations here, like saving data or closing open files.
+ save_conversations(st.session_state.conversations, st.session_state.current_conversation)
+
+
+# Register the exit_handler function to be called when the program is closing.
+atexit.register(exit_handler)
+
+st.header("Chat Placeholder")
+
+if 'conversations' not in st.session_state:
+ st.session_state['conversations'] = load_conversations()
+
+if 'input_text' not in st.session_state:
+ st.session_state['input_text'] = ''
+
+if 'selected_conversation' not in st.session_state:
+ st.session_state['selected_conversation'] = None
+
+if 'input_field_key' not in st.session_state:
+ st.session_state['input_field_key'] = 0
+
+if 'query_method' not in st.session_state:
+ st.session_state['query_method'] = query
+
+if 'search_query' not in st.session_state:
+ st.session_state['search_query'] = ''
+
+# Initialize new conversation
+if 'current_conversation' not in st.session_state or st.session_state['current_conversation'] is None:
+ st.session_state['current_conversation'] = {'user_inputs': [], 'generated_responses': []}
+
+input_placeholder = st.empty()
+user_input = input_placeholder.text_input(
+ 'You:', value=st.session_state['input_text'], key=f'input_text_-1'#{st.session_state["input_field_key"]}
+)
+submit_button = st.button("Submit")
+
+if (user_input and user_input != st.session_state['input_text']) or submit_button:
+ output = query(user_input, st.session_state['query_method'])
+
+ escaped_output = output.encode('utf-8').decode('unicode-escape')
+
+ st.session_state['current_conversation']['user_inputs'].append(user_input)
+ st.session_state.current_conversation['generated_responses'].append(escaped_output)
+ save_conversations(st.session_state.conversations, st.session_state.current_conversation)
+ st.session_state['input_text'] = ''
+ st.session_state['input_field_key'] += 1 # Increment key value for new widget
+ user_input = input_placeholder.text_input(
+ 'You:', value=st.session_state['input_text'], key=f'input_text_{st.session_state["input_field_key"]}'
+ ) # Clear the input field
+
+# Add a button to create a new conversation
+if st.sidebar.button("New Conversation"):
+ st.session_state['selected_conversation'] = None
+ st.session_state['current_conversation'] = {'user_inputs': [], 'generated_responses': []}
+ st.session_state['input_field_key'] += 1 # Increment key value for new widget
+ st.session_state['query_method'] = st.sidebar.selectbox("Select API:", options=avail_query_methods, index=0)
+
+# Proxy
+st.session_state['proxy'] = st.sidebar.text_input("Proxy: ")
+
+# Searchbar
+search_query = st.sidebar.text_input("Search Conversations:", value=st.session_state.get('search_query', ''), key='search')
+
+if search_query:
+ filtered_conversations = []
+ indices = []
+ for idx, conversation in enumerate(st.session_state.conversations):
+ if search_query in conversation['user_inputs'][0]:
+ filtered_conversations.append(conversation)
+ indices.append(idx)
+
+ filtered_conversations = list(zip(indices, filtered_conversations))
+ conversations = sorted(filtered_conversations, key=lambda x: Levenshtein.distance(search_query, x[1]['user_inputs'][0]))
+
+ sidebar_header = f"Search Results ({len(conversations)})"
+else:
+ conversations = st.session_state.conversations
+ sidebar_header = "Conversation History"
+
+# Sidebar
+st.sidebar.header(sidebar_header)
+sidebar_col1, sidebar_col2 = st.sidebar.columns([5,1])
+for idx, conversation in enumerate(conversations):
+ if sidebar_col1.button(f"Conversation {idx + 1}: {conversation['user_inputs'][0]}", key=f"sidebar_btn_{idx}"):
+ st.session_state['selected_conversation'] = idx
+ st.session_state['current_conversation'] = conversation
+ if sidebar_col2.button('🗑️', key=f"sidebar_btn_delete_{idx}"):
+ if st.session_state['selected_conversation'] == idx:
+ st.session_state['selected_conversation'] = None
+ st.session_state['current_conversation'] = {'user_inputs': [], 'generated_responses': []}
+ delete_conversation(conversations, conversation)
+ st.experimental_rerun()
+if st.session_state['selected_conversation'] is not None:
+ conversation_to_display = conversations[st.session_state['selected_conversation']]
+else:
+ conversation_to_display = st.session_state.current_conversation
+
+if conversation_to_display['generated_responses']:
+ for i in range(len(conversation_to_display['generated_responses']) - 1, -1, -1):
+ message(conversation_to_display["generated_responses"][i], key=f"display_generated_{i}")
+ message(conversation_to_display['user_inputs'][i], is_user=True, key=f"display_user_{i}")
\ No newline at end of file
diff --git a/g4f/.v1/poetry.lock b/g4f/.v1/poetry.lock
new file mode 100644
index 0000000000000000000000000000000000000000..1612dffe43989377b66cb50b65d46d7aa89c1e7b
--- /dev/null
+++ b/g4f/.v1/poetry.lock
@@ -0,0 +1,1816 @@
+# This file is automatically @generated by Poetry 1.5.0 and should not be changed by hand.
+
+[[package]]
+name = "altair"
+version = "4.2.2"
+description = "Altair: A declarative statistical visualization library for Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "altair-4.2.2-py3-none-any.whl", hash = "sha256:8b45ebeaf8557f2d760c5c77b79f02ae12aee7c46c27c06014febab6f849bc87"},
+ {file = "altair-4.2.2.tar.gz", hash = "sha256:39399a267c49b30d102c10411e67ab26374156a84b1aeb9fcd15140429ba49c5"},
+]
+
+[package.dependencies]
+entrypoints = "*"
+jinja2 = "*"
+jsonschema = ">=3.0"
+numpy = "*"
+pandas = ">=0.18"
+toolz = "*"
+
+[package.extras]
+dev = ["black", "docutils", "flake8", "ipython", "m2r", "mistune (<2.0.0)", "pytest", "recommonmark", "sphinx", "vega-datasets"]
+
+[[package]]
+name = "async-generator"
+version = "1.10"
+description = "Async generators and context managers for Python 3.5+"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "async_generator-1.10-py3-none-any.whl", hash = "sha256:01c7bf666359b4967d2cda0000cc2e4af16a0ae098cbffcb8472fb9e8ad6585b"},
+ {file = "async_generator-1.10.tar.gz", hash = "sha256:6ebb3d106c12920aaae42ccb6f787ef5eefdcdd166ea3d628fa8476abe712144"},
+]
+
+[[package]]
+name = "attrs"
+version = "23.1.0"
+description = "Classes Without Boilerplate"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
+ {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
+]
+
+[package.extras]
+cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
+dev = ["attrs[docs,tests]", "pre-commit"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
+tests = ["attrs[tests-no-zope]", "zope-interface"]
+tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+
+[[package]]
+name = "blinker"
+version = "1.6.2"
+description = "Fast, simple object-to-object and broadcast signaling"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "blinker-1.6.2-py3-none-any.whl", hash = "sha256:c3d739772abb7bc2860abf5f2ec284223d9ad5c76da018234f6f50d6f31ab1f0"},
+ {file = "blinker-1.6.2.tar.gz", hash = "sha256:4afd3de66ef3a9f8067559fb7a1cbe555c17dcbe15971b05d1b625c3e7abe213"},
+]
+
+[[package]]
+name = "cachetools"
+version = "5.3.0"
+description = "Extensible memoizing collections and decorators"
+optional = false
+python-versions = "~=3.7"
+files = [
+ {file = "cachetools-5.3.0-py3-none-any.whl", hash = "sha256:429e1a1e845c008ea6c85aa35d4b98b65d6a9763eeef3e37e92728a12d1de9d4"},
+ {file = "cachetools-5.3.0.tar.gz", hash = "sha256:13dfddc7b8df938c21a940dfa6557ce6e94a2f1cdfa58eb90c805721d58f2c14"},
+]
+
+[[package]]
+name = "certifi"
+version = "2023.5.7"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.5.7-py3-none-any.whl", hash = "sha256:c6c2e98f5c7869efca1f8916fed228dd91539f9f1b444c314c06eef02980c716"},
+ {file = "certifi-2023.5.7.tar.gz", hash = "sha256:0f0d56dc5a6ad56fd4ba36484d6cc34451e1c6548c61daad8c320169f91eddc7"},
+]
+
+[[package]]
+name = "cffi"
+version = "1.15.1"
+description = "Foreign Function Interface for Python calling C code."
+optional = false
+python-versions = "*"
+files = [
+ {file = "cffi-1.15.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:a66d3508133af6e8548451b25058d5812812ec3798c886bf38ed24a98216fab2"},
+ {file = "cffi-1.15.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:470c103ae716238bbe698d67ad020e1db9d9dba34fa5a899b5e21577e6d52ed2"},
+ {file = "cffi-1.15.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:9ad5db27f9cabae298d151c85cf2bad1d359a1b9c686a275df03385758e2f914"},
+ {file = "cffi-1.15.1-cp27-cp27m-win32.whl", hash = "sha256:b3bbeb01c2b273cca1e1e0c5df57f12dce9a4dd331b4fa1635b8bec26350bde3"},
+ {file = "cffi-1.15.1-cp27-cp27m-win_amd64.whl", hash = "sha256:e00b098126fd45523dd056d2efba6c5a63b71ffe9f2bbe1a4fe1716e1d0c331e"},
+ {file = "cffi-1.15.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:d61f4695e6c866a23a21acab0509af1cdfd2c013cf256bbf5b6b5e2695827162"},
+ {file = "cffi-1.15.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:ed9cb427ba5504c1dc15ede7d516b84757c3e3d7868ccc85121d9310d27eed0b"},
+ {file = "cffi-1.15.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:39d39875251ca8f612b6f33e6b1195af86d1b3e60086068be9cc053aa4376e21"},
+ {file = "cffi-1.15.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:285d29981935eb726a4399badae8f0ffdff4f5050eaa6d0cfc3f64b857b77185"},
+ {file = "cffi-1.15.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3eb6971dcff08619f8d91607cfc726518b6fa2a9eba42856be181c6d0d9515fd"},
+ {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21157295583fe8943475029ed5abdcf71eb3911894724e360acff1d61c1d54bc"},
+ {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5635bd9cb9731e6d4a1132a498dd34f764034a8ce60cef4f5319c0541159392f"},
+ {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2012c72d854c2d03e45d06ae57f40d78e5770d252f195b93f581acf3ba44496e"},
+ {file = "cffi-1.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd86c085fae2efd48ac91dd7ccffcfc0571387fe1193d33b6394db7ef31fe2a4"},
+ {file = "cffi-1.15.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fa6693661a4c91757f4412306191b6dc88c1703f780c8234035eac011922bc01"},
+ {file = "cffi-1.15.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:59c0b02d0a6c384d453fece7566d1c7e6b7bae4fc5874ef2ef46d56776d61c9e"},
+ {file = "cffi-1.15.1-cp310-cp310-win32.whl", hash = "sha256:cba9d6b9a7d64d4bd46167096fc9d2f835e25d7e4c121fb2ddfc6528fb0413b2"},
+ {file = "cffi-1.15.1-cp310-cp310-win_amd64.whl", hash = "sha256:ce4bcc037df4fc5e3d184794f27bdaab018943698f4ca31630bc7f84a7b69c6d"},
+ {file = "cffi-1.15.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3d08afd128ddaa624a48cf2b859afef385b720bb4b43df214f85616922e6a5ac"},
+ {file = "cffi-1.15.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:3799aecf2e17cf585d977b780ce79ff0dc9b78d799fc694221ce814c2c19db83"},
+ {file = "cffi-1.15.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a591fe9e525846e4d154205572a029f653ada1a78b93697f3b5a8f1f2bc055b9"},
+ {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3548db281cd7d2561c9ad9984681c95f7b0e38881201e157833a2342c30d5e8c"},
+ {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:91fc98adde3d7881af9b59ed0294046f3806221863722ba7d8d120c575314325"},
+ {file = "cffi-1.15.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:94411f22c3985acaec6f83c6df553f2dbe17b698cc7f8ae751ff2237d96b9e3c"},
+ {file = "cffi-1.15.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:03425bdae262c76aad70202debd780501fabeaca237cdfddc008987c0e0f59ef"},
+ {file = "cffi-1.15.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cc4d65aeeaa04136a12677d3dd0b1c0c94dc43abac5860ab33cceb42b801c1e8"},
+ {file = "cffi-1.15.1-cp311-cp311-win32.whl", hash = "sha256:a0f100c8912c114ff53e1202d0078b425bee3649ae34d7b070e9697f93c5d52d"},
+ {file = "cffi-1.15.1-cp311-cp311-win_amd64.whl", hash = "sha256:04ed324bda3cda42b9b695d51bb7d54b680b9719cfab04227cdd1e04e5de3104"},
+ {file = "cffi-1.15.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50a74364d85fd319352182ef59c5c790484a336f6db772c1a9231f1c3ed0cbd7"},
+ {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e263d77ee3dd201c3a142934a086a4450861778baaeeb45db4591ef65550b0a6"},
+ {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cec7d9412a9102bdc577382c3929b337320c4c4c4849f2c5cdd14d7368c5562d"},
+ {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4289fc34b2f5316fbb762d75362931e351941fa95fa18789191b33fc4cf9504a"},
+ {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:173379135477dc8cac4bc58f45db08ab45d228b3363adb7af79436135d028405"},
+ {file = "cffi-1.15.1-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:6975a3fac6bc83c4a65c9f9fcab9e47019a11d3d2cf7f3c0d03431bf145a941e"},
+ {file = "cffi-1.15.1-cp36-cp36m-win32.whl", hash = "sha256:2470043b93ff09bf8fb1d46d1cb756ce6132c54826661a32d4e4d132e1977adf"},
+ {file = "cffi-1.15.1-cp36-cp36m-win_amd64.whl", hash = "sha256:30d78fbc8ebf9c92c9b7823ee18eb92f2e6ef79b45ac84db507f52fbe3ec4497"},
+ {file = "cffi-1.15.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:198caafb44239b60e252492445da556afafc7d1e3ab7a1fb3f0584ef6d742375"},
+ {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5ef34d190326c3b1f822a5b7a45f6c4535e2f47ed06fec77d3d799c450b2651e"},
+ {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8102eaf27e1e448db915d08afa8b41d6c7ca7a04b7d73af6514df10a3e74bd82"},
+ {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5df2768244d19ab7f60546d0c7c63ce1581f7af8b5de3eb3004b9b6fc8a9f84b"},
+ {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8c4917bd7ad33e8eb21e9a5bbba979b49d9a97acb3a803092cbc1133e20343c"},
+ {file = "cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2642fe3142e4cc4af0799748233ad6da94c62a8bec3a6648bf8ee68b1c7426"},
+ {file = "cffi-1.15.1-cp37-cp37m-win32.whl", hash = "sha256:e229a521186c75c8ad9490854fd8bbdd9a0c9aa3a524326b55be83b54d4e0ad9"},
+ {file = "cffi-1.15.1-cp37-cp37m-win_amd64.whl", hash = "sha256:a0b71b1b8fbf2b96e41c4d990244165e2c9be83d54962a9a1d118fd8657d2045"},
+ {file = "cffi-1.15.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:320dab6e7cb2eacdf0e658569d2575c4dad258c0fcc794f46215e1e39f90f2c3"},
+ {file = "cffi-1.15.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e74c6b51a9ed6589199c787bf5f9875612ca4a8a0785fb2d4a84429badaf22a"},
+ {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5c84c68147988265e60416b57fc83425a78058853509c1b0629c180094904a5"},
+ {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3b926aa83d1edb5aa5b427b4053dc420ec295a08e40911296b9eb1b6170f6cca"},
+ {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:87c450779d0914f2861b8526e035c5e6da0a3199d8f1add1a665e1cbc6fc6d02"},
+ {file = "cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f2c9f67e9821cad2e5f480bc8d83b8742896f1242dba247911072d4fa94c192"},
+ {file = "cffi-1.15.1-cp38-cp38-win32.whl", hash = "sha256:8b7ee99e510d7b66cdb6c593f21c043c248537a32e0bedf02e01e9553a172314"},
+ {file = "cffi-1.15.1-cp38-cp38-win_amd64.whl", hash = "sha256:00a9ed42e88df81ffae7a8ab6d9356b371399b91dbdf0c3cb1e84c03a13aceb5"},
+ {file = "cffi-1.15.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:54a2db7b78338edd780e7ef7f9f6c442500fb0d41a5a4ea24fff1c929d5af585"},
+ {file = "cffi-1.15.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:fcd131dd944808b5bdb38e6f5b53013c5aa4f334c5cad0c72742f6eba4b73db0"},
+ {file = "cffi-1.15.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7473e861101c9e72452f9bf8acb984947aa1661a7704553a9f6e4baa5ba64415"},
+ {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6c9a799e985904922a4d207a94eae35c78ebae90e128f0c4e521ce339396be9d"},
+ {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3bcde07039e586f91b45c88f8583ea7cf7a0770df3a1649627bf598332cb6984"},
+ {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33ab79603146aace82c2427da5ca6e58f2b3f2fb5da893ceac0c42218a40be35"},
+ {file = "cffi-1.15.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5d598b938678ebf3c67377cdd45e09d431369c3b1a5b331058c338e201f12b27"},
+ {file = "cffi-1.15.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:db0fbb9c62743ce59a9ff687eb5f4afbe77e5e8403d6697f7446e5f609976f76"},
+ {file = "cffi-1.15.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:98d85c6a2bef81588d9227dde12db8a7f47f639f4a17c9ae08e773aa9c697bf3"},
+ {file = "cffi-1.15.1-cp39-cp39-win32.whl", hash = "sha256:40f4774f5a9d4f5e344f31a32b5096977b5d48560c5592e2f3d2c4374bd543ee"},
+ {file = "cffi-1.15.1-cp39-cp39-win_amd64.whl", hash = "sha256:70df4e3b545a17496c9b3f41f5115e69a4f2e77e94e1d2a8e1070bc0c38c8a3c"},
+ {file = "cffi-1.15.1.tar.gz", hash = "sha256:d400bfb9a37b1351253cb402671cea7e89bdecc294e8016a707f6d1d8ac934f9"},
+]
+
+[package.dependencies]
+pycparser = "*"
+
+[[package]]
+name = "charset-normalizer"
+version = "3.1.0"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.1.0.tar.gz", hash = "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-win32.whl", hash = "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448"},
+ {file = "charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-win32.whl", hash = "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909"},
+ {file = "charset_normalizer-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-win32.whl", hash = "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974"},
+ {file = "charset_normalizer-3.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-win32.whl", hash = "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0"},
+ {file = "charset_normalizer-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-win32.whl", hash = "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1"},
+ {file = "charset_normalizer-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b"},
+ {file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
+]
+
+[[package]]
+name = "click"
+version = "8.1.3"
+description = "Composable command line interface toolkit"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
+ {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
+]
+
+[package.dependencies]
+colorama = {version = "*", markers = "platform_system == \"Windows\""}
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+description = "Cross-platform colored terminal text."
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
+files = [
+ {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
+ {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
+]
+
+[[package]]
+name = "curl-cffi"
+version = "0.5.6"
+description = "libcurl ffi bindings for Python, with impersonation support"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "curl_cffi-0.5.6-cp37-abi3-macosx_10_9_x86_64.whl", hash = "sha256:d134cf4d78d070d6822e7f3fe29492d80b84af0befd6e2e7d5969b9ff37dc916"},
+ {file = "curl_cffi-0.5.6-cp37-abi3-macosx_11_0_arm64.whl", hash = "sha256:c5dcf9f6a128780f8b9aad81df4d091bbd3e1a51da6991ed594bd3fcdb7f867a"},
+ {file = "curl_cffi-0.5.6-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:00e8c83b0dfde19c3470e5a07350bc8124db11723ef4e6d346cd634bb30ebc42"},
+ {file = "curl_cffi-0.5.6-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7a9df9fabff038f1ac9e7e6f32b5edb5d8df8c2eec64f53f513de1766c17ffdb"},
+ {file = "curl_cffi-0.5.6-cp37-abi3-win_amd64.whl", hash = "sha256:dcadfe0bbba3955626db2834221e146b0b9909d1601e91a0051e9e5dabb8fb1e"},
+ {file = "curl_cffi-0.5.6.tar.gz", hash = "sha256:30eea55149bd66dbb11aa467e3b4e039085bfac38da7fb8ae694425d9b7061da"},
+]
+
+[package.dependencies]
+cffi = ">=1.12.0"
+
+[package.extras]
+build = ["cibuildwheel", "wheel"]
+dev = ["autoflake (==1.4)", "black (==22.8.0)", "coverage (==6.4.1)", "cryptography (==38.0.3)", "flake8 (==6.0.0)", "flake8-bugbear (==22.7.1)", "flake8-pie (==0.15.0)", "httpx (==0.23.1)", "isort (==5.10.1)", "mypy (==0.971)", "pytest (==7.1.2)", "pytest-asyncio (==0.19.0)", "pytest-trio (==0.7.0)", "trio (==0.21.0)", "trio-typing (==0.7.0)", "trustme (==0.9.0)", "types-certifi (==2021.10.8.2)", "uvicorn (==0.18.3)"]
+test = ["cryptography (==38.0.3)", "httpx (==0.23.1)", "pytest (==7.1.2)", "pytest-asyncio (==0.19.0)", "pytest-trio (==0.7.0)", "trio (==0.21.0)", "trio-typing (==0.7.0)", "trustme (==0.9.0)", "types-certifi (==2021.10.8.2)", "uvicorn (==0.18.3)"]
+
+[[package]]
+name = "decorator"
+version = "5.1.1"
+description = "Decorators for Humans"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "decorator-5.1.1-py3-none-any.whl", hash = "sha256:b8c3f85900b9dc423225913c5aace94729fe1fa9763b38939a95226f02d37186"},
+ {file = "decorator-5.1.1.tar.gz", hash = "sha256:637996211036b6385ef91435e4fae22989472f9d571faba8927ba8253acbc330"},
+]
+
+[[package]]
+name = "entrypoints"
+version = "0.4"
+description = "Discover and load entry points from installed packages."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "entrypoints-0.4-py3-none-any.whl", hash = "sha256:f174b5ff827504fd3cd97cc3f8649f3693f51538c7e4bdf3ef002c8429d42f9f"},
+ {file = "entrypoints-0.4.tar.gz", hash = "sha256:b706eddaa9218a19ebcd67b56818f05bb27589b1ca9e8d797b74affad4ccacd4"},
+]
+
+[[package]]
+name = "exceptiongroup"
+version = "1.1.1"
+description = "Backport of PEP 654 (exception groups)"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "exceptiongroup-1.1.1-py3-none-any.whl", hash = "sha256:232c37c63e4f682982c8b6459f33a8981039e5fb8756b2074364e5055c498c9e"},
+ {file = "exceptiongroup-1.1.1.tar.gz", hash = "sha256:d484c3090ba2889ae2928419117447a14daf3c1231d5e30d0aae34f354f01785"},
+]
+
+[package.extras]
+test = ["pytest (>=6)"]
+
+[[package]]
+name = "fake-useragent"
+version = "1.1.3"
+description = "Up-to-date simple useragent faker with real world database"
+optional = false
+python-versions = "*"
+files = [
+ {file = "fake-useragent-1.1.3.tar.gz", hash = "sha256:1c06f0aa7d6e4894b919b30b9c7ebd72ff497325191057fbb5df3d5db06b93fc"},
+ {file = "fake_useragent-1.1.3-py3-none-any.whl", hash = "sha256:695d3b1bf7d11d04ab0f971fb73b0ca8de98b78bbadfbc8bacbc9a48423f7531"},
+]
+
+[[package]]
+name = "gitdb"
+version = "4.0.10"
+description = "Git Object Database"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "gitdb-4.0.10-py3-none-any.whl", hash = "sha256:c286cf298426064079ed96a9e4a9d39e7f3e9bf15ba60701e95f5492f28415c7"},
+ {file = "gitdb-4.0.10.tar.gz", hash = "sha256:6eb990b69df4e15bad899ea868dc46572c3f75339735663b81de79b06f17eb9a"},
+]
+
+[package.dependencies]
+smmap = ">=3.0.1,<6"
+
+[[package]]
+name = "gitpython"
+version = "3.1.31"
+description = "GitPython is a Python library used to interact with Git repositories"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "GitPython-3.1.31-py3-none-any.whl", hash = "sha256:f04893614f6aa713a60cbbe1e6a97403ef633103cdd0ef5eb6efe0deb98dbe8d"},
+ {file = "GitPython-3.1.31.tar.gz", hash = "sha256:8ce3bcf69adfdf7c7d503e78fd3b1c492af782d58893b650adb2ac8912ddd573"},
+]
+
+[package.dependencies]
+gitdb = ">=4.0.1,<5"
+
+[[package]]
+name = "h11"
+version = "0.14.0"
+description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"},
+ {file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"},
+]
+
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "importlib-metadata"
+version = "6.6.0"
+description = "Read metadata from Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "importlib_metadata-6.6.0-py3-none-any.whl", hash = "sha256:43dd286a2cd8995d5eaef7fee2066340423b818ed3fd70adf0bad5f1fac53fed"},
+ {file = "importlib_metadata-6.6.0.tar.gz", hash = "sha256:92501cdf9cc66ebd3e612f1b4f0c0765dfa42f0fa38ffb319b6bd84dd675d705"},
+]
+
+[package.dependencies]
+zipp = ">=0.5"
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
+perf = ["ipython"]
+testing = ["flake8 (<5)", "flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)", "pytest-perf (>=0.9.2)"]
+
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
+[[package]]
+name = "jsonschema"
+version = "4.17.3"
+description = "An implementation of JSON Schema validation for Python"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "jsonschema-4.17.3-py3-none-any.whl", hash = "sha256:a870ad254da1a8ca84b6a2905cac29d265f805acc57af304784962a2aa6508f6"},
+ {file = "jsonschema-4.17.3.tar.gz", hash = "sha256:0f864437ab8b6076ba6707453ef8f98a6a0d512a80e93f8abdb676f737ecb60d"},
+]
+
+[package.dependencies]
+attrs = ">=17.4.0"
+pyrsistent = ">=0.14.0,<0.17.0 || >0.17.0,<0.17.1 || >0.17.1,<0.17.2 || >0.17.2"
+
+[package.extras]
+format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
+format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
+
+[[package]]
+name = "levenshtein"
+version = "0.21.0"
+description = "Python extension for computing string edit distances and similarities."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "Levenshtein-0.21.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1f19fe25ea0dd845d0f48505e8947f6080728e10b7642ba0dad34e9b48c81130"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d23c647b03acbb5783f9bdfd51cfa5365d51f7df9f4029717a35eff5cc32bbcc"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:668ea30b311944c643f866ce5e45edf346f05e920075c0056f2ba7f74dde6071"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0f42b8dba2cce257cd34efd1ce9678d06f3248cb0bb2a92a5db8402e1e4a6f30"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8c27a5178ce322b56527a451185b4224217aa81955d9b0dad6f5a8de81ffe80f"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:92bf2370b01d7a4862abf411f8f60f39f064cebebce176e3e9ee14e744db8288"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:32dfda2e64d0c50553e47d0ab2956413970f940253351c196827ad46f17916d5"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f55623094b665d79a3b82ba77386ac34fa85049163edfe65387063e5127d4184"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:25576ad9c337ecb342306fe87166b54b2f49e713d4ff592c752cc98e0046296e"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fae24c875c4ecc8c5f34a9715eb2a459743b4ca21d35c51819b640ee2f71cb51"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:4b2156f32e46d16b74a055ccb4f64ee3c64399372a6aaf1ee98f6dccfadecee1"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:04046878a57129da4e2352c032df7c1fceaa54870916d12772cad505ef998290"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:608beb1683508c3cdbfff669c1c872ea02b47965e1bbb8a630de548e2490f96a"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-win32.whl", hash = "sha256:cc36ba40027b4f8821155c9e3e0afadffccdccbe955556039d1d1169dfc659c9"},
+ {file = "Levenshtein-0.21.0-cp310-cp310-win_amd64.whl", hash = "sha256:80e67bd73a05592ecd52aede4afa8ea49575de70f9d5bfbe2c52ebd3541b20be"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:3305262cb85ff78ace9e2d8d2dfc029b34dc5f93aa2d24fd20b6ed723e2ad501"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:023ca95c833ca548280e444e9a4c34fdecb3be3851e96af95bad290ae0c708b9"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:8476862a5c3150b8d63a7475563a4bff6dc50bbc0447894eb6b6a116ced0809d"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0236c8ff4648c50ebd81ac3692430d2241b134936ac9d86d7ca32ba6ab4a4e63"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cfbc4ed7ee2965e305bf81388fea377b795dabc82ee07f04f31d1fb8677a885"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6338a47b6f8c7f1ee8b5636cc8b245ad2d1d0ee47f7bb6f33f38a522ef0219cc"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4dc79033140f82acaca40712a6d26ed190cc2dd403e104020a87c24f2771aa72"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88ccdc8dc20c16e8059ace00fb58d353346a04fd24c0733b009678b2554801d2"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:8c031cbe3685b0343f5cc2dcf2172fd21b82f8ccc5c487179a895009bf0e4ea8"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eab6c253983a6659e749f4c44fcc2215194c2e00bf7b1c5e90fe683ea3b7b00f"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:8bdbcd1570340b07549f71e8a5ba3f0a6d84408bf86c4051dc7b70a29ae342bb"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4357bf8146cbadb10016ad3a950bba16e042f79015362a575f966181d95b4bc7"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:be038321695267a8faa5ae1b1a83deb3748827f0b6f72471e0beed36afcbd72a"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-win32.whl", hash = "sha256:be87998ffcbb5fb0c37a76d100f63b4811f48527192677da0ec3624b49ab8a64"},
+ {file = "Levenshtein-0.21.0-cp311-cp311-win_amd64.whl", hash = "sha256:f873af54014cac12082c7f5ccec6bbbeb5b57f63466e7f9c61a34588621313fb"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:4ec2ef9836a34a3bb009a81e5efe4d9d43515455fb5f182c5d2cf8ae61c79496"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e748c2349719cb1bc90f802d9d7f07310633dcf166d468a5bd821f78ed17698"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:37a99d858fa1d88b1a917b4059a186becd728534e5e889d583086482356b7ca1"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:742b785c93d16c63289902607219c200bd2b6077dafc788073c74337cae382fb"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cefd5a668f6d7af1279aca10104b43882fdd83f9bdc68933ba5429257a628abe"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3e1723d515ab287b9b2c2e4a111894dc6b474f5d28826fff379647486cae98d2"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:3e22d31375d5fea5797c9b7aa0f8cc36579c31dcf5754e9931ca86c27d9011f8"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:31cb59d86a5f99147cd4a67ebced8d6df574b5d763dcb63c033a642e29568746"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:76d5d34a8e21de8073c66ae801f053520f946d499fa533fbba654712775f8132"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:46dab8c6e8fae563ca77acfaeb3824c4dd4b599996328b8a081b06f16befa6a0"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:ee62ec5882a857b252faffeb7867679f7e418052ca6bf7d6b56099f6498a2b0e"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-win32.whl", hash = "sha256:7e40a4bac848c9a8883225f926cfa7b2bc9f651e989a8b7006cdb596edc7ac9b"},
+ {file = "Levenshtein-0.21.0-cp36-cp36m-win_amd64.whl", hash = "sha256:709a727f58d31a5ee1e5e83b247972fe55ef0014f6222256c9692c5efa471785"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:01dd427cf72b4978b09558e3d36e3f92c8eef467e3eb4653c3fdccd8d70aaa08"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9ff1255c499fcb41ba37a578ad8c1b8dab5c44f78941b8e1c1d7fab5b5e831bc"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8aa92b05156dfa2e248c3743670d5deb41a45b5789416d5fa31be009f4f043ab"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d932cb21e40beb93cfc8973de7f25fbf25ba4a07d1dccac3b9ba977164cf9887"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8d4ba0df46bb41d660d77e7cc6b4d38c8d5b6f977d51c48ed1217db6a8474cde"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c8eaaa6f0df2838437d1d8739629486b145f7a3405d3ef0874301a9f5bc7dcd"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:6ede583155f24c8b2456a7720fbbfa5d9c1154ae04b4da3cf63368e2406ea099"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a18c8e4d1aae3f9950797d049020c64a8a63cc8b4e43afcca91ec400bf6304c5"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:8cf87a5e2962431d7260dd81dc1ca0697f61aad81036145d3666f4c0d514ce3a"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:bd0bfa71b1441be359e99e77709885b79c22857bf9bb7f4e84c09e501f6c5fad"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:e9a6251818b9eb6d519bffd7a0b745f3a99b3e99563a4c9d3cad26e34f6ac880"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-win32.whl", hash = "sha256:8dd8ef4239b24fb1c9f0b536e48e55194d5966d351d349af23e67c9eb3875c68"},
+ {file = "Levenshtein-0.21.0-cp37-cp37m-win_amd64.whl", hash = "sha256:26c6fb012538a245d78adea786d2cfe3c1506b835762c1c523a4ed6b9e08dc0b"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:b0ba9723c7d67a61e160b3457259552f7d679d74aaa144b892eb68b7e2a5ebb6"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:426883be613d912495cf6ee2a776d2ab84aa6b3de5a8d82c43a994267ea6e0e3"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:5369827ace536c6df04e0e670d782999bc17bf9eb111e77435fdcdaecb10c2a3"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ebabcf982ae161534f8729d13fe05eebc977b497ac34936551f97cf8b07dd9e"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:13e8a5b1b58de49befea555bb913dc394614f2d3553bc5b86bc672c69ef1a85a"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d647f1e0c30c7a73f70f4de7376ed7dafc2b856b67fe480d32a81af133edbaeb"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5378a8139ba61d7271c0f9350201259c11eb90bfed0ac45539c4aeaed3907230"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:df9b0f8f511270ad259c7bfba22ab6d5a0c33d81cd594461668e67cd80dd9052"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9485f2a5c88113410153256657072bc93b81bf5c8690d47e4cc3df58135dbadb"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:aa39bb773915e4df330d311bb6c100a8613e265cc50d5b25b015c8db824e1c47"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:de2dfd6498454c7d89036d56a53c0a01fd9bcf1c2970253e469b5e8bb938b69f"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:4515f9511cb91c66d254ee30154206aad76b57d8b25f64ba1402aad43efdb251"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:f622f542bd065ffec7d26b26d44d0c9a25c9c1295fd8ba6e4d77778e2293a12c"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-win32.whl", hash = "sha256:ee757fd36bad66ad8b961958840894021ecaad22194f65219a666432739393ff"},
+ {file = "Levenshtein-0.21.0-cp38-cp38-win_amd64.whl", hash = "sha256:457442911df185e28a32fd8b788b14ca22ab3a552256b556e7687173d5f18bc4"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b646ace5085a60d4f89b28c81301c9d9e8cd6a9bdda908181b2fa3dfac7fc10d"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:0cc3679978cd0250bf002963cf2e08855b93f70fa0fc9f74956115c343983fbb"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:84b55b732e311629a8308ad2778a0f9824e29e3c35987eb35610fc52eb6d4634"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a7adaabe07c5ceb6228332b9184f06eb9cda89c227d198a1b8a6f78c05b3c672"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac8b6266799645827980ab1af4e0bfae209c1f747a10bdf6e5da96a6ebe511a2"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c2d67220867d640e36931b3d63b8349369b485d52cf6f4a2635bec8da92d678"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb26e69fc6c12534fbaa1657efed3b6482f1a166ba8e31227fa6f6f062a59070"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a68b05614d25cc2a5fbcc4d2fd124be7668d075fd5ac3d82f292eec573157361"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b167b32b3e336c5ec5e0212f025587f9248344ae6e73ed668270eba5c6a506e5"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:04850a0719e503014acb3fee6d4ec7d7f170a2c7375ffbc5833c7256b7cd10ee"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:ce7e76c6341abb498368d42b8081f2f45c245ac2a221af6a0394349d41302c08"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:ec64b7b3fb95bc9c20c72548277794b81281a6ba9da85eda2c87324c218441ff"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:24843f28cbbdcbcfc18b08e7d3409dbaad7896fb7113442592fa978590a7bbf0"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-win32.whl", hash = "sha256:c290a7211f1b4f87c300df4424cc46b7379cead3b6f37fa8d3e7e6c6212ccd39"},
+ {file = "Levenshtein-0.21.0-cp39-cp39-win_amd64.whl", hash = "sha256:1fde464f937878e6f5c30c234b95ce2cb969331a175b3089367e077113428062"},
+ {file = "Levenshtein-0.21.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:66d303cd485710fe6d62108209219b7a695bdd10a722f4e86abdaf26f4bf2202"},
+ {file = "Levenshtein-0.21.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7bc550d0986ace95bde003b8a60e622449baf2bdf24d8412f7a50f401a289ec3"},
+ {file = "Levenshtein-0.21.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c6858cfd84568bc1df3ad545553b5c27af6ed3346973e8f4b57d23c318cf8f4"},
+ {file = "Levenshtein-0.21.0-pp37-pypy37_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7ce3f14a8e006fb7e3fc7bab965ab7da5817f48fc48d25cf735fcec8f1d2e39a"},
+ {file = "Levenshtein-0.21.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:024302c82d49fc1f1d044794997ef7aa9d01b509a9040e222480b64a01cd4b80"},
+ {file = "Levenshtein-0.21.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:e043b79e39f165026bc941c95582bfc4bfdd297a1de6f13ace0d0a7abf486288"},
+ {file = "Levenshtein-0.21.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8446f8da38857482ec0cfd616fe5e7dcd3695fd323cc65f37366a9ff6a31c9cb"},
+ {file = "Levenshtein-0.21.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:587ad51770de41eb491bea1bfb676abc7ff9a94dbec0e2bc51fc6a25abef99c4"},
+ {file = "Levenshtein-0.21.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8ac4ed77d3263eac7f9b6ed89d451644332aecd55cda921201e348803a1e5c57"},
+ {file = "Levenshtein-0.21.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:cf2dee0f8c71598f8be51e3feceb9142ac01576277b9e691e25740987761c86e"},
+ {file = "Levenshtein-0.21.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4bbceef2caba4b2ae613b0e853a7aaab990c1a13bddb9054ba1328a84bccdbf7"},
+ {file = "Levenshtein-0.21.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2290732763e3b75979888364b26acce79d72b8677441b5762a4e97b3630cc3d9"},
+ {file = "Levenshtein-0.21.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db7567997ffbc2feb999e30002a92461a76f17a596a142bdb463b5f7037f160c"},
+ {file = "Levenshtein-0.21.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c270487d60b33102efea73be6dcd5835f3ddc3dc06e77499f0963df6cba2ec71"},
+ {file = "Levenshtein-0.21.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:e2686c37d22faf27d02a19e83b55812d248b32b7ba3aa638e768d0ea032e1f3c"},
+ {file = "Levenshtein-0.21.0.tar.gz", hash = "sha256:545635d9e857711d049dcdb0b8609fb707b34b032517376c531ca159fcd46265"},
+]
+
+[package.dependencies]
+rapidfuzz = ">=2.3.0,<4.0.0"
+
+[[package]]
+name = "mailgw-temporary-email"
+version = "0.0.2"
+description = "10 Minute Temporary Email"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "MailGw Temporary Email-0.0.2.tar.gz", hash = "sha256:894f3e1a9b64f33bc720f5faaf69dd95cbf190c054036bdf202470e4f43e6bad"},
+ {file = "MailGw_Temporary_Email-0.0.2-py3-none-any.whl", hash = "sha256:961b476d8968a368e0399dc7f4ae01d62ab1844c586f19bfa1f9ba47763ec05e"},
+]
+
+[package.dependencies]
+requests = "*"
+
+[[package]]
+name = "markdown-it-py"
+version = "2.2.0"
+description = "Python port of markdown-it. Markdown parsing, done right!"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "markdown-it-py-2.2.0.tar.gz", hash = "sha256:7c9a5e412688bc771c67432cbfebcdd686c93ce6484913dccf06cb5a0bea35a1"},
+ {file = "markdown_it_py-2.2.0-py3-none-any.whl", hash = "sha256:5a35f8d1870171d9acc47b99612dc146129b631baf04970128b568f190d0cc30"},
+]
+
+[package.dependencies]
+mdurl = ">=0.1,<1.0"
+
+[package.extras]
+benchmarking = ["psutil", "pytest", "pytest-benchmark"]
+code-style = ["pre-commit (>=3.0,<4.0)"]
+compare = ["commonmark (>=0.9,<1.0)", "markdown (>=3.4,<4.0)", "mistletoe (>=1.0,<2.0)", "mistune (>=2.0,<3.0)", "panflute (>=2.3,<3.0)"]
+linkify = ["linkify-it-py (>=1,<3)"]
+plugins = ["mdit-py-plugins"]
+profiling = ["gprof2dot"]
+rtd = ["attrs", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx_book_theme"]
+testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"]
+
+[[package]]
+name = "markupsafe"
+version = "2.1.2"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:665a36ae6f8f20a4676b53224e33d456a6f5a72657d9c83c2aa00765072f31f7"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:340bea174e9761308703ae988e982005aedf427de816d1afe98147668cc03036"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22152d00bf4a9c7c83960521fc558f55a1adbc0631fbb00a9471e097b19d72e1"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28057e985dace2f478e042eaa15606c7efccb700797660629da387eb289b9323"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca244fa73f50a800cf8c3ebf7fd93149ec37f5cb9596aa8873ae2c1d23498601"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d9d971ec1e79906046aa3ca266de79eac42f1dbf3612a05dc9368125952bd1a1"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7e007132af78ea9df29495dbf7b5824cb71648d7133cf7848a2a5dd00d36f9ff"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7313ce6a199651c4ed9d7e4cfb4aa56fe923b1adf9af3b420ee14e6d9a73df65"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-win32.whl", hash = "sha256:c4a549890a45f57f1ebf99c067a4ad0cb423a05544accaf2b065246827ed9603"},
+ {file = "MarkupSafe-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:835fb5e38fd89328e9c81067fd642b3593c33e1e17e2fdbf77f5676abb14a156"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2ec4f2d48ae59bbb9d1f9d7efb9236ab81429a764dedca114f5fdabbc3788013"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:608e7073dfa9e38a85d38474c082d4281f4ce276ac0010224eaba11e929dd53a"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65608c35bfb8a76763f37036547f7adfd09270fbdbf96608be2bead319728fcd"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2bfb563d0211ce16b63c7cb9395d2c682a23187f54c3d79bfec33e6705473c6"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:da25303d91526aac3672ee6d49a2f3db2d9502a4a60b55519feb1a4c7714e07d"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9cad97ab29dfc3f0249b483412c85c8ef4766d96cdf9dcf5a1e3caa3f3661cf1"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:085fd3201e7b12809f9e6e9bc1e5c96a368c8523fad5afb02afe3c051ae4afcc"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1bea30e9bf331f3fef67e0a3877b2288593c98a21ccb2cf29b74c581a4eb3af0"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-win32.whl", hash = "sha256:7df70907e00c970c60b9ef2938d894a9381f38e6b9db73c5be35e59d92e06625"},
+ {file = "MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:e55e40ff0cc8cc5c07996915ad367fa47da6b3fc091fdadca7f5403239c5fec3"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a6e40afa7f45939ca356f348c8e23048e02cb109ced1eb8420961b2f40fb373a"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf877ab4ed6e302ec1d04952ca358b381a882fbd9d1b07cccbfd61783561f98a"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63ba06c9941e46fa389d389644e2d8225e0e3e5ebcc4ff1ea8506dce646f8c8a"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f1cd098434e83e656abf198f103a8207a8187c0fc110306691a2e94a78d0abb2"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:55f44b440d491028addb3b88f72207d71eeebfb7b5dbf0643f7c023ae1fba619"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a6f2fcca746e8d5910e18782f976489939d54a91f9411c32051b4aab2bd7c513"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:0b462104ba25f1ac006fdab8b6a01ebbfbce9ed37fd37fd4acd70c67c973e460"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:7668b52e102d0ed87cb082380a7e2e1e78737ddecdde129acadb0eccc5423859"},
+ {file = "MarkupSafe-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6d6607f98fcf17e534162f0709aaad3ab7a96032723d8ac8750ffe17ae5a0666"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a806db027852538d2ad7555b203300173dd1b77ba116de92da9afbc3a3be3eed"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a4abaec6ca3ad8660690236d11bfe28dfd707778e2442b45addd2f086d6ef094"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f03a532d7dee1bed20bc4884194a16160a2de9ffc6354b3878ec9682bb623c54"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4cf06cdc1dda95223e9d2d3c58d3b178aa5dacb35ee7e3bbac10e4e1faacb419"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22731d79ed2eb25059ae3df1dfc9cb1546691cc41f4e3130fe6bfbc3ecbbecfa"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f8ffb705ffcf5ddd0e80b65ddf7bed7ee4f5a441ea7d3419e861a12eaf41af58"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8db032bf0ce9022a8e41a22598eefc802314e81b879ae093f36ce9ddf39ab1ba"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2298c859cfc5463f1b64bd55cb3e602528db6fa0f3cfd568d3605c50678f8f03"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-win32.whl", hash = "sha256:50c42830a633fa0cf9e7d27664637532791bfc31c731a87b202d2d8ac40c3ea2"},
+ {file = "MarkupSafe-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:bb06feb762bade6bf3c8b844462274db0c76acc95c52abe8dbed28ae3d44a147"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:99625a92da8229df6d44335e6fcc558a5037dd0a760e11d84be2260e6f37002f"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8bca7e26c1dd751236cfb0c6c72d4ad61d986e9a41bbf76cb445f69488b2a2bd"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40627dcf047dadb22cd25ea7ecfe9cbf3bbbad0482ee5920b582f3809c97654f"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40dfd3fefbef579ee058f139733ac336312663c6706d1163b82b3003fb1925c4"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:090376d812fb6ac5f171e5938e82e7f2d7adc2b629101cec0db8b267815c85e2"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2e7821bffe00aa6bd07a23913b7f4e01328c3d5cc0b40b36c0bd81d362faeb65"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:c0a33bc9f02c2b17c3ea382f91b4db0e6cde90b63b296422a939886a7a80de1c"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b8526c6d437855442cdd3d87eede9c425c4445ea011ca38d937db299382e6fa3"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-win32.whl", hash = "sha256:137678c63c977754abe9086a3ec011e8fd985ab90631145dfb9294ad09c102a7"},
+ {file = "MarkupSafe-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:0576fe974b40a400449768941d5d0858cc624e3249dfd1e0c33674e5c7ca7aed"},
+ {file = "MarkupSafe-2.1.2.tar.gz", hash = "sha256:abcabc8c2b26036d62d4c746381a6f7cf60aafcc653198ad678306986b09450d"},
+]
+
+[[package]]
+name = "mdurl"
+version = "0.1.2"
+description = "Markdown URL utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8"},
+ {file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"},
+]
+
+[[package]]
+name = "names"
+version = "0.3.0"
+description = "Generate random names"
+optional = false
+python-versions = "*"
+files = [
+ {file = "names-0.3.0.tar.gz", hash = "sha256:726e46254f2ed03f1ffb5d941dae3bc67c35123941c29becd02d48d0caa2a671"},
+]
+
+[[package]]
+name = "numpy"
+version = "1.24.3"
+description = "Fundamental package for array computing in Python"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "numpy-1.24.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3c1104d3c036fb81ab923f507536daedc718d0ad5a8707c6061cdfd6d184e570"},
+ {file = "numpy-1.24.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:202de8f38fc4a45a3eea4b63e2f376e5f2dc64ef0fa692838e31a808520efaf7"},
+ {file = "numpy-1.24.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8535303847b89aa6b0f00aa1dc62867b5a32923e4d1681a35b5eef2d9591a463"},
+ {file = "numpy-1.24.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d926b52ba1367f9acb76b0df6ed21f0b16a1ad87c6720a1121674e5cf63e2b6"},
+ {file = "numpy-1.24.3-cp310-cp310-win32.whl", hash = "sha256:f21c442fdd2805e91799fbe044a7b999b8571bb0ab0f7850d0cb9641a687092b"},
+ {file = "numpy-1.24.3-cp310-cp310-win_amd64.whl", hash = "sha256:ab5f23af8c16022663a652d3b25dcdc272ac3f83c3af4c02eb8b824e6b3ab9d7"},
+ {file = "numpy-1.24.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:9a7721ec204d3a237225db3e194c25268faf92e19338a35f3a224469cb6039a3"},
+ {file = "numpy-1.24.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d6cc757de514c00b24ae8cf5c876af2a7c3df189028d68c0cb4eaa9cd5afc2bf"},
+ {file = "numpy-1.24.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:76e3f4e85fc5d4fd311f6e9b794d0c00e7002ec122be271f2019d63376f1d385"},
+ {file = "numpy-1.24.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a1d3c026f57ceaad42f8231305d4653d5f05dc6332a730ae5c0bea3513de0950"},
+ {file = "numpy-1.24.3-cp311-cp311-win32.whl", hash = "sha256:c91c4afd8abc3908e00a44b2672718905b8611503f7ff87390cc0ac3423fb096"},
+ {file = "numpy-1.24.3-cp311-cp311-win_amd64.whl", hash = "sha256:5342cf6aad47943286afa6f1609cad9b4266a05e7f2ec408e2cf7aea7ff69d80"},
+ {file = "numpy-1.24.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7776ea65423ca6a15255ba1872d82d207bd1e09f6d0894ee4a64678dd2204078"},
+ {file = "numpy-1.24.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ae8d0be48d1b6ed82588934aaaa179875e7dc4f3d84da18d7eae6eb3f06c242c"},
+ {file = "numpy-1.24.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ecde0f8adef7dfdec993fd54b0f78183051b6580f606111a6d789cd14c61ea0c"},
+ {file = "numpy-1.24.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4749e053a29364d3452c034827102ee100986903263e89884922ef01a0a6fd2f"},
+ {file = "numpy-1.24.3-cp38-cp38-win32.whl", hash = "sha256:d933fabd8f6a319e8530d0de4fcc2e6a61917e0b0c271fded460032db42a0fe4"},
+ {file = "numpy-1.24.3-cp38-cp38-win_amd64.whl", hash = "sha256:56e48aec79ae238f6e4395886b5eaed058abb7231fb3361ddd7bfdf4eed54289"},
+ {file = "numpy-1.24.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4719d5aefb5189f50887773699eaf94e7d1e02bf36c1a9d353d9f46703758ca4"},
+ {file = "numpy-1.24.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0ec87a7084caa559c36e0a2309e4ecb1baa03b687201d0a847c8b0ed476a7187"},
+ {file = "numpy-1.24.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea8282b9bcfe2b5e7d491d0bf7f3e2da29700cec05b49e64d6246923329f2b02"},
+ {file = "numpy-1.24.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:210461d87fb02a84ef243cac5e814aad2b7f4be953b32cb53327bb49fd77fbb4"},
+ {file = "numpy-1.24.3-cp39-cp39-win32.whl", hash = "sha256:784c6da1a07818491b0ffd63c6bbe5a33deaa0e25a20e1b3ea20cf0e43f8046c"},
+ {file = "numpy-1.24.3-cp39-cp39-win_amd64.whl", hash = "sha256:d5036197ecae68d7f491fcdb4df90082b0d4960ca6599ba2659957aafced7c17"},
+ {file = "numpy-1.24.3-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:352ee00c7f8387b44d19f4cada524586f07379c0d49270f87233983bc5087ca0"},
+ {file = "numpy-1.24.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7d6acc2e7524c9955e5c903160aa4ea083736fde7e91276b0e5d98e6332812"},
+ {file = "numpy-1.24.3-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:35400e6a8d102fd07c71ed7dcadd9eb62ee9a6e84ec159bd48c28235bbb0f8e4"},
+ {file = "numpy-1.24.3.tar.gz", hash = "sha256:ab344f1bf21f140adab8e47fdbc7c35a477dc01408791f8ba00d018dd0bc5155"},
+]
+
+[[package]]
+name = "outcome"
+version = "1.2.0"
+description = "Capture the outcome of Python function calls."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "outcome-1.2.0-py2.py3-none-any.whl", hash = "sha256:c4ab89a56575d6d38a05aa16daeaa333109c1f96167aba8901ab18b6b5e0f7f5"},
+ {file = "outcome-1.2.0.tar.gz", hash = "sha256:6f82bd3de45da303cf1f771ecafa1633750a358436a8bb60e06a1ceb745d2672"},
+]
+
+[package.dependencies]
+attrs = ">=19.2.0"
+
+[[package]]
+name = "packaging"
+version = "23.1"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.1-py3-none-any.whl", hash = "sha256:994793af429502c4ea2ebf6bf664629d07c1a9fe974af92966e4b8d2df7edc61"},
+ {file = "packaging-23.1.tar.gz", hash = "sha256:a392980d2b6cffa644431898be54b0045151319d1e7ec34f0cfed48767dd334f"},
+]
+
+[[package]]
+name = "pandas"
+version = "2.0.1"
+description = "Powerful data structures for data analysis, time series, and statistics"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "pandas-2.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:70a996a1d2432dadedbb638fe7d921c88b0cc4dd90374eab51bb33dc6c0c2a12"},
+ {file = "pandas-2.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:909a72b52175590debbf1d0c9e3e6bce2f1833c80c76d80bd1aa09188be768e5"},
+ {file = "pandas-2.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fe7914d8ddb2d54b900cec264c090b88d141a1eed605c9539a187dbc2547f022"},
+ {file = "pandas-2.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0a514ae436b23a92366fbad8365807fc0eed15ca219690b3445dcfa33597a5cc"},
+ {file = "pandas-2.0.1-cp310-cp310-win32.whl", hash = "sha256:12bd6618e3cc737c5200ecabbbb5eaba8ab645a4b0db508ceeb4004bb10b060e"},
+ {file = "pandas-2.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:2b6fe5f7ce1cba0e74188c8473c9091ead9b293ef0a6794939f8cc7947057abd"},
+ {file = "pandas-2.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:00959a04a1d7bbc63d75a768540fb20ecc9e65fd80744c930e23768345a362a7"},
+ {file = "pandas-2.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:af2449e9e984dfad39276b885271ba31c5e0204ffd9f21f287a245980b0e4091"},
+ {file = "pandas-2.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:910df06feaf9935d05247db6de452f6d59820e432c18a2919a92ffcd98f8f79b"},
+ {file = "pandas-2.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fa0067f2419f933101bdc6001bcea1d50812afbd367b30943417d67fbb99678"},
+ {file = "pandas-2.0.1-cp311-cp311-win32.whl", hash = "sha256:7b8395d335b08bc8b050590da264f94a439b4770ff16bb51798527f1dd840388"},
+ {file = "pandas-2.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:8db5a644d184a38e6ed40feeb12d410d7fcc36648443defe4707022da127fc35"},
+ {file = "pandas-2.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7bbf173d364130334e0159a9a034f573e8b44a05320995127cf676b85fd8ce86"},
+ {file = "pandas-2.0.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6c0853d487b6c868bf107a4b270a823746175b1932093b537b9b76c639fc6f7e"},
+ {file = "pandas-2.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f25e23a03f7ad7211ffa30cb181c3e5f6d96a8e4cb22898af462a7333f8a74eb"},
+ {file = "pandas-2.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e09a53a4fe8d6ae2149959a2d02e1ef2f4d2ceb285ac48f74b79798507e468b4"},
+ {file = "pandas-2.0.1-cp38-cp38-win32.whl", hash = "sha256:a2564629b3a47b6aa303e024e3d84e850d36746f7e804347f64229f8c87416ea"},
+ {file = "pandas-2.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:03e677c6bc9cfb7f93a8b617d44f6091613a5671ef2944818469be7b42114a00"},
+ {file = "pandas-2.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3d099ecaa5b9e977b55cd43cf842ec13b14afa1cfa51b7e1179d90b38c53ce6a"},
+ {file = "pandas-2.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a37ee35a3eb6ce523b2c064af6286c45ea1c7ff882d46e10d0945dbda7572753"},
+ {file = "pandas-2.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:320b180d125c3842c5da5889183b9a43da4ebba375ab2ef938f57bf267a3c684"},
+ {file = "pandas-2.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:18d22cb9043b6c6804529810f492ab09d638ddf625c5dea8529239607295cb59"},
+ {file = "pandas-2.0.1-cp39-cp39-win32.whl", hash = "sha256:90d1d365d77d287063c5e339f49b27bd99ef06d10a8843cf00b1a49326d492c1"},
+ {file = "pandas-2.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:99f7192d8b0e6daf8e0d0fd93baa40056684e4b4aaaef9ea78dff34168e1f2f0"},
+ {file = "pandas-2.0.1.tar.gz", hash = "sha256:19b8e5270da32b41ebf12f0e7165efa7024492e9513fb46fb631c5022ae5709d"},
+]
+
+[package.dependencies]
+numpy = [
+ {version = ">=1.21.0", markers = "python_version >= \"3.10\""},
+ {version = ">=1.23.2", markers = "python_version >= \"3.11\""},
+]
+python-dateutil = ">=2.8.2"
+pytz = ">=2020.1"
+tzdata = ">=2022.1"
+
+[package.extras]
+all = ["PyQt5 (>=5.15.1)", "SQLAlchemy (>=1.4.16)", "beautifulsoup4 (>=4.9.3)", "bottleneck (>=1.3.2)", "brotlipy (>=0.7.0)", "fastparquet (>=0.6.3)", "fsspec (>=2021.07.0)", "gcsfs (>=2021.07.0)", "html5lib (>=1.1)", "hypothesis (>=6.34.2)", "jinja2 (>=3.0.0)", "lxml (>=4.6.3)", "matplotlib (>=3.6.1)", "numba (>=0.53.1)", "numexpr (>=2.7.3)", "odfpy (>=1.4.1)", "openpyxl (>=3.0.7)", "pandas-gbq (>=0.15.0)", "psycopg2 (>=2.8.6)", "pyarrow (>=7.0.0)", "pymysql (>=1.0.2)", "pyreadstat (>=1.1.2)", "pytest (>=7.0.0)", "pytest-asyncio (>=0.17.0)", "pytest-xdist (>=2.2.0)", "python-snappy (>=0.6.0)", "pyxlsb (>=1.0.8)", "qtpy (>=2.2.0)", "s3fs (>=2021.08.0)", "scipy (>=1.7.1)", "tables (>=3.6.1)", "tabulate (>=0.8.9)", "xarray (>=0.21.0)", "xlrd (>=2.0.1)", "xlsxwriter (>=1.4.3)", "zstandard (>=0.15.2)"]
+aws = ["s3fs (>=2021.08.0)"]
+clipboard = ["PyQt5 (>=5.15.1)", "qtpy (>=2.2.0)"]
+compression = ["brotlipy (>=0.7.0)", "python-snappy (>=0.6.0)", "zstandard (>=0.15.2)"]
+computation = ["scipy (>=1.7.1)", "xarray (>=0.21.0)"]
+excel = ["odfpy (>=1.4.1)", "openpyxl (>=3.0.7)", "pyxlsb (>=1.0.8)", "xlrd (>=2.0.1)", "xlsxwriter (>=1.4.3)"]
+feather = ["pyarrow (>=7.0.0)"]
+fss = ["fsspec (>=2021.07.0)"]
+gcp = ["gcsfs (>=2021.07.0)", "pandas-gbq (>=0.15.0)"]
+hdf5 = ["tables (>=3.6.1)"]
+html = ["beautifulsoup4 (>=4.9.3)", "html5lib (>=1.1)", "lxml (>=4.6.3)"]
+mysql = ["SQLAlchemy (>=1.4.16)", "pymysql (>=1.0.2)"]
+output-formatting = ["jinja2 (>=3.0.0)", "tabulate (>=0.8.9)"]
+parquet = ["pyarrow (>=7.0.0)"]
+performance = ["bottleneck (>=1.3.2)", "numba (>=0.53.1)", "numexpr (>=2.7.1)"]
+plot = ["matplotlib (>=3.6.1)"]
+postgresql = ["SQLAlchemy (>=1.4.16)", "psycopg2 (>=2.8.6)"]
+spss = ["pyreadstat (>=1.1.2)"]
+sql-other = ["SQLAlchemy (>=1.4.16)"]
+test = ["hypothesis (>=6.34.2)", "pytest (>=7.0.0)", "pytest-asyncio (>=0.17.0)", "pytest-xdist (>=2.2.0)"]
+xml = ["lxml (>=4.6.3)"]
+
+[[package]]
+name = "pillow"
+version = "9.5.0"
+description = "Python Imaging Library (Fork)"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pillow-9.5.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:ace6ca218308447b9077c14ea4ef381ba0b67ee78d64046b3f19cf4e1139ad16"},
+ {file = "Pillow-9.5.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d3d403753c9d5adc04d4694d35cf0391f0f3d57c8e0030aac09d7678fa8030aa"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5ba1b81ee69573fe7124881762bb4cd2e4b6ed9dd28c9c60a632902fe8db8b38"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fe7e1c262d3392afcf5071df9afa574544f28eac825284596ac6db56e6d11062"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f36397bf3f7d7c6a3abdea815ecf6fd14e7fcd4418ab24bae01008d8d8ca15e"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:252a03f1bdddce077eff2354c3861bf437c892fb1832f75ce813ee94347aa9b5"},
+ {file = "Pillow-9.5.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:85ec677246533e27770b0de5cf0f9d6e4ec0c212a1f89dfc941b64b21226009d"},
+ {file = "Pillow-9.5.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b416f03d37d27290cb93597335a2f85ed446731200705b22bb927405320de903"},
+ {file = "Pillow-9.5.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:1781a624c229cb35a2ac31cc4a77e28cafc8900733a864870c49bfeedacd106a"},
+ {file = "Pillow-9.5.0-cp310-cp310-win32.whl", hash = "sha256:8507eda3cd0608a1f94f58c64817e83ec12fa93a9436938b191b80d9e4c0fc44"},
+ {file = "Pillow-9.5.0-cp310-cp310-win_amd64.whl", hash = "sha256:d3c6b54e304c60c4181da1c9dadf83e4a54fd266a99c70ba646a9baa626819eb"},
+ {file = "Pillow-9.5.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:7ec6f6ce99dab90b52da21cf0dc519e21095e332ff3b399a357c187b1a5eee32"},
+ {file = "Pillow-9.5.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:560737e70cb9c6255d6dcba3de6578a9e2ec4b573659943a5e7e4af13f298f5c"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:96e88745a55b88a7c64fa49bceff363a1a27d9a64e04019c2281049444a571e3"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d9c206c29b46cfd343ea7cdfe1232443072bbb270d6a46f59c259460db76779a"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cfcc2c53c06f2ccb8976fb5c71d448bdd0a07d26d8e07e321c103416444c7ad1"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:a0f9bb6c80e6efcde93ffc51256d5cfb2155ff8f78292f074f60f9e70b942d99"},
+ {file = "Pillow-9.5.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:8d935f924bbab8f0a9a28404422da8af4904e36d5c33fc6f677e4c4485515625"},
+ {file = "Pillow-9.5.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fed1e1cf6a42577953abbe8e6cf2fe2f566daebde7c34724ec8803c4c0cda579"},
+ {file = "Pillow-9.5.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c1170d6b195555644f0616fd6ed929dfcf6333b8675fcca044ae5ab110ded296"},
+ {file = "Pillow-9.5.0-cp311-cp311-win32.whl", hash = "sha256:54f7102ad31a3de5666827526e248c3530b3a33539dbda27c6843d19d72644ec"},
+ {file = "Pillow-9.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:cfa4561277f677ecf651e2b22dc43e8f5368b74a25a8f7d1d4a3a243e573f2d4"},
+ {file = "Pillow-9.5.0-cp311-cp311-win_arm64.whl", hash = "sha256:965e4a05ef364e7b973dd17fc765f42233415974d773e82144c9bbaaaea5d089"},
+ {file = "Pillow-9.5.0-cp312-cp312-win32.whl", hash = "sha256:22baf0c3cf0c7f26e82d6e1adf118027afb325e703922c8dfc1d5d0156bb2eeb"},
+ {file = "Pillow-9.5.0-cp312-cp312-win_amd64.whl", hash = "sha256:432b975c009cf649420615388561c0ce7cc31ce9b2e374db659ee4f7d57a1f8b"},
+ {file = "Pillow-9.5.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:5d4ebf8e1db4441a55c509c4baa7a0587a0210f7cd25fcfe74dbbce7a4bd1906"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:375f6e5ee9620a271acb6820b3d1e94ffa8e741c0601db4c0c4d3cb0a9c224bf"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:99eb6cafb6ba90e436684e08dad8be1637efb71c4f2180ee6b8f940739406e78"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dfaaf10b6172697b9bceb9a3bd7b951819d1ca339a5ef294d1f1ac6d7f63270"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:763782b2e03e45e2c77d7779875f4432e25121ef002a41829d8868700d119392"},
+ {file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:35f6e77122a0c0762268216315bf239cf52b88865bba522999dc38f1c52b9b47"},
+ {file = "Pillow-9.5.0-cp37-cp37m-win32.whl", hash = "sha256:aca1c196f407ec7cf04dcbb15d19a43c507a81f7ffc45b690899d6a76ac9fda7"},
+ {file = "Pillow-9.5.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322724c0032af6692456cd6ed554bb85f8149214d97398bb80613b04e33769f6"},
+ {file = "Pillow-9.5.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:a0aa9417994d91301056f3d0038af1199eb7adc86e646a36b9e050b06f526597"},
+ {file = "Pillow-9.5.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f8286396b351785801a976b1e85ea88e937712ee2c3ac653710a4a57a8da5d9c"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c830a02caeb789633863b466b9de10c015bded434deb3ec87c768e53752ad22a"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fbd359831c1657d69bb81f0db962905ee05e5e9451913b18b831febfe0519082"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8fc330c3370a81bbf3f88557097d1ea26cd8b019d6433aa59f71195f5ddebbf"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:7002d0797a3e4193c7cdee3198d7c14f92c0836d6b4a3f3046a64bd1ce8df2bf"},
+ {file = "Pillow-9.5.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:229e2c79c00e85989a34b5981a2b67aa079fd08c903f0aaead522a1d68d79e51"},
+ {file = "Pillow-9.5.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9adf58f5d64e474bed00d69bcd86ec4bcaa4123bfa70a65ce72e424bfb88ed96"},
+ {file = "Pillow-9.5.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:662da1f3f89a302cc22faa9f14a262c2e3951f9dbc9617609a47521c69dd9f8f"},
+ {file = "Pillow-9.5.0-cp38-cp38-win32.whl", hash = "sha256:6608ff3bf781eee0cd14d0901a2b9cc3d3834516532e3bd673a0a204dc8615fc"},
+ {file = "Pillow-9.5.0-cp38-cp38-win_amd64.whl", hash = "sha256:e49eb4e95ff6fd7c0c402508894b1ef0e01b99a44320ba7d8ecbabefddcc5569"},
+ {file = "Pillow-9.5.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:482877592e927fd263028c105b36272398e3e1be3269efda09f6ba21fd83ec66"},
+ {file = "Pillow-9.5.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3ded42b9ad70e5f1754fb7c2e2d6465a9c842e41d178f262e08b8c85ed8a1d8e"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c446d2245ba29820d405315083d55299a796695d747efceb5717a8b450324115"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8aca1152d93dcc27dc55395604dcfc55bed5f25ef4c98716a928bacba90d33a3"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:608488bdcbdb4ba7837461442b90ea6f3079397ddc968c31265c1e056964f1ef"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:60037a8db8750e474af7ffc9faa9b5859e6c6d0a50e55c45576bf28be7419705"},
+ {file = "Pillow-9.5.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:07999f5834bdc404c442146942a2ecadd1cb6292f5229f4ed3b31e0a108746b1"},
+ {file = "Pillow-9.5.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a127ae76092974abfbfa38ca2d12cbeddcdeac0fb71f9627cc1135bedaf9d51a"},
+ {file = "Pillow-9.5.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:489f8389261e5ed43ac8ff7b453162af39c3e8abd730af8363587ba64bb2e865"},
+ {file = "Pillow-9.5.0-cp39-cp39-win32.whl", hash = "sha256:9b1af95c3a967bf1da94f253e56b6286b50af23392a886720f563c547e48e964"},
+ {file = "Pillow-9.5.0-cp39-cp39-win_amd64.whl", hash = "sha256:77165c4a5e7d5a284f10a6efaa39a0ae8ba839da344f20b111d62cc932fa4e5d"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:833b86a98e0ede388fa29363159c9b1a294b0905b5128baf01db683672f230f5"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aaf305d6d40bd9632198c766fb64f0c1a83ca5b667f16c1e79e1661ab5060140"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0852ddb76d85f127c135b6dd1f0bb88dbb9ee990d2cd9aa9e28526c93e794fba"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:91ec6fe47b5eb5a9968c79ad9ed78c342b1f97a091677ba0e012701add857829"},
+ {file = "Pillow-9.5.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:cb841572862f629b99725ebaec3287fc6d275be9b14443ea746c1dd325053cbd"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:c380b27d041209b849ed246b111b7c166ba36d7933ec6e41175fd15ab9eb1572"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7c9af5a3b406a50e313467e3565fc99929717f780164fe6fbb7704edba0cebbe"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5671583eab84af046a397d6d0ba25343c00cd50bce03787948e0fff01d4fd9b1"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:84a6f19ce086c1bf894644b43cd129702f781ba5751ca8572f08aa40ef0ab7b7"},
+ {file = "Pillow-9.5.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:1e7723bd90ef94eda669a3c2c19d549874dd5badaeefabefd26053304abe5799"},
+ {file = "Pillow-9.5.0.tar.gz", hash = "sha256:bf548479d336726d7a0eceb6e767e179fbde37833ae42794602631a070d630f1"},
+]
+
+[package.extras]
+docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinx-removed-in", "sphinxext-opengraph"]
+tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
+
+[[package]]
+name = "protobuf"
+version = "3.20.3"
+description = "Protocol Buffers"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "protobuf-3.20.3-cp310-cp310-manylinux2014_aarch64.whl", hash = "sha256:f4bd856d702e5b0d96a00ec6b307b0f51c1982c2bf9c0052cf9019e9a544ba99"},
+ {file = "protobuf-3.20.3-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:9aae4406ea63d825636cc11ffb34ad3379335803216ee3a856787bcf5ccc751e"},
+ {file = "protobuf-3.20.3-cp310-cp310-win32.whl", hash = "sha256:28545383d61f55b57cf4df63eebd9827754fd2dc25f80c5253f9184235db242c"},
+ {file = "protobuf-3.20.3-cp310-cp310-win_amd64.whl", hash = "sha256:67a3598f0a2dcbc58d02dd1928544e7d88f764b47d4a286202913f0b2801c2e7"},
+ {file = "protobuf-3.20.3-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:899dc660cd599d7352d6f10d83c95df430a38b410c1b66b407a6b29265d66469"},
+ {file = "protobuf-3.20.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e64857f395505ebf3d2569935506ae0dfc4a15cb80dc25261176c784662cdcc4"},
+ {file = "protobuf-3.20.3-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:d9e4432ff660d67d775c66ac42a67cf2453c27cb4d738fc22cb53b5d84c135d4"},
+ {file = "protobuf-3.20.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:74480f79a023f90dc6e18febbf7b8bac7508420f2006fabd512013c0c238f454"},
+ {file = "protobuf-3.20.3-cp37-cp37m-win32.whl", hash = "sha256:b6cc7ba72a8850621bfec987cb72623e703b7fe2b9127a161ce61e61558ad905"},
+ {file = "protobuf-3.20.3-cp37-cp37m-win_amd64.whl", hash = "sha256:8c0c984a1b8fef4086329ff8dd19ac77576b384079247c770f29cc8ce3afa06c"},
+ {file = "protobuf-3.20.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:de78575669dddf6099a8a0f46a27e82a1783c557ccc38ee620ed8cc96d3be7d7"},
+ {file = "protobuf-3.20.3-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:f4c42102bc82a51108e449cbb32b19b180022941c727bac0cfd50170341f16ee"},
+ {file = "protobuf-3.20.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:44246bab5dd4b7fbd3c0c80b6f16686808fab0e4aca819ade6e8d294a29c7050"},
+ {file = "protobuf-3.20.3-cp38-cp38-win32.whl", hash = "sha256:c02ce36ec760252242a33967d51c289fd0e1c0e6e5cc9397e2279177716add86"},
+ {file = "protobuf-3.20.3-cp38-cp38-win_amd64.whl", hash = "sha256:447d43819997825d4e71bf5769d869b968ce96848b6479397e29fc24c4a5dfe9"},
+ {file = "protobuf-3.20.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:398a9e0c3eaceb34ec1aee71894ca3299605fa8e761544934378bbc6c97de23b"},
+ {file = "protobuf-3.20.3-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:bf01b5720be110540be4286e791db73f84a2b721072a3711efff6c324cdf074b"},
+ {file = "protobuf-3.20.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:daa564862dd0d39c00f8086f88700fdbe8bc717e993a21e90711acfed02f2402"},
+ {file = "protobuf-3.20.3-cp39-cp39-win32.whl", hash = "sha256:819559cafa1a373b7096a482b504ae8a857c89593cf3a25af743ac9ecbd23480"},
+ {file = "protobuf-3.20.3-cp39-cp39-win_amd64.whl", hash = "sha256:03038ac1cfbc41aa21f6afcbcd357281d7521b4157926f30ebecc8d4ea59dcb7"},
+ {file = "protobuf-3.20.3-py2.py3-none-any.whl", hash = "sha256:a7ca6d488aa8ff7f329d4c545b2dbad8ac31464f1d8b1c87ad1346717731e4db"},
+ {file = "protobuf-3.20.3.tar.gz", hash = "sha256:2e3427429c9cffebf259491be0af70189607f365c2f41c7c3764af6f337105f2"},
+]
+
+[[package]]
+name = "pyarrow"
+version = "12.0.0"
+description = "Python library for Apache Arrow"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "pyarrow-12.0.0-cp310-cp310-macosx_10_14_x86_64.whl", hash = "sha256:3b97649c8a9a09e1d8dc76513054f1331bd9ece78ee39365e6bf6bc7503c1e94"},
+ {file = "pyarrow-12.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bc4ea634dacb03936f50fcf59574a8e727f90c17c24527e488d8ceb52ae284de"},
+ {file = "pyarrow-12.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1d568acfca3faa565d663e53ee34173be8e23a95f78f2abfdad198010ec8f745"},
+ {file = "pyarrow-12.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1b50bb9a82dca38a002d7cbd802a16b1af0f8c50ed2ec94a319f5f2afc047ee9"},
+ {file = "pyarrow-12.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:3d1733b1ea086b3c101427d0e57e2be3eb964686e83c2363862a887bb5c41fa8"},
+ {file = "pyarrow-12.0.0-cp311-cp311-macosx_10_14_x86_64.whl", hash = "sha256:a7cd32fe77f967fe08228bc100433273020e58dd6caced12627bcc0a7675a513"},
+ {file = "pyarrow-12.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:92fb031e6777847f5c9b01eaa5aa0c9033e853ee80117dce895f116d8b0c3ca3"},
+ {file = "pyarrow-12.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:280289ebfd4ac3570f6b776515baa01e4dcbf17122c401e4b7170a27c4be63fd"},
+ {file = "pyarrow-12.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:272f147d4f8387bec95f17bb58dcfc7bc7278bb93e01cb7b08a0e93a8921e18e"},
+ {file = "pyarrow-12.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:0846ace49998825eda4722f8d7f83fa05601c832549c9087ea49d6d5397d8cec"},
+ {file = "pyarrow-12.0.0-cp37-cp37m-macosx_10_14_x86_64.whl", hash = "sha256:993287136369aca60005ee7d64130f9466489c4f7425f5c284315b0a5401ccd9"},
+ {file = "pyarrow-12.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7a7b6a765ee4f88efd7d8348d9a1f804487d60799d0428b6ddf3344eaef37282"},
+ {file = "pyarrow-12.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a1c4fce253d5bdc8d62f11cfa3da5b0b34b562c04ce84abb8bd7447e63c2b327"},
+ {file = "pyarrow-12.0.0-cp37-cp37m-win_amd64.whl", hash = "sha256:e6be4d85707fc8e7a221c8ab86a40449ce62559ce25c94321df7c8500245888f"},
+ {file = "pyarrow-12.0.0-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:ea830d9f66bfb82d30b5794642f83dd0e4a718846462d22328981e9eb149cba8"},
+ {file = "pyarrow-12.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7b5b9f60d9ef756db59bec8d90e4576b7df57861e6a3d6a8bf99538f68ca15b3"},
+ {file = "pyarrow-12.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b99e559d27db36ad3a33868a475f03e3129430fc065accc839ef4daa12c6dab6"},
+ {file = "pyarrow-12.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b0810864a593b89877120972d1f7af1d1c9389876dbed92b962ed81492d3ffc"},
+ {file = "pyarrow-12.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:23a77d97f4d101ddfe81b9c2ee03a177f0e590a7e68af15eafa06e8f3cf05976"},
+ {file = "pyarrow-12.0.0-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:2cc63e746221cddb9001f7281dee95fd658085dd5b717b076950e1ccc607059c"},
+ {file = "pyarrow-12.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:d8c26912607e26c2991826bbaf3cf2b9c8c3e17566598c193b492f058b40d3a4"},
+ {file = "pyarrow-12.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0d8b90efc290e99a81d06015f3a46601c259ecc81ffb6d8ce288c91bd1b868c9"},
+ {file = "pyarrow-12.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2466be046b81863be24db370dffd30a2e7894b4f9823fb60ef0a733c31ac6256"},
+ {file = "pyarrow-12.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:0e36425b1c1cbf5447718b3f1751bf86c58f2b3ad299f996cd9b1aa040967656"},
+ {file = "pyarrow-12.0.0.tar.gz", hash = "sha256:19c812d303610ab5d664b7b1de4051ae23565f9f94d04cbea9e50569746ae1ee"},
+]
+
+[package.dependencies]
+numpy = ">=1.16.6"
+
+[[package]]
+name = "pycparser"
+version = "2.21"
+description = "C parser in Python"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"},
+ {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
+]
+
+[[package]]
+name = "pycryptodome"
+version = "3.18.0"
+description = "Cryptographic library for Python"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "pycryptodome-3.18.0-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:d1497a8cd4728db0e0da3c304856cb37c0c4e3d0b36fcbabcc1600f18504fc54"},
+ {file = "pycryptodome-3.18.0-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:928078c530da78ff08e10eb6cada6e0dff386bf3d9fa9871b4bbc9fbc1efe024"},
+ {file = "pycryptodome-3.18.0-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:157c9b5ba5e21b375f052ca78152dd309a09ed04703fd3721dce3ff8ecced148"},
+ {file = "pycryptodome-3.18.0-cp27-cp27m-manylinux2014_aarch64.whl", hash = "sha256:d20082bdac9218649f6abe0b885927be25a917e29ae0502eaf2b53f1233ce0c2"},
+ {file = "pycryptodome-3.18.0-cp27-cp27m-musllinux_1_1_aarch64.whl", hash = "sha256:e8ad74044e5f5d2456c11ed4cfd3e34b8d4898c0cb201c4038fe41458a82ea27"},
+ {file = "pycryptodome-3.18.0-cp27-cp27m-win32.whl", hash = "sha256:62a1e8847fabb5213ccde38915563140a5b338f0d0a0d363f996b51e4a6165cf"},
+ {file = "pycryptodome-3.18.0-cp27-cp27m-win_amd64.whl", hash = "sha256:16bfd98dbe472c263ed2821284118d899c76968db1a6665ade0c46805e6b29a4"},
+ {file = "pycryptodome-3.18.0-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:7a3d22c8ee63de22336679e021c7f2386f7fc465477d59675caa0e5706387944"},
+ {file = "pycryptodome-3.18.0-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:78d863476e6bad2a592645072cc489bb90320972115d8995bcfbee2f8b209918"},
+ {file = "pycryptodome-3.18.0-cp27-cp27mu-manylinux2014_aarch64.whl", hash = "sha256:b6a610f8bfe67eab980d6236fdc73bfcdae23c9ed5548192bb2d530e8a92780e"},
+ {file = "pycryptodome-3.18.0-cp27-cp27mu-musllinux_1_1_aarch64.whl", hash = "sha256:422c89fd8df8a3bee09fb8d52aaa1e996120eafa565437392b781abec2a56e14"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-macosx_10_9_universal2.whl", hash = "sha256:9ad6f09f670c466aac94a40798e0e8d1ef2aa04589c29faa5b9b97566611d1d1"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-macosx_10_9_x86_64.whl", hash = "sha256:53aee6be8b9b6da25ccd9028caf17dcdce3604f2c7862f5167777b707fbfb6cb"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-manylinux2014_aarch64.whl", hash = "sha256:10da29526a2a927c7d64b8f34592f461d92ae55fc97981aab5bbcde8cb465bb6"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f21efb8438971aa16924790e1c3dba3a33164eb4000106a55baaed522c261acf"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4944defabe2ace4803f99543445c27dd1edbe86d7d4edb87b256476a91e9ffa4"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:51eae079ddb9c5f10376b4131be9589a6554f6fd84f7f655180937f611cd99a2"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-musllinux_1_1_i686.whl", hash = "sha256:83c75952dcf4a4cebaa850fa257d7a860644c70a7cd54262c237c9f2be26f76e"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:957b221d062d5752716923d14e0926f47670e95fead9d240fa4d4862214b9b2f"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-win32.whl", hash = "sha256:795bd1e4258a2c689c0b1f13ce9684fa0dd4c0e08680dcf597cf9516ed6bc0f3"},
+ {file = "pycryptodome-3.18.0-cp35-abi3-win_amd64.whl", hash = "sha256:b1d9701d10303eec8d0bd33fa54d44e67b8be74ab449052a8372f12a66f93fb9"},
+ {file = "pycryptodome-3.18.0-pp27-pypy_73-manylinux2010_x86_64.whl", hash = "sha256:cb1be4d5af7f355e7d41d36d8eec156ef1382a88638e8032215c215b82a4b8ec"},
+ {file = "pycryptodome-3.18.0-pp27-pypy_73-win32.whl", hash = "sha256:fc0a73f4db1e31d4a6d71b672a48f3af458f548059aa05e83022d5f61aac9c08"},
+ {file = "pycryptodome-3.18.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:f022a4fd2a5263a5c483a2bb165f9cb27f2be06f2f477113783efe3fe2ad887b"},
+ {file = "pycryptodome-3.18.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:363dd6f21f848301c2dcdeb3c8ae5f0dee2286a5e952a0f04954b82076f23825"},
+ {file = "pycryptodome-3.18.0-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:12600268763e6fec3cefe4c2dcdf79bde08d0b6dc1813887e789e495cb9f3403"},
+ {file = "pycryptodome-3.18.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:4604816adebd4faf8810782f137f8426bf45fee97d8427fa8e1e49ea78a52e2c"},
+ {file = "pycryptodome-3.18.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:01489bbdf709d993f3058e2996f8f40fee3f0ea4d995002e5968965fa2fe89fb"},
+ {file = "pycryptodome-3.18.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3811e31e1ac3069988f7a1c9ee7331b942e605dfc0f27330a9ea5997e965efb2"},
+ {file = "pycryptodome-3.18.0-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f4b967bb11baea9128ec88c3d02f55a3e338361f5e4934f5240afcb667fdaec"},
+ {file = "pycryptodome-3.18.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:9c8eda4f260072f7dbe42f473906c659dcbadd5ae6159dfb49af4da1293ae380"},
+ {file = "pycryptodome-3.18.0.tar.gz", hash = "sha256:c9adee653fc882d98956e33ca2c1fb582e23a8af7ac82fee75bd6113c55a0413"},
+]
+
+[[package]]
+name = "pydantic"
+version = "1.10.8"
+description = "Data validation and settings management using python type hints"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "pydantic-1.10.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:1243d28e9b05003a89d72e7915fdb26ffd1d39bdd39b00b7dbe4afae4b557f9d"},
+ {file = "pydantic-1.10.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c0ab53b609c11dfc0c060d94335993cc2b95b2150e25583bec37a49b2d6c6c3f"},
+ {file = "pydantic-1.10.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f9613fadad06b4f3bc5db2653ce2f22e0de84a7c6c293909b48f6ed37b83c61f"},
+ {file = "pydantic-1.10.8-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:df7800cb1984d8f6e249351139667a8c50a379009271ee6236138a22a0c0f319"},
+ {file = "pydantic-1.10.8-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:0c6fafa0965b539d7aab0a673a046466d23b86e4b0e8019d25fd53f4df62c277"},
+ {file = "pydantic-1.10.8-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e82d4566fcd527eae8b244fa952d99f2ca3172b7e97add0b43e2d97ee77f81ab"},
+ {file = "pydantic-1.10.8-cp310-cp310-win_amd64.whl", hash = "sha256:ab523c31e22943713d80d8d342d23b6f6ac4b792a1e54064a8d0cf78fd64e800"},
+ {file = "pydantic-1.10.8-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:666bdf6066bf6dbc107b30d034615d2627e2121506c555f73f90b54a463d1f33"},
+ {file = "pydantic-1.10.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:35db5301b82e8661fa9c505c800d0990bc14e9f36f98932bb1d248c0ac5cada5"},
+ {file = "pydantic-1.10.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f90c1e29f447557e9e26afb1c4dbf8768a10cc676e3781b6a577841ade126b85"},
+ {file = "pydantic-1.10.8-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:93e766b4a8226e0708ef243e843105bf124e21331694367f95f4e3b4a92bbb3f"},
+ {file = "pydantic-1.10.8-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:88f195f582851e8db960b4a94c3e3ad25692c1c1539e2552f3df7a9e972ef60e"},
+ {file = "pydantic-1.10.8-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:34d327c81e68a1ecb52fe9c8d50c8a9b3e90d3c8ad991bfc8f953fb477d42fb4"},
+ {file = "pydantic-1.10.8-cp311-cp311-win_amd64.whl", hash = "sha256:d532bf00f381bd6bc62cabc7d1372096b75a33bc197a312b03f5838b4fb84edd"},
+ {file = "pydantic-1.10.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:7d5b8641c24886d764a74ec541d2fc2c7fb19f6da2a4001e6d580ba4a38f7878"},
+ {file = "pydantic-1.10.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b1f6cb446470b7ddf86c2e57cd119a24959af2b01e552f60705910663af09a4"},
+ {file = "pydantic-1.10.8-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c33b60054b2136aef8cf190cd4c52a3daa20b2263917c49adad20eaf381e823b"},
+ {file = "pydantic-1.10.8-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1952526ba40b220b912cdc43c1c32bcf4a58e3f192fa313ee665916b26befb68"},
+ {file = "pydantic-1.10.8-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:bb14388ec45a7a0dc429e87def6396f9e73c8c77818c927b6a60706603d5f2ea"},
+ {file = "pydantic-1.10.8-cp37-cp37m-win_amd64.whl", hash = "sha256:16f8c3e33af1e9bb16c7a91fc7d5fa9fe27298e9f299cff6cb744d89d573d62c"},
+ {file = "pydantic-1.10.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1ced8375969673929809d7f36ad322934c35de4af3b5e5b09ec967c21f9f7887"},
+ {file = "pydantic-1.10.8-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:93e6bcfccbd831894a6a434b0aeb1947f9e70b7468f274154d03d71fabb1d7c6"},
+ {file = "pydantic-1.10.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:191ba419b605f897ede9892f6c56fb182f40a15d309ef0142212200a10af4c18"},
+ {file = "pydantic-1.10.8-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:052d8654cb65174d6f9490cc9b9a200083a82cf5c3c5d3985db765757eb3b375"},
+ {file = "pydantic-1.10.8-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ceb6a23bf1ba4b837d0cfe378329ad3f351b5897c8d4914ce95b85fba96da5a1"},
+ {file = "pydantic-1.10.8-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f2e754d5566f050954727c77f094e01793bcb5725b663bf628fa6743a5a9108"},
+ {file = "pydantic-1.10.8-cp38-cp38-win_amd64.whl", hash = "sha256:6a82d6cda82258efca32b40040228ecf43a548671cb174a1e81477195ed3ed56"},
+ {file = "pydantic-1.10.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3e59417ba8a17265e632af99cc5f35ec309de5980c440c255ab1ca3ae96a3e0e"},
+ {file = "pydantic-1.10.8-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:84d80219c3f8d4cad44575e18404099c76851bc924ce5ab1c4c8bb5e2a2227d0"},
+ {file = "pydantic-1.10.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e4148e635994d57d834be1182a44bdb07dd867fa3c2d1b37002000646cc5459"},
+ {file = "pydantic-1.10.8-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:12f7b0bf8553e310e530e9f3a2f5734c68699f42218bf3568ef49cd9b0e44df4"},
+ {file = "pydantic-1.10.8-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:42aa0c4b5c3025483240a25b09f3c09a189481ddda2ea3a831a9d25f444e03c1"},
+ {file = "pydantic-1.10.8-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:17aef11cc1b997f9d574b91909fed40761e13fac438d72b81f902226a69dac01"},
+ {file = "pydantic-1.10.8-cp39-cp39-win_amd64.whl", hash = "sha256:66a703d1983c675a6e0fed8953b0971c44dba48a929a2000a493c3772eb61a5a"},
+ {file = "pydantic-1.10.8-py3-none-any.whl", hash = "sha256:7456eb22ed9aaa24ff3e7b4757da20d9e5ce2a81018c1b3ebd81a0b88a18f3b2"},
+ {file = "pydantic-1.10.8.tar.gz", hash = "sha256:1410275520dfa70effadf4c21811d755e7ef9bb1f1d077a21958153a92c8d9ca"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.2.0"
+
+[package.extras]
+dotenv = ["python-dotenv (>=0.10.4)"]
+email = ["email-validator (>=1.0.3)"]
+
+[[package]]
+name = "pydeck"
+version = "0.8.0"
+description = "Widget for deck.gl maps"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "pydeck-0.8.0-py2.py3-none-any.whl", hash = "sha256:a8fa7757c6f24bba033af39db3147cb020eef44012ba7e60d954de187f9ed4d5"},
+ {file = "pydeck-0.8.0.tar.gz", hash = "sha256:07edde833f7cfcef6749124351195aa7dcd24663d4909fd7898dbd0b6fbc01ec"},
+]
+
+[package.dependencies]
+jinja2 = ">=2.10.1"
+numpy = ">=1.16.4"
+
+[package.extras]
+carto = ["pydeck-carto"]
+jupyter = ["ipykernel (>=5.1.2)", "ipython (>=5.8.0)", "ipywidgets (>=7,<8)", "traitlets (>=4.3.2)"]
+
+[[package]]
+name = "pydub"
+version = "0.25.1"
+description = "Manipulate audio with an simple and easy high level interface"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pydub-0.25.1-py2.py3-none-any.whl", hash = "sha256:65617e33033874b59d87db603aa1ed450633288aefead953b30bded59cb599a6"},
+ {file = "pydub-0.25.1.tar.gz", hash = "sha256:980a33ce9949cab2a569606b65674d748ecbca4f0796887fd6f46173a7b0d30f"},
+]
+
+[[package]]
+name = "pygments"
+version = "2.15.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.15.1-py3-none-any.whl", hash = "sha256:db2db3deb4b4179f399a09054b023b6a586b76499d36965813c71aa8ed7b5fd1"},
+ {file = "Pygments-2.15.1.tar.gz", hash = "sha256:8ace4d3c1dd481894b2005f560ead0f9f19ee64fe983366be1a21e171d12775c"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
+[[package]]
+name = "pymailtm"
+version = "1.1.1"
+description = "A python web api wrapper and command line client for mail.tm."
+optional = false
+python-versions = ">=3.7,<4.0"
+files = [
+ {file = "pymailtm-1.1.1-py3-none-any.whl", hash = "sha256:e42865ad24de4fc3629e453278f21a5e3e9c438e3f05c51fdaaedef0e943c87a"},
+ {file = "pymailtm-1.1.1.tar.gz", hash = "sha256:2ba80a89de878ae622904570e1f52e02c792b3f6dbd26b8d4821c36114797f7a"},
+]
+
+[package.dependencies]
+pyperclip = ">=1.8.2,<2.0.0"
+random-username = ">=1.0.2,<2.0.0"
+requests = ">=2.28.1,<3.0.0"
+
+[[package]]
+name = "pympler"
+version = "1.0.1"
+description = "A development tool to measure, monitor and analyze the memory behavior of Python objects."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "Pympler-1.0.1-py3-none-any.whl", hash = "sha256:d260dda9ae781e1eab6ea15bacb84015849833ba5555f141d2d9b7b7473b307d"},
+ {file = "Pympler-1.0.1.tar.gz", hash = "sha256:993f1a3599ca3f4fcd7160c7545ad06310c9e12f70174ae7ae8d4e25f6c5d3fa"},
+]
+
+[[package]]
+name = "pypasser"
+version = "0.0.5"
+description = "Bypassing reCaptcha V3 by sending HTTP requests & solving reCaptcha V2 using speech to text."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "PyPasser-0.0.5.tar.gz", hash = "sha256:72b0ded34edcfa885a13ecc825c5a058503b68521ab87294205d7ff5cd569515"},
+]
+
+[package.dependencies]
+pydub = "0.25.1"
+PySocks = "1.7.1"
+requests = ">=2.25.1,<3.0"
+selenium = "*"
+SpeechRecognition = "3.8.1"
+
+[[package]]
+name = "pyperclip"
+version = "1.8.2"
+description = "A cross-platform clipboard module for Python. (Only handles plain text for now.)"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyperclip-1.8.2.tar.gz", hash = "sha256:105254a8b04934f0bc84e9c24eb360a591aaf6535c9def5f29d92af107a9bf57"},
+]
+
+[[package]]
+name = "pyrsistent"
+version = "0.19.3"
+description = "Persistent/Functional/Immutable data structures"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "pyrsistent-0.19.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:20460ac0ea439a3e79caa1dbd560344b64ed75e85d8703943e0b66c2a6150e4a"},
+ {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4c18264cb84b5e68e7085a43723f9e4c1fd1d935ab240ce02c0324a8e01ccb64"},
+ {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b774f9288dda8d425adb6544e5903f1fb6c273ab3128a355c6b972b7df39dcf"},
+ {file = "pyrsistent-0.19.3-cp310-cp310-win32.whl", hash = "sha256:5a474fb80f5e0d6c9394d8db0fc19e90fa540b82ee52dba7d246a7791712f74a"},
+ {file = "pyrsistent-0.19.3-cp310-cp310-win_amd64.whl", hash = "sha256:49c32f216c17148695ca0e02a5c521e28a4ee6c5089f97e34fe24163113722da"},
+ {file = "pyrsistent-0.19.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f0774bf48631f3a20471dd7c5989657b639fd2d285b861237ea9e82c36a415a9"},
+ {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ab2204234c0ecd8b9368dbd6a53e83c3d4f3cab10ecaf6d0e772f456c442393"},
+ {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e42296a09e83028b3476f7073fcb69ffebac0e66dbbfd1bd847d61f74db30f19"},
+ {file = "pyrsistent-0.19.3-cp311-cp311-win32.whl", hash = "sha256:64220c429e42a7150f4bfd280f6f4bb2850f95956bde93c6fda1b70507af6ef3"},
+ {file = "pyrsistent-0.19.3-cp311-cp311-win_amd64.whl", hash = "sha256:016ad1afadf318eb7911baa24b049909f7f3bb2c5b1ed7b6a8f21db21ea3faa8"},
+ {file = "pyrsistent-0.19.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c4db1bd596fefd66b296a3d5d943c94f4fac5bcd13e99bffe2ba6a759d959a28"},
+ {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aeda827381f5e5d65cced3024126529ddc4289d944f75e090572c77ceb19adbf"},
+ {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:42ac0b2f44607eb92ae88609eda931a4f0dfa03038c44c772e07f43e738bcac9"},
+ {file = "pyrsistent-0.19.3-cp37-cp37m-win32.whl", hash = "sha256:e8f2b814a3dc6225964fa03d8582c6e0b6650d68a232df41e3cc1b66a5d2f8d1"},
+ {file = "pyrsistent-0.19.3-cp37-cp37m-win_amd64.whl", hash = "sha256:c9bb60a40a0ab9aba40a59f68214eed5a29c6274c83b2cc206a359c4a89fa41b"},
+ {file = "pyrsistent-0.19.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a2471f3f8693101975b1ff85ffd19bb7ca7dd7c38f8a81701f67d6b4f97b87d8"},
+ {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc5d149f31706762c1f8bda2e8c4f8fead6e80312e3692619a75301d3dbb819a"},
+ {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3311cb4237a341aa52ab8448c27e3a9931e2ee09561ad150ba94e4cfd3fc888c"},
+ {file = "pyrsistent-0.19.3-cp38-cp38-win32.whl", hash = "sha256:f0e7c4b2f77593871e918be000b96c8107da48444d57005b6a6bc61fb4331b2c"},
+ {file = "pyrsistent-0.19.3-cp38-cp38-win_amd64.whl", hash = "sha256:c147257a92374fde8498491f53ffa8f4822cd70c0d85037e09028e478cababb7"},
+ {file = "pyrsistent-0.19.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b735e538f74ec31378f5a1e3886a26d2ca6351106b4dfde376a26fc32a044edc"},
+ {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99abb85579e2165bd8522f0c0138864da97847875ecbd45f3e7e2af569bfc6f2"},
+ {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3a8cb235fa6d3fd7aae6a4f1429bbb1fec1577d978098da1252f0489937786f3"},
+ {file = "pyrsistent-0.19.3-cp39-cp39-win32.whl", hash = "sha256:c74bed51f9b41c48366a286395c67f4e894374306b197e62810e0fdaf2364da2"},
+ {file = "pyrsistent-0.19.3-cp39-cp39-win_amd64.whl", hash = "sha256:878433581fc23e906d947a6814336eee031a00e6defba224234169ae3d3d6a98"},
+ {file = "pyrsistent-0.19.3-py3-none-any.whl", hash = "sha256:ccf0d6bd208f8111179f0c26fdf84ed7c3891982f2edaeae7422575f47e66b64"},
+ {file = "pyrsistent-0.19.3.tar.gz", hash = "sha256:1a2994773706bbb4995c31a97bc94f1418314923bd1048c6d964837040376440"},
+]
+
+[[package]]
+name = "pysocks"
+version = "1.7.1"
+description = "A Python SOCKS client module. See https://github.com/Anorov/PySocks for more information."
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "PySocks-1.7.1-py27-none-any.whl", hash = "sha256:08e69f092cc6dbe92a0fdd16eeb9b9ffbc13cadfe5ca4c7bd92ffb078b293299"},
+ {file = "PySocks-1.7.1-py3-none-any.whl", hash = "sha256:2725bd0a9925919b9b51739eea5f9e2bae91e83288108a9ad338b2e3a4435ee5"},
+ {file = "PySocks-1.7.1.tar.gz", hash = "sha256:3f8804571ebe159c380ac6de37643bb4685970655d3bba243530d6558b799aa0"},
+]
+
+[[package]]
+name = "python-dateutil"
+version = "2.8.2"
+description = "Extensions to the standard Python datetime module"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
+files = [
+ {file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
+ {file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
+]
+
+[package.dependencies]
+six = ">=1.5"
+
+[[package]]
+name = "pytz"
+version = "2023.3"
+description = "World timezone definitions, modern and historical"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pytz-2023.3-py2.py3-none-any.whl", hash = "sha256:a151b3abb88eda1d4e34a9814df37de2a80e301e68ba0fd856fb9b46bfbbbffb"},
+ {file = "pytz-2023.3.tar.gz", hash = "sha256:1d8ce29db189191fb55338ee6d0387d82ab59f3d00eac103412d64e0ebd0c588"},
+]
+
+[[package]]
+name = "random-password-generator"
+version = "2.2.0"
+description = "Simple and custom random password generator for python"
+optional = false
+python-versions = "*"
+files = [
+ {file = "random-password-generator-2.2.0.tar.gz", hash = "sha256:d8a8e2c0420fdd2c096bc7948f62701cb36761aea42e59c9a504d02f0e359d43"},
+ {file = "random_password_generator-2.2.0-py3-none-any.whl", hash = "sha256:da9ce21a9b6a99dfda940596fe417a55470a2e0343fae56dea785f431780d8a9"},
+]
+
+[[package]]
+name = "random-username"
+version = "1.0.2"
+description = "Randomly generate compelling usernames."
+optional = false
+python-versions = "*"
+files = [
+ {file = "random-username-1.0.2.tar.gz", hash = "sha256:5fdc0604b5d1bdfe4acf4cd7491a9de1caf41bbdd890f646b434e09ae2a1b7ce"},
+ {file = "random_username-1.0.2-py3-none-any.whl", hash = "sha256:2536feb63fecde7e01ede4a541aadb6f0b58794a7ab327ca5369d2a4b7664c06"},
+]
+
+[[package]]
+name = "rapidfuzz"
+version = "3.0.0"
+description = "rapid fuzzy string matching"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "rapidfuzz-3.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3ab635a544243e1924508bfc3f294c28bdced6d74388ac25041d3dabcaefab75"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:cf8b1b29028dc1bc6a5654f22425ee6d3967bbd44bc3a117be0f43b03300f928"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:2bbf9aad86b70283362dc2db3cacb7dcde0ffe6027f54feb0ccb23cf87b6aa11"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9a9b7d22e46ada4e6a1f1404c267f3f023b44594929913d855f14bc5fb11b53d"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7c81891e570e50d0afe43f722f426b0bd602d3c5315f0f290514511d9520b1e6"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:abc2f05c1f30b9533cb9b85d73c28d93aa99c7ae2992df04c1704fcaf248b59c"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:339b94c536ab9c1b1bac245fb6814df3ba104603d2c1a97f8fb41922357bd772"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8883267df996b42494f40d533ef3a3fea247531d137773a649fb851747ae12c8"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:59c6c9d3ca7d84c5878a74d142816350a3bdfb51e4d10ac104afd396168481f6"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5759bb0fd13ee030626e8bbb5b644092a043817fb192335ff4c481402b1edd0e"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:1fe6ea9300f347fd3352c755aa04d71a2786afa008d1af1a35830e6a44e7fd5f"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:481c0389c3b26cd2aa498b6924ca6e9a1d1dd5b15ad5f009d273292949e47e24"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:658be4cabcc229f52a902f5e87205e1b9c29c66e463a267c8d8f237acde56002"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-win32.whl", hash = "sha256:7f8d89b16b4752deeb66dd321548c4cfa59819982d43d2ae7ab5d6e0f15bee94"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:2132d7724c03dd322035cf1b0c23ca9e7e449ec2b7225040a2ca2fa3f1a3bbfa"},
+ {file = "rapidfuzz-3.0.0-cp310-cp310-win_arm64.whl", hash = "sha256:0bf1953a4c32ce6e2f3ed1897d0a8dbbbf19456ef0a8e37bae26e007d9fb5096"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:526df35d07083480e751f9679fd1f3e8a0819e8a13586e3860db5b65549a408a"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:88c9e93508128168708aae3ef98eeb422a88204d81ac4492fcea1e1162a6af74"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:562caa39652c72156574fcf60ce7adf29964a031a57ae977c180947e00425b4a"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a149c944f3c349f6279a8cc4cbfa3e80cc2baaec9d983359698aa792faa44653"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c40a626050e37c2f74e2ba00538578d3c4a6baa171d08ed5091b6a03512ac4a"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6ee35eddeddb5f5750d2a9cc55894926969fa0bac80bbe57211ae6fd0d34b39f"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c94fe53da481d8580e6410f3e7e4ba4e9c5786cad1f289fbb6c9c9585c6d78e1"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4032713943c32fff97d863e6618162923e3b3c31917d437126d9fcf7e33c83d2"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:dc1a39d1cc8e679c7240b2d1ed8366cf740ab8429cc9b582ebd94a5c6ededbe5"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:f213bb5d4cd0b1fddf209bafe2d2896320a737fbded3a567d454e54875e4d9cc"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:c9ca5f4ae767605cefa5156f5fa8561bee61849f9b2ccfb165d7087b4f9af90c"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:938cc766d0ce9eca95549593b6ca7ff86a2917b9e68c1989ad95485aed0f49dd"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:87eb7e9fb49265c33bda0417cc74c474a891cae60735fbbd75d79a106483888e"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-win32.whl", hash = "sha256:3f51d35521f86e767d3e640d0ab42908d01c3e05cf54ac1f0b547f3f602800f1"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:4ef3a6aa07b996c789c8d5ab99ed0563d551d32fa9330fd0f52ba28d20fcb662"},
+ {file = "rapidfuzz-3.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:7eea0ca53da78040a6b7bb041af8051c52efa7facc6f18dce33e679f2decaf62"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:6dfd138dcc0920b71c1d1bc017413b032286a1f33488613dce9e254c454abaf2"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b81d15da16e97c288c645eb642d8a08d0ab98b827efb2682cab282a45893efe"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ff8835a3ba17f3baf3838f2612e3758d7b1ca09eb16c9a382df3bec5bb9bda3"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:82a4ea0742b9e375d4856714ef59241007765edbce34fd2f7d76c552ed93a7d2"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f5767b0e8220b6f8afcc1fe77529e5678470f9e94a1cfc9e29f5b0721dc1496c"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a8c372dc278d1d5ced7cdc99ad8cc1d3de0b245e769a6b327c462b98873e5d4"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:5d8664d6f844ea9018b4866e8a8dbf49c87f703668b1b3265de83aa3c9941272"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:fa098429af4e17fb5bacb0c39f1f8349891356ba7ca540521515b5708fec4a76"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:738ae2d59ab254c4f173b40b00a9c1f092697949284c59e0879e6e3beb337a69"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:1a5eb2d1844f375f34e3c83b1a03094293d894472fdd1a095cf35e4dfa2ecf01"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:24d77723bb20030a91b096326d14b673d2ff1f0c7bbc64ed519992ed8eb5b869"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-win32.whl", hash = "sha256:b85dfb6f0c353c4b37499529f9831620a7bdc61c375e07f8c38b595f93e906e5"},
+ {file = "rapidfuzz-3.0.0-cp37-cp37m-win_amd64.whl", hash = "sha256:54fb70b667b882f9939bc6f581957fcb47fec2e7ad652259835c80e9e30230c9"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:4e1da3dce34d742567da0722b9c8dc2b51554ab5a22fdaf763b60209445a7b37"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c50825de43442c4625a2ca1d948c911d116cf9007ad7f29cd27561c99b16947c"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:69270e0cd850984f562b2239d07cde2213e5c3642cd8d550d5ac9a0fcd0882df"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bf5ddf48f63895f949355a1c4d643e0a531c9317d52901f80d5a6299d967b766"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dd2ad4160d8ad9a2abdad1c765fd29e4d9b6b8ce6a707ee48eb2869e7dff0f89"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc39af05cdf89be426d96fce579c812948a324b022fb11dfea1e99e180d4f68b"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d25555297466ab5ded3875913fc0bfa78b89b0a32d79bd65ffbd32ae71f07c2d"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:76da6c8972acd58c31efdd09c4c85263ba3b4394d2c2929be4c171a22293bab3"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2b50f2d429d81a65910f5ee9b14e172e300a09b8b2ecb91d3e4efc5d2583915c"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:f00a8b3d0b21884ea972d5430797b1a25b9d2c715b3eaf03366903aac5d8398c"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:e058ecfd8edb04b221d1b2d005f17be932075a16f75b100b275de1d3d220da5f"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:35e21f0718fd1c1853f8f433f2f84f618f6d4a6d9d96bb7c42e39797be600b58"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d128f615da9a198cd9b33be658a0c26fabe06a6d28fa4652953853e8d174c2c6"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-win32.whl", hash = "sha256:bc593306faa6c73e50cb31b81efbb580957272b14c5cf6bcf0233adf8a7e118d"},
+ {file = "rapidfuzz-3.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:0f7041c550b69d35675e04dc3f0690d0c26499039e942a0b1604c6547951e6fc"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:351df02889ac3da9f3f7b10e6812740927cfab9def453079da94f83697b03f2f"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:35506d04429440333224c3711dfdb4195d34eff733cb48648d0c89a9b99faf14"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:d24054843f4cfbb86df608ec1209e6a29b0d2635230577a94e38a9cfa3880d18"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4a94fe0a42da816e4a6279ac4c23e4ba6de86a529b61b08d5e8e2633b29c781b"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:41e98b3cebfa3e720186eeab37e6c0565895edf848fd958c34ab94c39e743311"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ea5e25378591a698ae5076c582a0135db2cb43270fb2866737ab4cb6fcc34474"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6bf090b9b4ec4df5f0899bbc4055b8b173b33169186d4de1dd3d9c609bd330a2"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f241ca0bcbfcbf90bb48bb1c8dbc1fddc205bee5520f898b994adda3d3f150a"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:f80986d3c8d55b848d679084231a35273320f658e64f0d86d725bb360e6cd2c4"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:62c760748b1253e08ab5138855e8f8d2c25a7cb5a0bfad74bb66db63c27d8a50"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:8b5d78052e840191b9c7556eb3bd4fe52435e58bd979c75298b65262368dd1fa"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:fc764665ba19b923696eae6912a2f0fc52bdd7db6c53be178d1dd70eb72f2f68"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:62aac81ef17bab9f664828b9087d0afe5a94ed48396b0456a2503b68e3d567f2"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-win32.whl", hash = "sha256:aeb855b62bc351884a672b8f87875c552492d9199c78f33cc8650c283fd7504d"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:a0e0d8798b7048c9db4e139bafb21792013fb043df07bfaf0d3dc9e1df2be5e6"},
+ {file = "rapidfuzz-3.0.0-cp39-cp39-win_arm64.whl", hash = "sha256:ebf96d236a52058c010f354256e8de4274621a7f8b5a15dffa54d9b6a1c7e6e8"},
+ {file = "rapidfuzz-3.0.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:fbe4b8305cb427b49d70c182a01c91fd85112e0573193a1f9e4fbcec35ea3eff"},
+ {file = "rapidfuzz-3.0.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:db57085c9dbd0b1005d6ad905c610920c49d0752f522d2f34477b13cba24e1d1"},
+ {file = "rapidfuzz-3.0.0-pp37-pypy37_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e334225a97824d9f75f5cf8e949e129bc183f0762f4c9b7a127d1809461bdc55"},
+ {file = "rapidfuzz-3.0.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a0c22a36b611a2d53fada2cb03b276135d08c2703039078ce985d7cc42734fd7"},
+ {file = "rapidfuzz-3.0.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:05130d9d33c4770116037de9f131e488825165105588cc7143f77733c5b25a6f"},
+ {file = "rapidfuzz-3.0.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:9c263840bda0f532714ecd66f1f82ed3d3460f45e79e8a907f4df8eaafd93d31"},
+ {file = "rapidfuzz-3.0.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6dbaa605c0f81208efbf166afb23f73b0f3847a1a966bec828f4167f61d0ca4b"},
+ {file = "rapidfuzz-3.0.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3ba129c3c8202e8bef0d9964b8798913905ad1dc6293e94d7a02d87cdbef2544"},
+ {file = "rapidfuzz-3.0.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e85b4f4aeb994c841478e98a9e05bcb7ed8ead084d93bd2ca0683dc5e93b1c36"},
+ {file = "rapidfuzz-3.0.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:284f216f0500cd977830f107da5c3f96e91356dc7993512efc414dbd55679d51"},
+ {file = "rapidfuzz-3.0.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:494b613a3e730e08df1c7c14e45c303a0f5c8a701162bfc8ac9079585837de43"},
+ {file = "rapidfuzz-3.0.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3a5670d5475777450fbc70ed8de8d4e3f7c69230a8b539f45bda358a6f9699f2"},
+ {file = "rapidfuzz-3.0.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:14e9924108a26f2f58aa8cb90f1a106398fa43e359fa5a96b0f328c7bb7f76da"},
+ {file = "rapidfuzz-3.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:832d953b5f1462eba5a0830ea7df11b784f090ba5409fc92bccb856d2539b618"},
+ {file = "rapidfuzz-3.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:5fdcc1ce830bf46fbc098c8b6eb3201a8299476153bae7a5d5e86576f7228d0a"},
+ {file = "rapidfuzz-3.0.0.tar.gz", hash = "sha256:4c1d895d16f62e9ac88d303eb918d90a390bd712055c849e01c558b7ae0fa908"},
+]
+
+[package.extras]
+full = ["numpy"]
+
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
+[[package]]
+name = "retrying"
+version = "1.3.4"
+description = "Retrying"
+optional = false
+python-versions = "*"
+files = [
+ {file = "retrying-1.3.4-py3-none-any.whl", hash = "sha256:8cc4d43cb8e1125e0ff3344e9de678fefd85db3b750b81b2240dc0183af37b35"},
+ {file = "retrying-1.3.4.tar.gz", hash = "sha256:345da8c5765bd982b1d1915deb9102fd3d1f7ad16bd84a9700b85f64d24e8f3e"},
+]
+
+[package.dependencies]
+six = ">=1.7.0"
+
+[[package]]
+name = "rich"
+version = "13.3.5"
+description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal"
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "rich-13.3.5-py3-none-any.whl", hash = "sha256:69cdf53799e63f38b95b9bf9c875f8c90e78dd62b2f00c13a911c7a3b9fa4704"},
+ {file = "rich-13.3.5.tar.gz", hash = "sha256:2d11b9b8dd03868f09b4fffadc84a6a8cda574e40dc90821bd845720ebb8e89c"},
+]
+
+[package.dependencies]
+markdown-it-py = ">=2.2.0,<3.0.0"
+pygments = ">=2.13.0,<3.0.0"
+
+[package.extras]
+jupyter = ["ipywidgets (>=7.5.1,<9)"]
+
+[[package]]
+name = "selenium"
+version = "4.9.1"
+description = ""
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "selenium-4.9.1-py3-none-any.whl", hash = "sha256:82aedaa85d55bc861f4c89ff9609e82f6c958e2e1e3da3ffcc36703f21d3ee16"},
+ {file = "selenium-4.9.1.tar.gz", hash = "sha256:3444f4376321530c36ce8355b6b357d8cf4a7d588ce5cf772183465930bbed0e"},
+]
+
+[package.dependencies]
+certifi = ">=2021.10.8"
+trio = ">=0.17,<1.0"
+trio-websocket = ">=0.9,<1.0"
+urllib3 = {version = ">=1.26,<3", extras = ["socks"]}
+
+[[package]]
+name = "six"
+version = "1.16.0"
+description = "Python 2 and 3 compatibility utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
+files = [
+ {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
+ {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
+]
+
+[[package]]
+name = "smmap"
+version = "5.0.0"
+description = "A pure Python implementation of a sliding window memory map manager"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "smmap-5.0.0-py3-none-any.whl", hash = "sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94"},
+ {file = "smmap-5.0.0.tar.gz", hash = "sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936"},
+]
+
+[[package]]
+name = "sniffio"
+version = "1.3.0"
+description = "Sniff out which async library your code is running under"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "sniffio-1.3.0-py3-none-any.whl", hash = "sha256:eecefdce1e5bbfb7ad2eeaabf7c1eeb404d7757c379bd1f7e5cce9d8bf425384"},
+ {file = "sniffio-1.3.0.tar.gz", hash = "sha256:e60305c5e5d314f5389259b7f22aaa33d8f7dee49763119234af3755c55b9101"},
+]
+
+[[package]]
+name = "sortedcontainers"
+version = "2.4.0"
+description = "Sorted Containers -- Sorted List, Sorted Dict, Sorted Set"
+optional = false
+python-versions = "*"
+files = [
+ {file = "sortedcontainers-2.4.0-py2.py3-none-any.whl", hash = "sha256:a163dcaede0f1c021485e957a39245190e74249897e2ae4b2aa38595db237ee0"},
+ {file = "sortedcontainers-2.4.0.tar.gz", hash = "sha256:25caa5a06cc30b6b83d11423433f65d1f9d76c4c6a0c90e3379eaa43b9bfdb88"},
+]
+
+[[package]]
+name = "speechrecognition"
+version = "3.8.1"
+description = "Library for performing speech recognition, with support for several engines and APIs, online and offline."
+optional = false
+python-versions = "*"
+files = [
+ {file = "SpeechRecognition-3.8.1-py2.py3-none-any.whl", hash = "sha256:4d8f73a0c05ec70331c3bacaa89ecc06dfa8d9aba0899276664cda06ab597e8e"},
+]
+
+[[package]]
+name = "streamlit"
+version = "1.22.0"
+description = "A faster way to build and share data apps"
+optional = false
+python-versions = ">=3.7, !=3.9.7"
+files = [
+ {file = "streamlit-1.22.0-py2.py3-none-any.whl", hash = "sha256:520dd9b9e6efb559b5a9a22feadb48b1e6f0340ec83da3514810059fdecd4167"},
+ {file = "streamlit-1.22.0.tar.gz", hash = "sha256:5bef9bf8deef32814d9565c9df48331e6357eb0b90dabc3ec4f53c44fb34fc73"},
+]
+
+[package.dependencies]
+altair = ">=3.2.0,<5"
+blinker = ">=1.0.0"
+cachetools = ">=4.0"
+click = ">=7.0"
+gitpython = "!=3.1.19"
+importlib-metadata = ">=1.4"
+numpy = "*"
+packaging = ">=14.1"
+pandas = ">=0.25,<3"
+pillow = ">=6.2.0"
+protobuf = ">=3.12,<4"
+pyarrow = ">=4.0"
+pydeck = ">=0.1.dev5"
+pympler = ">=0.9"
+python-dateutil = "*"
+requests = ">=2.4"
+rich = ">=10.11.0"
+tenacity = ">=8.0.0,<9"
+toml = "*"
+tornado = ">=6.0.3"
+typing-extensions = ">=3.10.0.0"
+tzlocal = ">=1.1"
+validators = ">=0.2"
+watchdog = {version = "*", markers = "platform_system != \"Darwin\""}
+
+[package.extras]
+snowflake = ["snowflake-snowpark-python"]
+
+[[package]]
+name = "streamlit-chat"
+version = "0.0.2.2"
+description = ""
+optional = false
+python-versions = ">=3.8"
+files = []
+develop = false
+
+[package.dependencies]
+streamlit = ">=0.63"
+
+[package.source]
+type = "git"
+url = "https://github.com/AI-Yash/st-chat"
+reference = "ffb0583"
+resolved_reference = "ffb0583f59972adadb52164a285e8395e8a17ef7"
+
+[[package]]
+name = "tenacity"
+version = "8.2.2"
+description = "Retry code until it succeeds"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "tenacity-8.2.2-py3-none-any.whl", hash = "sha256:2f277afb21b851637e8f52e6a613ff08734c347dc19ade928e519d7d2d8569b0"},
+ {file = "tenacity-8.2.2.tar.gz", hash = "sha256:43af037822bd0029025877f3b2d97cc4d7bb0c2991000a3d59d71517c5c969e0"},
+]
+
+[package.extras]
+doc = ["reno", "sphinx", "tornado (>=4.5)"]
+
+[[package]]
+name = "tls-client"
+version = "0.2.1"
+description = "Advanced Python HTTP Client."
+optional = false
+python-versions = "*"
+files = [
+ {file = "tls_client-0.2.1-py3-none-any.whl", hash = "sha256:124a710952b979d5e20b4e2b7879b7958d6e48a259d0f5b83101055eb173f0bd"},
+ {file = "tls_client-0.2.1.tar.gz", hash = "sha256:473fb4c671d9d4ca6b818548ab6e955640dd589767bfce520830c5618c2f2e2b"},
+]
+
+[[package]]
+name = "toml"
+version = "0.10.2"
+description = "Python Library for Tom's Obvious, Minimal Language"
+optional = false
+python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
+files = [
+ {file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
+ {file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
+]
+
+[[package]]
+name = "toolz"
+version = "0.12.0"
+description = "List processing tools and functional utilities"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "toolz-0.12.0-py3-none-any.whl", hash = "sha256:2059bd4148deb1884bb0eb770a3cde70e7f954cfbbdc2285f1f2de01fd21eb6f"},
+ {file = "toolz-0.12.0.tar.gz", hash = "sha256:88c570861c440ee3f2f6037c4654613228ff40c93a6c25e0eba70d17282c6194"},
+]
+
+[[package]]
+name = "tornado"
+version = "6.3.2"
+description = "Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed."
+optional = false
+python-versions = ">= 3.8"
+files = [
+ {file = "tornado-6.3.2-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:c367ab6c0393d71171123ca5515c61ff62fe09024fa6bf299cd1339dc9456829"},
+ {file = "tornado-6.3.2-cp38-abi3-macosx_10_9_x86_64.whl", hash = "sha256:b46a6ab20f5c7c1cb949c72c1994a4585d2eaa0be4853f50a03b5031e964fc7c"},
+ {file = "tornado-6.3.2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c2de14066c4a38b4ecbbcd55c5cc4b5340eb04f1c5e81da7451ef555859c833f"},
+ {file = "tornado-6.3.2-cp38-abi3-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:05615096845cf50a895026f749195bf0b10b8909f9be672f50b0fe69cba368e4"},
+ {file = "tornado-6.3.2-cp38-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5b17b1cf5f8354efa3d37c6e28fdfd9c1c1e5122f2cb56dac121ac61baa47cbe"},
+ {file = "tornado-6.3.2-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:29e71c847a35f6e10ca3b5c2990a52ce38b233019d8e858b755ea6ce4dcdd19d"},
+ {file = "tornado-6.3.2-cp38-abi3-musllinux_1_1_i686.whl", hash = "sha256:834ae7540ad3a83199a8da8f9f2d383e3c3d5130a328889e4cc991acc81e87a0"},
+ {file = "tornado-6.3.2-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:6a0848f1aea0d196a7c4f6772197cbe2abc4266f836b0aac76947872cd29b411"},
+ {file = "tornado-6.3.2-cp38-abi3-win32.whl", hash = "sha256:7efcbcc30b7c654eb6a8c9c9da787a851c18f8ccd4a5a3a95b05c7accfa068d2"},
+ {file = "tornado-6.3.2-cp38-abi3-win_amd64.whl", hash = "sha256:0c325e66c8123c606eea33084976c832aa4e766b7dff8aedd7587ea44a604cdf"},
+ {file = "tornado-6.3.2.tar.gz", hash = "sha256:4b927c4f19b71e627b13f3db2324e4ae660527143f9e1f2e2fb404f3a187e2ba"},
+]
+
+[[package]]
+name = "trio"
+version = "0.22.0"
+description = "A friendly Python library for async concurrency and I/O"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "trio-0.22.0-py3-none-any.whl", hash = "sha256:f1dd0780a89bfc880c7c7994519cb53f62aacb2c25ff487001c0052bd721cdf0"},
+ {file = "trio-0.22.0.tar.gz", hash = "sha256:ce68f1c5400a47b137c5a4de72c7c901bd4e7a24fbdebfe9b41de8c6c04eaacf"},
+]
+
+[package.dependencies]
+async-generator = ">=1.9"
+attrs = ">=19.2.0"
+cffi = {version = ">=1.14", markers = "os_name == \"nt\" and implementation_name != \"pypy\""}
+exceptiongroup = {version = ">=1.0.0rc9", markers = "python_version < \"3.11\""}
+idna = "*"
+outcome = "*"
+sniffio = "*"
+sortedcontainers = "*"
+
+[[package]]
+name = "trio-websocket"
+version = "0.10.2"
+description = "WebSocket library for Trio"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "trio-websocket-0.10.2.tar.gz", hash = "sha256:af13e9393f9051111300287947ec595d601758ce3d165328e7d36325135a8d62"},
+ {file = "trio_websocket-0.10.2-py3-none-any.whl", hash = "sha256:0908435e4eecc49d830ae1c4d6c47b978a75f00594a2be2104d58b61a04cdb53"},
+]
+
+[package.dependencies]
+exceptiongroup = "*"
+trio = ">=0.11"
+wsproto = ">=0.14"
+
+[[package]]
+name = "twocaptcha"
+version = "0.0.1"
+description = "2Captcha Python3 API Wrapper"
+optional = false
+python-versions = "*"
+files = [
+ {file = "TwoCaptcha-0.0.1.tar.gz", hash = "sha256:fd04127de71ca4bd31c22add84a5bcb7c683cf9ee5bf503ca14a8f372ac76a0e"},
+]
+
+[package.dependencies]
+requests = "*"
+
+[[package]]
+name = "typing-extensions"
+version = "4.6.2"
+description = "Backported and Experimental Type Hints for Python 3.7+"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "typing_extensions-4.6.2-py3-none-any.whl", hash = "sha256:3a8b36f13dd5fdc5d1b16fe317f5668545de77fa0b8e02006381fd49d731ab98"},
+ {file = "typing_extensions-4.6.2.tar.gz", hash = "sha256:06006244c70ac8ee83fa8282cb188f697b8db25bc8b4df07be1873c43897060c"},
+]
+
+[[package]]
+name = "tzdata"
+version = "2023.3"
+description = "Provider of IANA time zone data"
+optional = false
+python-versions = ">=2"
+files = [
+ {file = "tzdata-2023.3-py2.py3-none-any.whl", hash = "sha256:7e65763eef3120314099b6939b5546db7adce1e7d6f2e179e3df563c70511eda"},
+ {file = "tzdata-2023.3.tar.gz", hash = "sha256:11ef1e08e54acb0d4f95bdb1be05da659673de4acbd21bf9c69e94cc5e907a3a"},
+]
+
+[[package]]
+name = "tzlocal"
+version = "5.0.1"
+description = "tzinfo object for the local timezone"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "tzlocal-5.0.1-py3-none-any.whl", hash = "sha256:f3596e180296aaf2dbd97d124fe76ae3a0e3d32b258447de7b939b3fd4be992f"},
+ {file = "tzlocal-5.0.1.tar.gz", hash = "sha256:46eb99ad4bdb71f3f72b7d24f4267753e240944ecfc16f25d2719ba89827a803"},
+]
+
+[package.dependencies]
+tzdata = {version = "*", markers = "platform_system == \"Windows\""}
+
+[package.extras]
+devenv = ["black", "check-manifest", "flake8", "pyroma", "pytest (>=4.3)", "pytest-cov", "pytest-mock (>=3.3)", "zest.releaser"]
+
+[[package]]
+name = "urllib3"
+version = "2.0.2"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.2-py3-none-any.whl", hash = "sha256:d055c2f9d38dc53c808f6fdc8eab7360b6fdbbde02340ed25cfbcd817c62469e"},
+ {file = "urllib3-2.0.2.tar.gz", hash = "sha256:61717a1095d7e155cdb737ac7bb2f4324a858a1e2e6466f6d03ff630ca68d3cc"},
+]
+
+[package.dependencies]
+pysocks = {version = ">=1.5.6,<1.5.7 || >1.5.7,<2.0", optional = true, markers = "extra == \"socks\""}
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
+[[package]]
+name = "validators"
+version = "0.20.0"
+description = "Python Data Validation for Humans™."
+optional = false
+python-versions = ">=3.4"
+files = [
+ {file = "validators-0.20.0.tar.gz", hash = "sha256:24148ce4e64100a2d5e267233e23e7afeb55316b47d30faae7eb6e7292bc226a"},
+]
+
+[package.dependencies]
+decorator = ">=3.4.0"
+
+[package.extras]
+test = ["flake8 (>=2.4.0)", "isort (>=4.2.2)", "pytest (>=2.2.3)"]
+
+[[package]]
+name = "watchdog"
+version = "3.0.0"
+description = "Filesystem events monitoring"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "watchdog-3.0.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:336adfc6f5cc4e037d52db31194f7581ff744b67382eb6021c868322e32eef41"},
+ {file = "watchdog-3.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a70a8dcde91be523c35b2bf96196edc5730edb347e374c7de7cd20c43ed95397"},
+ {file = "watchdog-3.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:adfdeab2da79ea2f76f87eb42a3ab1966a5313e5a69a0213a3cc06ef692b0e96"},
+ {file = "watchdog-3.0.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2b57a1e730af3156d13b7fdddfc23dea6487fceca29fc75c5a868beed29177ae"},
+ {file = "watchdog-3.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:7ade88d0d778b1b222adebcc0927428f883db07017618a5e684fd03b83342bd9"},
+ {file = "watchdog-3.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7e447d172af52ad204d19982739aa2346245cc5ba6f579d16dac4bfec226d2e7"},
+ {file = "watchdog-3.0.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:9fac43a7466eb73e64a9940ac9ed6369baa39b3bf221ae23493a9ec4d0022674"},
+ {file = "watchdog-3.0.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:8ae9cda41fa114e28faf86cb137d751a17ffd0316d1c34ccf2235e8a84365c7f"},
+ {file = "watchdog-3.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:25f70b4aa53bd743729c7475d7ec41093a580528b100e9a8c5b5efe8899592fc"},
+ {file = "watchdog-3.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4f94069eb16657d2c6faada4624c39464f65c05606af50bb7902e036e3219be3"},
+ {file = "watchdog-3.0.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7c5f84b5194c24dd573fa6472685b2a27cc5a17fe5f7b6fd40345378ca6812e3"},
+ {file = "watchdog-3.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3aa7f6a12e831ddfe78cdd4f8996af9cf334fd6346531b16cec61c3b3c0d8da0"},
+ {file = "watchdog-3.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:233b5817932685d39a7896b1090353fc8efc1ef99c9c054e46c8002561252fb8"},
+ {file = "watchdog-3.0.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:13bbbb462ee42ec3c5723e1205be8ced776f05b100e4737518c67c8325cf6100"},
+ {file = "watchdog-3.0.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:8f3ceecd20d71067c7fd4c9e832d4e22584318983cabc013dbf3f70ea95de346"},
+ {file = "watchdog-3.0.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:c9d8c8ec7efb887333cf71e328e39cffbf771d8f8f95d308ea4125bf5f90ba64"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_aarch64.whl", hash = "sha256:0e06ab8858a76e1219e68c7573dfeba9dd1c0219476c5a44d5333b01d7e1743a"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_armv7l.whl", hash = "sha256:d00e6be486affb5781468457b21a6cbe848c33ef43f9ea4a73b4882e5f188a44"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_i686.whl", hash = "sha256:c07253088265c363d1ddf4b3cdb808d59a0468ecd017770ed716991620b8f77a"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_ppc64.whl", hash = "sha256:5113334cf8cf0ac8cd45e1f8309a603291b614191c9add34d33075727a967709"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_ppc64le.whl", hash = "sha256:51f90f73b4697bac9c9a78394c3acbbd331ccd3655c11be1a15ae6fe289a8c83"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_s390x.whl", hash = "sha256:ba07e92756c97e3aca0912b5cbc4e5ad802f4557212788e72a72a47ff376950d"},
+ {file = "watchdog-3.0.0-py3-none-manylinux2014_x86_64.whl", hash = "sha256:d429c2430c93b7903914e4db9a966c7f2b068dd2ebdd2fa9b9ce094c7d459f33"},
+ {file = "watchdog-3.0.0-py3-none-win32.whl", hash = "sha256:3ed7c71a9dccfe838c2f0b6314ed0d9b22e77d268c67e015450a29036a81f60f"},
+ {file = "watchdog-3.0.0-py3-none-win_amd64.whl", hash = "sha256:4c9956d27be0bb08fc5f30d9d0179a855436e655f046d288e2bcc11adfae893c"},
+ {file = "watchdog-3.0.0-py3-none-win_ia64.whl", hash = "sha256:5d9f3a10e02d7371cd929b5d8f11e87d4bad890212ed3901f9b4d68767bee759"},
+ {file = "watchdog-3.0.0.tar.gz", hash = "sha256:4d98a320595da7a7c5a18fc48cb633c2e73cda78f93cac2ef42d42bf609a33f9"},
+]
+
+[package.extras]
+watchmedo = ["PyYAML (>=3.10)"]
+
+[[package]]
+name = "websocket-client"
+version = "1.5.2"
+description = "WebSocket client for Python with low level API options"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "websocket-client-1.5.2.tar.gz", hash = "sha256:c7d67c13b928645f259d9b847ab5b57fd2d127213ca41ebd880de1f553b7c23b"},
+ {file = "websocket_client-1.5.2-py3-none-any.whl", hash = "sha256:f8c64e28cd700e7ba1f04350d66422b6833b82a796b525a51e740b8cc8dab4b1"},
+]
+
+[package.extras]
+docs = ["Sphinx (>=3.4)", "sphinx-rtd-theme (>=0.5)"]
+optional = ["python-socks", "wsaccel"]
+test = ["websockets"]
+
+[[package]]
+name = "wsproto"
+version = "1.2.0"
+description = "WebSockets state-machine based protocol implementation"
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "wsproto-1.2.0-py3-none-any.whl", hash = "sha256:b9acddd652b585d75b20477888c56642fdade28bdfd3579aa24a4d2c037dd736"},
+ {file = "wsproto-1.2.0.tar.gz", hash = "sha256:ad565f26ecb92588a3e43bc3d96164de84cd9902482b130d0ddbaa9664a85065"},
+]
+
+[package.dependencies]
+h11 = ">=0.9.0,<1"
+
+[[package]]
+name = "zipp"
+version = "3.15.0"
+description = "Backport of pathlib-compatible object wrapper for zip files"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "zipp-3.15.0-py3-none-any.whl", hash = "sha256:48904fc76a60e542af151aded95726c1a5c34ed43ab4134b597665c86d7ad556"},
+ {file = "zipp-3.15.0.tar.gz", hash = "sha256:112929ad649da941c23de50f356a2b5570c954b65150642bccdd66bf194d224b"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"]
+testing = ["big-O", "flake8 (<5)", "jaraco.functools", "jaraco.itertools", "more-itertools", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)"]
+
+[metadata]
+lock-version = "2.0"
+python-versions = "^3.10"
+content-hash = "6c2e0d495c885f4e60f1496221b1e2dc1d0a512e2247b5babc767b1fed703c7e"
diff --git a/g4f/.v1/pyproject.toml b/g4f/.v1/pyproject.toml
new file mode 100644
index 0000000000000000000000000000000000000000..7281f67975a9ff95341e5904c2f88b45f17e76ce
--- /dev/null
+++ b/g4f/.v1/pyproject.toml
@@ -0,0 +1,37 @@
+[tool.poetry]
+name = "gpt4free"
+version = "0.1.0"
+description = ""
+authors = []
+license = "GPL-3.0"
+readme = "README.md"
+packages = [{ include = "gpt4free" }]
+exclude = ["**/*.txt"]
+
+[tool.poetry.dependencies]
+python = "^3.10"
+
+colorama = "^0.4.6"
+curl-cffi = "^0.5.5"
+fake-useragent = "^1.1.3"
+Levenshtein = "*"
+mailgw_temporary_email = "*"
+names = "^0.3.0"
+pycryptodome = "*"
+pydantic = "^1.10.7"
+pypasser = "^0.0.5"
+pymailtm = "*"
+random-password-generator = "*"
+requests = "^2.29.0"
+retrying = "*"
+selenium = "^4.9.0"
+streamlit-chat = { git = "https://github.com/AI-Yash/st-chat", rev = "ffb0583" }
+streamlit = "^1.21.0"
+tls-client = "^0.2"
+twocaptcha = "^0.0.1"
+websocket-client = "^1.5.1"
+
+
+[build-system]
+requires = ["poetry-core"]
+build-backend = "poetry.core.masonry.api"
diff --git a/g4f/.v1/requirements.txt b/g4f/.v1/requirements.txt
new file mode 100644
index 0000000000000000000000000000000000000000..3a1f815b17ce3ea1a3475c192e2586d65c573164
--- /dev/null
+++ b/g4f/.v1/requirements.txt
@@ -0,0 +1,21 @@
+websocket-client
+requests
+tls-client
+pypasser
+names
+colorama
+curl_cffi
+streamlit
+selenium
+fake-useragent
+twocaptcha
+https://github.com/AI-Yash/st-chat/archive/refs/pull/24/head.zip
+pydantic
+pymailtm
+Levenshtein
+retrying
+mailgw_temporary_email
+pycryptodome
+random-password-generator
+numpy>=1.22.2 # not directly required, pinned by Snyk to avoid a vulnerability
+tornado>=6.3.2 # not directly required, pinned by Snyk to avoid a vulnerability
\ No newline at end of file
diff --git a/g4f/.v1/testing/aiassistest.py b/g4f/.v1/testing/aiassistest.py
new file mode 100644
index 0000000000000000000000000000000000000000..57a34f1580ac3dc135ac025dd74236cbedbeb3c7
--- /dev/null
+++ b/g4f/.v1/testing/aiassistest.py
@@ -0,0 +1,13 @@
+import aiassist
+
+question1 = "Who won the world series in 2020?"
+req = aiassist.Completion.create(prompt=question1)
+answer = req["text"]
+message_id = req["parentMessageId"]
+
+question2 = "Where was it played?"
+req2 = aiassist.Completion.create(prompt=question2, parentMessageId=message_id)
+answer2 = req2["text"]
+
+print(answer)
+print(answer2)
diff --git a/g4f/.v1/testing/aicolors_test.py b/g4f/.v1/testing/aicolors_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..853f7e4560e9003b3a1d76119243027f06ec577c
--- /dev/null
+++ b/g4f/.v1/testing/aicolors_test.py
@@ -0,0 +1,6 @@
+from gpt4free import aicolors
+
+prompt = "Light green color"
+req = aicolors.Completion.create(prompt=prompt)
+
+print(req)
diff --git a/g4f/.v1/testing/deepai_test.py b/g4f/.v1/testing/deepai_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..474f663ecff8e48b17fc1cdae1ea6e0a89f03c0b
--- /dev/null
+++ b/g4f/.v1/testing/deepai_test.py
@@ -0,0 +1,18 @@
+from gpt4free import deepai
+
+#single completion
+for chunk in deepai.Completion.create("Write a list of possible vacation destinations:"):
+ print(chunk, end="", flush=True)
+print()
+
+#chat completion
+print("==============")
+messages = [ #taken from the openai docs
+ {"role": "system", "content": "You are a helpful assistant."},
+ {"role": "user", "content": "Who won the world series in 2020?"},
+ {"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
+ {"role": "user", "content": "Where was it played?"}
+]
+for chunk in deepai.ChatCompletion.create(messages):
+ print(chunk, end="", flush=True)
+print()
\ No newline at end of file
diff --git a/g4f/.v1/testing/forefront_test.py b/g4f/.v1/testing/forefront_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..b7b5c57c1e86016687611e2260078b5c800bec71
--- /dev/null
+++ b/g4f/.v1/testing/forefront_test.py
@@ -0,0 +1,9 @@
+from gpt4free import forefront
+
+# create an account
+token = forefront.Account.create(logging=True)
+print(token)
+
+# get a response
+for response in forefront.StreamingCompletion.create(token=token, prompt='hello world', model='gpt-4'):
+ print(response.text, end='')
diff --git a/g4f/.v1/testing/gptworldai_test.py b/g4f/.v1/testing/gptworldai_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..3dfb32ce17b645e21991d07124421b3dc11cbfb1
--- /dev/null
+++ b/g4f/.v1/testing/gptworldai_test.py
@@ -0,0 +1,18 @@
+import gptworldAi
+
+# single completion
+for chunk in gptworldAi.Completion.create("你是谁", "127.0.0.1:7890"):
+ print(chunk, end="", flush=True)
+print()
+
+# chat completion
+message = []
+while True:
+ prompt = input("请输入问题:")
+ message.append({"role": "user", "content": prompt})
+ text = ""
+ for chunk in gptworldAi.ChatCompletion.create(message, '127.0.0.1:7890'):
+ text = text + chunk
+ print(chunk, end="", flush=True)
+ print()
+ message.append({"role": "assistant", "content": text})
diff --git a/g4f/.v1/testing/hpgptai_test.py b/g4f/.v1/testing/hpgptai_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..cdd146dd381346d689266ce05b6fa9e12f574b1b
--- /dev/null
+++ b/g4f/.v1/testing/hpgptai_test.py
@@ -0,0 +1,41 @@
+import hpgptai
+
+#single completion
+res = hpgptai.Completion.create("你是谁","127.0.0.1:7890")
+print(res["reply"])
+
+
+#chat completion
+messages = [
+ {
+ "content": "你是谁",
+ "html": "你是谁",
+ "id": hpgptai.ChatCompletion.randomStr(),
+ "role": "user",
+ "who": "User: ",
+ },
+ {
+ "content": "我是一位AI助手,专门为您提供各种服务和支持。我可以回答您的问题,帮助您解决问题,提供相关信息,并执行一些任务。请随时告诉我您需要什么帮助。",
+ "html": "我是一位AI助手,专门为您提供各种服务和支持。我可以回答您的问题,帮助您解决问题,提供相关信息,并执行一些任务。请随时告诉我您需要什么帮助。",
+ "id": hpgptai.ChatCompletion.randomStr(),
+ "role": "assistant",
+ "who": "AI: ",
+ },
+ {
+ "content": "我上一句问的是什么?",
+ "html": "我上一句问的是什么?",
+ "id": hpgptai.ChatCompletion.randomStr(),
+ "role": "user",
+ "who": "User: ",
+ },
+]
+res = hpgptai.ChatCompletion.create(messages,proxy="127.0.0.1:7890")
+print(res["reply"])
+
+
+
+
+
+
+
+
diff --git a/g4f/.v1/testing/italygpt2_test.py b/g4f/.v1/testing/italygpt2_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..0494c8a2bfcef5107f65f368116470050afbe9ef
--- /dev/null
+++ b/g4f/.v1/testing/italygpt2_test.py
@@ -0,0 +1,4 @@
+from gpt4free import italygpt2
+account_data=italygpt2.Account.create()
+for chunk in italygpt2.Completion.create(account_data=account_data,prompt="Who are you?"):
+ print(chunk, end="", flush=True)
\ No newline at end of file
diff --git a/g4f/.v1/testing/openaihosted_test.py b/g4f/.v1/testing/openaihosted_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..d5a79e5209bcf234b7be7bc9aa4ad6fe2864f62f
--- /dev/null
+++ b/g4f/.v1/testing/openaihosted_test.py
@@ -0,0 +1,14 @@
+import openaihosted
+
+messages = [{"role": "system", "content": "You are a helpful assistant."}]
+while True:
+ question = input("Question: ")
+ if question == "!stop":
+ break
+
+ messages.append({"role": "user", "content": question})
+ request = openaihosted.Completion.create(messages=messages)
+
+ response = request["responses"]
+ messages.append({"role": "assistant", "content": response})
+ print(f"Answer: {response}")
diff --git a/g4f/.v1/testing/poe_account_create_test.py b/g4f/.v1/testing/poe_account_create_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..41ae5a33fb7fb41b12ca022c28b333573e8dab73
--- /dev/null
+++ b/g4f/.v1/testing/poe_account_create_test.py
@@ -0,0 +1,109 @@
+from hashlib import md5
+from json import dumps
+from re import findall
+from typing import Optional
+
+from tls_client import Session as TLS
+from twocaptcha import TwoCaptcha
+
+from gpt4free.quora import extract_formkey
+from gpt4free.quora.mail import Emailnator
+
+solver = TwoCaptcha('')
+
+
+class Account:
+ @staticmethod
+ def create(proxy: Optional[str] = None, logging: bool = False, enable_bot_creation: bool = False):
+ client = TLS(client_identifier='chrome110')
+ client.proxies = {'http': f'http://{proxy}', 'https': f'http://{proxy}'} if proxy else None
+
+ mail_client = Emailnator()
+ mail_address = mail_client.get_mail()
+
+ if logging:
+ print('email', mail_address)
+
+ client.headers = {
+ 'authority': 'poe.com',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'content-type': 'application/json',
+ 'origin': 'https://poe.com',
+ 'poe-formkey': 'null',
+ 'poe-tag-id': 'null',
+ 'poe-tchannel': 'null',
+ 'referer': 'https://poe.com/login',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
+ }
+
+ client.headers["poe-formkey"] = extract_formkey(client.get('https://poe.com/login').text)
+ client.headers["poe-tchannel"] = client.get('https://poe.com/api/settings').json()['tchannelData']['channel']
+
+ # token = reCaptchaV3('https://www.recaptcha.net/recaptcha/enterprise/anchor?ar=1&k=6LflhEElAAAAAI_ewVwRWI9hsyV4mbZnYAslSvlG&co=aHR0cHM6Ly9wb2UuY29tOjQ0Mw..&hl=en&v=4PnKmGB9wRHh1i04o7YUICeI&size=invisible&cb=bi6ivxoskyal')
+ token = solver.recaptcha(
+ sitekey='6LflhEElAAAAAI_ewVwRWI9hsyV4mbZnYAslSvlG',
+ url='https://poe.com/login?redirect_url=%2F',
+ version='v3',
+ enterprise=1,
+ invisible=1,
+ action='login',
+ )['code']
+
+ payload = dumps(
+ separators=(',', ':'),
+ obj={
+ 'queryName': 'MainSignupLoginSection_sendVerificationCodeMutation_Mutation',
+ 'variables': {'emailAddress': mail_address, 'phoneNumber': None, 'recaptchaToken': token},
+ 'query': 'mutation MainSignupLoginSection_sendVerificationCodeMutation_Mutation(\n $emailAddress: String\n $phoneNumber: String\n $recaptchaToken: String\n) {\n sendVerificationCode(verificationReason: login, emailAddress: $emailAddress, phoneNumber: $phoneNumber, recaptchaToken: $recaptchaToken) {\n status\n errorMessage\n }\n}\n',
+ },
+ )
+
+ base_string = payload + client.headers["poe-formkey"] + 'WpuLMiXEKKE98j56k'
+ client.headers["poe-tag-id"] = md5(base_string.encode()).hexdigest()
+
+ print(dumps(client.headers, indent=4))
+
+ response = client.post('https://poe.com/api/gql_POST', data=payload)
+
+ if 'automated_request_detected' in response.text:
+ print('please try using a proxy / wait for fix')
+
+ if 'Bad Request' in response.text:
+ if logging:
+ print('bad request, retrying...', response.json())
+ quit()
+
+ if logging:
+ print('send_code', response.json())
+
+ mail_content = mail_client.get_message()
+ mail_token = findall(r';">(\d{6,7})', mail_content)[0]
+
+ if logging:
+ print('code', mail_token)
+
+ payload = dumps(
+ separators=(',', ':'),
+ obj={
+ "queryName": "SignupOrLoginWithCodeSection_signupWithVerificationCodeMutation_Mutation",
+ "variables": {"verificationCode": str(mail_token), "emailAddress": mail_address, "phoneNumber": None},
+ "query": "mutation SignupOrLoginWithCodeSection_signupWithVerificationCodeMutation_Mutation(\n $verificationCode: String!\n $emailAddress: String\n $phoneNumber: String\n) {\n signupWithVerificationCode(verificationCode: $verificationCode, emailAddress: $emailAddress, phoneNumber: $phoneNumber) {\n status\n errorMessage\n }\n}\n",
+ },
+ )
+
+ base_string = payload + client.headers["poe-formkey"] + 'WpuLMiXEKKE98j56k'
+ client.headers["poe-tag-id"] = md5(base_string.encode()).hexdigest()
+
+ response = client.post('https://poe.com/api/gql_POST', data=payload)
+ if logging:
+ print('verify_code', response.json())
+
+
+Account.create(proxy='', logging=True)
diff --git a/g4f/.v1/testing/poe_test.py b/g4f/.v1/testing/poe_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..6edc030c3fc6d85c2cb8a27e8637391fbeac8c3f
--- /dev/null
+++ b/g4f/.v1/testing/poe_test.py
@@ -0,0 +1,13 @@
+from time import sleep
+
+from gpt4free import quora
+
+token = quora.Account.create(proxy=None, logging=True)
+print('token', token)
+
+sleep(2)
+
+for response in quora.StreamingCompletion.create(model='ChatGPT', prompt='hello world', token=token):
+ print(response.text, flush=True)
+
+quora.Account.delete(token)
diff --git a/g4f/.v1/testing/quora_test_2.py b/g4f/.v1/testing/quora_test_2.py
new file mode 100644
index 0000000000000000000000000000000000000000..297ca7a1b6b13559cc3f7f19e9ae6899e723f49b
--- /dev/null
+++ b/g4f/.v1/testing/quora_test_2.py
@@ -0,0 +1,12 @@
+from gpt4free import quora
+
+token = quora.Account.create(logging=True, enable_bot_creation=True)
+
+model = quora.Model.create(
+ token=token, model='ChatGPT', system_prompt='you are ChatGPT a large language model ...' # or claude-instant-v1.0
+)
+
+print(model.name)
+
+for response in quora.StreamingCompletion.create(custom_model=model.name, prompt='hello world', token=token):
+ print(response.text)
diff --git a/g4f/.v1/testing/sqlchat_test.py b/g4f/.v1/testing/sqlchat_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..1db71be2e8abccc16cdbfc1b78a8d3e9adbf2122
--- /dev/null
+++ b/g4f/.v1/testing/sqlchat_test.py
@@ -0,0 +1,4 @@
+import sqlchat
+
+for response in sqlchat.StreamCompletion.create(prompt='write python code to reverse a string', messages=[]):
+ print(response.completion.choices[0].text, end='')
diff --git a/g4f/.v1/testing/t3nsor_test.py b/g4f/.v1/testing/t3nsor_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..6d36400d01bdae0e7506d53fa9132ef9e53286ab
--- /dev/null
+++ b/g4f/.v1/testing/t3nsor_test.py
@@ -0,0 +1,4 @@
+import t3nsor
+
+for response in t3nsor.StreamCompletion.create(prompt='write python code to reverse a string', messages=[]):
+ print(response.completion.choices[0].text)
diff --git a/g4f/.v1/testing/test_main.py b/g4f/.v1/testing/test_main.py
new file mode 100644
index 0000000000000000000000000000000000000000..7c28f1d2b7e8c6da449ac6e47358881de7ea4fe5
--- /dev/null
+++ b/g4f/.v1/testing/test_main.py
@@ -0,0 +1,27 @@
+import gpt4free
+from gpt4free import Provider, quora, forefront
+
+# usage You
+response = gpt4free.Completion.create(Provider.You, prompt='Write a poem on Lionel Messi')
+print(response)
+
+# usage Poe
+token = quora.Account.create(logging=False)
+response = gpt4free.Completion.create(Provider.Poe, prompt='Write a poem on Lionel Messi', token=token, model='ChatGPT')
+print(response)
+
+# usage forefront
+token = forefront.Account.create(logging=False)
+response = gpt4free.Completion.create(
+ Provider.ForeFront, prompt='Write a poem on Lionel Messi', model='gpt-4', token=token
+)
+print(response)
+print(f'END')
+
+# usage theb
+response = gpt4free.Completion.create(Provider.Theb, prompt='Write a poem on Lionel Messi')
+print(response)
+
+# usage cocalc
+response = gpt4free.Completion.create(Provider.CoCalc, prompt='Write a poem on Lionel Messi', cookie_input='')
+print(response)
diff --git a/g4f/.v1/testing/theb_test.py b/g4f/.v1/testing/theb_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..5fa80908c401a98afe362c261344c0b8624f94e9
--- /dev/null
+++ b/g4f/.v1/testing/theb_test.py
@@ -0,0 +1,5 @@
+from gpt4free import theb
+
+for token in theb.Completion.create('hello world'):
+ print(token, end='', flush=True)
+ print('asdsos')
diff --git a/g4f/.v1/testing/useless_test.py b/g4f/.v1/testing/useless_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..47c92386ae925c79aec64891281041cd693077d5
--- /dev/null
+++ b/g4f/.v1/testing/useless_test.py
@@ -0,0 +1,25 @@
+from gpt4free import usesless
+
+message_id = ""
+while True:
+ prompt = input("Question: ")
+ if prompt == "!stop":
+ break
+
+ req = usesless.Completion.create(prompt=prompt, parentMessageId=message_id)
+
+ print(f"Answer: {req['text']}")
+ message_id = req["id"]
+
+import gpt4free
+
+message_id = ""
+while True:
+ prompt = input("Question: ")
+ if prompt == "!stop":
+ break
+
+ req = gpt4free.Completion.create(provider=gpt4free.Provider.UseLess, prompt=prompt, parentMessageId=message_id)
+
+ print(f"Answer: {req['text']}")
+ message_id = req["id"]
diff --git a/g4f/.v1/testing/usesless_test.py b/g4f/.v1/testing/usesless_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..e2e35547b0553611f321ab571f6af57def748807
--- /dev/null
+++ b/g4f/.v1/testing/usesless_test.py
@@ -0,0 +1,13 @@
+import usesless
+
+question1 = "Who won the world series in 2020?"
+req = usesless.Completion.create(prompt=question1)
+answer = req["text"]
+message_id = req["parentMessageId"]
+
+question2 = "Where was it played?"
+req2 = usesless.Completion.create(prompt=question2, parentMessageId=message_id)
+answer2 = req2["text"]
+
+print(answer)
+print(answer2)
diff --git a/g4f/.v1/testing/writesonic_test.py b/g4f/.v1/testing/writesonic_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..aff847f09f44b3569414e849dcf17c39e7b43b96
--- /dev/null
+++ b/g4f/.v1/testing/writesonic_test.py
@@ -0,0 +1,35 @@
+# import writesonic
+import writesonic
+
+# create account (3-4s)
+account = writesonic.Account.create(logging=True)
+
+# with loging:
+# 2023-04-06 21:50:25 INFO __main__ -> register success : '{"id":"51aa0809-3053-44f7-922a...' (2s)
+# 2023-04-06 21:50:25 INFO __main__ -> id : '51aa0809-3053-44f7-922a-2b85d8d07edf'
+# 2023-04-06 21:50:25 INFO __main__ -> token : 'eyJhbGciOiJIUzI1NiIsInR5cCI6Ik...'
+# 2023-04-06 21:50:28 INFO __main__ -> got key : '194158c4-d249-4be0-82c6-5049e869533c' (2s)
+
+# simple completion
+response = writesonic.Completion.create(api_key=account.key, prompt='hello world')
+
+print(response.completion.choices[0].text) # Hello! How may I assist you today?
+
+# conversation
+
+response = writesonic.Completion.create(
+ api_key=account.key,
+ prompt='what is my name ?',
+ enable_memory=True,
+ history_data=[{'is_sent': True, 'message': 'my name is Tekky'}, {'is_sent': False, 'message': 'hello Tekky'}],
+)
+
+print(response.completion.choices[0].text) # Your name is Tekky.
+
+# enable internet
+
+response = writesonic.Completion.create(
+ api_key=account.key, prompt='who won the quatar world cup ?', enable_google_results=True
+)
+
+print(response.completion.choices[0].text) # Argentina won the 2022 FIFA World Cup tournament held in Qatar ...
diff --git a/g4f/.v1/testing/you_test.py b/g4f/.v1/testing/you_test.py
new file mode 100644
index 0000000000000000000000000000000000000000..1e9f620507a3bb4ff5e546cf693cfe3764ac437f
--- /dev/null
+++ b/g4f/.v1/testing/you_test.py
@@ -0,0 +1,27 @@
+from gpt4free import you
+
+# simple request with links and details
+response = you.Completion.create(prompt="hello world", detailed=True, include_links=True)
+
+print(response)
+
+# {
+# "response": "...",
+# "links": [...],
+# "extra": {...},
+# "slots": {...}
+# }
+# }
+
+# chatbot
+
+chat = []
+
+while True:
+ prompt = input("You: ")
+
+ response = you.Completion.create(prompt=prompt, chat=chat)
+
+ print("Bot:", response.text)
+
+ chat.append({"question": prompt, "answer": response.text})
diff --git a/g4f/.v1/unfinished/bard/README.md b/g4f/.v1/unfinished/bard/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..67e8645ced188f048308ad80accee8ef900ef6ef
--- /dev/null
+++ b/g4f/.v1/unfinished/bard/README.md
@@ -0,0 +1,2 @@
+to do:
+- code refractoring
\ No newline at end of file
diff --git a/g4f/.v1/unfinished/bard/__init__.py b/g4f/.v1/unfinished/bard/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..f1d68b9281f7462f2f80a9b14d4c05795c05898d
--- /dev/null
+++ b/g4f/.v1/unfinished/bard/__init__.py
@@ -0,0 +1,93 @@
+from json import dumps, loads
+from os import getenv
+from random import randint
+from re import search
+from urllib.parse import urlencode
+
+from bard.typings import BardResponse
+from dotenv import load_dotenv
+from requests import Session
+
+load_dotenv()
+token = getenv('1psid')
+proxy = getenv('proxy')
+
+temperatures = {
+ 0: "Generate text strictly following known patterns, with no creativity.",
+ 0.1: "Produce text adhering closely to established patterns, allowing minimal creativity.",
+ 0.2: "Create text with modest deviations from familiar patterns, injecting a slight creative touch.",
+ 0.3: "Craft text with a mild level of creativity, deviating somewhat from common patterns.",
+ 0.4: "Formulate text balancing creativity and recognizable patterns for coherent results.",
+ 0.5: "Generate text with a moderate level of creativity, allowing for a mix of familiarity and novelty.",
+ 0.6: "Compose text with an increased emphasis on creativity, while partially maintaining familiar patterns.",
+ 0.7: "Produce text favoring creativity over typical patterns for more original results.",
+ 0.8: "Create text heavily focused on creativity, with limited concern for familiar patterns.",
+ 0.9: "Craft text with a strong emphasis on unique and inventive ideas, largely ignoring established patterns.",
+ 1: "Generate text with maximum creativity, disregarding any constraints of known patterns or structures."
+}
+
+
+class Completion:
+ def create(
+ prompt: str = 'hello world',
+ temperature: int = None,
+ conversation_id: str = '',
+ response_id: str = '',
+ choice_id: str = '') -> BardResponse:
+
+ if temperature:
+ prompt = f'''settings: follow these settings for your response: [temperature: {temperature} - {temperatures[temperature]}] | prompt : {prompt}'''
+
+ client = Session()
+ client.proxies = {
+ 'http': f'http://{proxy}',
+ 'https': f'http://{proxy}'} if proxy else None
+
+ client.headers = {
+ 'authority': 'bard.google.com',
+ 'content-type': 'application/x-www-form-urlencoded;charset=UTF-8',
+ 'origin': 'https://bard.google.com',
+ 'referer': 'https://bard.google.com/',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36',
+ 'x-same-domain': '1',
+ 'cookie': f'__Secure-1PSID={token}'
+ }
+
+ snlm0e = search(r'SNlM0e\":\"(.*?)\"',
+ client.get('https://bard.google.com/').text).group(1)
+
+ params = urlencode({
+ 'bl': 'boq_assistant-bard-web-server_20230326.21_p0',
+ '_reqid': randint(1111, 9999),
+ 'rt': 'c',
+ })
+
+ response = client.post(
+ f'https://bard.google.com/_/BardChatUi/data/assistant.lamda.BardFrontendService/StreamGenerate?{params}',
+ data={
+ 'at': snlm0e,
+ 'f.req': dumps([None, dumps([
+ [prompt],
+ None,
+ [conversation_id, response_id, choice_id],
+ ])])
+ }
+ )
+
+ chat_data = loads(response.content.splitlines()[3])[0][2]
+ if not chat_data:
+ print('error, retrying')
+ Completion.create(prompt, temperature,
+ conversation_id, response_id, choice_id)
+
+ json_chat_data = loads(chat_data)
+ results = {
+ 'content': json_chat_data[0][0],
+ 'conversation_id': json_chat_data[1][0],
+ 'response_id': json_chat_data[1][1],
+ 'factualityQueries': json_chat_data[3],
+ 'textQuery': json_chat_data[2][0] if json_chat_data[2] is not None else '',
+ 'choices': [{'id': i[0], 'content': i[1]} for i in json_chat_data[4]],
+ }
+
+ return BardResponse(results)
diff --git a/g4f/.v1/unfinished/bard/typings.py b/g4f/.v1/unfinished/bard/typings.py
new file mode 100644
index 0000000000000000000000000000000000000000..75b73bf9e5228ec3f636d184df6f5dd07b8dcd91
--- /dev/null
+++ b/g4f/.v1/unfinished/bard/typings.py
@@ -0,0 +1,54 @@
+from typing import Dict, List, Union
+
+
+class BardResponse:
+ def __init__(self, json_dict: Dict[str, Union[str, List]]) -> None:
+ """
+ Initialize a BardResponse object.
+
+ :param json_dict: A dictionary containing the JSON response data.
+ """
+ self.json = json_dict
+
+ self.content = json_dict.get('content')
+ self.conversation_id = json_dict.get('conversation_id')
+ self.response_id = json_dict.get('response_id')
+ self.factuality_queries = json_dict.get('factualityQueries', [])
+ self.text_query = json_dict.get('textQuery', [])
+ self.choices = [self.BardChoice(choice)
+ for choice in json_dict.get('choices', [])]
+
+ def __repr__(self) -> str:
+ """
+ Return a string representation of the BardResponse object.
+
+ :return: A string representation of the BardResponse object.
+ """
+ return f"BardResponse(conversation_id={self.conversation_id}, response_id={self.response_id}, content={self.content})"
+
+ def filter_choices(self, keyword: str) -> List['BardChoice']:
+ """
+ Filter the choices based on a keyword.
+
+ :param keyword: The keyword to filter choices by.
+ :return: A list of filtered BardChoice objects.
+ """
+ return [choice for choice in self.choices if keyword.lower() in choice.content.lower()]
+
+ class BardChoice:
+ def __init__(self, choice_dict: Dict[str, str]) -> None:
+ """
+ Initialize a BardChoice object.
+
+ :param choice_dict: A dictionary containing the choice data.
+ """
+ self.id = choice_dict.get('id')
+ self.content = choice_dict.get('content')[0]
+
+ def __repr__(self) -> str:
+ """
+ Return a string representation of the BardChoice object.
+
+ :return: A string representation of the BardChoice object.
+ """
+ return f"BardChoice(id={self.id}, content={self.content})"
diff --git a/g4f/.v1/unfinished/bing/README.md b/g4f/.v1/unfinished/bing/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..67e8645ced188f048308ad80accee8ef900ef6ef
--- /dev/null
+++ b/g4f/.v1/unfinished/bing/README.md
@@ -0,0 +1,2 @@
+to do:
+- code refractoring
\ No newline at end of file
diff --git a/g4f/.v1/unfinished/bing/__ini__.py b/g4f/.v1/unfinished/bing/__ini__.py
new file mode 100644
index 0000000000000000000000000000000000000000..1e4fd149dd2371c54989bf3b6e034fd60e156213
--- /dev/null
+++ b/g4f/.v1/unfinished/bing/__ini__.py
@@ -0,0 +1,108 @@
+# Import necessary libraries
+import asyncio
+from json import dumps, loads
+from ssl import create_default_context
+
+import websockets
+from browser_cookie3 import edge
+from certifi import where
+from requests import get
+
+# Set up SSL context
+ssl_context = create_default_context()
+ssl_context.load_verify_locations(where())
+
+
+def format(msg: dict) -> str:
+ """Format message as JSON string with delimiter."""
+ return dumps(msg) + '\x1e'
+
+
+def get_token():
+ """Retrieve token from browser cookies."""
+ cookies = {c.name: c.value for c in edge(domain_name='bing.com')}
+ return cookies['_U']
+
+
+class AsyncCompletion:
+ async def create(
+ prompt: str = 'hello world',
+ optionSets: list = [
+ 'deepleo',
+ 'enable_debug_commands',
+ 'disable_emoji_spoken_text',
+ 'enablemm',
+ 'h3relaxedimg'
+ ],
+ token: str = get_token()):
+ """Create a connection to Bing AI and send the prompt."""
+
+ # Send create request
+ create = get('https://edgeservices.bing.com/edgesvc/turing/conversation/create',
+ headers={
+ 'host': 'edgeservices.bing.com',
+ 'authority': 'edgeservices.bing.com',
+ 'cookie': f'_U={token}',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36 Edg/110.0.1587.69',
+ }
+ )
+
+ # Extract conversation data
+ conversationId = create.json()['conversationId']
+ clientId = create.json()['clientId']
+ conversationSignature = create.json()['conversationSignature']
+
+ # Connect to WebSocket
+ wss = await websockets.connect('wss://sydney.bing.com/sydney/ChatHub', max_size=None, ssl=ssl_context,
+ extra_headers={
+ # Add necessary headers
+ }
+ )
+
+ # Send JSON protocol version
+ await wss.send(format({'protocol': 'json', 'version': 1}))
+ await wss.recv()
+
+ # Define message structure
+ struct = {
+ # Add necessary message structure
+ }
+
+ # Send message
+ await wss.send(format(struct))
+
+ # Process responses
+ base_string = ''
+ final = False
+ while not final:
+ objects = str(await wss.recv()).split('\x1e')
+ for obj in objects:
+ if obj is None or obj == '':
+ continue
+
+ response = loads(obj)
+ if response.get('type') == 1 and response['arguments'][0].get('messages', ):
+ response_text = response['arguments'][0]['messages'][0]['adaptiveCards'][0]['body'][0].get(
+ 'text')
+
+ yield (response_text.replace(base_string, ''))
+ base_string = response_text
+
+ elif response.get('type') == 2:
+ final = True
+
+ await wss.close()
+
+
+async def run():
+ """Run the async completion and print the result."""
+ async for value in AsyncCompletion.create(
+ prompt='summarize cinderella with each word beginning with a consecutive letter of the alphabet, a-z',
+ optionSets=[
+ "galileo",
+ ]
+ ):
+ print(value, end='', flush=True)
+
+
+asyncio.run(run())
diff --git a/g4f/.v1/unfinished/chatpdf/__init__.py b/g4f/.v1/unfinished/chatpdf/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..30dc1d3e60365e97957dbfe6d702b1d5b2e39d01
--- /dev/null
+++ b/g4f/.v1/unfinished/chatpdf/__init__.py
@@ -0,0 +1,82 @@
+import requests
+import json
+
+from queue import Queue, Empty
+from threading import Thread
+from json import loads
+from re import findall
+
+
+class Completion:
+
+ def request(prompt: str):
+ '''TODO: some sort of authentication + upload PDF from URL or local file
+ Then you should get the atoken and chat ID
+ '''
+
+ token = "your_token_here"
+ chat_id = "your_chat_id_here"
+
+ url = "https://chat-pr4yueoqha-ue.a.run.app/"
+
+ payload = json.dumps({
+ "v": 2,
+ "chatSession": {
+ "type": "join",
+ "chatId": chat_id
+ },
+ "history": [
+ {
+ "id": "VNsSyJIq_0",
+ "author": "p_if2GPSfyN8hjDoA7unYe",
+ "msg": "",
+ "time": 1682672009270
+ },
+ {
+ "id": "Zk8DRUtx_6",
+ "author": "uplaceholder",
+ "msg": prompt,
+ "time": 1682672181339
+ }
+ ]
+ })
+
+ # TODO: fix headers, use random user-agent, streaming response, etc
+ headers = {
+ 'authority': 'chat-pr4yueoqha-ue.a.run.app',
+ 'accept': '*/*',
+ 'accept-language': 'en-US,en;q=0.9',
+ 'atoken': token,
+ 'content-type': 'application/json',
+ 'origin': 'https://www.chatpdf.com',
+ 'referer': 'https://www.chatpdf.com/',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'cross-site',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36'
+ }
+
+ response = requests.request(
+ "POST", url, headers=headers, data=payload).text
+ Completion.stream_completed = True
+ return {'response': response}
+
+ @staticmethod
+ def create(prompt: str):
+ Thread(target=Completion.request, args=[prompt]).start()
+
+ while Completion.stream_completed != True or not Completion.message_queue.empty():
+ try:
+ message = Completion.message_queue.get(timeout=0.01)
+ for message in findall(Completion.regex, message):
+ yield loads(Completion.part1 + message + Completion.part2)['delta']
+
+ except Empty:
+ pass
+
+ @staticmethod
+ def handle_stream_response(response):
+ Completion.message_queue.put(response.decode())
diff --git a/g4f/.v1/unfinished/gptbz/README.md b/g4f/.v1/unfinished/gptbz/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..05bc2770e0f5b20407b49e54870df4c09902886d
--- /dev/null
+++ b/g4f/.v1/unfinished/gptbz/README.md
@@ -0,0 +1,4 @@
+https://chat.gpt.bz
+
+to do:
+- code refractoring
\ No newline at end of file
diff --git a/g4f/.v1/unfinished/gptbz/__init__.py b/g4f/.v1/unfinished/gptbz/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..e95d5716960fdf1862744534f1a22522efd864a8
--- /dev/null
+++ b/g4f/.v1/unfinished/gptbz/__init__.py
@@ -0,0 +1,46 @@
+from json import dumps, loads
+
+import websockets
+
+
+# Define the asynchronous function to test the WebSocket connection
+
+
+async def test():
+ # Establish a WebSocket connection with the specified URL
+ async with websockets.connect('wss://chatgpt.func.icu/conversation+ws') as wss:
+
+ # Prepare the message payload as a JSON object
+ payload = {
+ 'content_type': 'text',
+ 'engine': 'chat-gpt',
+ 'parts': ['hello world'],
+ 'options': {}
+ }
+
+ # Send the payload to the WebSocket server
+ await wss.send(dumps(obj=payload, separators=(',', ':')))
+
+ # Initialize a variable to track the end of the conversation
+ ended = None
+
+ # Continuously receive and process messages until the conversation ends
+ while not ended:
+ try:
+ # Receive and parse the JSON response from the server
+ response = await wss.recv()
+ json_response = loads(response)
+
+ # Print the entire JSON response
+ print(json_response)
+
+ # Check for the end of the conversation
+ ended = json_response.get('eof')
+
+ # If the conversation has not ended, print the received message
+ if not ended:
+ print(json_response['content']['parts'][0])
+
+ # Handle cases when the connection is closed by the server
+ except websockets.ConnectionClosed:
+ break
diff --git a/g4f/.v1/unfinished/openprompt/README.md b/g4f/.v1/unfinished/openprompt/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..489d054aeb56c15d30d3cda1e8ef350c7ff3167a
--- /dev/null
+++ b/g4f/.v1/unfinished/openprompt/README.md
@@ -0,0 +1,5 @@
+https://openprompt.co/
+
+to do:
+- finish integrating email client
+- code refractoring
\ No newline at end of file
diff --git a/g4f/.v1/unfinished/openprompt/create.py b/g4f/.v1/unfinished/openprompt/create.py
new file mode 100644
index 0000000000000000000000000000000000000000..c968c162dbd4e44e4f29e5dfbf270c28963cb97b
--- /dev/null
+++ b/g4f/.v1/unfinished/openprompt/create.py
@@ -0,0 +1,64 @@
+from json import dumps
+# from mail import MailClient
+from re import findall
+
+from requests import post, get
+
+html = get('https://developermail.com/mail/')
+print(html.cookies.get('mailboxId'))
+email = findall(r'mailto:(.*)">', html.text)[0]
+
+headers = {
+ 'apikey': 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6InVzanNtdWZ1emRjcnJjZXVobnlqIiwicm9sZSI6ImFub24iLCJpYXQiOjE2NzgyODYyMzYsImV4cCI6MTk5Mzg2MjIzNn0.2MQ9Lkh-gPqQwV08inIgqozfbYm5jdYWtf-rn-wfQ7U',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
+ 'x-client-info': '@supabase/auth-helpers-nextjs@0.5.6',
+}
+
+json_data = {
+ 'email': email,
+ 'password': 'T4xyt4Yn6WWQ4NC',
+ 'data': {},
+ 'gotrue_meta_security': {},
+}
+
+response = post('https://usjsmufuzdcrrceuhnyj.supabase.co/auth/v1/signup', headers=headers, json=json_data)
+print(response.json())
+
+# email_link = None
+# while not email_link:
+# sleep(1)
+
+# mails = mailbox.getmails()
+# print(mails)
+
+
+quit()
+
+url = input("Enter the url: ")
+response = get(url, allow_redirects=False)
+
+# https://openprompt.co/#access_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8&expires_in=604800&refresh_token=_Zp8uXIA2InTDKYgo8TCqA&token_type=bearer&type=signup
+
+redirect = response.headers.get('location')
+access_token = redirect.split('&')[0].split('=')[1]
+refresh_token = redirect.split('&')[2].split('=')[1]
+
+supabase_auth_token = dumps([access_token, refresh_token, None, None, None], separators=(',', ':'))
+print(supabase_auth_token)
+
+cookies = {
+ 'supabase-auth-token': supabase_auth_token
+}
+
+json_data = {
+ 'messages': [
+ {
+ 'role': 'user',
+ 'content': 'how do I reverse a string in python?'
+ }
+ ]
+}
+
+response = post('https://openprompt.co/api/chat2', cookies=cookies, json=json_data, stream=True)
+for chunk in response.iter_content(chunk_size=1024):
+ print(chunk)
diff --git a/g4f/.v1/unfinished/openprompt/mail.py b/g4f/.v1/unfinished/openprompt/mail.py
new file mode 100644
index 0000000000000000000000000000000000000000..1130e7df1d309fdf35e13380cf81a1d162d14ff5
--- /dev/null
+++ b/g4f/.v1/unfinished/openprompt/mail.py
@@ -0,0 +1,111 @@
+import email
+
+import requests
+
+
+class MailClient:
+
+ def __init__(self):
+ self.username = None
+ self.token = None
+ self.raw = None
+ self.mailids = None
+ self.mails = None
+ self.mail = None
+
+ def create(self, force=False):
+ headers = {
+ 'accept': 'application/json',
+ }
+
+ if self.username:
+ pass
+ else:
+ self.response = requests.put(
+ 'https://www.developermail.com/api/v1/mailbox', headers=headers)
+ self.response = self.response.json()
+ self.username = self.response['result']['name']
+ self.token = self.response['result']['token']
+
+ return {'username': self.username, 'token': self.token}
+
+ def destroy(self):
+ headers = {
+ 'accept': 'application/json',
+ 'X-MailboxToken': self.token,
+ }
+ self.response = requests.delete(
+ f'https://www.developermail.com/api/v1/mailbox/{self.username}', headers=headers)
+ self.response = self.response.json()
+ self.username = None
+ self.token = None
+ return self.response
+
+ def newtoken(self):
+ headers = {
+ 'accept': 'application/json',
+ 'X-MailboxToken': self.token,
+ }
+ self.response = requests.put(
+ f'https://www.developermail.com/api/v1/mailbox/{self.username}/token', headers=headers)
+ self.response = self.response.json()
+ self.token = self.response['result']['token']
+ return {'username': self.username, 'token': self.token}
+
+ def getmailids(self):
+ headers = {
+ 'accept': 'application/json',
+ 'X-MailboxToken': self.token,
+ }
+
+ self.response = requests.get(
+ f'https://www.developermail.com/api/v1/mailbox/{self.username}', headers=headers)
+ self.response = self.response.json()
+ self.mailids = self.response['result']
+ return self.mailids
+
+ def getmails(self, mailids: list = None):
+ headers = {
+ 'accept': 'application/json',
+ 'X-MailboxToken': self.token,
+ 'Content-Type': 'application/json',
+ }
+
+ if mailids is None:
+ mailids = self.mailids
+
+ data = str(mailids)
+
+ self.response = requests.post(
+ f'https://www.developermail.com/api/v1/mailbox/{self.username}/messages', headers=headers, data=data)
+ self.response = self.response.json()
+ self.mails = self.response['result']
+ return self.mails
+
+ def getmail(self, mailid: str, raw=False):
+ headers = {
+ 'accept': 'application/json',
+ 'X-MailboxToken': self.token,
+ }
+ self.response = requests.get(
+ f'https://www.developermail.com/api/v1/mailbox/{self.username}/messages/{mailid}', headers=headers)
+ self.response = self.response.json()
+ self.mail = self.response['result']
+ if raw is False:
+ self.mail = email.message_from_string(self.mail)
+ return self.mail
+
+ def delmail(self, mailid: str):
+ headers = {
+ 'accept': 'application/json',
+ 'X-MailboxToken': self.token,
+ }
+ self.response = requests.delete(
+ f'https://www.developermail.com/api/v1/mailbox/{self.username}/messages/{mailid}', headers=headers)
+ self.response = self.response.json()
+ return self.response
+
+
+client = MailClient()
+client.newtoken()
+print(client.getmails())
diff --git a/g4f/.v1/unfinished/openprompt/main.py b/g4f/.v1/unfinished/openprompt/main.py
new file mode 100644
index 0000000000000000000000000000000000000000..e68a3b638950e2b486b7aa37dbd596679d5ce6af
--- /dev/null
+++ b/g4f/.v1/unfinished/openprompt/main.py
@@ -0,0 +1,36 @@
+import requests
+
+cookies = {
+ 'supabase-auth-token': '["eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk1NzQyLCJzdWIiOiJlOGExOTdiNS03YTAxLTQ3MmEtODQ5My1mNGUzNTNjMzIwNWUiLCJlbWFpbCI6InFlY3RncHZhamlibGNjQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTA5NDJ9XSwic2Vzc2lvbl9pZCI6IjIwNTg5MmE5LWU5YTAtNDk2Yi1hN2FjLWEyMWVkMTkwZDA4NCJ9.o7UgHpiJMfa6W-UKCSCnAncIfeOeiHz-51sBmokg0MA","RtPKeb7KMMC9Dn2fZOfiHA",null,null,null]',
+}
+
+headers = {
+ 'authority': 'openprompt.co',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'content-type': 'application/json',
+ # 'cookie': 'supabase-auth-token=%5B%22eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjkzMjQ4LCJzdWIiOiJlODQwNTZkNC0xZWJhLTQwZDktOWU1Mi1jMTc4MTUwN2VmNzgiLCJlbWFpbCI6InNia2didGJnZHB2bHB0ZUBidWdmb28uY29tIiwicGhvbmUiOiIiLCJhcHBfbWV0YWRhdGEiOnsicHJvdmlkZXIiOiJlbWFpbCIsInByb3ZpZGVycyI6WyJlbWFpbCJdfSwidXNlcl9tZXRhZGF0YSI6e30sInJvbGUiOiJhdXRoZW50aWNhdGVkIiwiYWFsIjoiYWFsMSIsImFtciI6W3sibWV0aG9kIjoib3RwIiwidGltZXN0YW1wIjoxNjgxNjg4NDQ4fV0sInNlc3Npb25faWQiOiJiNDhlMmU3NS04NzlhLTQxZmEtYjQ4MS01OWY0OTgxMzg3YWQifQ.5-3E7WvMMVkXewD1qA26Rv4OFSTT82wYUBXNGcYaYfQ%22%2C%22u5TGGMMeT3zZA0agm5HGuA%22%2Cnull%2Cnull%2Cnull%5D',
+ 'origin': 'https://openprompt.co',
+ 'referer': 'https://openprompt.co/ChatGPT',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
+}
+
+json_data = {
+ 'messages': [
+ {
+ 'role': 'user',
+ 'content': 'hello world',
+ },
+ ],
+}
+
+response = requests.post('https://openprompt.co/api/chat2', cookies=cookies, headers=headers, json=json_data,
+ stream=True)
+for chunk in response.iter_content(chunk_size=1024):
+ print(chunk)
diff --git a/g4f/.v1/unfinished/openprompt/test.py b/g4f/.v1/unfinished/openprompt/test.py
new file mode 100644
index 0000000000000000000000000000000000000000..65319cb60b47350453224eb2d9b0f6095c7629c1
--- /dev/null
+++ b/g4f/.v1/unfinished/openprompt/test.py
@@ -0,0 +1,6 @@
+access_token = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV'
+supabase_auth_token = '%5B%22eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8%22%2C%22_Zp8uXIA2InTDKYgo8TCqA%22%2Cnull%2Cnull%2Cnull%5D'
+
+idk = [
+ "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiJhdXRoZW50aWNhdGVkIiwiZXhwIjoxNjgyMjk0ODcxLCJzdWIiOiI4NWNkNTNiNC1lZTUwLTRiMDQtOGJhNS0wNTUyNjk4ODliZDIiLCJlbWFpbCI6ImNsc2J5emdqcGhiQGJ1Z2Zvby5jb20iLCJwaG9uZSI6IiIsImFwcF9tZXRhZGF0YSI6eyJwcm92aWRlciI6ImVtYWlsIiwicHJvdmlkZXJzIjpbImVtYWlsIl19LCJ1c2VyX21ldGFkYXRhIjp7fSwicm9sZSI6ImF1dGhlbnRpY2F0ZWQiLCJhYWwiOiJhYWwxIiwiYW1yIjpbeyJtZXRob2QiOiJvdHAiLCJ0aW1lc3RhbXAiOjE2ODE2OTAwNzF9XSwic2Vzc2lvbl9pZCI6ImY4MTg1YTM5LTkxYzgtNGFmMy1iNzAxLTdhY2MwY2MwMGNlNSJ9.UvcTfpyIM1TdzM8ZV6UAPWfa0rgNq4AiqeD0INy6zV8",
+ "_Zp8uXIA2InTDKYgo8TCqA", None, None, None]
diff --git a/g4f/.v1/unfinished/t3nsor/README.md b/g4f/.v1/unfinished/t3nsor/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..2790bf6e5fb5ab314395757168c26c956e0395fe
--- /dev/null
+++ b/g4f/.v1/unfinished/t3nsor/README.md
@@ -0,0 +1,44 @@
+### note: currently patched
+
+### Example: `t3nsor` (use like openai pypi package)
+
+```python
+# Import t3nsor
+import t3nsor
+
+# t3nsor.Completion.create
+# t3nsor.StreamCompletion.create
+
+[...]
+
+```
+
+#### Example Chatbot
+```python
+messages = []
+
+while True:
+ user = input('you: ')
+
+ t3nsor_cmpl = t3nsor.Completion.create(
+ prompt = user,
+ messages = messages
+ )
+
+ print('gpt:', t3nsor_cmpl.completion.choices[0].text)
+
+ messages.extend([
+ {'role': 'user', 'content': user },
+ {'role': 'assistant', 'content': t3nsor_cmpl.completion.choices[0].text}
+ ])
+```
+
+#### Streaming Response:
+
+```python
+for response in t3nsor.StreamCompletion.create(
+ prompt = 'write python code to reverse a string',
+ messages = []):
+
+ print(response.completion.choices[0].text)
+```
diff --git a/g4f/.v1/unfinished/t3nsor/__init__.py b/g4f/.v1/unfinished/t3nsor/__init__.py
new file mode 100644
index 0000000000000000000000000000000000000000..9b588e982231638b8aafb5491e28fb0236f133e0
--- /dev/null
+++ b/g4f/.v1/unfinished/t3nsor/__init__.py
@@ -0,0 +1,136 @@
+from time import time
+
+from requests import post
+
+headers = {
+ 'authority': 'www.t3nsor.tech',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'origin': 'https://www.t3nsor.tech',
+ 'pragma': 'no-cache',
+ 'referer': 'https://www.t3nsor.tech/',
+ 'sec-ch-ua': '"Chromium";v="112", "Google Chrome";v="112", "Not:A-Brand";v="99"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Safari/537.36',
+}
+
+
+class T3nsorResponse:
+ class Completion:
+ class Choices:
+ def __init__(self, choice: dict) -> None:
+ self.text = choice['text']
+ self.content = self.text.encode()
+ self.index = choice['index']
+ self.logprobs = choice['logprobs']
+ self.finish_reason = choice['finish_reason']
+
+ def __repr__(self) -> str:
+ return f'''<__main__.APIResponse.Completion.Choices(\n text = {self.text.encode()},\n index = {self.index},\n logprobs = {self.logprobs},\n finish_reason = {self.finish_reason})object at 0x1337>'''
+
+ def __init__(self, choices: dict) -> None:
+ self.choices = [self.Choices(choice) for choice in choices]
+
+ class Usage:
+ def __init__(self, usage_dict: dict) -> None:
+ self.prompt_tokens = usage_dict['prompt_chars']
+ self.completion_tokens = usage_dict['completion_chars']
+ self.total_tokens = usage_dict['total_chars']
+
+ def __repr__(self):
+ return f'''<__main__.APIResponse.Usage(\n prompt_tokens = {self.prompt_tokens},\n completion_tokens = {self.completion_tokens},\n total_tokens = {self.total_tokens})object at 0x1337>'''
+
+ def __init__(self, response_dict: dict) -> None:
+ self.response_dict = response_dict
+ self.id = response_dict['id']
+ self.object = response_dict['object']
+ self.created = response_dict['created']
+ self.model = response_dict['model']
+ self.completion = self.Completion(response_dict['choices'])
+ self.usage = self.Usage(response_dict['usage'])
+
+ def json(self) -> dict:
+ return self.response_dict
+
+
+class Completion:
+ model = {
+ 'model': {
+ 'id': 'gpt-3.5-turbo',
+ 'name': 'Default (GPT-3.5)'
+ }
+ }
+
+ def create(
+ prompt: str = 'hello world',
+ messages: list = []) -> T3nsorResponse:
+ response = post('https://www.t3nsor.tech/api/chat', headers=headers, json=Completion.model | {
+ 'messages': messages,
+ 'key': '',
+ 'prompt': prompt
+ })
+
+ return T3nsorResponse({
+ 'id': f'cmpl-1337-{int(time())}',
+ 'object': 'text_completion',
+ 'created': int(time()),
+ 'model': Completion.model,
+ 'choices': [{
+ 'text': response.text,
+ 'index': 0,
+ 'logprobs': None,
+ 'finish_reason': 'stop'
+ }],
+ 'usage': {
+ 'prompt_chars': len(prompt),
+ 'completion_chars': len(response.text),
+ 'total_chars': len(prompt) + len(response.text)
+ }
+ })
+
+
+class StreamCompletion:
+ model = {
+ 'model': {
+ 'id': 'gpt-3.5-turbo',
+ 'name': 'Default (GPT-3.5)'
+ }
+ }
+
+ def create(
+ prompt: str = 'hello world',
+ messages: list = []) -> T3nsorResponse:
+ print('t3nsor api is down, this may not work, refer to another module')
+
+ response = post('https://www.t3nsor.tech/api/chat', headers=headers, stream=True, json=Completion.model | {
+ 'messages': messages,
+ 'key': '',
+ 'prompt': prompt
+ })
+
+ for chunk in response.iter_content(chunk_size=2046):
+ yield T3nsorResponse({
+ 'id': f'cmpl-1337-{int(time())}',
+ 'object': 'text_completion',
+ 'created': int(time()),
+ 'model': Completion.model,
+
+ 'choices': [{
+ 'text': chunk.decode(),
+ 'index': 0,
+ 'logprobs': None,
+ 'finish_reason': 'stop'
+ }],
+
+ 'usage': {
+ 'prompt_chars': len(prompt),
+ 'completion_chars': len(chunk.decode()),
+ 'total_chars': len(prompt) + len(chunk.decode())
+ }
+ })
diff --git a/g4f/Provider/Provider.py b/g4f/Provider/Provider.py
index 185df846e446c7e7dc63f0bf98572403ef647cae..12c23333f87185e5fa0ae8f368540c816ab079f8 100644
--- a/g4f/Provider/Provider.py
+++ b/g4f/Provider/Provider.py
@@ -4,9 +4,12 @@ from ..typing import sha256, Dict, get_type_hints
url = None
model = None
supports_stream = False
+needs_auth = False
def _create_completion(model: str, messages: list, stream: bool, **kwargs):
return
+
params = f'g4f.Providers.{os.path.basename(__file__)[:-3]} supports: ' + \
- '(%s)' % ', '.join([f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
\ No newline at end of file
+ '(%s)' % ', '.join(
+ [f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
diff --git a/g4f/Provider/Providers/Aichat.py b/g4f/Provider/Providers/Aichat.py
new file mode 100644
index 0000000000000000000000000000000000000000..e4fde8c309f9ba0666c5edce35d777fa8fcf8d41
--- /dev/null
+++ b/g4f/Provider/Providers/Aichat.py
@@ -0,0 +1,44 @@
+import os, requests
+from ...typing import sha256, Dict, get_type_hints
+
+url = 'https://chat-gpt.org/chat'
+model = ['gpt-3.5-turbo']
+supports_stream = False
+needs_auth = False
+
+def _create_completion(model: str, messages: list, stream: bool, **kwargs):
+ base = ''
+ for message in messages:
+ base += '%s: %s\n' % (message['role'], message['content'])
+ base += 'assistant:'
+
+
+ headers = {
+ 'authority': 'chat-gpt.org',
+ 'accept': '*/*',
+ 'cache-control': 'no-cache',
+ 'content-type': 'application/json',
+ 'origin': 'https://chat-gpt.org',
+ 'pragma': 'no-cache',
+ 'referer': 'https://chat-gpt.org/chat',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"macOS"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36',
+ }
+
+ json_data = {
+ 'message':base,
+ 'temperature': 1,
+ 'presence_penalty': 0,
+ 'top_p': 1,
+ 'frequency_penalty': 0
+ }
+
+ response = requests.post('https://chat-gpt.org/api/text', headers=headers, json=json_data)
+ yield response.json()['message']
+
+params = f'g4f.Providers.{os.path.basename(__file__)[:-3]} supports: ' + \
+ '(%s)' % ', '.join([f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
\ No newline at end of file
diff --git a/g4f/Provider/Providers/Ails.py b/g4f/Provider/Providers/Ails.py
new file mode 100644
index 0000000000000000000000000000000000000000..1a14b2e9aec50328b7b21d5980bd67c5eaee2b3a
--- /dev/null
+++ b/g4f/Provider/Providers/Ails.py
@@ -0,0 +1,91 @@
+import os
+import time
+import json
+import uuid
+import random
+import hashlib
+import requests
+
+from ...typing import sha256, Dict, get_type_hints
+from datetime import datetime
+
+url: str = 'https://ai.ls'
+model: str = 'gpt-3.5-turbo'
+supports_stream = True
+needs_auth = False
+
+class Utils:
+ def hash(json_data: Dict[str, str]) -> sha256:
+
+ secretKey: bytearray = bytearray([79, 86, 98, 105, 91, 84, 80, 78, 123, 83,
+ 35, 41, 99, 123, 51, 54, 37, 57, 63, 103, 59, 117, 115, 108, 41, 67, 76])
+
+ base_string: str = '%s:%s:%s:%s' % (
+ json_data['t'],
+ json_data['m'],
+ 'WI,2rU#_r:r~aF4aJ36[.Z(/8Rv93Rf',
+ len(json_data['m'])
+ )
+
+ return hashlib.sha256(base_string.encode()).hexdigest()
+
+ def format_timestamp(timestamp: int) -> str:
+
+ e = timestamp
+ n = e % 10
+ r = n + 1 if n % 2 == 0 else n
+ return str(e - n + r)
+
+
+def _create_completion(model: str, messages: list, temperature: float = 0.6, stream: bool = False, **kwargs):
+
+ headers = {
+ 'authority': 'api.caipacity.com',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'authorization': 'Bearer free',
+ 'client-id': str(uuid.uuid4()),
+ 'client-v': '0.1.217',
+ 'content-type': 'application/json',
+ 'origin': 'https://ai.ls',
+ 'referer': 'https://ai.ls/',
+ 'sec-ch-ua': '"Not.A/Brand";v="8", "Chromium";v="114", "Google Chrome";v="114"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'cross-site',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36',
+ }
+
+ params = {
+ 'full': 'false',
+ }
+
+ timestamp = Utils.format_timestamp(int(time.time() * 1000))
+
+ sig = {
+ 'd': datetime.now().strftime('%Y-%m-%d'),
+ 't': timestamp,
+ 's': Utils.hash({
+ 't': timestamp,
+ 'm': messages[-1]['content']})}
+
+ json_data = json.dumps(separators=(',', ':'), obj={
+ 'model': 'gpt-3.5-turbo',
+ 'temperature': 0.6,
+ 'stream': True,
+ 'messages': messages} | sig)
+
+ response = requests.post('https://api.caipacity.com/v1/chat/completions',
+ headers=headers, data=json_data, stream=True)
+
+ for token in response.iter_lines():
+ if b'content' in token:
+ completion_chunk = json.loads(token.decode().replace('data: ', ''))
+ token = completion_chunk['choices'][0]['delta'].get('content')
+ if token != None:
+ yield token
+
+params = f'g4f.Providers.{os.path.basename(__file__)[:-3]} supports: ' + \
+ '(%s)' % ', '.join([f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
\ No newline at end of file
diff --git a/g4f/Provider/Providers/Bard.py b/g4f/Provider/Providers/Bard.py
new file mode 100644
index 0000000000000000000000000000000000000000..4c37c4b719430031fce41ce49946f0e6ac93d155
--- /dev/null
+++ b/g4f/Provider/Providers/Bard.py
@@ -0,0 +1,74 @@
+import os, requests, json, browser_cookie3, re, random
+from ...typing import sha256, Dict, get_type_hints
+
+url = 'https://bard.google.com'
+model = ['Palm2']
+supports_stream = False
+needs_auth = True
+
+def _create_completion(model: str, messages: list, stream: bool, **kwargs):
+ psid = {cookie.name: cookie.value for cookie in browser_cookie3.chrome(
+ domain_name='.google.com')}['__Secure-1PSID']
+
+ formatted = '\n'.join([
+ '%s: %s' % (message['role'], message['content']) for message in messages
+ ])
+ prompt = f'{formatted}\nAssistant:'
+
+ proxy = kwargs.get('proxy', False)
+ if proxy == False:
+ print('warning!, you did not give a proxy, a lot of countries are banned from Google Bard, so it may not work')
+
+ snlm0e = None
+ conversation_id = None
+ response_id = None
+ choice_id = None
+
+ client = requests.Session()
+ client.proxies = {
+ 'http': f'http://{proxy}',
+ 'https': f'http://{proxy}'} if proxy else None
+
+ client.headers = {
+ 'authority': 'bard.google.com',
+ 'content-type': 'application/x-www-form-urlencoded;charset=UTF-8',
+ 'origin': 'https://bard.google.com',
+ 'referer': 'https://bard.google.com/',
+ 'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36',
+ 'x-same-domain': '1',
+ 'cookie': f'__Secure-1PSID={psid}'
+ }
+
+ snlm0e = re.search(r'SNlM0e\":\"(.*?)\"',
+ client.get('https://bard.google.com/').text).group(1) if not snlm0e else snlm0e
+
+ params = {
+ 'bl': 'boq_assistant-bard-web-server_20230326.21_p0',
+ '_reqid': random.randint(1111, 9999),
+ 'rt': 'c'
+ }
+
+ data = {
+ 'at': snlm0e,
+ 'f.req': json.dumps([None, json.dumps([[prompt], None, [conversation_id, response_id, choice_id]])])}
+
+ intents = '.'.join([
+ 'assistant',
+ 'lamda',
+ 'BardFrontendService'
+ ])
+
+ response = client.post(f'https://bard.google.com/_/BardChatUi/data/{intents}/StreamGenerate',
+ data=data, params=params)
+
+ chat_data = json.loads(response.content.splitlines()[3])[0][2]
+ if chat_data:
+ json_chat_data = json.loads(chat_data)
+
+ yield json_chat_data[0][0]
+
+ else:
+ yield 'error'
+
+params = f'g4f.Providers.{os.path.basename(__file__)[:-3]} supports: ' + \
+ '(%s)' % ', '.join([f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
\ No newline at end of file
diff --git a/g4f/Provider/Providers/Bing.py b/g4f/Provider/Providers/Bing.py
new file mode 100644
index 0000000000000000000000000000000000000000..2ec2cf054ef208682f585fe182a589d9907e1304
--- /dev/null
+++ b/g4f/Provider/Providers/Bing.py
@@ -0,0 +1,351 @@
+import os
+import json
+import random
+import json
+import os
+import uuid
+import ssl
+import certifi
+import aiohttp
+import asyncio
+
+import requests
+from ...typing import sha256, Dict, get_type_hints
+
+url = 'https://bing.com/chat'
+model = ['gpt-4']
+supports_stream = True
+needs_auth = False
+
+ssl_context = ssl.create_default_context()
+ssl_context.load_verify_locations(certifi.where())
+
+
+class optionsSets:
+ optionSet: dict = {
+ 'tone': str,
+ 'optionsSets': list
+ }
+
+ jailbreak: dict = {
+ "optionsSets": [
+ 'saharasugg',
+ 'enablenewsfc',
+ 'clgalileo',
+ 'gencontentv3',
+ "nlu_direct_response_filter",
+ "deepleo",
+ "disable_emoji_spoken_text",
+ "responsible_ai_policy_235",
+ "enablemm",
+ "h3precise"
+ # "harmonyv3",
+ "dtappid",
+ "cricinfo",
+ "cricinfov2",
+ "dv3sugg",
+ "nojbfedge"
+ ]
+ }
+
+
+class Defaults:
+ delimiter = '\x1e'
+ ip_address = f'13.{random.randint(104, 107)}.{random.randint(0, 255)}.{random.randint(0, 255)}'
+
+ allowedMessageTypes = [
+ 'Chat',
+ 'Disengaged',
+ 'AdsQuery',
+ 'SemanticSerp',
+ 'GenerateContentQuery',
+ 'SearchQuery',
+ 'ActionRequest',
+ 'Context',
+ 'Progress',
+ 'AdsQuery',
+ 'SemanticSerp'
+ ]
+
+ sliceIds = [
+
+ # "222dtappid",
+ # "225cricinfo",
+ # "224locals0"
+
+ 'winmuid3tf',
+ 'osbsdusgreccf',
+ 'ttstmout',
+ 'crchatrev',
+ 'winlongmsgtf',
+ 'ctrlworkpay',
+ 'norespwtf',
+ 'tempcacheread',
+ 'temptacache',
+ '505scss0',
+ '508jbcars0',
+ '515enbotdets0',
+ '5082tsports',
+ '515vaoprvs',
+ '424dagslnv1s0',
+ 'kcimgattcf',
+ '427startpms0'
+ ]
+
+ location = {
+ 'locale': 'en-US',
+ 'market': 'en-US',
+ 'region': 'US',
+ 'locationHints': [
+ {
+ 'country': 'United States',
+ 'state': 'California',
+ 'city': 'Los Angeles',
+ 'timezoneoffset': 8,
+ 'countryConfidence': 8,
+ 'Center': {
+ 'Latitude': 34.0536909,
+ 'Longitude': -118.242766
+ },
+ 'RegionType': 2,
+ 'SourceType': 1
+ }
+ ],
+ }
+
+
+def _format(msg: dict) -> str:
+ return json.dumps(msg, ensure_ascii=False) + Defaults.delimiter
+
+
+async def create_conversation():
+ for _ in range(5):
+ create = requests.get('https://www.bing.com/turing/conversation/create',
+ headers={
+ 'authority': 'edgeservices.bing.com',
+ 'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7',
+ 'accept-language': 'en-US,en;q=0.9',
+ 'cache-control': 'max-age=0',
+ 'sec-ch-ua': '"Chromium";v="110", "Not A(Brand";v="24", "Microsoft Edge";v="110"',
+ 'sec-ch-ua-arch': '"x86"',
+ 'sec-ch-ua-bitness': '"64"',
+ 'sec-ch-ua-full-version': '"110.0.1587.69"',
+ 'sec-ch-ua-full-version-list': '"Chromium";v="110.0.5481.192", "Not A(Brand";v="24.0.0.0", "Microsoft Edge";v="110.0.1587.69"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-model': '""',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-ch-ua-platform-version': '"15.0.0"',
+ 'sec-fetch-dest': 'document',
+ 'sec-fetch-mode': 'navigate',
+ 'sec-fetch-site': 'none',
+ 'sec-fetch-user': '?1',
+ 'upgrade-insecure-requests': '1',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36 Edg/110.0.1587.69',
+ 'x-edge-shopping-flag': '1',
+ 'x-forwarded-for': Defaults.ip_address
+ })
+
+ conversationId = create.json().get('conversationId')
+ clientId = create.json().get('clientId')
+ conversationSignature = create.json().get('conversationSignature')
+
+ if not conversationId or not clientId or not conversationSignature and _ == 4:
+ raise Exception('Failed to create conversation.')
+
+ return conversationId, clientId, conversationSignature
+
+
+async def stream_generate(prompt: str, mode: optionsSets.optionSet = optionsSets.jailbreak, context: bool or str = False):
+ timeout = aiohttp.ClientTimeout(total=900)
+ session = aiohttp.ClientSession(timeout=timeout)
+
+ conversationId, clientId, conversationSignature = await create_conversation()
+
+ wss = await session.ws_connect('wss://sydney.bing.com/sydney/ChatHub', ssl=ssl_context, autoping=False,
+ headers={
+ 'accept': 'application/json',
+ 'accept-language': 'en-US,en;q=0.9',
+ 'content-type': 'application/json',
+ 'sec-ch-ua': '"Not_A Brand";v="99", "Microsoft Edge";v="110", "Chromium";v="110"',
+ 'sec-ch-ua-arch': '"x86"',
+ 'sec-ch-ua-bitness': '"64"',
+ 'sec-ch-ua-full-version': '"109.0.1518.78"',
+ 'sec-ch-ua-full-version-list': '"Chromium";v="110.0.5481.192", "Not A(Brand";v="24.0.0.0", "Microsoft Edge";v="110.0.1587.69"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-model': '',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-ch-ua-platform-version': '"15.0.0"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'x-ms-client-request-id': str(uuid.uuid4()),
+ 'x-ms-useragent': 'azsdk-js-api-client-factory/1.0.0-beta.1 core-rest-pipeline/1.10.0 OS/Win32',
+ 'Referer': 'https://www.bing.com/search?q=Bing+AI&showconv=1&FORM=hpcodx',
+ 'Referrer-Policy': 'origin-when-cross-origin',
+ 'x-forwarded-for': Defaults.ip_address
+ })
+
+ await wss.send_str(_format({'protocol': 'json', 'version': 1}))
+ await wss.receive(timeout=900)
+
+ struct = {
+ 'arguments': [
+ {
+ **mode,
+ 'source': 'cib',
+ 'allowedMessageTypes': Defaults.allowedMessageTypes,
+ 'sliceIds': Defaults.sliceIds,
+ 'traceId': os.urandom(16).hex(),
+ 'isStartOfSession': True,
+ 'message': Defaults.location | {
+ 'author': 'user',
+ 'inputMethod': 'Keyboard',
+ 'text': prompt,
+ 'messageType': 'Chat'
+ },
+ 'conversationSignature': conversationSignature,
+ 'participant': {
+ 'id': clientId
+ },
+ 'conversationId': conversationId
+ }
+ ],
+ 'invocationId': '0',
+ 'target': 'chat',
+ 'type': 4
+ }
+
+ if context:
+ struct['arguments'][0]['previousMessages'] = [
+ {
+ "author": "user",
+ "description": context,
+ "contextType": "WebPage",
+ "messageType": "Context",
+ "messageId": "discover-web--page-ping-mriduna-----"
+ }
+ ]
+
+ await wss.send_str(_format(struct))
+
+ final = False
+ draw = False
+ resp_txt = ''
+ result_text = ''
+ resp_txt_no_link = ''
+ cache_text = ''
+
+ while not final:
+ msg = await wss.receive(timeout=900)
+ objects = msg.data.split(Defaults.delimiter)
+
+ for obj in objects:
+ if obj is None or not obj:
+ continue
+
+ response = json.loads(obj)
+ if response.get('type') == 1 and response['arguments'][0].get('messages',):
+ if not draw:
+ if (response['arguments'][0]['messages'][0]['contentOrigin'] != 'Apology') and not draw:
+ resp_txt = result_text + \
+ response['arguments'][0]['messages'][0]['adaptiveCards'][0]['body'][0].get(
+ 'text', '')
+ resp_txt_no_link = result_text + \
+ response['arguments'][0]['messages'][0].get(
+ 'text', '')
+
+ if response['arguments'][0]['messages'][0].get('messageType',):
+ resp_txt = (
+ resp_txt
+ + response['arguments'][0]['messages'][0]['adaptiveCards'][0]['body'][0]['inlines'][0].get('text')
+ + '\n'
+ )
+ result_text = (
+ result_text
+ + response['arguments'][0]['messages'][0]['adaptiveCards'][0]['body'][0]['inlines'][0].get('text')
+ + '\n'
+ )
+
+ if cache_text.endswith(' '):
+ final = True
+ if wss and not wss.closed:
+ await wss.close()
+ if session and not session.closed:
+ await session.close()
+
+ yield (resp_txt.replace(cache_text, ''))
+ cache_text = resp_txt
+
+ elif response.get('type') == 2:
+ if response['item']['result'].get('error'):
+ if wss and not wss.closed:
+ await wss.close()
+ if session and not session.closed:
+ await session.close()
+
+ raise Exception(
+ f"{response['item']['result']['value']}: {response['item']['result']['message']}")
+
+ if draw:
+ cache = response['item']['messages'][1]['adaptiveCards'][0]['body'][0]['text']
+ response['item']['messages'][1]['adaptiveCards'][0]['body'][0]['text'] = (
+ cache + resp_txt)
+
+ if (response['item']['messages'][-1]['contentOrigin'] == 'Apology' and resp_txt):
+ response['item']['messages'][-1]['text'] = resp_txt_no_link
+ response['item']['messages'][-1]['adaptiveCards'][0]['body'][0]['text'] = resp_txt
+
+ # print('Preserved the message from being deleted', file=sys.stderr)
+
+ final = True
+ if wss and not wss.closed:
+ await wss.close()
+ if session and not session.closed:
+ await session.close()
+
+
+def run(generator):
+ loop = asyncio.new_event_loop()
+ asyncio.set_event_loop(loop)
+ gen = generator.__aiter__()
+
+ while True:
+ try:
+ next_val = loop.run_until_complete(gen.__anext__())
+ yield next_val
+
+ except StopAsyncIteration:
+ break
+ #print('Done')
+
+
+
+def convert(messages):
+ context = ""
+
+ for message in messages:
+ context += "[%s](#message)\n%s\n\n" % (message['role'],
+ message['content'])
+
+ return context
+
+
+def _create_completion(model: str, messages: list, stream: bool, **kwargs):
+ if len(messages) < 2:
+ prompt = messages[0]['content']
+ context = False
+
+ else:
+ prompt = messages[-1]['content']
+ context = convert(messages[:-1])
+
+ response = run(stream_generate(prompt, optionsSets.jailbreak, context))
+ for token in response:
+ yield (token)
+
+ #print('Done')
+
+
+params = f'g4f.Providers.{os.path.basename(__file__)[:-3]} supports: ' + \
+ '(%s)' % ', '.join(
+ [f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
diff --git a/g4f/Provider/Providers/ChatgptAi.py b/g4f/Provider/Providers/ChatgptAi.py
new file mode 100644
index 0000000000000000000000000000000000000000..00d4cf6f6bfb6435de9978900829662b26f12047
--- /dev/null
+++ b/g4f/Provider/Providers/ChatgptAi.py
@@ -0,0 +1,51 @@
+import os
+import requests, re
+from ...typing import sha256, Dict, get_type_hints
+
+url = 'https://chatgpt.ai/gpt-4/'
+model = ['gpt-4']
+supports_stream = False
+needs_auth = False
+
+def _create_completion(model: str, messages: list, stream: bool, **kwargs):
+ chat = ''
+ for message in messages:
+ chat += '%s: %s\n' % (message['role'], message['content'])
+ chat += 'assistant: '
+
+ response = requests.get('https://chatgpt.ai/gpt-4/')
+
+ nonce, post_id, _, bot_id = re.findall(r'data-nonce="(.*)"\n data-post-id="(.*)"\n data-url="(.*)"\n data-bot-id="(.*)"\n data-width', response.text)[0]
+
+ headers = {
+ 'authority': 'chatgpt.ai',
+ 'accept': '*/*',
+ 'accept-language': 'en,fr-FR;q=0.9,fr;q=0.8,es-ES;q=0.7,es;q=0.6,en-US;q=0.5,am;q=0.4,de;q=0.3',
+ 'cache-control': 'no-cache',
+ 'origin': 'https://chatgpt.ai',
+ 'pragma': 'no-cache',
+ 'referer': 'https://chatgpt.ai/gpt-4/',
+ 'sec-ch-ua': '"Not.A/Brand";v="8", "Chromium";v="114", "Google Chrome";v="114"',
+ 'sec-ch-ua-mobile': '?0',
+ 'sec-ch-ua-platform': '"Windows"',
+ 'sec-fetch-dest': 'empty',
+ 'sec-fetch-mode': 'cors',
+ 'sec-fetch-site': 'same-origin',
+ 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36',
+ }
+ data = {
+ '_wpnonce': nonce,
+ 'post_id': post_id,
+ 'url': 'https://chatgpt.ai/gpt-4',
+ 'action': 'wpaicg_chat_shortcode_message',
+ 'message': chat,
+ 'bot_id': bot_id
+ }
+
+ response = requests.post('https://chatgpt.ai/wp-admin/admin-ajax.php',
+ headers=headers, data=data)
+
+ yield (response.json()['data'])
+
+params = f'g4f.Providers.{os.path.basename(__file__)[:-3]} supports: ' + \
+ '(%s)' % ', '.join([f"{name}: {get_type_hints(_create_completion)[name].__name__}" for name in _create_completion.__code__.co_varnames[:_create_completion.__code__.co_argcount]])
\ No newline at end of file
diff --git a/g4f/Provider/Providers/ChatgptLogin.py b/g4f/Provider/Providers/ChatgptLogin.py
new file mode 100644
index 0000000000000000000000000000000000000000..9551d15dd5121c4b42f80d0ba547a10f0868563b
--- /dev/null
+++ b/g4f/Provider/Providers/ChatgptLogin.py
@@ -0,0 +1,96 @@
+import os
+from ...typing import sha256, Dict, get_type_hints
+import requests
+import re
+import base64
+
+url = 'https://chatgptlogin.ac'
+model = ['gpt-3.5-turbo']
+supports_stream = False
+needs_auth = False
+
+
+def _create_completion(model: str, messages: list, stream: bool, **kwargs):
+ def get_nonce():
+ res = requests.get('https://chatgptlogin.ac/use-chatgpt-free/', headers={
+ "Referer": "https://chatgptlogin.ac/use-chatgpt-free/",
+ "User-Agent": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36'
+ })
+
+ src = re.search(r'class="mwai-chat mwai-chatgpt">.*Send