--- title: Eshop Chat emoji: 🦀 colorFrom: indigo colorTo: gray sdk: docker pinned: false --- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference ### How to run on your localhost? (For MAC users) - Install Docker for MAC app to make docer enginee available to vscode - Install vscode - Open the project in vscode - Create .env file only on local machine (DO NOT ADD AND PUSH IT TO GIT) - Add openai api key to .env file: Eg. OPENAI_API_KEY=sk-WSISHD232UDDUDJHHQQTT - Type CMD+SHIFT+P and select Dev Containers: Rebuild Container. It will create docker container environment to run the code within the container. You don't have to install any project specific packages on you local machine. - Use menu Terminal > New Terminal, to open a shell and type following command to run chainlit server - If everything works, then you can access the application using http://localhost:7860/ . Please note that port 7860 is required by huggingface to make our application accessible using huggingface url (after git push). - chainlit run app.py --port 7860 ### BestBuy Datasets - https://github.com/BestBuyAPIs/open-data-set ### This app only uses first 100 product information - It uses products.json data to create normalized categories - Google colab is used to explore the data: https://colab.research.google.com/drive/17iPtcrdEHnqcECF6xakGB0HaWG33PzFw?usp=sharing - It generates product spec using the product and categories features, which is then used to create vector embeddings and save in vector store (Chroma vector store)