barunsaha commited on
Commit
223053f
1 Parent(s): 4184417

Add description on how to use the offline mode of SlideDeck AI

Browse files
Files changed (1) hide show
  1. README.md +24 -0
README.md CHANGED
@@ -47,6 +47,8 @@ Different LLMs offer different styles of content generation. Use one of the foll
47
 
48
  The Mistral models do not mandatorily require an access token. However, you are encouraged to get and use your own Hugging Face access token.
49
 
 
 
50
 
51
  # Icons
52
 
@@ -62,6 +64,28 @@ To run this project by yourself, you need to provide the `HUGGINGFACEHUB_API_TOK
62
  for example, in a `.env` file. Alternatively, you can provide the access token in the app's user interface itself (UI). For other LLM providers, the API key can only be specified in the UI. For image search, the `PEXEL_API_KEY` should be made available as an environment variable.
63
  Visit the respective websites to obtain the API keys.
64
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
65
 
66
  # Live Demo
67
 
 
47
 
48
  The Mistral models do not mandatorily require an access token. However, you are encouraged to get and use your own Hugging Face access token.
49
 
50
+ In addition, offline LLMs provided by Ollama can be used. Read below to know more.
51
+
52
 
53
  # Icons
54
 
 
64
  for example, in a `.env` file. Alternatively, you can provide the access token in the app's user interface itself (UI). For other LLM providers, the API key can only be specified in the UI. For image search, the `PEXEL_API_KEY` should be made available as an environment variable.
65
  Visit the respective websites to obtain the API keys.
66
 
67
+ ## Offline LLMs Using Ollama
68
+
69
+ SlideDeck AI allows the use of offline LLMs to generate the contents of the slide decks. This is typically suitable for individuals or organizations who would like to use self-hosted LLMs for privacy concerns, for example.
70
+
71
+ Offline LLMs are made available via Ollama. Therefore, a pre-requisite here is to have [Ollama installed](https://ollama.com/download) on the system and the desired [LLM](https://ollama.com/search) pulled locally.
72
+
73
+ In addition, the `RUN_IN_OFFLINE_MODE` environment variable needs to be set to `True` to enable the offline mode. This, for example, can be done using a `.env` file or from the terminal. The typical steps to use SlideDeck AI in offline mode (in a `bash` shell) are as follows:
74
+
75
+ ```bash
76
+ ollama list # View locally available LLMs
77
+ export RUN_IN_OFFLINE_MODE=True # Enable the offline mode to use Ollama
78
+ git clone https://github.com/barun-saha/slide-deck-ai.git
79
+ cd slide-deck-ai
80
+ streamlit run ./app.py # Run the application
81
+ ```
82
+
83
+ The `.env` file should be created inside the `slide-deck-ai` directory.
84
+
85
+ The UI is similar to the online mode. However, rather than selecting an LLM from a list, one has to write the name of the Ollama model to be used in a textbox. There is no API key asked here.
86
+
87
+ Finally, the focus is on using offline LLMs, not going completely offline. So, Internet connectivity would still be required to fetch the images from Pexels.
88
+
89
 
90
  # Live Demo
91