-
Notifications
You must be signed in to change notification settings - Fork 47
Open
Labels
good first issueGood for newcomersGood for newcomers
Description
In the offline mode of SlideDeck AI, one can use an offline LLM via Ollama to generate the slide deck. The name of the offline LLM needs to be specified in the text input box every time the app launches. This could become a repetitive task.
To address this, introduce a new environment variable, DEFAULT_OFFLINE_LLM
, which points to the Ollama model name to use. If this value is available, set it as the default value of the llm_provider_to_use
text input box (under offline mode) in app.py
.
How to test the changes?
- Launch the app in offline mode. The model name field should be empty. Specify a model name and generate a slide deck.
- Set
DEFAULT_OFFLINE_LLM
to a valid, locally available Ollama model name. Launch the app in offline mode. The model name field should contain the default model name. Generate a slide deck using this LLM. - Launch the app in online mode. No change should appear in this mode. Generate a slide deck using any LLM.
It would be best if you have a GPU for testing with offline LLMs. If you do not have, try using Gemma 3 1B.
Metadata
Metadata
Assignees
Labels
good first issueGood for newcomersGood for newcomers