Skip to content

Set default offline LLM name via env #115

@barun-saha

Description

@barun-saha

In the offline mode of SlideDeck AI, one can use an offline LLM via Ollama to generate the slide deck. The name of the offline LLM needs to be specified in the text input box every time the app launches. This could become a repetitive task.

To address this, introduce a new environment variable, DEFAULT_OFFLINE_LLM, which points to the Ollama model name to use. If this value is available, set it as the default value of the llm_provider_to_use text input box (under offline mode) in app.py.

How to test the changes?

  1. Launch the app in offline mode. The model name field should be empty. Specify a model name and generate a slide deck.
  2. Set DEFAULT_OFFLINE_LLM to a valid, locally available Ollama model name. Launch the app in offline mode. The model name field should contain the default model name. Generate a slide deck using this LLM.
  3. Launch the app in online mode. No change should appear in this mode. Generate a slide deck using any LLM.

It would be best if you have a GPU for testing with offline LLMs. If you do not have, try using Gemma 3 1B.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions