Skip to content

Discover high-signal YouTube videos for your learning goal, then chat with a local LLM grounded on their transcripts.

Notifications You must be signed in to change notification settings

Anishrkhadka/asktube

Repository files navigation

AskTube logo

Discover high-signal YouTube videos for any learning goal, then chat with a local LLM grounded on the video transcript.

AskTube demo

✨ Features

  • Query refinement
    Turn a short learning goal into a sharp YouTube search query using your local Ollama model.
  • YouTube search
    Fetch relevant videos via the YouTube Data API v3.
  • Transcript fetch
    Prefers manual transcripts, then auto-generated, then translated-to-English.
  • Grounded chat
    Ask questions about the selected video; answers are based on its transcript.
  • Simple, responsive UI
    Built with Streamlit.

🚀 Quick Start

1. Prerequisites

  • Python 3.11+
  • An Ollama server running locally or reachable over HTTP
  • A YouTube Data API v3 key

2. Clone & install

git clone https://github.com/Anishrkhadka/asktube.git
cd asktube

3. Configure environment

Create a .env file in the project root:

# Required: YouTube Data API key
YOUTUBE_API_KEY=YOUR_API_KEY_HERE

# Optional: Ollama host (defaults to http://localhost:11434)
OLLAMA_HOST=http://localhost:11434

4. Ensure a model is available in Ollama

The app defaults to gemma3:12b (falls back to discovered tags). Pull at least one model:

ollama pull gemma3:12b
# or another compatible model, e.g. llama3.1:8b, mistral:7b

🐳 Docker

Build:

docker compose up --build 

🎯 Usage

  1. In the text area, describe what you want to learn.
  2. Choose number of videos, model, sort order, and duration filter.
  3. Click Find videos to fetch results.
  4. Pick a video from the sidebar. Transcript loads automatically (if available).
  5. Use the chat box to ask questions grounded on the transcript.

🔑 Environment Notes

  • YouTube API key → Create in Google Cloud Console (enable YouTube Data API v3, create API key).
  • OllamaInstall Ollama, run ollama serve, and pull a model with ollama pull <model>.

📄 License

MIT License © 2025 AskTube

About

Discover high-signal YouTube videos for your learning goal, then chat with a local LLM grounded on their transcripts.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published