Welcome to the LLM Projects repository — a growing collection of hands-on applications powered by Large Language Models (LLMs) like Google Gemini 2.0 Flash and Meta's LLaMA 3.2. This repo showcases real-world use cases built with Streamlit, focused on automating content creation and enhancing productivity with AI.
Whether you're a student, developer, or AI enthusiast, these projects demonstrate how to integrate cutting-edge LLMs into useful tools like blog generators, SQL query builders, and data analysis assistants.
Generate SEO-friendly blog posts with just a title and keywords. Built using Gemini 2.0 Flash and optionally powered by LLaMA 3.2 for experimentation with open-source models.
Features:
- Input-based article generation
- Multi-model support (Gemini & LLaMA)
- SEO keyword handling
- Markdown export
- Beautiful, responsive UI
Turn plain English into fully-formed SQL queries! This tool supports schema context, dialect customization, and offers AI-generated explanations and sample outputs.
Features:
- Converts natural language to SQL queries
- Supports Gemini 2.0 and LLaMA 3.2
- Accepts optional DB schema and dialect
- Returns SQL, explanation, and sample output
- Clean Streamlit interface
Upload a CSV file and interact with your dataset through natural language questions using either Google Gemini or LLaMA 3.2 via Ollama. No embeddings or external databases required.
Features:
- Upload and preview CSV files
- Ask questions about your data
- Gemini or local LLaMA backend support
- No vector store needed
- Session-based Q&A memory
Extract and summarize website content using Google Gemini API or LLaMA 3.2 (via Ollama). Scrapes raw text, removes unnecessary elements, and outputs clean markdown summaries.
Features:
- Extracts title and main text from a URL
- Removes scripts, styles, and irrelevant tags
- Summarizes using Gemini or LLaMA 3.2
- Works with Jupyter or standalone scripts
- No need for embeddings or RAG setup
Summarize content from uploaded PDF and Word documents in seconds using Google Gemini 1.5 Flash and LangChain. The app creates a vector-based knowledge base and queries the content for an intelligent summary.
Features:
- Upload
.pdf
or.docx
files - Automatically extracts and cleans text
- Uses FAISS and LangChain for context-aware summarization
- Gemini 1.5 Flash as the LLM backend
- Clean and responsive Streamlit interface
git clone https://github.com/MoustafaMohamed01/llm-projects.git
cd llm-projects
For example, to use the AI SQL Generator:
cd ai-sql-query-generator
pip install -r requirements.txt
Create a file named api_key.py
inside the subproject folder:
GEMINI_API_KEY = "your_google_gemini_api_key"
Each app has its own requirements.txt
, but common dependencies include:
streamlit
google-generativeai
pandas
requests
json
Install them globally or per-project as needed.
![]() |
![]() |
![]() |
![]() |
Created by Moustafa Mohamed - feel free to reach out!
- GitHub: MoustafaMohamed01
- Linkedin: Moustafa Mohamed
- Kaggle: moustafamohamed01