Skip to content

yusufnuru/mcd-locator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 

Repository files navigation

🧠 MindHive Technical Assessment – McDonald's Locator

📋 Overview

This project is a full-stack application built for MindHive's technical assessment. It scrapes McDonald's Malaysia outlets located in Kuala Lumpur, stores the data in a database, enriches it with geographical coordinates, exposes an API, and visualizes the outlets on a map with basic chatbot functionality.

📦 Tech Stack

Layer Technology Used Reason
Web Scraping Playwright Robust against dynamic, JS-heavy websites
Backend API FastAPI, SQLAlchemy, PostgreSQL Fast, async-ready backend with ORM support
Frontend React, TanStack Router, Leaflet.js, Vite Lightweight and modern stack with interactive map support
AI Chatbot OpenRouter (OpenAI-compatible), Custom NLP For processing natural language outlet-related queries

📌 Features

✅ Part 1 – Web Scraping

  • Scrapes outlets only in Kuala Lumpur
  • Extracts:
    • Outlet name
    • Address
    • Facilites
    • Waze link
  • Handles pagination

✅ Part 2 – Geocoding

  • Retrieves latitude and longitude for each outlet via the JSON data in the mc donalds website

✅ Part 3 – API Development

  • Built with FastAPI
  • Serves outlet data via REST endpoints
  • Supports filtering via query params

✅ Part 4 – Frontend Map

  • Fetches outlet data from the backend
  • Displays each outlet as a marker on Leaflet.js map

✅ Part 5 – Chatbot

  • Basic query support for:
    • "Which outlets in KL operate 24 hours?"
    • "Which outlets allow birthday parties?"
  • Uses OpenRouter (or OpenAI-compatible API) + metadata filtering

📁 Project Structure

.
├── client/                     # React + Leaflet frontend
├── server/                     # FastAPI backend
│   ├── app/
│   │   ├── scraping/           # Playwright scraper
│   │   ├── models/             # SQLAlchemy models
│   │   ├── services/           # Geocoding and business logic
│   │   └── api/                # FastAPI routes
│   └── main.py                 # FastAPI entry point
├── data/                       # Optional data exports
├── README.md                   # Project documentation
└── requirements.txt

⚙️ Setup Instructions

🔧 Backend Setup

  1. Clone and create a virtual environment

    git clone https://github.com/yusufnuru/mcd-locator.git
    cd mcd-locator/server
    python -m venv .venv && source .venv/bin/activate
    pip install -r requirements.txt
  2. Configure Environment Create .env:

    DATABASE_URL=postgresql://username:password@localhost:5432/mcd
    OPENAI_API_KEY=your_openrouter_or_openai_key
  3. Run migrations

    alembic upgrade head
  4. Scrape and Save The Outles

    python scripts/scrape_and_save.py
  5. Start backend

    uvicorn app.main:app --reload

🌍 Frontend Setup

  1. Configure Environment
cd client

Create .env:

VITE_API_BASE_URL=your_vite_api_base_url
  1. Start frontend
pnpm install
pnpm run dev

🚀 API Endpoints

Endpoint Description
GET /api/outlets List all outlets
GET /api/outlets/{id} Retrieve outlet by ID
POST /api/chat/ Ask natural language questions

🧠 Technical Decisions

✅ Web Scraping

Used Playwright to simulate user filtering by "Kuala Lumpur" and handle dynamic pagination.

✅ Geocoding

Scrap geo data from the mc donalds website

✅ Database Schema

Outlets (
  id UUID PRIMARY KEY,
  name TEXT,
  address TEXT,
  waze_link TEXT,
  telephone TEXT,
  google_maps_url TEXT,
  facilities ARRAY,
  latitude FLOAT,
  longitude FLOAT,
)

This schema supports chatbot queries and future features like filtering.

✅ Chatbot

Basic parsing of keywords from queries (e.g. "24 hours", "birthday party") combined with LLM support via OpenRouter.

❗ Known Limitations

  • No 5KM catchment area or radius intersection logic
  • No unit/integration tests added
  • Chatbot currently relies on metadata + simple LLM logic — could be improved with RAG or semantic search

📬 Submission

GitHub Repo: https://github.com/yusufnuru/mcd-locator

Submit to:

📌 Final Notes

The solution reflects real-world decision-making under uncertainty — balancing technical constraints with practical outcomes. It is modular, extendable, and shows how to integrate scraping, geospatial APIs, backend systems, frontend maps, and AI interaction.