Skip to content

czhang771/nowadays

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Event Quote Parser

A Next.js tool for event planners that parses hotel quote emails (HTML/plain text) and file uploads (PDF) to extract key financial data:

  • Total Quote
  • Guestroom Total
  • Meeting Room Total
  • Food & Beverage Total

Frontend: React + Tailwind (Next.js App Router). Backend: Next.js API routes. Database: Supabase (PostgreSQL).


Prerequisites

  • Node.js 18+ and npm 9+
  • A Supabase account and project
  • macOS/Linux/WSL recommended (Windows works too)

Check versions:

node -v
npm -v

1) Create a Supabase project

  1. Go to the Supabase dashboard and create a new project.
  2. In Project Settings → Database, note your database region and password.
  3. In Project Settings → API, copy:
    • Project URL (aka SUPABASE_URL)
    • service_role key (server secret — keep private)

Run the database migration (SQL)

Paste the SQL below into the Supabase SQL Editor and run it. This creates two tables in the default public schema, which matches the code in this repo.

-- Create extension for UUID generation (usually available by default)
create extension if not exists pgcrypto;

-- Raw inputs (pastes/uploads)
create table if not exists public.ingest_sources (
  id uuid primary key default gen_random_uuid(),
  created_at timestamptz not null default now(),
  source_kind text not null check (source_kind in ('paste', 'upload')),
  original_filename text,
  content_type text,
  content_text text,
  content_html text
);

-- Parsed results linked to an ingest row
create table if not exists public.parsed_results (
  id uuid primary key default gen_random_uuid(),
  created_at timestamptz not null default now(),
  ingest_id uuid not null references public.ingest_sources(id) on delete cascade,
  total_quote numeric,
  guestroom_total numeric,
  meeting_room_total numeric,
  food_beverage_total numeric,
  currency text
);

-- Helpful index for joins/analytics
create index if not exists idx_parsed_results_ingest_id on public.parsed_results(ingest_id);

2) Configure environment variables

Create an environment file for the web app at apps/web/.env.local and set:

NEXT_PUBLIC_SUPABASE_URL=your_supabase_project_url
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key  # server-only secret

Notes:

  • NEXT_PUBLIC_SUPABASE_URL is safe to expose to the browser.
  • SUPABASE_SERVICE_ROLE_KEY is a sensitive server secret. It is only used in API routes and is not exposed to the client.

3) Install dependencies

From the repository root:

npm install

Optional but recommended (for headless rendering of proposal pages when needed):

# Install Playwright browsers used by the scraper when SPA pages require JS rendering
npx playwright install chromium

4) Run in development

From the repository root:

npm run dev

Then open http://localhost:3000.

You can paste email HTML/plain text into the left panel, upload files on the right (.pdf), and click Parse.


5) Build and run in production mode

npm run build
npm run start

The server will listen on http://localhost:3000.


Notes on scraping (optional)

  • If the parser cannot find totals directly in the email content, it may try to fetch proposal URLs found in the content.
  • Regular HTTP fetch is used first; if the page appears to be a JavaScript SPA, a headless Chromium render is attempted (requires the Playwright step above).

Troubleshooting:

  • If you see errors about launching Chromium, run npx playwright install chromium.
  • If Supabase inserts fail, re-check that the tables exist in the public schema and that env vars are set correctly.

Monorepo commands

From the repo root (these proxy to apps/web):

npm run dev     # start Next.js dev server
npm run build   # build for production
npm run start   # start production server
npm run lint    # run lints

Or directly inside apps/web:

cd apps/web
npm run dev

Project structure

  • apps/web/app — Next.js App Router UI and API routes
  • apps/web/lib — core parsers (text, HTML, PDF), scraper, Supabase client
  • supabase/migrations — SQL migration(s)
  • supabase/ — seeds/sql/types (if used)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published