You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Course Materials (along with assignments) for Intro to NLP, done as a part for requirement of the course "Introduction to NLP" (course-code: CS7.401.S22) @ IIITH. Note: If you are cloning this or taking help of this repo, try to star the repo.
This repository implements N-gram language modeling with Kneser-Kney and Witten Bell smoothing techniques, including an in-house tokenizer. It also features a neural model with LSTM architecture and calculates perplexities for comparing language and neural models.
Implemented and compared Statistical and Neural Language Models. Built N‑gram models with Kneser‑Ney & Witten‑Bell smoothing and a Neural Language Model, evaluated using sentence‑level and corpus‑level perplexity scores on multiple text corpora to analyze performance differences between statistical and neural approaches