A from-scratch implementation of K-Nearest Neighbors algorithm for classification, demonstrated on the classic Iris dataset.
This project includes:
- Complete KNN implementation with Euclidean, Manhattan, and Cosine distance metrics
- Data visualization using Seaborn and Matplotlib
- Model evaluation with accuracy scoring
- Clean, production-ready code with input validation
- 🧮 Three distance metrics supported:
- Euclidean distance
- Manhattan distance
- Cosine similarity (properly normalized)
- 📊 Interactive visualizations of dataset features
- 🔍 Comprehensive data exploration
- ⚙️ Configurable hyperparameters:
- k-value (number of neighbors)
- Task type (classification/regression)
- Distance metric