"Are you a human, or have the aliens taken over?" 👽 Inspired by the Men in Black series, AIH is a fun and quirky project designed to determine if you're truly human or if you're just a cleverly disguised extraterrestrial. With a mix of modern AI techniques, this tool can assess not only your human status but also your age, gender, whether you're alive, or if you might just be a brilliant spoof. Don't worry, I've got you covered!
- Transfer Learning (MobileNetV3): Using the MobileNetV3 model for efficient image classification.
- FastAPI: Backend for handling requests and serving the model efficiently.
- DVC (Data Version Control): For managing model data and tracking changes.
- MLflow: To manage the entire machine learning lifecycle including experimentation, model tracking, and deployment.
- YOLO (You Only Look Once): For object detection, especially detecting faces to evaluate human-like features.
- Dagshub: Cloud-based model and data management system.
- Jinja: For dynamically generating HTML templates to serve results.
- Uvicorn: ASGI server for running the FastAPI app.
-
Model Training: Fine-tune a MobileNetV3 model using a dataset of human images. The model is trained to classify various features like age, gender, and whether the input appears human.
-
FastAPI Backend: FastAPI is used to build a lightweight API that takes an image as input and returns predictions (Human Status, Age, Gender, etc.). It’s fast, lightweight, and super efficient.
-
Model Management in the Cloud:
- The model is completely managed in the cloud via Dagshub.
- Every time the server (FastAPI) restarts, the model is fetched from the cloud and loaded for prediction.
- DVC is used for version control to manage and track data and model changes.
- MLflow is used to track the machine learning lifecycle, from experimentation to deployment.
-
Prediction Process: When you upload an image, YOLO detects if the face is human, and the MobileNetV3 model classifies it into human categories. The result is served via a friendly UI built using Jinja templates.
- Human Detection: Models that will tell you if you're human or just pretending to be one!
- Age & Gender Prediction: Get an estimate of your age and gender based on the image you provide.
- Spoof Detection: techniques to detect spoof faces or avatars.
- Interactive UI: Dynamic, fun, and interactive web interface built with Jinja templates.
git clone https://github.com/RijoSLal/AIH.git
cd AIH
pip install -r requirements.txt
uvicorn service:app --reload
Open your browser and go to http://127.0.0.1:8000/
to interact with the app!
To train the model, follow the steps below:
-
Run the
model.ipynb
to train and fine-tune the MobileNetV3 model on your dataset.jupyter notebook model.ipynb
The model will be trained using the transfer learning approach, leveraging the MobileNetV3 pre-trained weights.
-
Register the Model on the Server: Once training is complete, use the
model_registration.ipynb
to register the trained model in the server. This will save the model on Dagshub for cloud management.jupyter notebook model_registration.ipynb
This script will upload the model to the cloud and register it for future use, ensuring that it's available whenever the server restarts.
The model and data are managed in the cloud via Dagshub, ensuring that:
- Version Control: All models and datasets are versioned using DVC.
- Tracking: The entire machine learning workflow, including experimentation, is tracked using MLflow.
- Cloud-based Deployment: Whenever the FastAPI server restarts, the model is automatically fetched from Dagshub and deployed without any additional setup.
💡 Change Your Dagshub Repo Name Don’t forget to update the Dagshub remote URL in .dvc/config and .mlflow tracking if you fork this project or rename it:
Example:
dvc remote modify origin url https://dagshub.com/your-username/your-repo-name.git
-
Run the FastAPI Server: Once the model is registered, the server can be started using Uvicorn to serve the application.
uvicorn main:app --reload
-
Access the API: Open your browser and go to
http://127.0.0.1:8000/
to interact with the fun human-detection tool.
Since this is a solo project, contributions are welcome but not expected. However, if you think of any fun ideas, bug fixes, or improvements, feel free to fork the repo and submit a pull request!
This project is licensed under the MIT License - see the LICENSE file for details.
No alien or spoofed face will get past my detectors! 😉