A Python-based tool for detecting adult content in images using machine learning APIs and computer vision techniques. This tool is designed for content moderation, parental controls, and maintaining safe digital environments.
This tool is intended for legitimate purposes such as:
- Content moderation for websites and platforms
- Parental control systems
- Automated content filtering for educational environments
- Personal digital hygiene and safety
Please use responsibly and in accordance with local laws and regulations.
- Multi-API Detection: Integrates with multiple content moderation APIs for high accuracy
- Fallback Detection: Basic computer vision-based detection when APIs are unavailable
- Safe Deletion: Moves flagged content to trash instead of permanent deletion
- Image Validation: Supports multiple image formats with validation
- User Confirmation: Requires user approval before any file operations
- Cross-Platform: Works on Windows, macOS, and Linux
- Python 3.7+
- OpenCV
- PIL/Pillow
- NumPy
- Requests
-
Clone the repository:
git clone https://github.com/developeranveshraman/Adult-Image-Detector.git cd Adult-Image-Detector
-
Install required packages:
For regular use:
pip install -r requirements.txt
For development:
pip install -r requirements-dev.txt
Or install manually:
pip install opencv-python pillow requests numpy
-
Optional (for Windows trash functionality):
pip install winshell
For optimal accuracy, configure API keys from supported services:
-
ModerateContent API
- Sign up at moderatecontent.com
- Get your API key
- Replace
YOUR_MODERATECONTENT_API_KEY
in the code
-
Sightengine API
- Sign up at sightengine.com
- Get your API user and secret
- Replace
YOUR_SIGHTENGINE_USER
andYOUR_SIGHTENGINE_SECRET
in the code
You can set API keys as environment variables:
export MODERATECONTENT_API_KEY="your_api_key_here"
export SIGHTENGINE_API_USER="your_user_here"
export SIGHTENGINE_API_SECRET="your_secret_here"
python index.py
The program will prompt you to enter image paths for analysis.
from adult_content_detector import AdultContentDetector
detector = AdultContentDetector()
is_adult_content = detector.process_image("path/to/image.jpg")
if is_adult_content:
print("Adult content detected!")
else:
print("Image is safe.")
import os
from adult_content_detector import AdultContentDetector
detector = AdultContentDetector()
image_folder = "path/to/images"
for filename in os.listdir(image_folder):
if filename.lower().endswith(('.jpg', '.jpeg', '.png', '.bmp')):
image_path = os.path.join(image_folder, filename)
detector.process_image(image_path)
- JPEG (.jpg, .jpeg)
- PNG (.png)
- BMP (.bmp)
- GIF (.gif)
- TIFF (.tiff)
- WebP (.webp)
- ModerateContent API: Advanced ML models for content classification
- Sightengine API: Specialized in nudity and adult content detection
- High accuracy and reliability
- Skin tone detection using HSV color space
- Fallback method when APIs are unavailable
- Lower accuracy, use with caution
- Local Processing: Basic detection runs entirely on your machine
- API Communication: Only when configured and consented
- Safe Deletion: Files moved to trash, not permanently deleted
- No Data Storage: No images or results stored by this tool
Parameter | Description | Default |
---|---|---|
use_api |
Enable API-based detection | True |
skin_threshold |
Skin detection sensitivity (0.0-1.0) | 0.3 |
adult_threshold |
API confidence threshold (0.0-1.0) | 0.5 |
-
"Invalid image file" error
- Ensure the file is a supported image format
- Check if the file is corrupted
-
API errors
- Verify your API keys are correct
- Check your internet connection
- Ensure you haven't exceeded API limits
-
Permission errors during deletion
- Run with appropriate permissions
- Check if the file is in use by another program
Enable debug output:
detector = AdultContentDetector()
detector.debug = True
- API-based detection: 95-99% accuracy (depending on service)
- Basic skin detection: 60-70% accuracy (many false positives)
- Recommendation: Always use API-based detection for production use
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
# Install development dependencies
pip install -r requirements-dev.txt
# Run tests
python -m pytest tests/
# Run linting
flake8 adult_content_detector.py
This project is licensed under the MIT License - see the LICENSE file for details.
This software is provided for educational and legitimate content moderation purposes only. Users are responsible for:
- Complying with local laws and regulations
- Respecting privacy and consent requirements
- Using the tool ethically and responsibly
- Not using it for harassment or illegal activities
The developers are not responsible for misuse of this software.
- Issues: GitHub Issues
- Email: support@anveshraman.rf.gd
⭐ If this project helped you, please consider giving it a star!