Control your mouse cursor using just your eyes and blinks — no hands needed! This project uses MediaPipe and FaceMesh to track eye landmarks and PyAutoGUI to move the mouse and simulate clicks.
This project aims to:
- Enable hands-free mouse control for accessibility and convenience.
- Provide left-click functionality via blink detection.
- Serve as a base for gesture-based human-computer interaction systems.
- MediaPipe FaceMesh detects 468 facial landmarks.
- The iris movement is tracked to move the mouse pointer on screen.
- Left eye blink is detected by measuring the distance between eyelid landmarks (145 & 159).
- A calibration phase in the first few seconds determines your eye-open distance, improving accuracy.
- Mouse movement is smoothed to reduce jitter.
- Make sure you have Python 3.x installed, then run: pip install opencv-python mediapipe pyautogui
- Download and Run the main Python script ie mouse_tracking_using_eyeball.py : python mouse_tracking_using_eyeball.py
- Follow the instructions: a. Keep your eyes open during the first few seconds (for calibration). b. Move your eyes to control the cursor. c. Blink your left eye to perform a mouse click.
- Accessible for users with physical disabilities.
- Hardware-free — uses just your webcam.
- Custom calibration adapts to each user.
- Extensible — easily add gestures or voice commands.
- Right-eye blink for right-click.
- Eye-based scrolling or drag-and-drop.
- Gesture-based UI for menu navigation.
- On-screen overlay for gaze zones.
- Voice or sound feedback.