This project is a gesture and speech-based control system for Microsoft PowerPoint presentations using the Microsoft Kinect Sensor and C#. It enables presenters to interact with slides through hand gestures or voice commands—eliminating the need for physical remotes or keyboard shortcuts. Built on the Kinect for Windows SDK v1.7, it tracks user movement and recognizes specific gestures to trigger slide navigation, while also providing real-time visual feedback and multi-application support.
- Project Summary
- Features
- Project Structure
- Technology Stack
- Requirements
- Installation & Setup
- Usage Instructions
- Project Walkthrough & Learning Guide
- Components & Code Examples
- Gesture Controls
- Speech Recognition
- Limitations
- Keywords
- License
- Screenshots
- Gesture-based slide navigation: Move forward or backward in a PowerPoint presentation by extending your right or left arm.
- Real-time visual feedback: Application window shows your tracked head and hand positions with animated circles.
- Speech recognition: Control the UI and visual feedback with voice commands.
- Multi-application support: Gestures send key events to any foreground application (not limited to PowerPoint).
- Easy setup and operation: Step-by-step instructions for quick setup.
- Educational resource: Clear code and project structure for students to learn about Kinect, C#, and human-computer interaction.
The repository typically contains the following folders and files:
Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor/
├── src/ # Main application source code (C#)
│ ├── KinectPowerPointControl/
│ │ ├── App.xaml.cs
│ │ ├── MainWindow.xaml
│ │ ├── MainWindow.xaml.cs
│ │ └── Properties/
│ │ └── AssemblyInfo.cs
│ └── ...
├── bin/ # Compiled binaries after build
├── Resources/ # Supporting files, assets, and images
├── README.md # Project documentation (this file)
├── *.sln / *.csproj # Visual Studio solution/project files
└── ...
Note: Some folders or file names might differ based on local organization.
- Programming Language: C#
- Framework: .NET Framework (Windows)
- Hardware: Microsoft Kinect Sensor (v1)
- SDK: Kinect for Windows SDK v1.7
- Optional: Microsoft Office PowerPoint (for controlling slides)
- Microsoft Kinect Sensor for Windows
- Kinect for Windows SDK v1.7
- Windows PC (compatible with Kinect SDK)
- Optional: Microsoft Office PowerPoint
-
Install the Kinect SDK:
- Download and install Kinect for Windows SDK v1.7
- Plug in your Kinect Sensor and ensure it is recognized by Windows.
-
Clone this repository:
git clone https://github.com/arnobt78/Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor.git cd Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor
-
Open the project in Visual Studio:
- Open the
.sln
file from the root directory.
- Open the
-
Build the solution:
- Restore any missing NuGet packages if prompted.
- Build the solution (
Build > Build Solution
).
-
Run the application:
- Connect your Kinect sensor.
- Set your PowerPoint presentation (optional).
- Start the application from Visual Studio or by running the compiled binary.
- Stand in front of Kinect: Position yourself at least five feet away from the sensor.
- Track your movements: The application window will display your image and three tracking circles (head, left hand, right hand).
- Control slides with gestures:
- Right arm extended: "Next Slide" (sends right arrow key)
- Left arm extended: "Previous Slide" (sends left arrow key)
- Present your PowerPoint: Make PowerPoint the foreground application. Gestures will control slide navigation.
- Visual feedback: Circles grow and change color when your hand exceeds the threshold (45cm from head). Only one gesture is activated at a time.
- The application initializes the Kinect sensor and subscribes to skeleton and audio streams.
- The main logic is contained in the
MainWindow.xaml.cs
file.
- Kinect provides body/skeleton tracking data, including head and hand positions.
- The UI visualizes these positions with animated ellipses.
- By monitoring the distance between your head and each hand, the app detects “right arm out” or “left arm out” gestures.
- When the threshold is exceeded, the app sends a keyboard event (right/left arrow).
- After a 4-second delay from startup, the app listens for specific voice commands:
computer show window
computer hide window
computer show circles
computer hide circles
- These commands let you control the visual feedback interface.
- The arrow key events are sent to any foreground window, so gestures work in PowerPoint or even Notepad.
- The project is intended as a teaching resource for:
- Gesture recognition basics
- Integrating hardware sensors
- Real-time data visualization
- Combining speech and gesture inputs
- C# and .NET event-driven programming
// Initialize Kinect sensor
KinectSensor sensor = KinectSensor.KinectSensors[0];
sensor.Start();
// Skeleton frame event handler
sensor.SkeletonFrameReady += (s, e) => {
// Track skeletons, extract hand and head positions
// Calculate distance from hands to head
// If right hand distance > threshold -> send Key.Right
// If left hand distance > threshold -> send Key.Left
// Update UI circles with position and color feedback
};
// Speech recognition initialization
// On recognized command, update UI visibility as required
// Key event sending (for slide control)
SendKeys.SendWait("{RIGHT}"); // or "{LEFT}"
Refer to MainWindow.xaml.cs
for the complete implementation.
- Right Arm Out: Advances to the next slide (
Right Arrow
) - Left Arm Out: Moves to the previous slide (
Left Arrow
) - Gesture Activation: Gesture triggers only when the hand moves beyond a certain threshold from the head position. Each gesture only fires once per activation.
Tip: Gestures also work in other applications. For example, you can move the cursor in Notepad using gestures.
The application supports the following voice commands:
computer show window
computer hide window
computer show circles
computer hide circles
There is a four-second delay after the program starts before speech recognition is activated.
- Embedded video activation: Cannot start embedded videos in PowerPoint directly. Use animations to trigger videos with arrow keys.
- Gesture sensitivity: Gestures are based on the distance between your head and hands. Unintended movements (e.g., bending down, stretching) may trigger gestures accidentally.
- Kinect SDK version: Only works with Kinect SDK v1.7 (may not be compatible with newer hardware or SDKs).
Gesture Recognition
, Kinect
, PowerPoint
, C#
, Kinect SDK
, Human-Computer Interaction
, Presentation Control
, Speech Recognition
, Touchless UI
, Computer Vision
This project is for research and demonstration purposes. Please refer to the LICENSE file (if provided) for usage details.
Look above for example screenshots of the application in use.
This project demonstrates the integration of gesture and speech recognition for practical, real-world control of presentation software. By leveraging the Microsoft Kinect Sensor and C#, it provides a touchless, intuitive interface for presenters. The code and architecture are designed for learning purposes, making it an excellent resource to study human-computer interaction, hardware integration, and modern C# programming practices. Experiment with the code, try extending the gesture or speech logic, and explore new ways to build interactive applications!