Skip to content

Gesture Recognition is a project to control Microsoft Powerpoint while presenting on a large screen, using the Microsoft Kinect Sensor, C# and Kinect SDK. Using their right or left hand, the presenter can slide to the next or previous page. They can also zoom in/out by fisting their hand, play/pause the presentation by displaying all/none fingers.

Notifications You must be signed in to change notification settings

arnobt78/Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor

Repository files navigation

Gesture Recognition Control for PowerPoint Presentation using Microsoft Kinect Sensor

Screenshot 2024-09-09 at 20 36 26 Screenshot 2024-09-09 at 20 46 36


Project Summary

This project is a gesture and speech-based control system for Microsoft PowerPoint presentations using the Microsoft Kinect Sensor and C#. It enables presenters to interact with slides through hand gestures or voice commands—eliminating the need for physical remotes or keyboard shortcuts. Built on the Kinect for Windows SDK v1.7, it tracks user movement and recognizes specific gestures to trigger slide navigation, while also providing real-time visual feedback and multi-application support.


Table of Contents


Features

  • Gesture-based slide navigation: Move forward or backward in a PowerPoint presentation by extending your right or left arm.
  • Real-time visual feedback: Application window shows your tracked head and hand positions with animated circles.
  • Speech recognition: Control the UI and visual feedback with voice commands.
  • Multi-application support: Gestures send key events to any foreground application (not limited to PowerPoint).
  • Easy setup and operation: Step-by-step instructions for quick setup.
  • Educational resource: Clear code and project structure for students to learn about Kinect, C#, and human-computer interaction.

Project Structure

The repository typically contains the following folders and files:

Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor/
├── src/                        # Main application source code (C#)
│   ├── KinectPowerPointControl/
│   │   ├── App.xaml.cs
│   │   ├── MainWindow.xaml
│   │   ├── MainWindow.xaml.cs
│   │   └── Properties/
│   │       └── AssemblyInfo.cs
│   └── ...
├── bin/                        # Compiled binaries after build
├── Resources/                  # Supporting files, assets, and images
├── README.md                   # Project documentation (this file)
├── *.sln / *.csproj            # Visual Studio solution/project files
└── ...

Note: Some folders or file names might differ based on local organization.


Technology Stack

  • Programming Language: C#
  • Framework: .NET Framework (Windows)
  • Hardware: Microsoft Kinect Sensor (v1)
  • SDK: Kinect for Windows SDK v1.7
  • Optional: Microsoft Office PowerPoint (for controlling slides)

Requirements

  • Microsoft Kinect Sensor for Windows
  • Kinect for Windows SDK v1.7
  • Windows PC (compatible with Kinect SDK)
  • Optional: Microsoft Office PowerPoint

Installation & Setup

  1. Install the Kinect SDK:

  2. Clone this repository:

    git clone https://github.com/arnobt78/Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor.git
    cd Gesture-Recognition-Control-Powerpoint-Presentation--Microsoft-Kinect-Sensor
  3. Open the project in Visual Studio:

    • Open the .sln file from the root directory.
  4. Build the solution:

    • Restore any missing NuGet packages if prompted.
    • Build the solution (Build > Build Solution).
  5. Run the application:

    • Connect your Kinect sensor.
    • Set your PowerPoint presentation (optional).
    • Start the application from Visual Studio or by running the compiled binary.

Usage Instructions

  1. Stand in front of Kinect: Position yourself at least five feet away from the sensor.
  2. Track your movements: The application window will display your image and three tracking circles (head, left hand, right hand).
  3. Control slides with gestures:
    • Right arm extended: "Next Slide" (sends right arrow key)
    • Left arm extended: "Previous Slide" (sends left arrow key)
  4. Present your PowerPoint: Make PowerPoint the foreground application. Gestures will control slide navigation.
  5. Visual feedback: Circles grow and change color when your hand exceeds the threshold (45cm from head). Only one gesture is activated at a time.

Project Walkthrough & Learning Guide

1. Kinect Initialization

  • The application initializes the Kinect sensor and subscribes to skeleton and audio streams.
  • The main logic is contained in the MainWindow.xaml.cs file.

2. Skeleton Tracking

  • Kinect provides body/skeleton tracking data, including head and hand positions.
  • The UI visualizes these positions with animated ellipses.

3. Gesture Recognition Logic

  • By monitoring the distance between your head and each hand, the app detects “right arm out” or “left arm out” gestures.
  • When the threshold is exceeded, the app sends a keyboard event (right/left arrow).

4. Speech Recognition (Optional)

  • After a 4-second delay from startup, the app listens for specific voice commands:
    • computer show window
    • computer hide window
    • computer show circles
    • computer hide circles
  • These commands let you control the visual feedback interface.

5. Multi-Application Support

  • The arrow key events are sent to any foreground window, so gestures work in PowerPoint or even Notepad.

6. Educational Focus

  • The project is intended as a teaching resource for:
    • Gesture recognition basics
    • Integrating hardware sensors
    • Real-time data visualization
    • Combining speech and gesture inputs
    • C# and .NET event-driven programming

Components & Code Examples

Main Logical Flow (Pseudo/Example)

// Initialize Kinect sensor
KinectSensor sensor = KinectSensor.KinectSensors[0];
sensor.Start();

// Skeleton frame event handler
sensor.SkeletonFrameReady += (s, e) => {
    // Track skeletons, extract hand and head positions
    // Calculate distance from hands to head
    // If right hand distance > threshold -> send Key.Right
    // If left hand distance > threshold -> send Key.Left
    // Update UI circles with position and color feedback
};

// Speech recognition initialization
// On recognized command, update UI visibility as required

// Key event sending (for slide control)
SendKeys.SendWait("{RIGHT}");  // or "{LEFT}"

Refer to MainWindow.xaml.cs for the complete implementation.


Gesture Controls

  • Right Arm Out: Advances to the next slide (Right Arrow)
  • Left Arm Out: Moves to the previous slide (Left Arrow)
  • Gesture Activation: Gesture triggers only when the hand moves beyond a certain threshold from the head position. Each gesture only fires once per activation.

Tip: Gestures also work in other applications. For example, you can move the cursor in Notepad using gestures.


Speech Recognition

The application supports the following voice commands:

  • computer show window
  • computer hide window
  • computer show circles
  • computer hide circles

There is a four-second delay after the program starts before speech recognition is activated.


Limitations

  1. Embedded video activation: Cannot start embedded videos in PowerPoint directly. Use animations to trigger videos with arrow keys.
  2. Gesture sensitivity: Gestures are based on the distance between your head and hands. Unintended movements (e.g., bending down, stretching) may trigger gestures accidentally.
  3. Kinect SDK version: Only works with Kinect SDK v1.7 (may not be compatible with newer hardware or SDKs).

Keywords

Gesture Recognition, Kinect, PowerPoint, C#, Kinect SDK, Human-Computer Interaction, Presentation Control, Speech Recognition, Touchless UI, Computer Vision


License

This project is for research and demonstration purposes. Please refer to the LICENSE file (if provided) for usage details.


Screenshots

Look above for example screenshots of the application in use.


Conclusion

This project demonstrates the integration of gesture and speech recognition for practical, real-world control of presentation software. By leveraging the Microsoft Kinect Sensor and C#, it provides a touchless, intuitive interface for presenters. The code and architecture are designed for learning purposes, making it an excellent resource to study human-computer interaction, hardware integration, and modern C# programming practices. Experiment with the code, try extending the gesture or speech logic, and explore new ways to build interactive applications!


About

Gesture Recognition is a project to control Microsoft Powerpoint while presenting on a large screen, using the Microsoft Kinect Sensor, C# and Kinect SDK. Using their right or left hand, the presenter can slide to the next or previous page. They can also zoom in/out by fisting their hand, play/pause the presentation by displaying all/none fingers.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages