Skip to content

This project showcases my skills in robotics and computer vision, as I create a web-based control interface for a robotic arm using Flask and React, and implement hand gesture recognition using OpenCV. The robotic arm can execute commands based on the user's hand gestures, such as picking up and dropping objects.

Notifications You must be signed in to change notification settings

neiltarar/Computer-Vision-Enabled-Robotic-Arm-for-Object-Manipulation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robotic Arm Controlled by Hand Gestures

In this end-to-end proof of concept project, I present a gesture-controlled robotic arm that leverages real-time computer vision to interpret x, y, z coordinates. As an integral part of my professional portfolio, this system exemplifies my skills in developing user-friendly applications for advanced computer vision and control tasks.

Robot arm controlled by hand gestures

Hardware

The hardware components used in this project include:

  • OAK-D Lite stereo camera for obtaining 3D coordinates
  • STM32F407 Cortex-M4 development board for controlling the servos
  • Three MG996R servos
  • Three KY66 servos
  • CH341T serial adapter module for UART USB communication

Software

The software components consist of a Flask app for handling back-end functionalities and a React app for the user interface. The Flask app receives commands from the user interface and sends them to the STM32 development board, which in turn controls the servo motors.

Firmware

The firmware running on the STM32 development board plays a crucial role in translating the command string sent by the Flask app into actions performed by the robotic arm's servo motors. This section details the key aspects of the STM32 code and its role in the project.

The STM32 code is responsible for:
  1. Serial communication: The STM32 development board receives commands from the Flask app via a serial interface. The code parses the received command strings and extracts the necessary information for controlling the servo motors.
  2. Servo motor control: The STM32 code converts the angles received from the command strings into corresponding PWM signals to control the servo motors. The code ensures smooth and accurate movement of the robotic arm's base, arm, and claw.
  3. Safety mechanisms: The firmware incorporates safety mechanisms to prevent the robotic arm from exceeding its operational limits or moving too quickly. These safeguards protect both the user and the hardware from potential damage.
Timer And Uart Settings:

SystemClock_Config: Configure the system clock

  • MX_GPIO_Init: Initialise GPIO pins
  • MX_TIM2_Init: Initialise Timer 2
  • MX_TIM3_Init: Initialise Timer 3
  • MX_UART4_Init: Initialise UART 4

Flashing the STM32 Firmware

  1. Compile the provided source code using STM32CubeIDE.
  2. Flash the compiled code to the STM32F4XX microcontroller.
  3. Connect the servo motors to the respective GPIO pins mentioned in the code.
  4. Power up the system and establish a UART connection to communicate with the robotic arm.
  5. Send commands in the following format:
    Example: #BASE1-90,ARM2-45,ARM3-135\n
    The robotic arm will move the joints to the specified angles.

PWM Pin Clock Configurations

In order to achieve a timer frequency of 50Hz with a 84MHz APB1 timer clock and a 84MHz APB1 peripheral clock we do the calculation below.

Here's the calculation:

Timer frequency = APB1 timer clock / (PSC + 1) / (ARR + 1) Timer frequency = 84,000,000 Hz / (41 + 1) / (39999 + 1) ≈ 50 Hz Solving for PSC: PSC = 41 Solving for ARR: ARR = 39999

stm32 clock configuration diagram
stm32 pinout configuration diagram

Flask App

The Flask app serves the React app and handles the following routes:

  • /: Serves the main index.html file
  • /<path:path>: Serves static files
  • /command: Receives command data via a POST request, processes the command, and sends the appropriate serial data
  • /computer_vision_video_feed: Provides a video feed from the computer vision module
  • /robot_arm_video_feed: Provides a video feed from the robot arm

React App

The React app serves as the user interface, allowing users to interact with and control the robotic arm.

Hand Detection and Gesture Recognition

When a hand is detected in the video feed, the system recognises the hand's gesture (either "FIST" or "FIVE") and extracts the x, y, and z coordinates of the hand. These values are then converted into angles for controlling the robotic arm's servos. The robot's base, arm, and claw movements are determined by the hand's yaw, y-coordinate, and z-coordinate, respectively. The command string is generated based on these values and sent to the STM32 development board, which moves the robotic arm accordingly.

About

This project showcases my skills in robotics and computer vision, as I create a web-based control interface for a robotic arm using Flask and React, and implement hand gesture recognition using OpenCV. The robotic arm can execute commands based on the user's hand gestures, such as picking up and dropping objects.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published