-
Notifications
You must be signed in to change notification settings - Fork 5
KinestheticInteraction
Vivian Chu edited this page Aug 2, 2016
·
1 revision
Ubuntu 14.04 Trusty ROS Indigo
- ROS
- HLPR_Manipulation
To interact with the robot, the KinestheticInteraction abstract class is designed to connect speech to basic actions of the robots. Specifically, it supports the following commands:
- Mic check:
- Internal:
HEAR_CHECK
- Speech command: "Can you hear me?"
- Robot response: "I heard ya!"
- Internal:
- Greeting:
GREETING
- Internal:
GREETING
- Speech command:
Hello
- Robot response:
Hello!
- Internal:
- Open/Close Hand
- Internal:
OPEN_HAND
CLOSE_HAND
- Speech command:
Open/Close your hand
- Robot response:
Ok
and gripper opens/closes
- Internal:
- Start gravity compensation
- Internal:
START_GC
- Speech command:
Release your arm.
- Robot response:
Ok
and arm can now move
- Internal:
- End gravity compensation
- Internal:
END_GC
- Speech command:
Hold your arm
- Robot response:
Ok
and arm will now hold itself
- Internal:
- Demo start/stop
- Trajectory start/stop
- Keyframe
The last three commands do not do anything to the robot and abstract methods are created to be called in the extended class such that whatever you choose to do during these commands will be done.
To activate the class you need to extend the abstract KinestheticInteraction class and then call a service call to the kinesthetic_interaction
service. An example of how to extend the class can be seen in basic_kinesthetic_interaction.py
and a working script that calls the service call can be found in activate_kineshetic_interaction