@@ -177,6 +177,38 @@ path1 = planner.PlanToConfiguration(robot, goal)
177
177
path2 = planner.PlanToBasePose(robot, goal_pose)
178
178
```
179
179
180
+ ## Perception Pipeline
181
+
182
+ Recently, support has been added for a few perception routines. The general structure is intended
183
+ to mirror that of the planning pipeline, but it is somewhat less encapsulated than
184
+ planning, from the user's perspective.
185
+
186
+ There is a ` prpy.perception.base.PerceptionModule ` class which is extended by every perception
187
+ routine. Every routine has some common methods for perception, which are annotated with
188
+ ` @PerceptionMethod ` . Here is an example call (should happen in a typical herbpy console):
189
+
190
+ ``` python
191
+ from prpy.perception.apriltags import ApriltagsModule
192
+
193
+ adetector = ApriltagsModule(marker_topic = ' /apriltags_kinect2/marker_array' ,
194
+ marker_data_path = FindCatkinResource(' pr_ordata' ,' data/objects/tag_data.json' ),
195
+ kinbody_path = FindCatkinResource(' pr_ordata' ,' data/objects' ),
196
+ destination_frame = ' /map' ,
197
+ detection_frame = ' /head/kinect2_rgb_optical_frame' )
198
+ detected_objects = adetector.DetectObjects(robot)
199
+ ```
200
+
201
+ ### Perception Methods
202
+
203
+ Currently, the following perception routines are supported:
204
+
205
+ - ` AprilTags `
206
+ - ` VNCC ` : Vectorized Normalized Cross Correlation
207
+ - ` SimTrack `
208
+ - ` BlockDetector `
209
+ - ` ROCK ` : Robust Object Constellation and Kinematic Pose
210
+
211
+
180
212
181
213
## Environment Cloning
182
214
0 commit comments