@@ -16,7 +16,7 @@ camera interface.
16
16
17
17
### Installation Instructions
18
18
19
- In order to use a depth sensor with OpenCV you should do the following steps:
19
+ In order to use the Astra camera's depth sensor with OpenCV you should do the following steps:
20
20
21
21
-# Download the latest version of Orbbec OpenNI SDK (from here < https://orbbec3d.com/develop/ > ).
22
22
Unzip the archive, choose the build according to your operating system and follow installation
@@ -72,29 +72,32 @@ In order to use a depth sensor with OpenCV you should do the following steps:
72
72
73
73
### Code
74
74
75
- The Astra Pro camera has two sensors -- a depth sensor and a color sensor. The depth sensors
75
+ The Astra Pro camera has two sensors -- a depth sensor and a color sensor. The depth sensor
76
76
can be read using the OpenNI interface with @ref cv::VideoCapture class. The video stream is
77
- not available through OpenNI API and is only provided through the regular camera interface.
77
+ not available through OpenNI API and is only provided via the regular camera interface.
78
78
So, to get both depth and color frames, two @ref cv::VideoCapture objects should be created:
79
79
80
80
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Open streams
81
81
82
- The first object will use the Video4Linux2 interface to access the color sensor. The second one
83
- is using OpenNI2 API to retrieve depth data.
82
+ The first object will use the OpenNI2 API to retrieve depth data. The second one uses the
83
+ Video4Linux2 interface to access the color sensor. Note that the example above assumes that
84
+ the Astra camera is the first camera in the system. If you have more than one camera connected,
85
+ you may need to explicitly set the proper camera number.
84
86
85
87
Before using the created VideoCapture objects you may want to set up stream parameters by setting
86
88
objects' properties. The most important parameters are frame width, frame height and fps.
87
- For this example, we’ll configure width and height of both streams to VGA resolution as that’s
88
- the maximum resolution available for both sensors and we’d like both stream parameters to be the same:
89
+ For this example, we’ll configure width and height of both streams to VGA resolution, which is
90
+ the maximum resolution available for both sensors, and we’d like both stream parameters to be the
91
+ same for easier color-to-depth data registration:
89
92
90
93
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Setup streams
91
94
92
- For setting and getting some property of sensor data generators use @ref cv::VideoCapture::set and
95
+ For setting and retrieving some property of sensor data generators use @ref cv::VideoCapture::set and
93
96
@ref cv::VideoCapture::get methods respectively, e.g. :
94
97
95
98
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Get properties
96
99
97
- The following properties of cameras available through OpenNI interfaces are supported for the depth
100
+ The following properties of cameras available through OpenNI interface are supported for the depth
98
101
generator:
99
102
100
103
- @ref cv::CAP_PROP_FRAME_WIDTH -- Frame width in pixels.
@@ -113,7 +116,7 @@ generator:
113
116
- @ref cv::CAP_PROP_OPENNI_FRAME_MAX_DEPTH -- A maximum supported depth of the camera in mm.
114
117
- @ref cv::CAP_PROP_OPENNI_BASELINE -- Baseline value in mm.
115
118
116
- After the VideoCapture objects are set up you can start reading frames from them.
119
+ After the VideoCapture objects have been set up, you can start reading frames from them.
117
120
118
121
@note
119
122
OpenCV's VideoCapture provides synchronous API, so you have to grab frames in a new thread
@@ -138,11 +141,12 @@ VideoCapture can retrieve the following data:
138
141
139
142
-# data given from the color sensor is a regular BGR image (CV_8UC3).
140
143
141
- When new data are available a reading thread notifies the main thread using a condition variable.
142
- A frame is stored in the ordered list -- the first frame is the latest one. As depth and color frames
143
- are read from independent sources two video streams may become out of sync even when both streams
144
- are set up for the same frame rate. A post-synchronization procedure can be applied to the streams
145
- to combine depth and color frames into pairs. The sample code below demonstrates this procedure:
144
+ When new data are available, each reading thread notifies the main thread using a condition variable.
145
+ A frame is stored in the ordered list -- the first frame in the list is the earliest captured,
146
+ the last frame is the latest captured. As depth and color frames are read from independent sources
147
+ two video streams may become out of sync even when both streams are set up for the same frame rate.
148
+ A post-synchronization procedure can be applied to the streams to combine depth and color frames into
149
+ pairs. The sample code below demonstrates this procedure:
146
150
147
151
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Pair frames
148
152
0 commit comments