Skip to content

Commit a3de457

Browse files
committed
Merge pull request opencv#18971 from GArik:orbbec
2 parents 7d7d907 + 38a4eaf commit a3de457

File tree

2 files changed

+74
-54
lines changed

2 files changed

+74
-54
lines changed

doc/tutorials/videoio/orbbec-astra/orbbec_astra.markdown

Lines changed: 29 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ Using Orbbec Astra 3D cameras {#tutorial_orbbec_astra}
99

1010
This tutorial is devoted to the Astra Series of Orbbec 3D cameras (https://orbbec3d.com/product-astra-pro/).
1111
That cameras have a depth sensor in addition to a common color sensor. The depth sensors can be read using
12-
the OpenNI interface with @ref cv::VideoCapture class. The video stream is provided through the regular camera
13-
interface.
12+
the open source OpenNI API with @ref cv::VideoCapture class. The video stream is provided through the regular
13+
camera interface.
1414

1515
### Installation Instructions
1616

@@ -70,15 +70,20 @@ In order to use a depth sensor with OpenCV you should do the following steps:
7070

7171
### Code
7272

73-
To get both depth and color frames, two @ref cv::VideoCapture objects should be created:
73+
The Astra Pro camera has two sensors -- a depth sensor and a color sensor. The depth sensors
74+
can be read using the OpenNI interface with @ref cv::VideoCapture class. The video stream is
75+
not available through OpenNI API and is only provided through the regular camera interface.
76+
So, to get both depth and color frames, two @ref cv::VideoCapture objects should be created:
7477

7578
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Open streams
7679

77-
The first object will use the regular Video4Linux2 interface to access the color sensor. The second one
80+
The first object will use the Video4Linux2 interface to access the color sensor. The second one
7881
is using OpenNI2 API to retrieve depth data.
7982

80-
Before using the created VideoCapture objects you may want to setup stream parameters by setting
81-
objects' properties. The most important parameters are frame width, frame height and fps:
83+
Before using the created VideoCapture objects you may want to set up stream parameters by setting
84+
objects' properties. The most important parameters are frame width, frame height and fps.
85+
For this example, we’ll configure width and height of both streams to VGA resolution as that’s
86+
the maximum resolution available for both sensors and we’d like both stream parameters to be the same:
8287

8388
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Setup streams
8489

@@ -113,8 +118,9 @@ After the VideoCapture objects are set up you can start reading frames from them
113118
to avoid one stream blocking while another stream is being read. VideoCapture is not a
114119
thread-safe class, so you need to be careful to avoid any possible deadlocks or data races.
115120

116-
Example implementation that gets frames from each sensor in a new thread and stores them
117-
in a list along with their timestamps:
121+
As there are two video sources that should be read simultaneously, it’s necessary to create two
122+
threads to avoid blocking. Example implementation that gets frames from each sensor in a new thread
123+
and stores them in a list along with their timestamps:
118124

119125
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Read streams
120126

@@ -130,17 +136,24 @@ VideoCapture can retrieve the following data:
130136

131137
-# data given from the color sensor is a regular BGR image (CV_8UC3).
132138

133-
When new data is available a reading thread notifies the main thread. A frame is stored in the
134-
ordered list -- the first frame is the latest one:
139+
When new data are available a reading thread notifies the main thread using a condition variable.
140+
A frame is stored in the ordered list -- the first frame is the latest one. As depth and color frames
141+
are read from independent sources two video streams may become out of sync even when both streams
142+
are set up for the same frame rate. A post-synchronization procedure can be applied to the streams
143+
to combine depth and color frames into pairs. The sample code below demonstrates this procedure:
135144

136-
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Show color frame
145+
@snippetlineno samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp Pair frames
137146

138-
Depth frames can be picked the same way from the `depthFrames` list.
147+
In the code snippet above the execution is blocked until there are some frames in both frame lists.
148+
When there are new frames, their timestamps are being checked -- if they differ more than a half of
149+
the frame period then one of the frames is dropped. If timestamps are close enough, then two frames
150+
are paired. Now, we have two frames: one containing color information and another one -- depth information.
151+
In the example above retrieved frames are simply shown with cv::imshow function, but you can insert
152+
any other processing code here.
139153

140-
After that, you'll have two frames: one containing color information and another one -- depth
141-
information. In the sample images below you can see the color frame and the depth frame showing
142-
the same scene. Looking at the color frame it's hard to distinguish plant leaves from leaves painted
143-
on a wall, but the depth data makes it easy.
154+
In the sample images below you can see the color frame and the depth frame representing the same scene.
155+
Looking at the color frame it's hard to distinguish plant leaves from leaves painted on a wall,
156+
but the depth data makes it easy.
144157

145158
![Color frame](images/astra_color.jpg)
146159
![Depth frame](images/astra_depth.png)

samples/cpp/tutorial_code/videoio/orbbec_astra/orbbec_astra.cpp

Lines changed: 45 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,6 @@ int main()
6969
//! [Read streams]
7070
// Create two lists to store frames
7171
std::list<Frame> depthFrames, colorFrames;
72-
std::mutex depthFramesMtx, colorFramesMtx;
7372
const std::size_t maxFrames = 64;
7473

7574
// Synchronization objects
@@ -90,16 +89,14 @@ int main()
9089
Frame f;
9190
f.timestamp = cv::getTickCount();
9291
depthStream.retrieve(f.frame, CAP_OPENNI_DEPTH_MAP);
93-
//depthStream.retrieve(f.frame, CAP_OPENNI_DISPARITY_MAP);
94-
//depthStream.retrieve(f.frame, CAP_OPENNI_IR_IMAGE);
9592
if (f.frame.empty())
9693
{
9794
cerr << "ERROR: Failed to decode frame from depth stream" << endl;
9895
break;
9996
}
10097

10198
{
102-
std::lock_guard<std::mutex> lk(depthFramesMtx);
99+
std::lock_guard<std::mutex> lk(mtx);
103100
if (depthFrames.size() >= maxFrames)
104101
depthFrames.pop_front();
105102
depthFrames.push_back(f);
@@ -127,7 +124,7 @@ int main()
127124
}
128125

129126
{
130-
std::lock_guard<std::mutex> lk(colorFramesMtx);
127+
std::lock_guard<std::mutex> lk(mtx);
131128
if (colorFrames.size() >= maxFrames)
132129
colorFrames.pop_front();
133130
colorFrames.push_back(f);
@@ -138,56 +135,66 @@ int main()
138135
});
139136
//! [Read streams]
140137

141-
while (true)
138+
//! [Pair frames]
139+
// Pair depth and color frames
140+
while (!isFinish)
142141
{
143142
std::unique_lock<std::mutex> lk(mtx);
144-
while (depthFrames.empty() && colorFrames.empty())
143+
while (!isFinish && (depthFrames.empty() || colorFrames.empty()))
145144
dataReady.wait(lk);
146145

147-
depthFramesMtx.lock();
148-
if (depthFrames.empty())
149-
{
150-
depthFramesMtx.unlock();
151-
}
152-
else
146+
while (!depthFrames.empty() && !colorFrames.empty())
153147
{
148+
if (!lk.owns_lock())
149+
lk.lock();
150+
154151
// Get a frame from the list
155-
Mat depthMap = depthFrames.front().frame;
152+
Frame depthFrame = depthFrames.front();
153+
int64 depthT = depthFrame.timestamp;
154+
155+
// Get a frame from the list
156+
Frame colorFrame = colorFrames.front();
157+
int64 colorT = colorFrame.timestamp;
158+
159+
// Half of frame period is a maximum time diff between frames
160+
const int64 maxTdiff = int64(1000000000 / (2 * colorStream.get(CAP_PROP_FPS)));
161+
if (depthT + maxTdiff < colorT)
162+
{
163+
depthFrames.pop_front();
164+
continue;
165+
}
166+
else if (colorT + maxTdiff < depthT)
167+
{
168+
colorFrames.pop_front();
169+
continue;
170+
}
156171
depthFrames.pop_front();
157-
depthFramesMtx.unlock();
172+
colorFrames.pop_front();
173+
lk.unlock();
158174

175+
//! [Show frames]
159176
// Show depth frame
160177
Mat d8, dColor;
161-
depthMap.convertTo(d8, CV_8U, 255.0 / 2500);
178+
depthFrame.frame.convertTo(d8, CV_8U, 255.0 / 2500);
162179
applyColorMap(d8, dColor, COLORMAP_OCEAN);
163180
imshow("Depth (colored)", dColor);
164-
}
165-
166-
//! [Show color frame]
167-
colorFramesMtx.lock();
168-
if (colorFrames.empty())
169-
{
170-
colorFramesMtx.unlock();
171-
}
172-
else
173-
{
174-
// Get a frame from the list
175-
Mat colorFrame = colorFrames.front().frame;
176-
colorFrames.pop_front();
177-
colorFramesMtx.unlock();
178181

179182
// Show color frame
180-
imshow("Color", colorFrame);
181-
}
182-
//! [Show color frame]
183+
imshow("Color", colorFrame.frame);
184+
//! [Show frames]
183185

184-
// Exit on Esc key press
185-
int key = waitKey(1);
186-
if (key == 27) // ESC
187-
break;
186+
// Exit on Esc key press
187+
int key = waitKey(1);
188+
if (key == 27) // ESC
189+
{
190+
isFinish = true;
191+
break;
192+
}
193+
}
188194
}
195+
//! [Pair frames]
189196

190-
isFinish = true;
197+
dataReady.notify_one();
191198
depthReader.join();
192199
colorReader.join();
193200

0 commit comments

Comments
 (0)