Capture the Camera H264 frame ("/dev/video0") and send it to the WebRTC channel #583
-
Hope you are doing great! I am working on a WebRTC based video chat application development and after exploring so many things I came to know this "libdatachannel" library. I have gone through multiple examples and am also able to run on a Ubuntu machine. I have also explored APIs documents. Now I have confidence about this library so please confirm whether I can use your library for the following requirements. Your early response will be highly appreciable. My requirement is to develop a native WebRTC based video chat application that can able to read H264 frames from the camera (/dev/video0) and send the same media on the WebRTC channel. Kindly note: I don't have any browser support for my embedded device, but it has 4 core CPUs, 2 GB RAM, HW codes, HDMI Display, USB camera support, etc. I have gone through the streamer example code (C++) and came to know that this example reads video and audio data from files (My requirement is to read the same from the UVC camera)and sends the same on the WebRTC channel and another side browser (HTML, JS)is able to play the video in a browser. Same way in the media example browser (HTML, JS) is reading frames from the camera and sending the same to the WebRTC channel and the other side media example (C++) is sending the same frame on UDP and I was able to play the same video by GStreamer. I couldn't find any example which is in C++ and was able to read the data from a camera and send the same on the WebRTC channel. Do we have any C++ APIs available in this library that can read the live data from Camera and send the same On the WebRTC channel? I don't have a browser option. Kindly note: I am not worried about the receiver side because I can leverage the media example and by using GStreamer I will be able to play on my local display. The only blocker point is that I couldn't find any C++ APIs that can handle camera input for WebRTC media handling. Regards, |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
Hi, libdatachannel is a network library, media capture is out of its scope and must be achieved with other software. This question is quite similar to the one I answered here, you can actually use the same approach as in the media example with an external gstreamer pipeline, but in reverse, with gstreamer capturing from the webcam and sending RTP packets to a local UDP socket on which you read to forward in a track. |
Beta Was this translation helpful? Give feedback.
-
@paullouisageneau A great Thanks for your clear answer, will try the suggested way and will get back to you. |
Beta Was this translation helpful? Give feedback.
-
I ran into this early into my project and took the advice @paullouisageneau had. After a lot of trial and error I finally got it stable. Check out my solution which I have libdatachannel and gstreamer working together here. |
Beta Was this translation helpful? Give feedback.
Hi,
libdatachannel is a network library, media capture is out of its scope and must be achieved with other software. This question is quite similar to the one I answered here, you can actually use the same approach as in the media example with an external gstreamer pipeline, but in reverse, with gstreamer capturing from the webcam and sending RTP packets to a local UDP socket on which you read to forward in a track.