Replies: 2 comments 8 replies
-
I’ve only had a quick look on my phone but it looks like you’re encoding h264 with the preset ultrafast, which won’t be supported by most browsers (if any) |
Beta Was this translation helpful? Give feedback.
-
I'm still struggling with this. I've simplified my use case to eliminate variables by only trying to stream a prerecorded .mp4 file. I've confirmed that the video is playable by the browser by putting the file into a
I'm then indexing into that vector and sending the packets to libdatachannel with following sendFrame function, like this:
I just seem to be missing something here. The handshake is happening, the frames are getting sent, but no video is ever shown. Am I doing the NALU wrong? Is there some kind of diagnostic log on the browser side above the console or chrome://webrtc-internals? I don't even know what the problem is. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to stream the graphical output of a VTK window through FFMpeg and libdatachannel to a web browser. I've successfully got it so I can write the graphical output to a MP4 file with H264 codec. I've adapted the libdatachannel streamer example into a single-threaded implementation with copy/paste SDP exchange with the browser for now. I'm taking the AVPackets and sending them to libdatachannel instead of the MP4 file. I can get the peer connection up and connected, and I can send frames across, but nothing shows up in the
<video>
element. Looking at chrome://webrtc-internals, I can see that we're receiving frames and bytes across the transport, but no frames get decoded in the RTP connection. What am I doing wrong?graphics_stream_poc.tar.gz
Beta Was this translation helpful? Give feedback.
All reactions