'Send webRTC video over RTMP to server for live broadcasting?
I am implementing a live broadcasting in my android app. I am using webRTC for real-time video chat. Now I would like to broadcast live chat to many people.I would like to use Mpeg-DASH.For that video stream can be sent to the server over RTMP and then will be broadcasted using Mpeg-DASH.
So I would like to know how to capture media stream of both local and remote user and then send it over RTMP. I have a working prototype for sending camera captured video to the server over RTMP. But I don't know how to send same media stream to the server which is being used by webRTC. Possible solutions
- Record/Capture screen of live chat and then send it to the server over RTMP.
- Make server a peer in webRTC and manipulate stream and broadcast it via Mpeg-DASH.
I would like to do it in client side.Is there any other way to do this? Thanks.
Solution 1:[1]
You should use a WebRTC SFU to forward packets to app, and to covert WebRTC to RTMP. It works like this:
android app --WebRTC--> Server -+--WebRTC--> android app
|
+--RTMP--> live streaming platform
+--HLS/DASH--> player
Because WebRTC is always encrypted by DTLS, so you should covert the stream by a SFU server, rather than hacking the stream of client.
Please read more in this post.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Winlin |
