'RTMP FFmpeg muxing example
I'm trying to use the FFmpeg muxing example to send content using RTMP. When modifying the example to send something to a youtube channel, the live result isn't right:
av_log_set_level(AV_LOG_DEBUG);
avformat_alloc_output_context2(&oc, NULL, "flv", "rtmp://x.rtmp.youtube.com/live2/KEY");
Log seems fine :
[flv @ 0000016CDFB97640] intra_quant_bias = 0 inter_quant_bias = -64
[SWR @ 0000016CDFBD5140] Using s16p internally between filters
Output #0, flv, to 'rtmp://x.rtmp.youtube.com/live2/KEY':
Stream #0:0, 0, 1/30: Video: flv1, 1 reference frame, yuv420p, 256x144 (0x0), 0/1, q=2-31, 400 kb/s, 30 tbn
Stream #0:1, 0, 1/44100: Audio: adpcm_swf, 44100 Hz, stereo, s16, 352 kb/s
[rtmp @ 0000016CDFB63300] No default whitelist set
[tcp @ 0000016CDFB7E780] No default whitelist set
[tcp @ 0000016CDFB7E780] Original list of addresses:
[tcp @ 0000016CDFB7E780] Address 142.250.178.172 port 1935
[tcp @ 0000016CDFB7E780] Interleaved list of addresses:
[tcp @ 0000016CDFB7E780] Address 142.250.178.172 port 1935
[tcp @ 0000016CDFB7E780] Starting connection attempt to 142.250.178.172 port 1935
[tcp @ 0000016CDFB7E780] Successfully connected to 142.250.178.172 port 1935
[rtmp @ 0000016CDFB63300] Handshaking...
[rtmp @ 0000016CDFB63300] Type answer 3
[rtmp @ 0000016CDFB63300] Server version 4.0.0.1
[rtmp @ 0000016CDFB63300] Proto = rtmp, path = /live2/KEY, app = live2, fname = KEY
[rtmp @ 0000016CDFB63300] Window acknowledgement size = 2500000
[rtmp @ 0000016CDFB63300] Max sent, unacked = 57000000
[rtmp @ 0000016CDFB63300] Releasing stream...
[rtmp @ 0000016CDFB63300] FCPublish stream...
[rtmp @ 0000016CDFB63300] Creating stream...
[rtmp @ 0000016CDFB63300] Sending publish command for 'KEY'
On youtube, the issue is as follow: after 1min20seconds of streaming, the length of the timeline is of 10min or so, as you can see here.
My guess is that youtube is receiving more than 25 frames per seconds, and doesn't discard them. Therefore the timeline grows creating a video with all the frames 1/25 of a second appart, which doesn't correspond to the actual length of streaming. I can provide the PTS DTS if needed.
I'm not sure how to make the AVCodeccontext wait.
Any help?
Solution 1:[1]
From this rtmp ffmpeg example which is a little old; i used the av_usleep :
// inside write_frame
/* rescale output packet timestamp values from codec to stream timebase */
av_packet_rescale_ts(pkt, c->time_base, st->time_base);
pkt->stream_index = st->index;
AVRational time_base = st->time_base;
AVRational time_base_q = { 1,AV_TIME_BASE };
int64_t pts_time = av_rescale_q(pkt->dts, time_base, time_base_q);
int64_t now_time = av_gettime() - start_time;
if (pts_time > now_time) {
av_usleep(pts_time - now_time);
}
/* Write the compressed frame to the media file. */
log_packet(fmt_ctx, pkt);
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | pierre tardif |
