'How to record realtime video from webcam to a JPEG image sequence using FFmpeg?
Using the info provided in this question, I am able to record realtime footage from the builtin webcam on a MacBook. The fact that I am using macOS shouldn't matter, as that only changes the video source, but I am providing it just in case:
ffmpeg -y -f avfoundation -framerate 30 -i "0" -preset ultrafast -update 1 -r 100 output.jpg
Then, I am using a custom C++ program to continuously send the entire output.jpg as a UDP packet (it is guaranteed to fit in the packet size limit).
The problem with this approach is that there is a race condition between the C++ program and ffmpeg.
In order to avoid it, I would like to use a circular buffer between them. By this I mean that ffmpeg would write consecutive frames to output1.jpeg, output2.jpeg, output3.jpeg, ..., outputN.jpeg and then start over from output1.jpeg. The buffer is circular so that the disk doesn't get completely filled with incoming frames.
I would like to ask, how could I achieve that or possibly solve the problem using a better approach. I was also thinking of writing consecutive frames to an image sequence without looping (ie. output + 1, 2, 3, ... + .jpg) and then deleting them by the C++ program after they have been read and sent. The program should be able to keep up with ffmpeg.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
