'embed video stream with custom meta data
I have an optical system that provides a UDP video stream.
From device specification FAQ:
Both single metadata (KLV) stream and compressed video (H.264) with metadata (KLV) are available on Ethernet link. Compressed video and metadata are coupled in the same stream compliant with STANAG 4609 standard. Each encoded video stream is encapsulated with the associated metadata within an MPEG-TS single program stream over Ethernet UDP/IP/ The video and metadata are synchronized through the use of timestamps.
Also there are other devices that provide data about the state of an aircraft (velocity, coords, etc). This data should be displayed on a client GUI display alongside with video. Of course it has to be synchronized with the current video frame.
One of the approaches I though of is to embed this data into the video stream. But I am not sure if it is possible or should I use another (than UDP) protocol for this purpose.
Is it possible/reasonable to use such approach? Is ffmpeg library suitable in this case? If not, what are the other ways to synchronize data with a video frame. Latency is crucial. Although bandwidth is limited to 2-5 Mbps.
It seems to be possible using ffmpeg: AVPacket can be provided with additional data using function av_packet_add_side_data which takes a preallocated buffer, size and a type AVPacketSideDataType.
However, I am not sure for now, which enum value of AVPacketSideDataType can be used for custom user-provided binary data.
Something similar that might be used for my needs:
Solution 1:[1]
The quote sounds like you have a transport stream containing two elementary streams (the H.264 video in one, and the KLV data in another). The transport stream is sent over UDP (or TCP, or is just a file, whatever you want - its mostly independent of the transport).
There is a discussion of implementing this kind of thing in the Motion Imagery handbook (which you can download from the MISB part of the NSG Registry at https://nsgreg.nga.mil/misb.jsp - its towards the bottom of the Non-cited Standards Documents table) and in detail in ST 1402 (which you can find in the same table). I'm avoiding providing direct links because the versions change - just look for whatever is current.
The short version is that you can embed the timestamp in the video (see ST 0603 and ST 0604), and then correlate that to the metadata timestamp (Precision Time Stamp, see ST 0601). You don't want to do that at the AVPacket level though. Instead, you need to put side data into AVFrame, with the AV_FRAME_DATA_SEI_UNREGISTERED key (https://ffmpeg.org/doxygen/trunk/group__lavu__frame.html#ggae01fa7e427274293aacdf2adc17076bca4f2dcaee18e5ffed8ff4ab1cc3b326aa). You will need a fairly recent FFmpeg version.
Note: if all you want to do is see the UDP data stream - video on one side, and decoded KLV on the other, then you might like to check out the jMISB Viewer application: https://github.com/WestRidgeSystems/jmisb It also provides an example of encoding (generator example). Disclaimer: I contribute to the project.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |
