Dear all
I have a camera which emits a h264 stream via rtsp.
This stream follows two paths: the firts stores the stream as raw h264 (this in handled by an external application) and the second one performs some real-time video analysis operations.
The second path generates some events (like moving objecs) which should be stored in a database and later displayed in onverlay on the recorded videos coming from path one.
As the recording and the video analysis might be sligthly shifted in time (let's say that a random number of frames are dropped before the external application starts recording) I cannot rely on the frame number. Also I would like to avoid to rely on a NTP to sync the timestamps because even a difference of a bounch of milliseconds might lead to a bad user experience in terms of ovelayed info.
I am not that expert in h264 and since now I've used opencv to decode frames, but I've read that h264 streams are packed into NAL unit. Are these units numbered in some way? and if so, can these info be extracted using gstreamer? If a counter was naturally embedded in the h264 stream and it could be reached I think that would be the best synchronization option.
Before I start editing opencv's source code or change the video analysis application to use pure gstreamer I need to know if this can be achieved in some way.
Thanks in advance for your help!
Sent from the GStreamer-devel mailing list archive at Nabble.com. _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Le lun. 29 avr. 2019 04 h 55, Andressio <[hidden email]> a écrit : Dear all I have a camera which emits a h264 stream via rtsp. This stream follows two paths: the firts stores the stream as raw h264 (this in handled by an external application) and the second one performs some real-time video analysis operations. The second path generates some events (like moving objecs) which should be stored in a database and later displayed in onverlay on the recorded videos coming from path one. As the recording and the video analysis might be sligthly shifted in time (let's say that a random number of frames are dropped before the external application starts recording) I cannot rely on the frame number. Also I would like to avoid to rely on a NTP to sync the timestamps because even a difference of a bounch of milliseconds might lead to a bad user experience in terms of ovelayed info. I am not that expert in h264 and since now I've used opencv to decode frames, but I've read that h264 streams are packed into NAL unit. Are these units numbered in some way? and if so, can these info be extracted using gstreamer? If a Not really. In general you should avoid storing raw h264 for this purpose. Instead use a container, like streamable matroska or Mpeg ts. counter was naturally embedded in the h264 stream and it could be reached I think that would be the best synchronization option. Before I start editing opencv's source code or change the video analysis application to use pure gstreamer I need to know if this can be achieved in some way. Thanks in advance for your help! _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
If I understand correctly you are saying that if I encapsulate the h264 stream in, for example, a matroska container I will be able to retrieve specific frames. I guess that this can be done in the recording application but still there would be no way to obtain a frame rference in the analytics application. To solve this sync problem the emitting camera should be able to tag or mark frames. Not even the keyframe are numbered? Sent from the GStreamer-devel mailing list archive at Nabble.com. _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Le lundi 29 avril 2019 à 10:14 -0500, Andressio a écrit : If I understand correctly you are saying that if I encapsulate the h264 stream in, for example, a matroska container I will be able to retrieve specific frames. I guess that this can be done in the recording application but still there would be no way to obtain a frame rference in the analytics application. It's container (matroska, TS, ISOMP4, RTP etc.) jobs to store metadata about frames. I don't have a good view or you application design, but if you do this well, the meta data (usually timestamp or sequence number) should travel through. Saving raw H264 to disk strip off any information, you'd even be lucky if you can figure-out the original framerate. Nicolas
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (201 bytes) Download Attachment |
Free forum by Nabble | Edit this page |