Hi,
I am streaming video using RTSP. Here I have access to both server and client. Now I want to encode timestamp to each frame so that I can access the timestamp from the client pipeline. Right now I have tried the following options, 1. Set PTS of each frame manually at the server. But on the client, PTS absolute value is not preserved. So, this doesn't work. 2. I tried to access GstBuffer metadata and write inside it. But GstBuffer metadata is not preserved beyond the server. Because of this, the client is not able to access the metadata. 3. Access each frame buffer and replace first pixel content with the timestamp. I can read the first pixel data on the client and get to know the timestamp. Although it works, it's not an ideal solution. Please help me to find a way to encode/embed the timestamp into the frame at the server so that I can retrieve it from the client pipeline. Thanks. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
It would help to know what you are trying to accomplish by embedding a
custom time code. Because there may be a workaround that you are not aware of. For instance to synchronize several players, you can use a gstreamer network clock. You have to remember that RTSP is a standard protocol, and that gstreamer is going to convert what you are trying to do to play nice with other players that might not be gstreamer on the client end. One way that I can think of is to use a data stream that is getting sent with the video stream as well, so you would have a video stream + (maybe audio) + a data stream, so you would have 3 channels. This way you could use all of the facilities between gstreamer and RTSP to keep those data frames in-sync (roughly). To accomplish this, I would use an appsrc in your rtsp server. Granted I have never tried this before, so your mileage may vary. You also might run into issues with #3 if you are encoding your frames, as the encoder might destroy or distort that information. On 12/11/2019 5:03 AM, k4ustav wrote: > Hi, > > I am streaming video using RTSP. Here I have access to both server and > client. Now I want to encode timestamp to each frame so that I can access > the timestamp from the client pipeline. > > Right now I have tried the following options, > > 1. Set PTS of each frame manually at the server. But on the client, PTS > absolute value is not preserved. So, this doesn't work. > 2. I tried to access GstBuffer metadata and write inside it. But GstBuffer > metadata is not preserved beyond the server. Because of this, the client is > not able to access the metadata. > 3. Access each frame buffer and replace first pixel content with the > timestamp. I can read the first pixel data on the client and get to know the > timestamp. Although it works, it's not an ideal solution. > > Please help me to find a way to encode/embed the timestamp into the frame at > the server so that I can retrieve it from the client pipeline. > > Thanks. > > > > -- > Sent from: http://gstreamer-devel.966125.n4.nabble.com/ > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel > gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Thanks Michael for your reply. I am trying to send two parallel streams. One
video stream and another metadata stream. The metadata stream contains metadata for each video frame, for example, inference data for each video frame. The framerates for both the streams are same. To implement this, I thought to send the metadata through normal sockets and video stream through the GStreamer RTSP server. To sync the video frame and per-frame metadata at the client I am planning to use timestamp. Because of this, I am planning to send timestamps along with the video frame. But if you can suggest a better way or example to do this, it would be really helpful. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
The best thing I can think of is a variant on your idea, but to employ a second stream that has your metadata in a binary format, loaded in an appsrc, and then use rtpvrawpay to payload it for the rtsp server. I am not sure what would be the "proper" way to send this kind of information. The drawback is you would have to have metadata for every frame, but it would at least hopefully be fairly synchronized. I have seen datastreams
in the wild, just haven't had the need to implement one myself.
So there is probably a cleaner implementation. On 12/11/2019 2:24 PM, k4ustav wrote:
Thanks Michael for your reply. I am trying to send two parallel streams. One video stream and another metadata stream. The metadata stream contains metadata for each video frame, for example, inference data for each video frame. The framerates for both the streams are same. To implement this, I thought to send the metadata through normal sockets and video stream through the GStreamer RTSP server. To sync the video frame and per-frame metadata at the client I am planning to use timestamp. Because of this, I am planning to send timestamps along with the video frame. But if you can suggest a better way or example to do this, it would be really helpful. -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |