Hi, Is it possible to get the RTP timestamps from the buffer in the appsink element ? I have read, that if the RtpBin is in 'buffer-none' the PTS should contain the RTP timestamp. PTS timestamp is 8 bytes whereas RTP timestamp is 4 bytes ? When I run my program the timestamps from PTS is not the same I see in wireshark on the incomming packets. Best Regards, Frederik
_______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
can you share the code for better understanding of problem.
i used to do this to set video PTS. GstBuffer *gstbuf; GstMapInfo map = {0}; GstFlowReturn ret; gtbuf->pts = thiz->timestamp; thiz->timestamp += 33333333; // ns -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
adi
|
In reply to this post by Frederik Leth
On Thu, 2018-02-01 at 11:09 +0100, Frederik Leth wrote:
Hi Frederik, > Is it possible to get the RTP timestamps from the buffer in the > appsink element ? > I have read, that if the RtpBin is in 'buffer-none' the PTS should > contain the RTP timestamp. > PTS timestamp is 8 bytes whereas RTP timestamp is 4 bytes ? > When I run my program the timestamps from PTS is not the same I see > in wireshark on the incomming packets. What is the data you are getting in the appsink? Is it RTP packets, or are you talking about the depayloaded/decoded data and want to know what the original RTP timestamp was? If it's RTP packets you could use gst_rtp_buffer_map() + gst_rtp_buffer_get_timestamp(). Cheers -Tim -- Tim Müller, Centricular Ltd - http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
The data is raw data. In this case it's depayloaded and decoded audio or
video. My setup is: A device delivers an audio and a video stream (on seperate SSRC). The device creates timestamps for both audio and video, so the two sources are in sync when leaving the device. The receiver is a gstreamer, which decodes etc. Both streams ends in it's own appsink. An application consumes both appsinks, and at this point I need to sync the two streams. That is why I wan't to maintain the RTP timestamp created on the producers (device) Does this makes sense ?? -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by devil coder
In my reply to Tim, you can read how the setup of the pipelines are.
Here's the appSink code static GstFlowReturn on_new_buffer(GstElement* object, gpointer user_data) { GstAppSink* app_sink = (GstAppSink*)object; GstSample* sample = gst_app_sink_pull_sample(app_sink); if (sample) { GstBuffer* buffer = gst_sample_get_buffer(sample); GstClockTime timestamp = GST_BUFFER_PTS(buffer); //this timestamp is not the same as the RTP timestamp GstMapInfo map; gst_buffer_map(buffer, &map, GST_MAP_READ); PipeContext* context = (PipeContext*)user_data; context->PublishAppSink(map.data, map.size, timestamp); gst_buffer_unmap(buffer, &map); gst_sample_unref(sample); } return GST_FLOW_OK; } -- Sent from: http://gstreamer-devel.966125.n4.nabble.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |