Hi all,
I'm currently looking at syncing of two independent RTP sources (one video, one audio), both of which are recorded by our application to be played back later, in sync. I understand that part of the RTP/RTCP spec is that RTCP sender reports sent periodically (e.g. every few seconds) contain a ntp timestamp which correlates ntp time to RTP timestamps, in theory allowing accurate synchronisation between multiple sources. My question is: do any of the gstreamer rtp related elements take account of the ntp timestamps in sender reports and adjust buffer timestamps accordingly? Which element(s) do RTCP processing? My video source is a RTSP/mpeg4 part2 network encoder, and the relevant gst-launch equivalent of my pipeline looks something like: > gst-launch uridecodebin uri="rtsp://someurl" ! queue ! decodebin2 .... in case that can help somebody direct me... The audio source is received by something other than a g-streamer pipeline btw. Thanks in advance, Jono ------------------------------------------------------------------------------ SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada. The future of the web can't happen without you. Join us at MIX09 to help pave the way to the Next Web now. Learn more and register at http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Wed, 2008-12-17 at 10:46 +0900, Jon Burgess wrote:
> My question is: do any of the gstreamer rtp related elements take > account of the ntp timestamps in sender reports and adjust buffer > timestamps accordingly? Which element(s) do RTCP processing? gstrtpbin is what you want. -- Olivier Crête [hidden email] Collabora Ltd ------------------------------------------------------------------------------ SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada. The future of the web can't happen without you. Join us at MIX09 to help pave the way to the Next Web now. Learn more and register at http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel signature.asc (204 bytes) Download Attachment |
Hi, gstreamer-devel:
"gst_rtp_bin_associate" in gstrtpbin.c do what you want. But I think it needs improvements. :)
Eric Zhang
2008/12/17 Olivier Crête <[hidden email]>
------------------------------------------------------------------------------ SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada. The future of the web can't happen without you. Join us at MIX09 to help pave the way to the Next Web now. Learn more and register at http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Thanks Olivier and Eric... From what I can tell, this bit of code handles synchronization between RTP streams within the same session, or in other words, all streams handled by one RTP bin will be synced. But my case is slightly different because of the fact that one stream (video) will be received by a gstreamer pipeline, while the other (audio) will be received by legacy code. So my thought of how to sync them (either when playing live, or playing back recorded data) would be to have gstreamer timestamp the buffers with something like a unix (wallclock) timestamp, have the legacy code do the same, and then things should stay synced. Would this work, or would timestamping like this upset gstreamer? Also, I'm using a uridecodebin - would that use a gstrtpbin internally for an RTSP source? Jono ------------------------------------------------------------------------------ SF.Net email is Sponsored by MIX09, March 18-20, 2009 in Las Vegas, Nevada. The future of the web can't happen without you. Join us at MIX09 to help pave the way to the Next Web now. Learn more and register at http://ad.doubleclick.net/clk;208669438;13503038;i?http://2009.visitmix.com/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi, gstreamer-devel:
1. Seems you should write some codes to achieve your goal because you mentioned you have other non-gstreamer codes to handle the audio data. So, you can read the description and algorithms on RTP lip-synchronization and do it yourself. "gst_rtp_bin_associate" is a good reference as well. 2. uridecodebin will select rtspsrc and autoplug other stuffs when you play a RTSP movie. I skimmed the source codes of uridecodebin and noticed there is function named "gen_source_element" and it also calls "gst_element_make_from_uri" as well. Eric Zhang 2008/12/18 Jon Burgess <[hidden email]>
------------------------------------------------------------------------------ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi, gstreamer-devel:
Also the gstrtpbin will be used by uridecodebin because gstrtpbin will be used in rtspsrc which is a source element handles RTSP medias and inherits from GstBin. Eric Zhang
2008/12/19 Eric Zhang <[hidden email]> Hi, gstreamer-devel: ------------------------------------------------------------------------------ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |