Administrator
|
| queue | rtph264depay | h264parse | avdec_h264 | d3dvideosink sync=false
rtspsrc | queue | rtpjitterbuffer | rtpmp4gdepay | aacpars | faad | directsound sync=false I am capturing the video and audio but when I see the capture the audio is behind about a second from the video. I have done the following scenarios: - put sync = true on both audio and video but didn't seem to work - put rtpjitterbuffer on the video side and didn't seem to work. - tried fiddling with the latency on the rtpjitterbuffers it doesn't work the only thing I could think of is to use the GST_BUFFER_TIMESTAMP. I am trying to figure out how I could do this with this pipeline. Any ideas?
------------------------------
Gstreamer 1.16.2 ------------------------------ Windows |
Administrator
|
I used the sync=true on both audio and video and used the jitter buffer but seems to be a second late. the audio and video are synchronized.
------------------------------
Gstreamer 1.16.2 ------------------------------ Windows |
In reply to this post by killerrats
On Fri, 2017-06-16 at 09:02 -0700, killerrats wrote:
> | queue | rtph264depay | h264parse | avdec_h264 | d3dvideosink > sync=false > rtspsrc > | queue | rtpjitterbuffer | rtpmp4gdepay | aacpars | faad | > directsound sync=false > > I am capturing the video and audio but when I see the capture the audio is > behind about a second from the video. I have done the following scenarios: > > - put sync = true on both audio and video but didn't seem to work > - put rtpjitterbuffer on the video side and didn't seem to work. > - tried fiddling with the latency on the rtpjitterbuffers it doesn't work > > the only thing I could think of is to use the GST_BUFFER_TIMESTAMP. I am > trying to figure out how I could do this with this pipeline. Any ideas? packets come? Generally, without sync=true you have no chance of any synchronization at all as the name suggests. If that doesn't work, you have to debug that first. Then for synchronizing two RTP streams relative to each other the easiest way is to use RTCP. There is no information in separate RTP streams that tells you how they should be synchronized to each others (the RTP timestamps have random offsets), and RTCP allows you to get the relation between the RTP timestamps of the two streams (basically allows you to get the offset). This could be done by using rtpbin (which also already includes the rtpjitterbuffer) on sender and receiver side. It allows you to create RTCP for all streams passing through it, and by default synchronizes the streams to each other. Alternatively there are also various other ways to synchronize RTP streams, but that then all depends on the details of your setup. -- Sebastian Dröge, Centricular Ltd · http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (981 bytes) Download Attachment |
Free forum by Nabble | Edit this page |