Hi,
I am trying to synchronize two pipelines connected via
proxysrc/proxysink.
Pipeline #1 reads wav file and sends it to network as RTP stream. It
looks likes this. I have omitted some conversion elements, RTCP sockets
and part used for receiving RTP stream:
filesrc ! wavparse ! tee ! queue ! mulawenc ! rtppcmupay ! rtpbin !
udpsink
When I create this pipeline, I also create 2nd sink for tee and ghost
pad for it. I do not connect this sink anywhere at this time.
About 1 second later I create 2nd pipeline. It should receive RTP
stream, interleave it with another stream coming from 2nd tee output in
1st pipeline, and save result to another wav file. Receiving part of
this pipeline looks like this. I have omitted some conversion elements,
RTCP sockets and part used for sending RTP too:
udpsrc ! rtpbin ! decodebin ! queue ! audiointerleave ! queue ! wavenc !
filesink sync=true
I connect audiointerleave to both decodebin and tee (via
proxysrc/proxysink/queue) after decodebin creates src pad.
For both pipelines I set clock to system one and base time to 0, like in
example in proxysrc element description. However for some reason voice
in both channels in recorded file is shifted by ~950ms. I tried to
create 2nd pipeline about 2 secs after 1st one, and this delay also
increased to ~2 secs. However real delay (measured with help of
Wireshark) is ~35ms. So clearly some synchronization here is missing.
What I have to change to make it working as expected?
Regards,
Daniel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel