syncronizing multiple pipelines

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

syncronizing multiple pipelines

Attila RS
I'm working on a app that needs to dynamically add/remove outputs without affecting already active outputs. Below is a simple example.

1. Receive an RTP stream with audio/video and playback both.
2. Stop audio output associated with the above stream.
3. Receive a 2nd RTP stream with audio and playback audio.
4. Stop 2nd RTP stream.
5. Connect audio output back to the first RTP stream without affecting the video.

In order to be able to dynamically add/remove outputs I have separated the input (udpsrc/depay/demux), the video output (decode/vidsink) and the audio output (decode/audsink) into 3 separate pipelines and link them together with ghost pads. This seems to work well for doing the above controls, but I have 2 problems. 

1. Synchronization is a mess. It looks like each pipeline is using a different clock and as a result audio/video is out of sync. 
2. Adding a new output pipeline to a running input pipeline result in out of sync clocks, so the newly added output thinks it is too far ahead/behind and just dumps all data instead of outputting it. Disabling sync and the newly added pipeline starts playing fine, but of course it is out of sync.

I think the second problem can probably be improved or eliminated by forcing a new segment event when linking the output, but still the clocks need to be synchronized between the pipelines in order for a/v to sync correctly and playback smoothly.

Any thoughts on how to better handle these problems? Maybe forcing the output pipelines to use the input pipelines clock? Or perhaps there is a way to maintain the control I need without using 3 different pipelines? 

Thanks for the help.

Attila

------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: syncronizing multiple pipelines

Stefan Sauer
Attila RS wrote:

> I'm working on a app that needs to dynamically add/remove outputs
> without affecting already active outputs. Below is a simple example.
>
> 1. Receive an RTP stream with audio/video and playback both.
> 2. Stop audio output associated with the above stream.
> 3. Receive a 2nd RTP stream with audio and playback audio.
> 4. Stop 2nd RTP stream.
> 5. Connect audio output back to the first RTP stream without affecting
> the video.
>
> In order to be able to dynamically add/remove outputs I
> have separated the input (udpsrc/depay/demux), the video output
> (decode/vidsink) and the audio output (decode/audsink) into
> 3 separate pipelines and link them together with ghost pads. This
> seems to work well for doing the above controls, but I have 2 problems.
>
> 1. Synchronization is a mess. It looks like each pipeline is using a
> different clock and as a result audio/video is out of sync.
> 2. Adding a new output pipeline to a running input pipeline result in
> out of sync clocks, so the newly added output thinks it is too far
> ahead/behind and just dumps all data instead of outputting it.
> Disabling sync and the newly added pipeline starts playing fine, but
> of course it is out of sync.
>
> I think the second problem can probably be improved or eliminated by
> forcing a new segment event when linking the output, but still the
> clocks need to be synchronized between the pipelines in order for a/v
> to sync correctly and playback smoothly.
>
> Any thoughts on how to better handle these problems? Maybe forcing the
> output pipelines to use the input pipelines clock? Or perhaps there is
> a way to maintain the control I need without using 3 different pipelines?

You want to have one pipeline, so that you share *one* clock. You can
dynamicly link and unlink elements. Read part-blocking.txt in the source
tree under gstreamer/docs/design. You need to use pad-blocking to ensure
no dataflow is happening on the unlinked pads. If a sink is not
conencted its probably a good idea to pause it (and lock the
paused_state). If you don't use the audiosink, you need to trigger clock
re-selection (dunno how from the top of my head - maybe you can check
what playbin2 does if you unset the audio flag). When connecting a new
upstream branch to a sink, you need to resend the newsegment event to
the sink so that it 'knows' whats playing.

I know that all this is a bit complicated. Making this easier on the
framework level would be nice ...

Stefan


>
> Thanks for the help.
>
> Attila
> ------------------------------------------------------------------------
>
> ------------------------------------------------------------------------------
> Download Intel® Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> ------------------------------------------------------------------------
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>  


------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: syncronizing multiple pipelines

Attila RS


On Mon, Apr 19, 2010 at 5:21 AM, Stefan Kost <[hidden email]> wrote:
Attila RS wrote:
> I'm working on a app that needs to dynamically add/remove outputs
> without affecting already active outputs. Below is a simple example.
>
> 1. Receive an RTP stream with audio/video and playback both.
> 2. Stop audio output associated with the above stream.
> 3. Receive a 2nd RTP stream with audio and playback audio.
> 4. Stop 2nd RTP stream.
> 5. Connect audio output back to the first RTP stream without affecting
> the video.
>
> In order to be able to dynamically add/remove outputs I
> have separated the input (udpsrc/depay/demux), the video output
> (decode/vidsink) and the audio output (decode/audsink) into
> 3 separate pipelines and link them together with ghost pads. This
> seems to work well for doing the above controls, but I have 2 problems.
>
> 1. Synchronization is a mess. It looks like each pipeline is using a
> different clock and as a result audio/video is out of sync.
> 2. Adding a new output pipeline to a running input pipeline result in
> out of sync clocks, so the newly added output thinks it is too far
> ahead/behind and just dumps all data instead of outputting it.
> Disabling sync and the newly added pipeline starts playing fine, but
> of course it is out of sync.
>
> I think the second problem can probably be improved or eliminated by
> forcing a new segment event when linking the output, but still the
> clocks need to be synchronized between the pipelines in order for a/v
> to sync correctly and playback smoothly.
>
> Any thoughts on how to better handle these problems? Maybe forcing the
> output pipelines to use the input pipelines clock? Or perhaps there is
> a way to maintain the control I need without using 3 different pipelines?

You want to have one pipeline, so that you share *one* clock. You can
dynamicly link and unlink elements. Read part-blocking.txt in the source
tree under gstreamer/docs/design. You need to use pad-blocking to ensure
no dataflow is happening on the unlinked pads. If a sink is not
conencted its probably a good idea to pause it (and lock the
paused_state). If you don't use the audiosink, you need to trigger clock
re-selection (dunno how from the top of my head - maybe you can check
what playbin2 does if you unset the audio flag). When connecting a new
upstream branch to a sink, you need to resend the newsegment event to
the sink so that it 'knows' whats playing.

I guess what I will do then is instead of creating the 3 pipelines I will create 3 bins (one for the input elements, one for the audio elements and one for the video elements) and put them all in the same pipeline. That way I can still easily control the 3 components individually and they will all share the same clock. Then I'll just have to send the appropriate events when adding/removing from the pipeline. Thanks.
 
I know that all this is a bit complicated. Making this easier on the
framework level would be nice ...

Stefan


>
> Thanks for the help.
>
> Attila
> ------------------------------------------------------------------------
>
> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> ------------------------------------------------------------------------
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: syncronizing multiple pipelines

Attila RS


On Mon, Apr 19, 2010 at 9:03 AM, Attila RS <[hidden email]> wrote:


On Mon, Apr 19, 2010 at 5:21 AM, Stefan Kost <[hidden email]> wrote:
Attila RS wrote:
> I'm working on a app that needs to dynamically add/remove outputs
> without affecting already active outputs. Below is a simple example.
>
> 1. Receive an RTP stream with audio/video and playback both.
> 2. Stop audio output associated with the above stream.
> 3. Receive a 2nd RTP stream with audio and playback audio.
> 4. Stop 2nd RTP stream.
> 5. Connect audio output back to the first RTP stream without affecting
> the video.
>
> In order to be able to dynamically add/remove outputs I
> have separated the input (udpsrc/depay/demux), the video output
> (decode/vidsink) and the audio output (decode/audsink) into
> 3 separate pipelines and link them together with ghost pads. This
> seems to work well for doing the above controls, but I have 2 problems.
>
> 1. Synchronization is a mess. It looks like each pipeline is using a
> different clock and as a result audio/video is out of sync.
> 2. Adding a new output pipeline to a running input pipeline result in
> out of sync clocks, so the newly added output thinks it is too far
> ahead/behind and just dumps all data instead of outputting it.
> Disabling sync and the newly added pipeline starts playing fine, but
> of course it is out of sync.
>
> I think the second problem can probably be improved or eliminated by
> forcing a new segment event when linking the output, but still the
> clocks need to be synchronized between the pipelines in order for a/v
> to sync correctly and playback smoothly.
>
> Any thoughts on how to better handle these problems? Maybe forcing the
> output pipelines to use the input pipelines clock? Or perhaps there is
> a way to maintain the control I need without using 3 different pipelines?

You want to have one pipeline, so that you share *one* clock. You can
dynamicly link and unlink elements. Read part-blocking.txt in the source
tree under gstreamer/docs/design. You need to use pad-blocking to ensure
no dataflow is happening on the unlinked pads. If a sink is not
conencted its probably a good idea to pause it (and lock the
paused_state). If you don't use the audiosink, you need to trigger clock
re-selection (dunno how from the top of my head - maybe you can check
what playbin2 does if you unset the audio flag). When connecting a new
upstream branch to a sink, you need to resend the newsegment event to
the sink so that it 'knows' whats playing.

I guess what I will do then is instead of creating the 3 pipelines I will create 3 bins (one for the input elements, one for the audio elements and one for the video elements) and put them all in the same pipeline. That way I can still easily control the 3 components individually and they will all share the same clock. Then I'll just have to send the appropriate events when adding/removing from the pipeline. Thanks.
 
Making most of these changes was fairly straight forward and I can add remove sources as needed. 

I then re-enabled sync on the a/v sinks and started getting a lot of stuttering and dropping of frames along with "There may be a timestamping problem, or this computer is too slow" from the video sink and complaints from the audio sink as well. Also cpu usage would climb to around 90%. I looked around a bit and saw others complaining about this as well. I recently updated all of my gstreamer components to the latest revisions (0.10.23 -> 0.10.28 for gstreamer and gst-base). I went back to the old versions and didn't see this problem any more (although playback wasn't perfectly smooth it was close and there were no more errors). Anyone know which version introduced this problem and what components it is associated with? I even saw the problem with a simple mp3 decode pipeline (udpsrc -> rtpmpadepay -> mad -> alsasink).

I know that all this is a bit complicated. Making this easier on the
framework level would be nice ...

Stefan


>
> Thanks for the help.
>
> Attila
> ------------------------------------------------------------------------
>
> ------------------------------------------------------------------------------
> Download Intel&#174; Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> ------------------------------------------------------------------------
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>


------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel



------------------------------------------------------------------------------
Download Intel&#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel