Re: Synchronised RTP Client

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: Synchronised RTP Client

Jonathan Thorpe
Hi All,

Sebastian - thank you so much for pointing me in the right direction with this earlier. It was very helpful in allowing me to progress with this and I have finally had a chance to put some code together.

I have written an RTSP server in Python trying to re-implement the code that was in test-netclock.c / test-netclock-client.c, however I have a few problems with synchronisation. I must be close.

My server is configured with a pipeline as follows:

My launch string for the factory on the server is as follows:

udpsrc uri=udp://127.0.0.1:6001 caps="application/x-rtp, media=(string)audio, clock-rate=(int)44100, channels=(int)2, format=(string)S16LE" ! rtpL16depay ! audioconvert ! audioresample ! opusenc bitrate=96000 inband-fec=1 ! rtpopuspay name=pay0

It captures raw audio through RTP on port 6001 and converts it to OPUS. which will be sent out to the clients (which should all play out in sync).

For this factory, I'm also setting:

factory.set_shared(True)
factory.set_retransmission_time(5000)
factory.set_clock(clock)
factory.set_profiles(GstRtsp.RTSPProfile.AVPF)

Where clock is a time provider that I want to run from the server and use that to sync the clients:

clock = Gst.SystemClock.obtain()
clock_provider = GstNet.NetTimeProvider.new(clock, None, 8555)

On my clients, where I have sync=true on my alsaaudiosink, I get about 2 seconds of audio followed by silence and a lot of this:

audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<autoaudiosink0-actual-sink-alsa> Unexpected discontinuity in audio timestamps of +1012391:15:11.969041666, resyncing

audiobasesink gstaudiobasesink.c:1512:gst_audio_base_sink_skew_slaving:<autoaudiosink0-actual-sink-alsa> correct clock skew -0:00:00.020016938 < -+0:00:00.020000000

The client comprises the following pipeline (location is the server and buffer is usually configured as 1000 or 5000)

My client contains the following:

self.pipeline = Gst.parse_launch('rtspsrc location=%s latency=%s buffer-mode=synced ntp-time-source=clock-time ntp-sync=1 do-rtcp=true ! rtpopusdepay name=pay0 ! opusdec ! audioconvert ! autoaudiosink sync=true async=false' % (src, latency))

# make a clock slaving to the network
self.clock = GstNet.NetClientClock.new('clock0', clock_ip, clock_port, 0)

# Wait for initial clock sync
self.clock.wait_for_sync(Gst.CLOCK_TIME_NONE)

self.pipeline.use_clock(self.clock)
self.pipeline.set_start_time(Gst.CLOCK_TIME_NONE)
self.pipeline.set_latency(latency * Gst.MSECOND)

self.pipeline.set_state(Gst.State.PLAYING)

# Wait until error or EOS.
self.bus = self.pipeline.get_bus()
self.bus.add_signal_watch()
self.bus.connect('message', self.bus_handler)

The intention for this set up is to:
1. Buffer 1-5 seconds of audio.
2. Within this buffer period, allow for retransmits.
3. Otherwise, keep the clients in sync with the server's clock using the GST NetClientClock.

This configuration plays audio fine if I change "sync=false" on the audio sink, but this also breaks synchronisation.

Any pointers as to what I've done wrong? I'm currently running GStreamer out of git.

Kind Regards,
Jonathan


 
Message: 5
Date: Mon, 25 Apr 2016 09:24:53 +0300
From: Sebastian Dröge <[hidden email]>
To: Discussion of the development of and with GStreamer
        <[hidden email]>
Subject: Re: Synchronised RTP Clients
Message-ID: <[hidden email]>
Content-Type: text/plain; charset="utf-8"

On Mo, 2016-04-25 at 13:19 +1000, Jonathan Thorpe wrote:

> I have clearly chosen a very complex project to get started with
> GStreamer on, but would be very grateful if someone could assist with
> helping establish pipelines (Client and Server) that will achieve
> this.

Take a look at my presentation from last year's GStreamer conference:
https://gstconf.ubicast.tv/videos/synchronised-multi-room-media-playback-and-distributed-live-media-processing-and-mixing-with-gstreamer/

Also these two example applications are basically doing exactly what
you want, just with the GStreamer netclock instead of PTP. But
exchanging that is a matter of changing a couple of lines.

https://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock.c
https://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock-client.c

The examples are using RTSP and gst-rtsp-server, but there's nothing
RTSP specific in how all this works. It's just for simplicity to make
it easier to exchange the SDP between sender and clients.

--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 949 bytes
Desc: This is a digitally signed message part
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20160425/31cca1a2/attachment-0001.sig>

------------------------------

Message: 6
Date: Mon, 25 Apr 2016 09:20:48 +0100
From: Andy Robinson <[hidden email]>
To: [hidden email]
Subject: Re: How to query the GstSegment "rate"?
Message-ID: <[hidden email]>
Content-Type: text/plain; charset=utf-8; format=flowed

On 25/04/16 07:21, Sebastian Dröge wrote:
> On Sa, 2016-04-23 at 14:55 +0100, Andy Robinson wrote:
>> I imagine there must be some way of querying the
>> pipeline for the rate once it is in paused state, but how?
>
> You could try the SEGMENT query, if something in your pipeline is
> answering it then it will contain the rate.
>
> But the bigger question is why you need to query the rate and don't
> know it already. In the end it was your code that was setting that
> exact rate via a seek :)
>
>
> Also what's the bigger picture here? Why do you want to convert from
> stream time to the scaled stream time by rate (which is not exactly the
> running time in general, but maybe you actually want the running
> time?)?
>

Even if I don't do any seek, a Segment event goes down the pipeline and
it has a rate of 0.5 - I know this from putting diagnostics in the pipeline.

The file is here if you are interested:
http://www.seventhstring.com/other2/JAttendraiShort50.mov
It plays ok in Parole Media Player on Linux, or QuickTime on Mac. It's
60 secs long and was produced, using QuickTime, by slowing down a 30 sec
clip to half speed.

When I play it in my app, I find that when I want to seek using
gst_element_seek_simple within this video I must use the pre-slowdown
times, e.g. if I want to seek to the 40th second of the 60 second video,
I must actually seek to a time of 20 secs, and also must make a similar
adjustment to the values returned by gst_element_query_position.

It seems to me that I need to get that 0.5 rate number and use it as a
multiplier when I call gst_element_seek_simple but if there is a better
way, please advise me.

Of course, maybe the file is simply erroneous, illegal, though it does
play ok in some (not all) players.

Regards,
Andy Robinson, Seventh String Software, www.seventhstring.com


------------------------------

Message: 7
Date: Mon, 25 Apr 2016 08:28:23 +0200
From: marc dingemans <[hidden email]>
To: [hidden email]
Subject: muliple gst_element_seek for creating new file
Message-ID:
        <CAN2V=[hidden email]>
Content-Type: text/plain; charset="utf-8"

Hi

I want to extract some video parts from a mkv file and create based on
these parts one new mkv file.

This is done with muliple gst_element_seek calls with a time frame.
When a eos is received, then a the next gst_element_seek is started until
done.
The destination file is generated, but can not be played with VLC.
The seek is done executed on element "matroskademux" (also used pipeline)
with GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_SNAP_BEFORE properties

Used pipeline :
filesrc| matroskademux| h264parse | matroskamux | filesink


When I use next pipeline, then everything works fine.
filesrc| matroskademux| h264parse | avdec_h264| autovideosink


Why does the seek seems work to gui and not to file.
Must I use other approaches to put parts from a file in a new one

Thanks for your help
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20160425/407be008/attachment.html>

------------------------------

Subject: Digest Footer

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


------------------------------

End of gstreamer-devel Digest, Vol 63, Issue 76
***********************************************


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Synchronised RTP Client

Jonathan Thorpe
Doh! Looks like I might have missed this:

In my factory subclass:

self.set_media_gtype(MyRTSPMediaType)


Which should reference:

class MyRTSPMediaType(GstRtspServer.RTSPMedia):
   def do_setup_rtpbin(self, rtpbin):
      rtpbin.set_property('ntp-time-source', 'clock-time')
      return(rtpbin)

And suddenly my timestamp issues seem to go away :)

On 29 May 2016 at 23:07, Jonathan Thorpe <[hidden email]> wrote:
Hi All,

Sebastian - thank you so much for pointing me in the right direction with this earlier. It was very helpful in allowing me to progress with this and I have finally had a chance to put some code together.

I have written an RTSP server in Python trying to re-implement the code that was in test-netclock.c / test-netclock-client.c, however I have a few problems with synchronisation. I must be close.

My server is configured with a pipeline as follows:

My launch string for the factory on the server is as follows:

udpsrc uri=udp://127.0.0.1:6001 caps="application/x-rtp, media=(string)audio, clock-rate=(int)44100, channels=(int)2, format=(string)S16LE" ! rtpL16depay ! audioconvert ! audioresample ! opusenc bitrate=96000 inband-fec=1 ! rtpopuspay name=pay0

It captures raw audio through RTP on port 6001 and converts it to OPUS. which will be sent out to the clients (which should all play out in sync).

For this factory, I'm also setting:

factory.set_shared(True)
factory.set_retransmission_time(5000)
factory.set_clock(clock)
factory.set_profiles(GstRtsp.RTSPProfile.AVPF)

Where clock is a time provider that I want to run from the server and use that to sync the clients:

clock = Gst.SystemClock.obtain()
clock_provider = GstNet.NetTimeProvider.new(clock, None, 8555)

On my clients, where I have sync=true on my alsaaudiosink, I get about 2 seconds of audio followed by silence and a lot of this:

audiobasesink gstaudiobasesink.c:1807:gst_audio_base_sink_get_alignment:<autoaudiosink0-actual-sink-alsa> Unexpected discontinuity in audio timestamps of +1012391:15:11.969041666, resyncing

audiobasesink gstaudiobasesink.c:1512:gst_audio_base_sink_skew_slaving:<autoaudiosink0-actual-sink-alsa> correct clock skew -0:00:00.020016938 < -+0:00:00.020000000

The client comprises the following pipeline (location is the server and buffer is usually configured as 1000 or 5000)

My client contains the following:

self.pipeline = Gst.parse_launch('rtspsrc location=%s latency=%s buffer-mode=synced ntp-time-source=clock-time ntp-sync=1 do-rtcp=true ! rtpopusdepay name=pay0 ! opusdec ! audioconvert ! autoaudiosink sync=true async=false' % (src, latency))

# make a clock slaving to the network
self.clock = GstNet.NetClientClock.new('clock0', clock_ip, clock_port, 0)

# Wait for initial clock sync
self.clock.wait_for_sync(Gst.CLOCK_TIME_NONE)

self.pipeline.use_clock(self.clock)
self.pipeline.set_start_time(Gst.CLOCK_TIME_NONE)
self.pipeline.set_latency(latency * Gst.MSECOND)

self.pipeline.set_state(Gst.State.PLAYING)

# Wait until error or EOS.
self.bus = self.pipeline.get_bus()
self.bus.add_signal_watch()
self.bus.connect('message', self.bus_handler)

The intention for this set up is to:
1. Buffer 1-5 seconds of audio.
2. Within this buffer period, allow for retransmits.
3. Otherwise, keep the clients in sync with the server's clock using the GST NetClientClock.

This configuration plays audio fine if I change "sync=false" on the audio sink, but this also breaks synchronisation.

Any pointers as to what I've done wrong? I'm currently running GStreamer out of git.

Kind Regards,
Jonathan


 
Message: 5
Date: Mon, 25 Apr 2016 09:24:53 +0300
From: Sebastian Dröge <[hidden email]>
To: Discussion of the development of and with GStreamer
        <[hidden email]>
Subject: Re: Synchronised RTP Clients
Message-ID: <[hidden email]>
Content-Type: text/plain; charset="utf-8"

On Mo, 2016-04-25 at 13:19 +1000, Jonathan Thorpe wrote:

> I have clearly chosen a very complex project to get started with
> GStreamer on, but would be very grateful if someone could assist with
> helping establish pipelines (Client and Server) that will achieve
> this.

Take a look at my presentation from last year's GStreamer conference:
https://gstconf.ubicast.tv/videos/synchronised-multi-room-media-playback-and-distributed-live-media-processing-and-mixing-with-gstreamer/

Also these two example applications are basically doing exactly what
you want, just with the GStreamer netclock instead of PTP. But
exchanging that is a matter of changing a couple of lines.

https://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock.c
https://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-netclock-client.c

The examples are using RTSP and gst-rtsp-server, but there's nothing
RTSP specific in how all this works. It's just for simplicity to make
it easier to exchange the SDP between sender and clients.

--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 949 bytes
Desc: This is a digitally signed message part
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20160425/31cca1a2/attachment-0001.sig>

------------------------------

Message: 6
Date: Mon, 25 Apr 2016 09:20:48 +0100
From: Andy Robinson <[hidden email]>
To: [hidden email]
Subject: Re: How to query the GstSegment "rate"?
Message-ID: <[hidden email]>
Content-Type: text/plain; charset=utf-8; format=flowed

On 25/04/16 07:21, Sebastian Dröge wrote:
> On Sa, 2016-04-23 at 14:55 +0100, Andy Robinson wrote:
>> I imagine there must be some way of querying the
>> pipeline for the rate once it is in paused state, but how?
>
> You could try the SEGMENT query, if something in your pipeline is
> answering it then it will contain the rate.
>
> But the bigger question is why you need to query the rate and don't
> know it already. In the end it was your code that was setting that
> exact rate via a seek :)
>
>
> Also what's the bigger picture here? Why do you want to convert from
> stream time to the scaled stream time by rate (which is not exactly the
> running time in general, but maybe you actually want the running
> time?)?
>

Even if I don't do any seek, a Segment event goes down the pipeline and
it has a rate of 0.5 - I know this from putting diagnostics in the pipeline.

The file is here if you are interested:
http://www.seventhstring.com/other2/JAttendraiShort50.mov
It plays ok in Parole Media Player on Linux, or QuickTime on Mac. It's
60 secs long and was produced, using QuickTime, by slowing down a 30 sec
clip to half speed.

When I play it in my app, I find that when I want to seek using
gst_element_seek_simple within this video I must use the pre-slowdown
times, e.g. if I want to seek to the 40th second of the 60 second video,
I must actually seek to a time of 20 secs, and also must make a similar
adjustment to the values returned by gst_element_query_position.

It seems to me that I need to get that 0.5 rate number and use it as a
multiplier when I call gst_element_seek_simple but if there is a better
way, please advise me.

Of course, maybe the file is simply erroneous, illegal, though it does
play ok in some (not all) players.

Regards,
Andy Robinson, Seventh String Software, www.seventhstring.com


------------------------------

Message: 7
Date: Mon, 25 Apr 2016 08:28:23 +0200
From: marc dingemans <[hidden email]>
To: [hidden email]
Subject: muliple gst_element_seek for creating new file
Message-ID:
        <CAN2V=[hidden email]>
Content-Type: text/plain; charset="utf-8"

Hi

I want to extract some video parts from a mkv file and create based on
these parts one new mkv file.

This is done with muliple gst_element_seek calls with a time frame.
When a eos is received, then a the next gst_element_seek is started until
done.
The destination file is generated, but can not be played with VLC.
The seek is done executed on element "matroskademux" (also used pipeline)
with GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_SNAP_BEFORE properties

Used pipeline :
filesrc| matroskademux| h264parse | matroskamux | filesink


When I use next pipeline, then everything works fine.
filesrc| matroskademux| h264parse | avdec_h264| autovideosink


Why does the seek seems work to gui and not to file.
Must I use other approaches to put parts from a file in a new one

Thanks for your help
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20160425/407be008/attachment.html>

------------------------------

Subject: Digest Footer

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


------------------------------

End of gstreamer-devel Digest, Vol 63, Issue 76
***********************************************



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Synchronised RTP Client

Sebastian Dröge-3
On Mo, 2016-05-30 at 19:08 +1000, Jonathan Thorpe wrote:

> Doh! Looks like I might have missed this:
>
> In my factory subclass:
> self.set_media_gtype(MyRTSPMediaType)
>
> Which should reference:
>
> class MyRTSPMediaType(GstRtspServer.RTSPMedia):
>    def do_setup_rtpbin(self, rtpbin):
>       rtpbin.set_property('ntp-time-source', 'clock-time')
>       return(rtpbin)
>
> And suddenly my timestamp issues seem to go away :)
Good to hear that it works for you now :)

Is the synchronization also working accurate enough for you?

--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (968 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Synchronised RTP Client

Jonathan Thorpe
Hi Sebastian,

Not testing it too scientifically, it appears to be performing well. Using the NetClock seems to be more than sufficient - the next step was to look at P2P, but I'll see how I go with this for the moment.

Kind Regards,
Jonathan

> And suddenly my timestamp issues seem to go away :)

Good to hear that it works for you now :)

Is the synchronization also working accurate enough for you?

--
Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


signature.asc (968 bytes) <http://gstreamer-devel.966125.n4.nabble.com/attachment/4677803/0/signature.asc>