How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Jake Zhang
Greetings,

I have been working on this for a while but still have no luck to get this working. My pipeline can be simplified as below: 

pipeline1:
v4l2src -> tee -> x264enc -> appsink1  
             ->  appsink2
                 
RTSP server pipeline2: 
appsrc -> omxh264enc -> rtph264pay

Pipeline 1 have other processing unit and I will need pull buffers from appsink2 and push those buffers into appsrc of RTSP server. 
My RTSP server part of implementation is very similar to test-appsrc.c of gst-rtsp-server 1.6.4. 

The buffer I got from appsink already has their own PTS and to get RTSP pipeline2 running, I have to re-stamp the buffer PTS the exactly same way as test-appsrc.c (set ctx->timestamp=0 and increment based of buffer duration) otherwise, the RTSP server will not get running. 

I have went through the gstreamer manual and I have some basic understanding about running time/ base time and stream time 
I have tried below things:
1. set the basetime of pipeline2 to be the same as pipeline1
2. set the start time of pipeline2 to be GST_CLOCK_TIME_NONE
3. make sure the two pipeline is using the same clock. 
4. I have checked the segment event on the appsrc of pipeline2 and the value is as below, and I have tried generate a new segment to appsrc sink pad but I have a hard time know how to set the right base and start value. 
//appsrc0:src segment: rate 1 format 3, start: 0:00:00.000000000, stop: 99:99:99.999999999, time: 0:00:00.000000000 base: 0:00:00.000000000

Why I have to re-stamp the PTS of the buffer?  
pipeline2 running time = clocktime - basetime of pipeline1, so pipeline2 should be able to process buffers with original PTS, right? 

How I can  retain the original buffer PTS in pipeline2? I guess segment event is the right direction to go, but like I said I have not figure out how to play with segment to let pipeline2 happy with original PTS. 

Thanks in advance for any comments here.


 


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Nicolas Dufresne-4
Le jeudi 16 juin 2016 à 17:05 -0400, Jake Zhang a écrit :

> Greetings,
>
> I have been working on this for a while but still have no luck to get
> this working. My pipeline can be simplified as below: 
>
> pipeline1:
> v4l2src -> tee -> x264enc -> appsink1  
>              ->  appsink2
>                  
> RTSP server pipeline2: 
> appsrc -> omxh264enc -> rtph264pay
>
> Pipeline 1 have other processing unit and I will need pull buffers
> from appsink2 and push those buffers into appsrc of RTSP server. 
> My RTSP server part of implementation is very similar to test-
> appsrc.c of gst-rtsp-server 1.6.4. 
>
> The buffer I got from appsink already has their own PTS and to get
> RTSP pipeline2 running, I have to re-stamp the buffer PTS the exactly
> same way as test-appsrc.c (set ctx->timestamp=0 and increment based
> of buffer duration) otherwise, the RTSP server will not get running. 
You don't, you can use push_sample(), so the segment is passed. And
then, on the first buffer, you can compute the running time of this
buffer (gst_segment_to_running_time()), and set the appsrc pad offset
to -running_time_first_pts. Careful, that if you add audio, you should
use the same offset for all stream, otherwise you will break a/v sync.

regards,
Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (188 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Jake Zhang
Hi Nicolas, setting offset on appsrc solved my issue. I really appreciate your input. 

On Sat, Jun 18, 2016 at 11:27 AM, Nicolas Dufresne <[hidden email]> wrote:
Le jeudi 16 juin 2016 à 17:05 -0400, Jake Zhang a écrit :
> Greetings,
>
> I have been working on this for a while but still have no luck to get
> this working. My pipeline can be simplified as below: 
>
> pipeline1:
> v4l2src -> tee -> x264enc -> appsink1  
>              ->  appsink2
>                  
> RTSP server pipeline2: 
> appsrc -> omxh264enc -> rtph264pay
>
> Pipeline 1 have other processing unit and I will need pull buffers
> from appsink2 and push those buffers into appsrc of RTSP server. 
> My RTSP server part of implementation is very similar to test-
> appsrc.c of gst-rtsp-server 1.6.4. 
>
> The buffer I got from appsink already has their own PTS and to get
> RTSP pipeline2 running, I have to re-stamp the buffer PTS the exactly
> same way as test-appsrc.c (set ctx->timestamp=0 and increment based
> of buffer duration) otherwise, the RTSP server will not get running. 

You don't, you can use push_sample(), so the segment is passed. And
then, on the first buffer, you can compute the running time of this
buffer (gst_segment_to_running_time()), and set the appsrc pad offset
to -running_time_first_pts. Careful, that if you add audio, you should
use the same offset for all stream, otherwise you will break a/v sync.

regards,
Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Sebastian Dröge-3
In reply to this post by Nicolas Dufresne-4
On Sa, 2016-06-18 at 11:27 -0400, Nicolas Dufresne wrote:

> You don't, you can use push_sample(), so the segment is passed. And
> then, on the first buffer, you can compute the running time of this
> buffer (gst_segment_to_running_time()), and set the appsrc pad offset
> to -running_time_first_pts. Careful, that if you add audio, you
> should use the same offset for all stream, otherwise you will break
> a/v sync.

push_sample() does not pass on the segment in appsrc. It only handles
the contained buffer and caps currently, no other information from the
sample.

--

Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (968 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Nicolas Dufresne-5
Le mardi 21 juin 2016 à 09:30 +0300, Sebastian Dröge a écrit :
> push_sample() does not pass on the segment in appsrc. It only handles
> the contained buffer and caps currently, no other information from
> the
> sample.

Sounds like a bug to me ;-P But you are probably lucky enough to have
the same segment (0 to infinity) on both sides.

Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (188 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Sebastian Dröge-3
On Di, 2016-06-21 at 11:53 -0400, Nicolas Dufresne wrote:
> Le mardi 21 juin 2016 à 09:30 +0300, Sebastian Dröge a écrit :
> > push_sample() does not pass on the segment in appsrc. It only
> > handles
> > the contained buffer and caps currently, no other information from
> > the
> > sample.
>
> Sounds like a bug to me ;-P But you are probably lucky enough to have
> the same segment (0 to infinity) on both sides.

Yes we're missing some API in general to properly configure custom
segments in basesrc. The seamless segment API is not that.

--

Sebastian Dröge, Centricular Ltd · http://www.centricular.com
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (968 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Jake Zhang
Thanks Sebastian and Nicolas. All those information are exactly what I need.  Somehow, set min/max latency on appsrc does not solve the issue. I will need more digging about this. Setting the pad running time offset solved my issue. 

 

On Tue, Jun 21, 2016 at 11:56 AM, Sebastian Dröge <[hidden email]> wrote:
On Di, 2016-06-21 at 11:53 -0400, Nicolas Dufresne wrote:
> Le mardi 21 juin 2016 à 09:30 +0300, Sebastian Dröge a écrit :
> > push_sample() does not pass on the segment in appsrc. It only
> > handles
> > the contained buffer and caps currently, no other information from
> > the
> > sample.
>
> Sounds like a bug to me ;-P But you are probably lucky enough to have
> the same segment (0 to infinity) on both sides.

Yes we're missing some API in general to properly configure custom
segments in basesrc. The seamless segment API is not that.

--

Sebastian Dröge, Centricular Ltd · http://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Jake Zhang
In reply to this post by Nicolas Dufresne-4
Hi Nicolas,

This is interesting question now. So after setting the appsrc pad offset as the first PTS, appsrc will generate a segment event with offset = - pts of first buffer. This is causing an issue if I am using mpegtsmux and rtpmp2tpay in rtsp server as mpegts mux is always using pipeline running time to re-stamp the output buffer. See the 'mpegtsmux_clip_inc_running_time()' in mpegtsmux.c. So my PTS is still being re-stamped by RTSP pipeline. Any advice? 

Here is my rtsp server part of launch string:
  launch_stream << "appsrc is-live=true name=appsrc0 do-timestamp=true block=false ! "
                  << "mpegtsmux name=rtsp_mux0 alignment=7  ! "
                  << "video/mpegts, systemstream=true, packet-size=188 !  rtpmp2tpay perfect-rtptime=false timestamp-offset=0 name=pay0";
     

 Thanks
-Jake

On Sat, Jun 18, 2016 at 11:27 AM, Nicolas Dufresne <[hidden email]> wrote:
Le jeudi 16 juin 2016 à 17:05 -0400, Jake Zhang a écrit :
> Greetings,
>
> I have been working on this for a while but still have no luck to get
> this working. My pipeline can be simplified as below: 
>
> pipeline1:
> v4l2src -> tee -> x264enc -> appsink1  
>              ->  appsink2
>                  
> RTSP server pipeline2: 
> appsrc -> omxh264enc -> rtph264pay
>
> Pipeline 1 have other processing unit and I will need pull buffers
> from appsink2 and push those buffers into appsrc of RTSP server. 
> My RTSP server part of implementation is very similar to test-
> appsrc.c of gst-rtsp-server 1.6.4. 
>
> The buffer I got from appsink already has their own PTS and to get
> RTSP pipeline2 running, I have to re-stamp the buffer PTS the exactly
> same way as test-appsrc.c (set ctx->timestamp=0 and increment based
> of buffer duration) otherwise, the RTSP server will not get running. 

You don't, you can use push_sample(), so the segment is passed. And
then, on the first buffer, you can compute the running time of this
buffer (gst_segment_to_running_time()), and set the appsrc pad offset
to -running_time_first_pts. Careful, that if you add audio, you should
use the same offset for all stream, otherwise you will break a/v sync.

regards,
Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to synchronize buffer timestamp across two gstreamer pipeline(appsink,appsrc, rtspserver)

Nicolas Dufresne-5
Le dimanche 26 juin 2016 à 19:17 -0400, Jake Zhang a écrit :
> This is interesting question now. So after setting the appsrc pad
> offset as the first PTS, appsrc will generate a segment event with
> offset = - pts of first buffer. This is causing an issue if I am
> using mpegtsmux and rtpmp2tpay in rtsp server as mpegts mux is always
> using pipeline running time to re-stamp the output buffer. See the
> 'mpegtsmux_clip_inc_running_time()' in mpegtsmux.c. So my PTS is
> still being re-stamped by RTSP pipeline. Any advice? 

All muxers transform the relative timestamp to running time. Can you
explain what issue you are having ?

>
> Here is my rtsp server part of launch string:
>   launch_stream << "appsrc is-live=true name=appsrc0 do-
> timestamp=true block=false ! "
>                   << "mpegtsmux name=rtsp_mux0 alignment=7  ! "
>                   << "video/mpegts, systemstream=true, packet-
> size=188 !  rtpmp2tpay perfect-rtptime=false timestamp-offset=0
> name=pay0";
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel