How to correctly timestamp buffers in an appsrc element

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

How to correctly timestamp buffers in an appsrc element

J. Krieg
Hello,

I’m currently working on an application to display live TV using
GStreamer on a Raspberry Pi 2B.

Therefore I use two appsrc elements (one for video and one for audio)
which are reading PES packets in 2 separate threads directly from the
V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
My current test pipelines are:

Video
  V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
queue ! kmssink
Audio
  V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
mpg123audiodec ! queue ! alsasink

I managed to get this working without timestamping the buffers at all
in both appsrc elements but then video and audio isn't synchronous.

I tried to implement timestamping the buffers according to
https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
But when doing this I get slightly stuttering video and extremely
stuttering or no audio.

What I'm also struggling with is that in the link above the following is said:
"In live mode, you should timestamp the buffers with the pipeline
running-time when the first byte of the buffer was captured before
feeding them to appsrc."

But according to my tests the pipeline only changes its state from
PAUSED to PLAYING (where the clock of the pipeline is only available)
after some captured buffers have already fed into the pipeline.
So how could the buffers be timestamped with the running time the very
first time before they have been put into the pipeline which is in a
PAUSED state to get video and audio synchronous?

What am I doing wrong?
Any help or pointing in the right direction would be really appreciated.

Thanks,
Joerg

Code:
// Create a new empty buffer
gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);

// Timestamp buffer
if (((CustomData *)data)->pipelineclock) {
    pipeline_clock_time = gst_clock_get_time(((CustomData
*)data)->pipelineclock);
    pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
    GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
    GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
g_last_pipeline_running_time_a;
    g_last_pipeline_running_time_a = pipeline_running_time;
    printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
audio is in ns: %lld\n", pipeline_running_time);
} else {
    printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
available...\n");
    GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
}

// Fill data into buffer
bc = gst_buffer_fill(gbuffer, 0, buf, rc);

// Push the buffer into the appsrc
g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
gbuffer, &rb);

// Free the buffer now that we are done with it
gst_buffer_unref (gbuffer);
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to correctly timestamp buffers in an appsrc element

J. Krieg
Hello,

No ideas?
Could anyone help please?
Unfortunately I can’t figure out this by myself.

Thank you very much.

Best Regards,
Joerg

Am Fr., 20. Nov. 2020 um 15:47 Uhr schrieb J. Krieg <[hidden email]>:

>
> Hello,
>
> I’m currently working on an application to display live TV using
> GStreamer on a Raspberry Pi 2B.
>
> Therefore I use two appsrc elements (one for video and one for audio)
> which are reading PES packets in 2 separate threads directly from the
> V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
> My current test pipelines are:
>
> Video
>   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
> queue ! kmssink
> Audio
>   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
> mpg123audiodec ! queue ! alsasink
>
> I managed to get this working without timestamping the buffers at all
> in both appsrc elements but then video and audio isn't synchronous.
>
> I tried to implement timestamping the buffers according to
> https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
> But when doing this I get slightly stuttering video and extremely
> stuttering or no audio.
>
> What I'm also struggling with is that in the link above the following is said:
> "In live mode, you should timestamp the buffers with the pipeline
> running-time when the first byte of the buffer was captured before
> feeding them to appsrc."
>
> But according to my tests the pipeline only changes its state from
> PAUSED to PLAYING (where the clock of the pipeline is only available)
> after some captured buffers have already fed into the pipeline.
> So how could the buffers be timestamped with the running time the very
> first time before they have been put into the pipeline which is in a
> PAUSED state to get video and audio synchronous?
>
> What am I doing wrong?
> Any help or pointing in the right direction would be really appreciated.
>
> Thanks,
> Joerg
>
> Code:
> // Create a new empty buffer
> gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);
>
> // Timestamp buffer
> if (((CustomData *)data)->pipelineclock) {
>     pipeline_clock_time = gst_clock_get_time(((CustomData
> *)data)->pipelineclock);
>     pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
>     GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
>     GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
> g_last_pipeline_running_time_a;
>     g_last_pipeline_running_time_a = pipeline_running_time;
>     printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
> audio is in ns: %lld\n", pipeline_running_time);
> } else {
>     printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
> available...\n");
>     GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
> }
>
> // Fill data into buffer
> bc = gst_buffer_fill(gbuffer, 0, buf, rc);
>
> // Push the buffer into the appsrc
> g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
> gbuffer, &rb);
>
> // Free the buffer now that we are done with it
> gst_buffer_unref (gbuffer);
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to correctly timestamp buffers in an appsrc element

Nicolas Dufresne-5
Le jeudi 26 novembre 2020 à 21:55 +0100, J. Krieg a écrit :
> Hello,
>
> No ideas?
> Could anyone help please?
> Unfortunately I can’t figure out this by myself.

By default, appsrc uses an open segment (start=0, end=infinity). That means your
timestamp must match the running-time. The running time can be obtained like
this:

  clock = gst_pipeline_get_clock(pipeline)
  if (clock) {
  time_now = gst_clock_get_time(clock)
  rt_time = time_now - gst_element_get_base_time (GST_ELEMENT
(pipeline));
   } else {
        rt_time = GST_CLOCK_TIME_NONE; /* or 0 depending on your use case */
   }

If you have raw audio data, it might be easier to calculate the timestamp base
ont he data lenght, starting from zero. Of if your data isn't live, you might
also calculate timestamp using the framerate of a video (starting from 0 again).

>
> Thank you very much.
>
> Best Regards,
> Joerg
>
> Am Fr., 20. Nov. 2020 um 15:47 Uhr schrieb J. Krieg <[hidden email]>:
> >
> > Hello,
> >
> > I’m currently working on an application to display live TV using
> > GStreamer on a Raspberry Pi 2B.
> >
> > Therefore I use two appsrc elements (one for video and one for audio)
> > which are reading PES packets in 2 separate threads directly from the
> > V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
> > My current test pipelines are:
> >
> > Video
> >   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! h264parse ! v4l2h264dec !
> > queue ! kmssink
> > Audio
> >   V4L DVB demux (DMX_OUT_TAP) -> appsrc ! mpegaudioparse !
> > mpg123audiodec ! queue ! alsasink
> >
> > I managed to get this working without timestamping the buffers at all
> > in both appsrc elements but then video and audio isn't synchronous.
> >
> > I tried to implement timestamping the buffers according to
> > https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#inserting-data-with-appsrc
> > But when doing this I get slightly stuttering video and extremely
> > stuttering or no audio.
> >
> > What I'm also struggling with is that in the link above the following is
> > said:
> > "In live mode, you should timestamp the buffers with the pipeline
> > running-time when the first byte of the buffer was captured before
> > feeding them to appsrc."
> >
> > But according to my tests the pipeline only changes its state from
> > PAUSED to PLAYING (where the clock of the pipeline is only available)
> > after some captured buffers have already fed into the pipeline.
> > So how could the buffers be timestamped with the running time the very
> > first time before they have been put into the pipeline which is in a
> > PAUSED state to get video and audio synchronous?
> >
> > What am I doing wrong?
> > Any help or pointing in the right direction would be really appreciated.
> >
> > Thanks,
> > Joerg
> >
> > Code:
> > // Create a new empty buffer
> > gbuffer = gst_buffer_new_allocate(NULL, rc, NULL);
> >
> > // Timestamp buffer
> > if (((CustomData *)data)->pipelineclock) {
> >     pipeline_clock_time = gst_clock_get_time(((CustomData
> > *)data)->pipelineclock);
> >     pipeline_running_time = pipeline_clock_time - g_pipeline_base_time;
> >     GST_BUFFER_PTS(gbuffer) = pipeline_running_time;
> >     GST_BUFFER_DURATION(gbuffer) = pipeline_running_time -
> > g_last_pipeline_running_time_a;
> >     g_last_pipeline_running_time_a = pipeline_running_time;
> >     printf("*** DEBUG *** dmx_read_a | pipeline running timestamp for
> > audio is in ns: %lld\n", pipeline_running_time);
> > } else {
> >     printf("*** DEBUG *** dmx_read_a | Sorry, pipelineclock NOT
> > available...\n");
> >     GST_BUFFER_PTS(gbuffer) = GST_CLOCK_TIME_NONE;
> > }
> >
> > // Fill data into buffer
> > bc = gst_buffer_fill(gbuffer, 0, buf, rc);
> >
> > // Push the buffer into the appsrc
> > g_signal_emit_by_name (((CustomData *)data)->aappsrc, "push-buffer",
> > gbuffer, &rb);
> >
> > // Free the buffer now that we are done with it
> > gst_buffer_unref (gbuffer);
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to correctly timestamp buffers in an appsrc element

J. Krieg
Hello Nicolas,

Thanks for your response.

I did so as you suggested and also a step back focusing only on the
video part to get a smooth playback.
But I still get a stuttering video output.

I see the following log lines in the debug output:
0:00:03.200317080  2248   0xdf1630 DEBUG           videodecoder
gstvideodecoder.c:1575:gst_video_decoder_src_event:<vdecoder> received
event 48641, qos
0:00:03.201372809  2248   0xdf1630 DEBUG           videodecoder
gstvideodecoder.c:1476:gst_video_decoder_src_event_default:<vdecoder>
received event 48641, qos
0:00:03.202372548  2248   0xdf1630 DEBUG           videodecoder
gstvideodecoder.c:1547:gst_video_decoder_src_event_default:<vdecoder>
got QoS 0:00:01.740711589, +0:00:00.068395216, 0.77583
0:00:03.221023165  2248   0xdf1690 DEBUG           videodecoder
gstvideodecoder.c:3408:gst_video_decoder_decode_frame:<vdecoder>
decoder frame list getting long: 12 frames,possible internal leaking?
0:00:03.225905403  2248 0x74e02a60 DEBUG           videodecoder
gstvideodecoder.c:3155:gst_video_decoder_clip_and_push_buf:<vdecoder>
Dropping frame due to QoS. start:0:00:01.828595512
deadline:0:00:01.828595512 earliest_time:0:00:01.897502021
0:00:03.244647270  2248   0xdf1690 DEBUG           videodecoder
gstvideodecoder.c:3408:gst_video_decoder_decode_frame:<vdecoder>
decoder frame list getting long: 12 frames,possible internal leaking?
0:00:03.249153987  2248 0x74e02a60 DEBUG           videodecoder
gstvideodecoder.c:2667:gst_video_decoder_prepare_finish_frame:<vdecoder>
sync timestamp 0:00:01.848735183 diff -0:00:01.848735183
0:00:03.249393674  2248 0x74e02a60 DEBUG           videodecoder
gstvideodecoder.c:3155:gst_video_decoder_clip_and_push_buf:<vdecoder>
Dropping frame due to QoS. start:0:00:01.848735183
deadline:0:00:01.848735183 earliest_time:0:00:01.897502021

I'm new to GStreamer, what does 'Dropping frame due to QoS' mean?
I assume this is the cause for the stuttering video playback - how can
it be fixed?

An additional question which is unclear to me: which is the correct
Gstreamer plugin to extract the H264 PES packes provided by the appsrc
to produce an elementary stream (ES) for the decoder?
Is it h264parse?

Thanks for any help,
Joerg

Am Fr., 27. Nov. 2020 um 19:30 Uhr schrieb Nicolas Dufresne
<[hidden email]>:

>
> Le jeudi 26 novembre 2020 à 21:55 +0100, J. Krieg a écrit :
> > Hello,
> >
> > No ideas?
> > Could anyone help please?
> > Unfortunately I can’t figure out this by myself.
>
> By default, appsrc uses an open segment (start=0, end=infinity). That means your
> timestamp must match the running-time. The running time can be obtained like
> this:
>
>   clock = gst_pipeline_get_clock(pipeline)
>   if (clock) {
>         time_now = gst_clock_get_time(clock)
>         rt_time = time_now - gst_element_get_base_time (GST_ELEMENT
> (pipeline));
>    } else {
>         rt_time = GST_CLOCK_TIME_NONE; /* or 0 depending on your use case */
>    }
>
> If you have raw audio data, it might be easier to calculate the timestamp base
> ont he data lenght, starting from zero. Of if your data isn't live, you might
> also calculate timestamp using the framerate of a video (starting from 0 again).
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel