timestamps on a live h264 source

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

timestamps on a live h264 source

PALFFY Daniel

Hi,

I'm developing a gstreamer source for a raw-yuv/h264 capable video grabber
card. In raw mode, the source works fine without setting
GST_BUFFER_OFFSET, GST_BUFFER_OFFSET_END, GST_BUFFER_TIMESTAMP and
GST_BUFFER_DURATION, but for h264 live play, I can't find a working
combination.

The example pipeline looks like this:
gst-launch mysource ! "video/x-h264,framerate=25/1" ! ffdec_h264 ! xvimagesink

The card provides each frame as a separate buffer, and (in the current
configuration) I have one SPS, one PPS, one I, and 14 P-frames in a group,
each output in a separate GstBuffer;

When not setting anything, the pipeline takes all grabbed frames, but
displays only the first (or maybe first few).

If I set all the values to what i believe is correct (put a serial number
incrementing from 0 in OFFSET, OFFSET+1 in OFFSET_END, a
hardware-generated timestamp in TIMESTAMP, and 0 for SPS/PPS frames
and GST_SECOND/framerate for I/P frames in DURATION), the pipeline only
takes and displays the first four frames and then stalls.

If I count the SPS/PPS frames as normal frames, use the same duration for
them as I/P frmaes and increment the timestamp accordingly, the buffer in
my element fills slowly as the decoder takes fewer frames than produced.

When saving the stream to a file and playing back from there, everything
works fine.

What would be the correct values for the timestamps in this case? Or do I
have to implement a clock-capable element?

--
Dani
  ...and Linux for all.

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: timestamps on a live h264 source

Farkas Levente
hi,
is there anybody how has the required knowledge about gstreamer,
encoding and h264 to answer this question? since it's also big problem
for us:-(

On 09/08/2009 04:28 PM, PALFFY Daniel wrote:

>
> Hi,
>
> I'm developing a gstreamer source for a raw-yuv/h264 capable video grabber
> card. In raw mode, the source works fine without setting
> GST_BUFFER_OFFSET, GST_BUFFER_OFFSET_END, GST_BUFFER_TIMESTAMP and
> GST_BUFFER_DURATION, but for h264 live play, I can't find a working
> combination.
>
> The example pipeline looks like this:
> gst-launch mysource ! "video/x-h264,framerate=25/1" ! ffdec_h264 ! xvimagesink
>
> The card provides each frame as a separate buffer, and (in the current
> configuration) I have one SPS, one PPS, one I, and 14 P-frames in a group,
> each output in a separate GstBuffer;
>
> When not setting anything, the pipeline takes all grabbed frames, but
> displays only the first (or maybe first few).
>
> If I set all the values to what i believe is correct (put a serial number
> incrementing from 0 in OFFSET, OFFSET+1 in OFFSET_END, a
> hardware-generated timestamp in TIMESTAMP, and 0 for SPS/PPS frames
> and GST_SECOND/framerate for I/P frames in DURATION), the pipeline only
> takes and displays the first four frames and then stalls.
>
> If I count the SPS/PPS frames as normal frames, use the same duration for
> them as I/P frmaes and increment the timestamp accordingly, the buffer in
> my element fills slowly as the decoder takes fewer frames than produced.
>
> When saving the stream to a file and playing back from there, everything
> works fine.
>
> What would be the correct values for the timestamps in this case? Or do I
> have to implement a clock-capable element?



--
   Levente                               "Si vis pacem para bellum!"

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: timestamps on a live h264 source

Wim Taymans
On Thu, 2009-09-10 at 10:50 +0200, Farkas Levente wrote:
> hi,
> is there anybody how has the required knowledge about gstreamer,
> encoding and h264 to answer this question? since it's also big problem
> for us:-(

You are probably dealing with a live source and so you should use the
running-time of the pipeline to timestamp outgoing buffers.

Why don't we talk about it on IRC, much easier.

Wim

>
> On 09/08/2009 04:28 PM, PALFFY Daniel wrote:
> >
> > Hi,
> >
> > I'm developing a gstreamer source for a raw-yuv/h264 capable video grabber
> > card. In raw mode, the source works fine without setting
> > GST_BUFFER_OFFSET, GST_BUFFER_OFFSET_END, GST_BUFFER_TIMESTAMP and
> > GST_BUFFER_DURATION, but for h264 live play, I can't find a working
> > combination.
> >
> > The example pipeline looks like this:
> > gst-launch mysource ! "video/x-h264,framerate=25/1" ! ffdec_h264 ! xvimagesink
> >
> > The card provides each frame as a separate buffer, and (in the current
> > configuration) I have one SPS, one PPS, one I, and 14 P-frames in a group,
> > each output in a separate GstBuffer;
> >
> > When not setting anything, the pipeline takes all grabbed frames, but
> > displays only the first (or maybe first few).
> >
> > If I set all the values to what i believe is correct (put a serial number
> > incrementing from 0 in OFFSET, OFFSET+1 in OFFSET_END, a
> > hardware-generated timestamp in TIMESTAMP, and 0 for SPS/PPS frames
> > and GST_SECOND/framerate for I/P frames in DURATION), the pipeline only
> > takes and displays the first four frames and then stalls.
> >
> > If I count the SPS/PPS frames as normal frames, use the same duration for
> > them as I/P frmaes and increment the timestamp accordingly, the buffer in
> > my element fills slowly as the decoder takes fewer frames than produced.
> >
> > When saving the stream to a file and playing back from there, everything
> > works fine.
> >
> > What would be the correct values for the timestamps in this case? Or do I
> > have to implement a clock-capable element?
>
>
>


------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/gstreamer-devel