Buffering stream problem

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Buffering stream problem

arsen1y
This post was updated on .
Hello.
I have custom usb device, which outputs Mpeg video + audio stream.
Parameters are: Video 656x495, 30 fps. Audio 32kHz 128 kbps
I want to get only video from that stream and redirect it to frame buffer
(hdmi display in my case).

This device is connected to my Raspberry Pi 3.
Also my Rapsberry has hdmi display connected.

The problem is I don't understand how to properly buffer my stream and tune
latencies.

I've created pipeline:
appsrc->tsdemux->h264parse->omxh264dec->videoconvert->fbdevsink

I've created "need-data" and "enough-data" callbacks for appsrc.
In "need-data" callback I push buffer, collected from my usb device to
appsrc with "push-buffer" signal.
I've linked dynamically "tsdemux" and "h264parse"

If I'm wright, data flows through my pipeline the following way:
1) when "need-data" is called, my application pushes to appsrc data
collected from usb device.
2) Then this data flows through "tsdemux" demuxer, which outputs only Mpeg
video at it's src.
3) Then "h264parse" parses this video into frames and sends them to
"omxh264dec".
4) Decoder outputs video in x-raw format and sends it to "videoconvert",
which adapts size for my frame buffer.

But data buffers from my usb device are not equal size each time a read from
it.
So, if I'm wright, I have to use buffering, but don't understand where to
insert it in my pipeline.
And I don't understand how to tune it properly.

Can someone help me?
And correct me please if I don't understand data flow right.
Thank You




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
gstreamer-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel