Hello,
I’m
currently developing an application to display live TV using GStreamer on a
Raspberry Pi 2B.
Therefore I
use two appsrc elements (one for video and one for audio) to connect them to
the V4L DVB demux device ‘/dev/dvb/adapter0/demux0’.
My
pipelines are:
Video
V4L DVB demux (DMX_OUT_TAP) -> appsrc !
h264parse ! v4l2h264dec ! queue ! kmssink
Audio
V4L DVB demux (DMX_OUT_TAP) -> appsrc !
mpegaudioparse ! mpg123audiodec ! queue ! alsasink
I’ve
created a thread which feeds the output of the 2 video and audio demux devices
into my appsrc elements and they feed them to both pipelines:
uint8_t vbuf[65536];
uint8_t abuf[4096];
// Read
data from demux
vrc =
read(vdfd, vbuf, sizeof(vbuf));
arc =
read(adfd, abuf, sizeof(abuf));
// Create a
new empty buffer and fill data into it
gvbuffer =
gst_buffer_new_allocate(NULL, vrc, NULL);
gst_buffer_fill(gvbuffer,
0, vbuf, vrc);
gabuffer =
gst_buffer_new_allocate(NULL, arc, NULL);
gst_buffer_fill(gabuffer,
0, abuf, arc);
// Push the
buffer into the appsrc
g_signal_emit_by_name
(((CustomData *)data)->vappsrc, "push-buffer", gvbuffer,
&vrb);
g_signal_emit_by_name
(((CustomData *)data)->aappsrc, "push-buffer", gabuffer,
&arb);
// Free the
buffers now that we are done with it
gst_buffer_unref
(gvbuffer);
gst_buffer_unref
(gabuffer);
This works
quite well and I’ve got video and audio.
The only
thing which doesn’t work as expected is that video and audio isn’t synchronous.
What am I
missing here and how to get video and audio synchronous?
Thanks for
any advice.
Best
Regards,
Joerg