Decode AVPacket

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Decode AVPacket

fre deric
Hi,
my goal is to demux video in libav and decode it in GStreamer.

My approach is to take AVPacket from the video stream in the first thread
and send it to GStreamer pipeline in the second thread. Important parts of
code are here:

-- THREAD 1 --
// Take data from AVPacket
img_data = (guchar *)packet.data;
size = packet.size;
// Create gst buffer
buffer = gst_buffer_new_allocate(NULL, size, NULL);
gst_buffer_map(buffer, &map, GST_MAP_WRITE);
memcpy((guchar *)map.data, img_data, gst_buffer_get_size(buffer));
map.size = size;
gst_buffer_unmap(buffer, &map);
// Send the buffer to appsrc element in the gst pipeline.
gstret = gst_app_src_push_buffer((GstAppSrc *)app_source, buffer);

-- THREAD 2 --
// A video cap for appsrc element
const gchar *video_caps = "video/x-theora, width=1920, height=1080,
framerate=30/1";
// pipeline
string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! theoradec !
videoconvert ! autovideosink", video_caps);

However I got this error:
"
ERROR from element theoradec0: Could not decode stream.
Debugging info: gsttheoradec.c(812): theora_handle_data_packet ():
/GstPipeline:pipeline0/GstTheoraDec:theoradec0: no header sent yet
"
where theoradec0 is name for theoradec element.

Why theoradec element is not able to decode the video stream? Does "no
header sent yet" mean, that I am sending data to the pipeline in wrong
format?

Tested video file:
container: ogg
codec: Theora
dim: 1920x1080
framerate: 30fps


I also tested a version, when the AVPacket was decoded to AVFrame by libav
and then sent to the gstreamer pipeline and it works:

-- THREAD 1 --
// Decode.
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
img_data = (guchar *)pFrame->data
// same way as before.

-- THREAD 2 --
const gchar *video_caps = "video/x-raw, format=BGR, width=1920, height=1080,
framerate=30/1";
string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! videoconvert
! autovideosink", video_caps);

Thank you.





--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Decode AVPacket

Edward Hervey-4
Hi,

  Any reason why you don't just use the existing demuxer gstreamer elements ?
This will solve a ton of issues.

  BR,

    Edward

On Wed, 2020-12-30 at 12:17 -0600, fre deric wrote:

> Hi,
> my goal is to demux video in libav and decode it in GStreamer.
>
> My approach is to take AVPacket from the video stream in the first thread
> and send it to GStreamer pipeline in the second thread. Important parts of
> code are here:
>
> -- THREAD 1 --
> // Take data from AVPacket
> img_data = (guchar *)packet.data;
> size = packet.size;
> // Create gst buffer
> buffer = gst_buffer_new_allocate(NULL, size, NULL);
> gst_buffer_map(buffer, &map, GST_MAP_WRITE);
> memcpy((guchar *)map.data, img_data, gst_buffer_get_size(buffer));
> map.size = size;
> gst_buffer_unmap(buffer, &map);
> // Send the buffer to appsrc element in the gst pipeline.
> gstret = gst_app_src_push_buffer((GstAppSrc *)app_source, buffer);
>
> -- THREAD 2 --
> // A video cap for appsrc element
> const gchar *video_caps = "video/x-theora, width=1920, height=1080,
> framerate=30/1";
> // pipeline
> string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! theoradec !
> videoconvert ! autovideosink", video_caps);
>
> However I got this error:
> "
> ERROR from element theoradec0: Could not decode stream.
> Debugging info: gsttheoradec.c(812): theora_handle_data_packet ():
> /GstPipeline:pipeline0/GstTheoraDec:theoradec0: no header sent yet
> "
> where theoradec0 is name for theoradec element.
>
> Why theoradec element is not able to decode the video stream? Does "no
> header sent yet" mean, that I am sending data to the pipeline in wrong
> format?
>
> Tested video file:
> container: ogg
> codec: Theora
> dim: 1920x1080
> framerate: 30fps
>
>
> I also tested a version, when the AVPacket was decoded to AVFrame by libav
> and then sent to the gstreamer pipeline and it works:
>
> -- THREAD 1 --
> // Decode.
> avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
> img_data = (guchar *)pFrame->data
> // same way as before.
>
> -- THREAD 2 --
> const gchar *video_caps = "video/x-raw, format=BGR, width=1920, height=1080,
> framerate=30/1";
> string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! videoconvert
> ! autovideosink", video_caps);
>
> Thank you.
>
>
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Decode AVPacket

Sebastian Dröge-3
In reply to this post by fre deric
On Wed, 2020-12-30 at 12:17 -0600, fre deric wrote:
Hi,
my goal is to demux video in libav and decode it in GStreamer.

My approach is to take AVPacket from the video stream in the first thread
and send it to GStreamer pipeline in the second thread. Important parts of
code are here: [...]

Apart from what Edward replied already, the code in gst-libav should be helpful to you.
Specifically everything in this file:

-- 
Sebastian Dröge, Centricular Ltd · https://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Decode AVPacket

fre deric
In reply to this post by Edward Hervey-4
Hi,
I need to read a video file by libav.
So you suggest to demux the video stream in gstreamer pipeline. It means to
have pipeline like this:

appsrc ! decodebin ! videoconvert ! autovideosink
or
appsrc ! oggdemux ! theoradec ! videoconvert ! autovideosink

However, it also means that I can not send AVPacket. Should I send AVStream?
How to split AVStream? And how to set appsrc caps for such data? Thank you
for your time.




--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Decode AVPacket

fre deric
In reply to this post by Sebastian Dröge-3
Thank you!
Gst-libav helped me a lot. The final problem was in the video caps, because
it also required a "codec_data" parameter there.



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel