How to capture a yuv frame from h.264 stream

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

How to capture a yuv frame from h.264 stream

glenne
I am successfully receiving a stream on android via the following config to
hopefully get minimal latency:

rtspsrc location=rtsp://10.0.1.155:8554/test latency=0 ! rtpjitterbuffer
drop-on-latency=false latency=1 ! rtph264depay ! avdec_h264 ! videoconvert !
autovideosink sync=false

1. I would like to grab a yuv frame periodically to do some AR processing.
What is the best way to grab a frame for processing?

2. Do these options make sense and are there suggestions for improvement?
Do hw h.264 encoders automatically get utilized if present?


--
Glenn



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to capture a yuv frame from h.264 stream

Michael MacIntosh
> 1. I would like to grab a yuv frame periodically to do some AR processing.
> What is the best way to grab a frame for processing?
Most sinks have a property called "last-sample" however autovidoesink is
a bin, and you would need to iterate through it to find the actual sink
at runtime.

For instance, you might use:

GstIterator *sink_iterator = gst_bin_iterate_sinks(GST_BIN(your_pipeline));
GValue item = G_VALUE_INIT;
while (gst_iterator_next (sink_iterator, &item) == GST_ITERATOR_OK)
{
      // you only have one sink here
}

To get the actual video sink you are using, then you can use
g_object_get(sink, "last-sample", &samp, NULL); to get your sample for
processing.

> 2. Do these options make sense and are there suggestions for improvement?
> Do hw h.264 encoders automatically get utilized if present?

Hardware h264 encoders do not get automatically used, as far as I know
there are no bins that autoplug encoders (I could be wrong on this).
However, if you use decodebin, and want to use hardware h264 decoders,
they should autoplug, because hardware decoders generally have a high
merit (if they are working).

Hope that helps!


On 11/1/2017 12:11 PM, glenne wrote:

> I am successfully receiving a stream on android via the following config to
> hopefully get minimal latency:
>
> rtspsrc location=rtsp://10.0.1.155:8554/test latency=0 ! rtpjitterbuffer
> drop-on-latency=false latency=1 ! rtph264depay ! avdec_h264 ! videoconvert !
> autovideosink sync=false
>
> 1. I would like to grab a yuv frame periodically to do some AR processing.
> What is the best way to grab a frame for processing?
>
> 2. Do these options make sense and are there suggestions for improvement?
> Do hw h.264 encoders automatically get utilized if present?
>
>
> --
> Glenn
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to capture a yuv frame from h.264 stream

glenne
This is very helpful.  I will look into the enable-last-buffer option of
autovideosink.  Is use of appsink an option for this as well?

Regarding hardware decoding, are you suggesting that replacing 'rtph264depay
! avdec_h264' with 'decodebin' would use hardware in some cases?  Is there
additional configuration needed by me for this to happen?



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to capture a yuv frame from h.264 stream

glenne
In reply to this post by Michael MacIntosh
The iteration code ends up with an item of type GValue.  How do I take that
and end up with a sink reference I can pass to g_object_get?



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to capture a yuv frame from h.264 stream

glenne
Here is my current iteration code and an attempt to get an image.  My stream
is "videotestsrc ! autovideosink enable-last-buffer=true"

Unfortunately, I always get NULL for from_sample after calling g_object_get.
Do I have an error in my
use of item or does autovideosink not pass on the enable-last-buffer setting
to the sink?


static void take_pic(CustomData *data) {
    GstIterator *sink_iterator =
gst_bin_iterate_sinks(GST_BIN(data->pipeline));
    GValue item = G_VALUE_INIT;
    while (gst_iterator_next (sink_iterator, &item) == GST_ITERATOR_OK)
    {
    }

    GstElement *sink = GST_ELEMENT(g_value_get_object (&item));
    g_object_get (sink, "last-sample", &from_sample, (char*)NULL);
    if (from_sample == NULL) {
        GST_ERROR ("Error getting last sample form sink");
        return;
    }

  GError *err = NULL;
  GstCaps *caps = gst_caps_from_string ("image/jpeg");
  GstSample *to_sample = gst_video_convert_sample (from_sample, caps,
GST_CLOCK_TIME_NONE, &err);

    if (to_sample == NULL) {
        GST_ERROR ("Error converting video sample");
        return;
    }



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: How to capture a yuv frame from h.264 stream

Michael MacIntosh
Hello,

You could try changing your while loop to an if and wrapping your
last-sample code in that, you could be getting a null item because you
could be getting GST_ITERATOR_DONE. You could verify if sink is null as
well.  Are you getting any error output?

As far as I know, autovideosink doesn't have an enable-last-buffer
property (just looking at it through gst-inspect).  However
enable-last-sample is default true for anything inheriting from base
sink (unless I am mistaken).

Cheers,
Michael.


On 11/6/2017 4:29 PM, glenne wrote:

> Here is my current iteration code and an attempt to get an image.  My stream
> is "videotestsrc ! autovideosink enable-last-buffer=true"
>
> Unfortunately, I always get NULL for from_sample after calling g_object_get.
> Do I have an error in my
> use of item or does autovideosink not pass on the enable-last-buffer setting
> to the sink?
>
>
> static void take_pic(CustomData *data) {
>      GstIterator *sink_iterator =
> gst_bin_iterate_sinks(GST_BIN(data->pipeline));
>      GValue item = G_VALUE_INIT;
>      while (gst_iterator_next (sink_iterator, &item) == GST_ITERATOR_OK)
>      {
>      }
>
>      GstElement *sink = GST_ELEMENT(g_value_get_object (&item));
>      g_object_get (sink, "last-sample", &from_sample, (char*)NULL);
>      if (from_sample == NULL) {
>          GST_ERROR ("Error getting last sample form sink");
>          return;
>      }
>
>    GError *err = NULL;
>    GstCaps *caps = gst_caps_from_string ("image/jpeg");
>    GstSample *to_sample = gst_video_convert_sample (from_sample, caps,
> GST_CLOCK_TIME_NONE, &err);
>
>      if (to_sample == NULL) {
>          GST_ERROR ("Error converting video sample");
>          return;
>      }
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel