Grabbing an image frame from a running pipeline on demand

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Grabbing an image frame from a running pipeline on demand

Charlie Turner
Hi all,

My high-level problem is that I'm generating a stream, and at certain
points in time, I would like to take a screenshot from the camera,
ensuring I don't drop any frames in my stream while this happens. More
details follow.

I currently have a pipeline that generates MPEG-TS streams, for
completeness it looks like this,

mpegtsmux name=mux ! multifilesink next-file=5 max-file-duration=5 location=%05d.ts \
  v4l2src device=/dev/video0 ! video/x-raw,width=1280,height=720 ! videoconvert ! \
      x264enc tune=zerolatency cabac=false speed-preset=ultrafast name=x264enc ! queue ! mux. \
  autoaudiosrc ! audioconvert ! audioresample ! lamemp3enc cbr=true target=bitrate ! \
      mpegaudioparse ! mux.

This pipeline is always running in my app. This point will be
important in a moment.

What I need to do is at an application-determined point in time, save
one frame from v4l2src device as a JPEG.

What I'd love to be able to do is just launch a separate pipeline
thusly,

v4l2src device=/dev/video0 num-buffers=1 ! jpegenc ! filesink location="thumbnail.jpg"

Unfortunately, my hardware drivers only allow me to open the video
device once (don't ask :))

So what options do I have,

   1) I first considered a data probe attached the src of my
     camera. What it does it try to map a buffer from the received
     pad, and if that is successful, saves a JPEG image using the raw
     map.data. This feels really nasty, and I doubt I'm making good
     use of Gstreamer here. The other problem with this approach is
     that I need a user-defined hook to capture a frame. I don't want
     to save every frame, so to support this case, it seems like I
     need to have a global somewhere that my app can set, such that
     when the variable is true it enables capturing a frame, and when
     false, the probe just passes the data along.
   2) Another possibility is using the dynamic pipeline support. But no
     matter how I looked at that, I couldn't convince myself I would
     not drop any frames.
   3) I considered using a combination of the 'tee' and 'valve'
      elements. Similar to option 1), it seems like there would have to
      be some global state somewhere that the user code could set to
      turn the valve on. The problem here is I just want a frame, I
      would have to figure out some way of knowing that just one frame
      got encoded, and then turn the valve off. This seems like a bad
      approach.

Could anyone give me a pointer on what a sensible approach to this
issue might be in the Gstreamer framework? I've had a good rummage
through the excellent docs, but I feel like I'm missing something.

Thanks for your time,


--
Kind regards,
Charlie Turner

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Grabbing an image frame from a running pipeline on demand

Michael Gruner
Hi Charlie

We have dealt with similar requirements on the past. After trying several approaches, we developed an open source solution that might fit your needs. Please refer to 


GstInterpipes was designed in a way that no buffers are lost while you can take your snapshot at any time. I believe in the wiki there is an example of that precise use case.

Hope it helps!
Michael

Michael Gruner <[hidden email]>
Embedded Linux and GStreamer solutions
RidgeRun Engineering

On Jun 8, 2016, at 10:02, Charlie Turner <[hidden email]> wrote:

Hi all,

My high-level problem is that I'm generating a stream, and at certain
points in time, I would like to take a screenshot from the camera,
ensuring I don't drop any frames in my stream while this happens. More
details follow.

I currently have a pipeline that generates MPEG-TS streams, for
completeness it looks like this,

mpegtsmux name=mux ! multifilesink next-file=5 max-file-duration=5 location=%05d.ts \
  v4l2src device=/dev/video0 ! video/x-raw,width=1280,height=720 ! videoconvert ! \
      x264enc tune=zerolatency cabac=false speed-preset=ultrafast name=x264enc ! queue ! mux. \
  autoaudiosrc ! audioconvert ! audioresample ! lamemp3enc cbr=true target=bitrate ! \
      mpegaudioparse ! mux.

This pipeline is always running in my app. This point will be
important in a moment.

What I need to do is at an application-determined point in time, save
one frame from v4l2src device as a JPEG.

What I'd love to be able to do is just launch a separate pipeline
thusly,

v4l2src device=/dev/video0 num-buffers=1 ! jpegenc ! filesink location="thumbnail.jpg"

Unfortunately, my hardware drivers only allow me to open the video
device once (don't ask :))

So what options do I have,

   1) I first considered a data probe attached the src of my
     camera. What it does it try to map a buffer from the received
     pad, and if that is successful, saves a JPEG image using the raw
     map.data. This feels really nasty, and I doubt I'm making good
     use of Gstreamer here. The other problem with this approach is
     that I need a user-defined hook to capture a frame. I don't want
     to save every frame, so to support this case, it seems like I
     need to have a global somewhere that my app can set, such that
     when the variable is true it enables capturing a frame, and when
     false, the probe just passes the data along.
   2) Another possibility is using the dynamic pipeline support. But no
     matter how I looked at that, I couldn't convince myself I would
     not drop any frames.
   3) I considered using a combination of the 'tee' and 'valve'
      elements. Similar to option 1), it seems like there would have to
      be some global state somewhere that the user code could set to
      turn the valve on. The problem here is I just want a frame, I
      would have to figure out some way of knowing that just one frame
      got encoded, and then turn the valve off. This seems like a bad
      approach.

Could anyone give me a pointer on what a sensible approach to this
issue might be in the Gstreamer framework? I've had a good rummage
through the excellent docs, but I feel like I'm missing something.

Thanks for your time,


--
Kind regards,
Charlie Turner
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Grabbing an image frame from a running pipeline on demand

Nicolas Dufresne-4
In reply to this post by Charlie Turner
Le mercredi 08 juin 2016 à 17:02 +0100, Charlie Turner a écrit :
>
> v4l2src device=/dev/video0 num-buffers=1 ! jpegenc ! filesink
> location="thumbnail.jpg"

You could:

v4l2src ! tee name=t
  t. ! queue ! some-pipeline
  t. ! queue ! fakesink

And another pipeline
  
  appsrc ! jpegenc ! filesink ...

At random point, you can read "last-sample" property from the fakesink,
and push that sample to the appsrc (followed with an eos).

Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Grabbing an image frame from a running pipeline on demand

Charlie Turner
Hi Michael & Nicolas,

Thanks for the pointers. The fakesink approach appeals more to me because it seems simpler, but I'm have a very elementary problem with it!

If I run a time "tee'd" pipeline like this,

 gst-launch-1.0 -e v4l2src  device=/dev/video0 ! tee name=t    \
     t. ! queue ! x264enc ! filesink location=fakesink_test.raw     \
     t. ! queue ! fakesink

I see the following output in my terminal,

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
  C-c C-chandling interrupt.             # Pressed C-c first time
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
  C-c C-chandling interrupt.            # Pressed C-c second time, something seems to have gone wrong :(
Interrupt: Stopping pipeline ...
Interrupt while waiting for EOS - stopping pipeline...
Execution ended after 0:00:03.109734115
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

And when I look at the "fakesink_test.raw" file, I see that it's zero bytes. Something appears to be getting blocked in the above minimal example.

If I remove the x264enc element, the pipeline works as I expect it to, producing a suitably large file,

$ gst-launch-1.0 -e v4l2src  device=/dev/video0 ! tee name=t       t. ! queue ! filesink location=fakesink_test.raw         t. ! queue ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
  C-c C-chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:01.653899816
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Any idea what I've done wrong in the first pipeline? I appear to be following the same pattern as the example in tee docs.

Thanks in advance,

On 8 June 2016 at 18:57, Nicolas Dufresne <[hidden email]> wrote:
Le mercredi 08 juin 2016 à 17:02 +0100, Charlie Turner a écrit :
>
> v4l2src device=/dev/video0 num-buffers=1 ! jpegenc ! filesink
> location="thumbnail.jpg"

You could:

v4l2src ! tee name=t
  t. ! queue ! some-pipeline
  t. ! queue ! fakesink

And another pipeline
  
  appsrc ! jpegenc ! filesink ...

At random point, you can read "last-sample" property from the fakesink,
and push that sample to the appsrc (followed with an eos).

Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



--
Kind regards,
Charlie Turner

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Grabbing an image frame from a running pipeline on demand

Tim Müller
On Thu, 2016-06-09 at 11:23 +0100, Charlie Turner wrote:

Hi Charlie,

If I run a time "tee'd" pipeline like this,

 gst-launch-1.0 -e v4l2src  device=/dev/video0 ! tee name=t    \
     t. ! queue ! x264enc ! filesink location=fakesink_test.raw     \
     t. ! queue ! fakesink

I see the following output in my terminal,
 (snip)
And when I look at the "fakesink_test.raw" file, I see that it's zero bytes. Something appears to be getting blocked in the above minimal example.

If I remove the x264enc element, the pipeline works as I expect it to, producing a suitably large file,

The problem is that x264enc with default settings consumes about 3 seconds of video before outputting anything, but queue's default size is only ~1 second. Which means the fakesink branch queue will run full and block, and the x264enc branch will never receive enough data for x264enc to output a buffer (and make the pipeline as a whole preroll).

You can work around this by making the queue before fakesink unlimited in size:

  ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! fakesink

or by configuring x264enc differently, e.g.

  x264enc tune=zerocopy

(other parameters will also help, this is just the easiest, but it will affect quality)

Cheers
 -Tim

-- 
Tim Müller, Centricular Ltd - http://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Grabbing an image frame from a running pipeline on demand

Charlie Turner
Thanks Tim! That got me unblocked. May I ask hopefully one final question? :)

I am now trying to add the "appsrc name=thumbnailer ! jpegenc ! filesink location=thumb.jpg " pipeline to my now working pipeline with the fakesink.

The following launch line works fine, 

$ gst-launch-1.0 -e v4l2src device=/dev/video1 ! tee name=t t. ! queue ! videoconvert ! videorate ! x264enc ! mpegtsmux ! filesink location=test_stream.ts t. ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! fakesink name=snapshotter   appsrc name=thumbnailer ! jpegenc ! filesink location=thumb.jpg
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
  C-c C-chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:03.865664400
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

But actually I'm trying to replicate this in a C program, and despite launching the exact same pipeline, it hangs indefinitely before getting into the PLAYING state (sometimes I get a libjpeg EOS error). I'll attach the whole program just in case this snippet doesn't describe enough context, but here's what I'm doing,

int main(...) {
  ...
  descr =
    g_strdup_printf ("v4l2src device=/dev/video1 ! tee name=t "
    "t. ! queue ! videoconvert ! videorate ! x264enc ! mpegtsmux ! filesink location=test_stream.ts "
    "t. ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! fakesink name=snapshotter  "
    " appsrc name=thumbnailer ! jpegenc ! filesink location=thumb.jpg ");

  printf("launch line: %s\n", descr);
  pipeline = gst_parse_launch (descr, &error);

  gst_element_set_state (pipeline, GST_STATE_PLAYING);

  /* wait until it's up and running or failed    << THIS CALL HANGS SOMETIMES >> */
  if (gst_element_get_state (pipeline, NULL, NULL, -1) == GST_STATE_CHANGE_FAILURE) {
    g_error ("Failed to go into PLAYING state");
  }

  g_print ("Running ...\n");
  g_main_loop_run (loop);
  ...
}

$ gcc fakesink_test.c $(pkg-config --cflags gtk+-3.0 gstreamer-1.0) -o fake_sink_test $(pkg-config --libs gstreamer-1.0 gtk+-3.0)
charlie:[gstreamer]$ ./fake_sink_test
launch line: v4l2src device=/dev/video1 ! tee name=t t. ! queue ! videoconvert ! videorate ! x264enc ! mpegtsmux ! filesink location=test_stream.ts t. ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! fakesink name=snapshotter   appsrc name=thumbnailer ! jpegenc ! filesink location=thumb.jpg 
// tumbleweed
  C-c C-c


I can copy n' paste my printf output labelled "launch line" above and I don't see a stall. There must be something wrong with my state management, but I can't think what else I should do.

Thanks very much for your help,


On 9 June 2016 at 11:48, Tim Müller <[hidden email]> wrote:
On Thu, 2016-06-09 at 11:23 +0100, Charlie Turner wrote:

Hi Charlie,

If I run a time "tee'd" pipeline like this,

 gst-launch-1.0 -e v4l2src  device=/dev/video0 ! tee name=t    \
     t. ! queue ! x264enc ! filesink location=fakesink_test.raw     \
     t. ! queue ! fakesink

I see the following output in my terminal,
 (snip)
And when I look at the "fakesink_test.raw" file, I see that it's zero bytes. Something appears to be getting blocked in the above minimal example.

If I remove the x264enc element, the pipeline works as I expect it to, producing a suitably large file,

The problem is that x264enc with default settings consumes about 3 seconds of video before outputting anything, but queue's default size is only ~1 second. Which means the fakesink branch queue will run full and block, and the x264enc branch will never receive enough data for x264enc to output a buffer (and make the pipeline as a whole preroll).

You can work around this by making the queue before fakesink unlimited in size:

  ! queue max-size-bytes=0 max-size-time=0 max-size-buffers=0 ! fakesink

or by configuring x264enc differently, e.g.

  x264enc tune=zerocopy

(other parameters will also help, this is just the easiest, but it will affect quality)

Cheers
 -Tim

-- 
Tim Müller, Centricular Ltd - http://www.centricular.com

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel




--
Kind regards,
Charlie Turner

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

fakesink_test.c (2K) Download Attachment