encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

classic Classic list List threaded Threaded
8 messages Options
Reply | Threaded
Open this post in threaded view
|

encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

constantine.elster
Hi devs,

I'm trying to construct a pipeline that captures frames from a USB camera (YUV) and encodes them with HW encoder and saves into a file. My setup is iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.

When I try a sw encoder, it works okay albeit I get very high 100% CPU usage. The working pipeline based on software plugins:
gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1, colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink location=aha.mp4

My attempt to replace sw based plugins by HW based with efficient memory management:
gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1, colorimetry=bt709" ! v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-mode=dmabuf-import ! mp4mux ! filesink location=aha.mp4

I get the following error: "WARNING: erroneous pipeline: could not link v4l2h264enc0 to mp4mux0"

Would appreciate any ideas how to understand what's wrong, how to debug and make it work.

Thank you very much,
  -- Constantine.



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

Milian Wolff
On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:

> Hi devs,
>
> I'm trying to construct a pipeline that captures frames from a USB camera
> (YUV) and encodes them with HW encoder and saves into a file. My setup is
> iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
>
> When I try a sw encoder, it works okay albeit I get very high 100% CPU
> usage. The working pipeline based on software plugins:
> gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> location=aha.mp4
>
> My attempt to replace sw based plugins by HW based with efficient memory
> management:
> gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> colorimetry=bt709" ! *v4l2video8convert* *output-io-mode=dmabuf-import* !
> *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> location=aha.mp4
>
> I get the following error: "WARNING: erroneous pipeline: could not link
> v4l2h264enc0 to mp4mux0"
>
> Would appreciate any ideas how to understand what's wrong, how to debug and
> make it work.
To debug, I suggest you compare the SRC of `gst-inspect-1.0 v4l2h264enc` with
the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able to fix
the issue by adding a `h264parse` element in the middle to fix the alignment
since `mp4mux` requires `au` alignment, whereas the encoder may output `nal`
frames?

Good luck

--
Milian Wolff | [hidden email] | Senior Software Engineer
KDAB (Deutschland) GmbH, a KDAB Group company
Tel: +49-30-521325470
KDAB - The Qt, C++ and OpenGL Experts
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

smime.p7s (5K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

constantine.elster
Thank you very much Milian! Looks better with h264parse. Still getting an error, though now different one.

I added h264parse to the pipeline:
gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=bt709" ! v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-mode=dmabuf-import ! h264parse ! mp4mux ! filesink location=aha.mp4

Output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:src: caps = video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)I420, width=(int)640, height=(int)480, colorimetry=(string)bt709
/GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
Redistribute latency...
/GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)I420, width=(int)640, height=(int)480, colorimetry=(string)bt709
/GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason error (-5)

Execution ended after 0:00:00.436501205
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Any hint on debugging would be appreciated!
Thank you,
  -- Constantine.





On Sun, Jan 19, 2020 at 2:13 PM Milian Wolff <[hidden email]> wrote:
On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:
> Hi devs,
>
> I'm trying to construct a pipeline that captures frames from a USB camera
> (YUV) and encodes them with HW encoder and saves into a file. My setup is
> iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
>
> When I try a sw encoder, it works okay albeit I get very high 100% CPU
> usage. The working pipeline based on software plugins:
> gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> location=aha.mp4
>
> My attempt to replace sw based plugins by HW based with efficient memory
> management:
> gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> colorimetry=bt709" ! *v4l2video8convert* *output-io-mode=dmabuf-import* !
> *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> location=aha.mp4
>
> I get the following error: "WARNING: erroneous pipeline: could not link
> v4l2h264enc0 to mp4mux0"
>
> Would appreciate any ideas how to understand what's wrong, how to debug and
> make it work.

To debug, I suggest you compare the SRC of `gst-inspect-1.0 v4l2h264enc` with
the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able to fix
the issue by adding a `h264parse` element in the middle to fix the alignment
since `mp4mux` requires `au` alignment, whereas the encoder may output `nal`
frames?

Good luck

--
Milian Wolff | [hidden email] | Senior Software Engineer
KDAB (Deutschland) GmbH, a KDAB Group company
Tel: +49-30-521325470
KDAB - The Qt, C++ and OpenGL Experts


--

Director, Software Engineering
www.valerann.com

       

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

Nicolas Dufresne-5
In reply to this post by constantine.elster


Le dim. 19 janv. 2020 06 h 15, Constantine Elster <[hidden email]> a écrit :
Hi devs,

I'm trying to construct a pipeline that captures frames from a USB camera (YUV) and encodes them with HW encoder and saves into a file. My setup is iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.

When I try a sw encoder, it works okay albeit I get very high 100% CPU usage. The working pipeline based on software plugins:
gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1, colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink location=aha.mp4

My attempt to replace sw based plugins by HW based with efficient memory management:
gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1, colorimetry=bt709" ! v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-mode=dmabuf-import ! mp4mux ! filesink location=aha.mp4

I get the following error: "WARNING: erroneous pipeline: could not link v4l2h264enc0 to mp4mux0"

While reading this, I was worried you had hit some much more serious issue. Short story, you need h264parse between the encoder and the muxer.

The HW encoder produces H264 using Annex B format (with start code), but ISOMP4 require AVCc format, h264parse element can negotiate and convert this for you.


Would appreciate any ideas how to understand what's wrong, how to debug and make it work.

Thank you very much,
  -- Constantine.


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

Nicolas Dufresne-5
In reply to this post by constantine.elster
Le dimanche 19 janvier 2020 à 15:12 +0200, Constantine Elster a écrit :

> Thank you very much Milian! Looks better with h264parse. Still getting an error, though now different one.
>
> I added h264parse to the pipeline:
> gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=bt709" ! v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-mode=dmabuf-import ! h264parse ! mp4mux ! filesink location=aha.mp4
>
> Output:
> Setting pipeline to PAUSED ...
> Pipeline is live and does not need PREROLL ...
> Setting pipeline to PLAYING ...
> New clock: GstSystemClock
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:src: caps = video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)I420, width=(int)640, height=(int)480, colorimetry=(string)bt709
> /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
> /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
> Redistribute latency...
> /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)I420, width=(int)640, height=(int)480, colorimetry=(string)bt709
> /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
> Additional debug info:
> gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> streaming stopped, reason error (-5)

Looks like the real error was not propagated (let me know which version you are
using, Ubuntu tends to be far behind, or only cherry-picking fixes rather then
tracking stable branches).

To find our more, set GST_DEBUG="v4l2*:7" env. That will give you a lot more
details. By experience, I suspect that your output-io-mode=dmabuf-import is
faulty on the v4l2convert element (on newer GStreamer, we now have a fix name
for converter too).

IMX.6 platform (and IMX.8 too) didn't include an IOMMU in their platform. On the
other side, USB cameras (like UVC) produces scattered memory (virtual memory).
That makes importation of dmabuf from UVC to v4l2convert impossible in this
case. You may want to try the other way around, to set io-mode=dmabuf-import on
the v4l2src element. If that does not work, you'll have no choice but to let the
pipelein makes a copy (default setting). Normally, I suggest to first try
without touching these advance settings, to make it work, and then tweak to gain
more performance.

> Execution ended after 0:00:00.436501205
> Setting pipeline to PAUSED ...
> Setting pipeline to READY ...
> Setting pipeline to NULL ...
> Freeing pipeline ...
>
> Any hint on debugging would be appreciated!
> Thank you,
>   -- Constantine.
>
>
>
>
>
> On Sun, Jan 19, 2020 at 2:13 PM Milian Wolff <[hidden email]> wrote:
> > On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:
> > > Hi devs,
> > >
> > > I'm trying to construct a pipeline that captures frames from a USB camera
> > > (YUV) and encodes them with HW encoder and saves into a file. My setup is
> > > iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
> > >
> > > When I try a sw encoder, it works okay albeit I get very high 100% CPU
> > > usage. The working pipeline based on software plugins:
> > > gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> > > location=aha.mp4
> > >
> > > My attempt to replace sw based plugins by HW based with efficient memory
> > > management:
> > > gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > colorimetry=bt709" ! *v4l2video8convert* *output-io-mode=dmabuf-import* !
> > > *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> > > location=aha.mp4
> > >
> > > I get the following error: "WARNING: erroneous pipeline: could not link
> > > v4l2h264enc0 to mp4mux0"
> > >
> > > Would appreciate any ideas how to understand what's wrong, how to debug and
> > > make it work.
> >
> > To debug, I suggest you compare the SRC of `gst-inspect-1.0 v4l2h264enc` with
> > the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able to fix
> > the issue by adding a `h264parse` element in the middle to fix the alignment
> > since `mp4mux` requires `au` alignment, whereas the encoder may output `nal`
> > frames?
> >
> > Good luck
> >
> > --
> > Milian Wolff | [hidden email] | Senior Software Engineer
> > KDAB (Deutschland) GmbH, a KDAB Group company
> > Tel: +49-30-521325470
> > KDAB - The Qt, C++ and OpenGL Experts
>
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

constantine.elster
Hi Nicolas,

Thank you very much!! 

I removed output-io-mode=dmabuf-import from v4l2video8convert and v4l2h264enc elements and got the pipeline working!
This pipeline worked for me: gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=1000 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1, colorimetry=bt709" ! v4l2video8convert ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=aha.mp4

I get about 15% CPU usage with the above pipeline, though I get few dropped frames in the output file.

So, the next step I want to perform few optimizations.
1) I set the io-mode=dmabuf-import on the v4l2src element though I get this error: 
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate required memory.
Additional debug info:
gstv4l2src.c(658): gst_v4l2src_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Buffer pool activation failed


So, I presume 0memcopy is not possible?

2) I would like to drop frames as close as possible to the v4l2 source to achieve 10 fps at the input of the encoder. Though my v4l2 src accepts only 45 and 60 fps configuration. Is there an elegant way to reduce the frame rate this way?

3) Any other recommendations I can pursue to optimize the pipeline?

Thank you very very much,
  -- Constantine.


On Sun, Jan 19, 2020 at 7:57 PM Nicolas Dufresne <[hidden email]> wrote:
Le dimanche 19 janvier 2020 à 15:12 +0200, Constantine Elster a écrit :
> Thank you very much Milian! Looks better with h264parse. Still getting an error, though now different one.
>
> I added h264parse to the pipeline:
> gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=200 ! "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=bt709" ! v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-mode=dmabuf-import ! h264parse ! mp4mux ! filesink location=aha.mp4
>
> Output:
> Setting pipeline to PAUSED ...
> Pipeline is live and does not need PREROLL ...
> Setting pipeline to PLAYING ...
> New clock: GstSystemClock
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:src: caps = video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)I420, width=(int)640, height=(int)480, colorimetry=(string)bt709
> /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
> /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-mode=(string)progressive, colorimetry=(string)bt709
> Redistribute latency...
> /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive, format=(string)I420, width=(int)640, height=(int)480, colorimetry=(string)bt709
> /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709, interlace-mode=(string)progressive
> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
> Additional debug info:
> gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> streaming stopped, reason error (-5)

Looks like the real error was not propagated (let me know which version you are
using, Ubuntu tends to be far behind, or only cherry-picking fixes rather then
tracking stable branches).

To find our more, set GST_DEBUG="v4l2*:7" env. That will give you a lot more
details. By experience, I suspect that your output-io-mode=dmabuf-import is
faulty on the v4l2convert element (on newer GStreamer, we now have a fix name
for converter too).

IMX.6 platform (and IMX.8 too) didn't include an IOMMU in their platform. On the
other side, USB cameras (like UVC) produces scattered memory (virtual memory).
That makes importation of dmabuf from UVC to v4l2convert impossible in this
case. You may want to try the other way around, to set io-mode=dmabuf-import on
the v4l2src element. If that does not work, you'll have no choice but to let the
pipelein makes a copy (default setting). Normally, I suggest to first try
without touching these advance settings, to make it work, and then tweak to gain
more performance.

> Execution ended after 0:00:00.436501205
> Setting pipeline to PAUSED ...
> Setting pipeline to READY ...
> Setting pipeline to NULL ...
> Freeing pipeline ...
>
> Any hint on debugging would be appreciated!
> Thank you,
>   -- Constantine.
>
>
>
>
>
> On Sun, Jan 19, 2020 at 2:13 PM Milian Wolff <[hidden email]> wrote:
> > On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:
> > > Hi devs,
> > >
> > > I'm trying to construct a pipeline that captures frames from a USB camera
> > > (YUV) and encodes them with HW encoder and saves into a file. My setup is
> > > iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
> > >
> > > When I try a sw encoder, it works okay albeit I get very high 100% CPU
> > > usage. The working pipeline based on software plugins:
> > > gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> > > location=aha.mp4
> > >
> > > My attempt to replace sw based plugins by HW based with efficient memory
> > > management:
> > > gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > colorimetry=bt709" ! *v4l2video8convert* *output-io-mode=dmabuf-import* !
> > > *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> > > location=aha.mp4
> > >
> > > I get the following error: "WARNING: erroneous pipeline: could not link
> > > v4l2h264enc0 to mp4mux0"
> > >
> > > Would appreciate any ideas how to understand what's wrong, how to debug and
> > > make it work.
> >
> > To debug, I suggest you compare the SRC of `gst-inspect-1.0 v4l2h264enc` with
> > the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able to fix
> > the issue by adding a `h264parse` element in the middle to fix the alignment
> > since `mp4mux` requires `au` alignment, whereas the encoder may output `nal`
> > frames?
> >
> > Good luck
> >
> > --
> > Milian Wolff | [hidden email] | Senior Software Engineer
> > KDAB (Deutschland) GmbH, a KDAB Group company
> > Tel: +49-30-521325470
> > KDAB - The Qt, C++ and OpenGL Experts
>
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


--

Director, Software Engineering
www.valerann.com

       

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

Nicolas Dufresne-5
Le lundi 20 janvier 2020 à 20:13 +0200, Constantine Elster a écrit :

> Hi Nicolas,
>
> Thank you very much!!
>
> I removed output-io-mode=dmabuf-import from v4l2video8convert and v4l2h264enc
> elements and got the pipeline working!
> This pipeline worked for me: gst-launch-1.0 -vvv v4l2src device="/dev/video9"
> num-buffers=1000 ! "video/x-raw, format=(string)UYVY, width=(int)640,
> height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> colorimetry=bt709" ! v4l2video8convert ! v4l2h264enc ! h264parse ! mp4mux !
> filesink location=aha.mp4
>
> I get about 15% CPU usage with the above pipeline, though I get few dropped
> frames in the output file.

That might go away with a 'queue' between v4l2src and v4l2video8convert.

>
> So, the next step I want to perform few optimizations.
> 1) I set the io-mode=dmabuf-import on the v4l2src element though I get this
> error:
> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to
> allocate required memory.
> Additional debug info:
> gstv4l2src.c(658): gst_v4l2src_decide_allocation ():
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> Buffer pool activation failed
>
> So, I presume 0memcopy is not possible?

I guess not indeed. Would be nice to learn more (GST_DEBUG="v4l2*:7"). Though I
know UVC driver is not flexible at all. Also, buffer importation code isn't safe
in older release of GStreamer. On the good side, the copy is made with a
virtually allocated buffer as source, so it fairly fast.

>
> 2) I would like to drop frames as close as possible to the v4l2 source to
> achieve 10 fps at the input of the encoder. Though my v4l2 src accepts only 45
> and 60 fps configuration. Is there an elegant way to reduce the frame rate
> this way?

I'd use "videorate drop-only=0 ! <caps filter with framerate=10/1>".

>
> 3) Any other recommendations I can pursue to optimize the pipeline?

You can try to enable 0 copy between v4l2video8convert and v4l2h264enc. Two
options:

   v4l2video8convert capture-io-mode=dmabuf-import ! v4l2h264enc
   v4l2video8convert ! v4l2h264enc output-io-mode=dmabuf-import

>
> Thank you very very much,
>   -- Constantine.
>
>
> On Sun, Jan 19, 2020 at 7:57 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le dimanche 19 janvier 2020 à 15:12 +0200, Constantine Elster a écrit :
> > > Thank you very much Milian! Looks better with h264parse. Still getting an
> > error, though now different one.
> > >
> > > I added h264parse to the pipeline:
> > > gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=200 !
> > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-
> > aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=bt709" !
> > v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-
> > mode=dmabuf-import ! h264parse ! mp4mux ! filesink location=aha.mp4
> > >
> > > Output:
> > > Setting pipeline to PAUSED ...
> > > Pipeline is live and does not need PREROLL ...
> > > Setting pipeline to PLAYING ...
> > > New clock: GstSystemClock
> > > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw,
> > format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-
> > ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709,
> > interlace-mode=(string)progressive
> > > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps =
> > video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-
> > aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> > colorimetry=(string)bt709, interlace-mode=(string)progressive
> > > /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:src:
> > caps = video/x-raw, framerate=(fraction)60/1, interlace-
> > mode=(string)progressive, format=(string)I420, width=(int)640,
> > height=(int)480, colorimetry=(string)bt709
> > > /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps =
> > video/x-h264, stream-format=(string)byte-stream, alignment=(string)au,
> > profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480,
> > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-
> > mode=(string)progressive, colorimetry=(string)bt709
> > > /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps =
> > video/x-h264, stream-format=(string)byte-stream, alignment=(string)au,
> > profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480,
> > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-
> > mode=(string)progressive, colorimetry=(string)bt709
> > > Redistribute latency...
> > > /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps =
> > video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive,
> > format=(string)I420, width=(int)640, height=(int)480,
> > colorimetry=(string)bt709
> > > /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:sink:
> > caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> > colorimetry=(string)bt709, interlace-mode=(string)progressive
> > > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps =
> > video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-
> > aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> > colorimetry=(string)bt709, interlace-mode=(string)progressive
> > > ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal
> > data stream error.
> > > Additional debug info:
> > > gstbasesrc.c(3055): gst_base_src_loop ():
> > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> > > streaming stopped, reason error (-5)
> >
> > Looks like the real error was not propagated (let me know which version you
> > are
> > using, Ubuntu tends to be far behind, or only cherry-picking fixes rather
> > then
> > tracking stable branches).
> >
> > To find our more, set GST_DEBUG="v4l2*:7" env. That will give you a lot more
> > details. By experience, I suspect that your output-io-mode=dmabuf-import is
> > faulty on the v4l2convert element (on newer GStreamer, we now have a fix
> > name
> > for converter too).
> >
> > IMX.6 platform (and IMX.8 too) didn't include an IOMMU in their platform. On
> > the
> > other side, USB cameras (like UVC) produces scattered memory (virtual
> > memory).
> > That makes importation of dmabuf from UVC to v4l2convert impossible in this
> > case. You may want to try the other way around, to set io-mode=dmabuf-import
> > on
> > the v4l2src element. If that does not work, you'll have no choice but to let
> > the
> > pipelein makes a copy (default setting). Normally, I suggest to first try
> > without touching these advance settings, to make it work, and then tweak to
> > gain
> > more performance.
> >
> > > Execution ended after 0:00:00.436501205
> > > Setting pipeline to PAUSED ...
> > > Setting pipeline to READY ...
> > > Setting pipeline to NULL ...
> > > Freeing pipeline ...
> > >
> > > Any hint on debugging would be appreciated!
> > > Thank you,
> > >   -- Constantine.
> > >
> > >
> > >
> > >
> > >
> > > On Sun, Jan 19, 2020 at 2:13 PM Milian Wolff <[hidden email]>
> > wrote:
> > > > On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:
> > > > > Hi devs,
> > > > >
> > > > > I'm trying to construct a pipeline that captures frames from a USB
> > camera
> > > > > (YUV) and encodes them with HW encoder and saves into a file. My setup
> > is
> > > > > iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
> > > > >
> > > > > When I try a sw encoder, it works okay albeit I get very high 100% CPU
> > > > > usage. The working pipeline based on software plugins:
> > > > > gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> > > > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > > > colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> > > > > location=aha.mp4
> > > > >
> > > > > My attempt to replace sw based plugins by HW based with efficient
> > memory
> > > > > management:
> > > > > gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> > > > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > > > colorimetry=bt709" ! *v4l2video8convert* *output-io-mode=dmabuf-
> > import* !
> > > > > *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> > > > > location=aha.mp4
> > > > >
> > > > > I get the following error: "WARNING: erroneous pipeline: could not
> > link
> > > > > v4l2h264enc0 to mp4mux0"
> > > > >
> > > > > Would appreciate any ideas how to understand what's wrong, how to
> > debug and
> > > > > make it work.
> > > >
> > > > To debug, I suggest you compare the SRC of `gst-inspect-1.0 v4l2h264enc`
> > with
> > > > the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able
> > to fix
> > > > the issue by adding a `h264parse` element in the middle to fix the
> > alignment
> > > > since `mp4mux` requires `au` alignment, whereas the encoder may output
> > `nal`
> > > > frames?
> > > >
> > > > Good luck
> > > >
> > > > --
> > > > Milian Wolff | [hidden email] | Senior Software Engineer
> > > > KDAB (Deutschland) GmbH, a KDAB Group company
> > > > Tel: +49-30-521325470
> > > > KDAB - The Qt, C++ and OpenGL Experts
> > >
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: encoding pipeline from v4l2 usb camera with v4l2video8convert and v4l2h264enc

constantine.elster
Thank you very much Nicolas!

videoconvert element worked for me. Also " v4l2video8convert ! v4l2h264enc output-io-mode=dmabuf-import" helped to reduce the CPU usage by 2-3%.

I think I am ok with what I have for now, thank you very much!

Having said that, I can proceed further with "GST_DEBUG="v4l2*:7"" to attempt to see if zero copy possible with my USB camera source? 
Please advise if you are okay with that. What exact log would be interesting to observe?

Thank you very much,
  -- Constantine.



On Mon, Jan 20, 2020 at 9:32 PM Nicolas Dufresne <[hidden email]> wrote:
Le lundi 20 janvier 2020 à 20:13 +0200, Constantine Elster a écrit :
> Hi Nicolas,
>
> Thank you very much!!
>
> I removed output-io-mode=dmabuf-import from v4l2video8convert and v4l2h264enc
> elements and got the pipeline working!
> This pipeline worked for me: gst-launch-1.0 -vvv v4l2src device="/dev/video9"
> num-buffers=1000 ! "video/x-raw, format=(string)UYVY, width=(int)640,
> height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> colorimetry=bt709" ! v4l2video8convert ! v4l2h264enc ! h264parse ! mp4mux !
> filesink location=aha.mp4
>
> I get about 15% CPU usage with the above pipeline, though I get few dropped
> frames in the output file.

That might go away with a 'queue' between v4l2src and v4l2video8convert.

>
> So, the next step I want to perform few optimizations.
> 1) I set the io-mode=dmabuf-import on the v4l2src element though I get this
> error:
> ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to
> allocate required memory.
> Additional debug info:
> gstv4l2src.c(658): gst_v4l2src_decide_allocation ():
> /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> Buffer pool activation failed
>
> So, I presume 0memcopy is not possible?

I guess not indeed. Would be nice to learn more (GST_DEBUG="v4l2*:7"). Though I
know UVC driver is not flexible at all. Also, buffer importation code isn't safe
in older release of GStreamer. On the good side, the copy is made with a
virtually allocated buffer as source, so it fairly fast.

>
> 2) I would like to drop frames as close as possible to the v4l2 source to
> achieve 10 fps at the input of the encoder. Though my v4l2 src accepts only 45
> and 60 fps configuration. Is there an elegant way to reduce the frame rate
> this way?

I'd use "videorate drop-only=0 ! <caps filter with framerate=10/1>".

>
> 3) Any other recommendations I can pursue to optimize the pipeline?

You can try to enable 0 copy between v4l2video8convert and v4l2h264enc. Two
options:

   v4l2video8convert capture-io-mode=dmabuf-import ! v4l2h264enc
   v4l2video8convert ! v4l2h264enc output-io-mode=dmabuf-import

>
> Thank you very very much,
>   -- Constantine.
>
>
> On Sun, Jan 19, 2020 at 7:57 PM Nicolas Dufresne <[hidden email]> wrote:
> > Le dimanche 19 janvier 2020 à 15:12 +0200, Constantine Elster a écrit :
> > > Thank you very much Milian! Looks better with h264parse. Still getting an
> > error, though now different one.
> > >
> > > I added h264parse to the pipeline:
> > > gst-launch-1.0 -vvv v4l2src device="/dev/video9" num-buffers=200 !
> > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-
> > aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=bt709" !
> > v4l2video8convert output-io-mode=dmabuf-import ! v4l2h264enc output-io-
> > mode=dmabuf-import ! h264parse ! mp4mux ! filesink location=aha.mp4
> > >
> > > Output:
> > > Setting pipeline to PAUSED ...
> > > Pipeline is live and does not need PREROLL ...
> > > Setting pipeline to PLAYING ...
> > > New clock: GstSystemClock
> > > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw,
> > format=(string)UYVY, width=(int)640, height=(int)480, pixel-aspect-
> > ratio=(fraction)1/1, framerate=(fraction)60/1, colorimetry=(string)bt709,
> > interlace-mode=(string)progressive
> > > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps =
> > video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-
> > aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> > colorimetry=(string)bt709, interlace-mode=(string)progressive
> > > /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:src:
> > caps = video/x-raw, framerate=(fraction)60/1, interlace-
> > mode=(string)progressive, format=(string)I420, width=(int)640,
> > height=(int)480, colorimetry=(string)bt709
> > > /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps =
> > video/x-h264, stream-format=(string)byte-stream, alignment=(string)au,
> > profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480,
> > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-
> > mode=(string)progressive, colorimetry=(string)bt709
> > > /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps =
> > video/x-h264, stream-format=(string)byte-stream, alignment=(string)au,
> > profile=(string)baseline, level=(string)4, width=(int)640, height=(int)480,
> > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1, interlace-
> > mode=(string)progressive, colorimetry=(string)bt709
> > > Redistribute latency...
> > > /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps =
> > video/x-raw, framerate=(fraction)60/1, interlace-mode=(string)progressive,
> > format=(string)I420, width=(int)640, height=(int)480,
> > colorimetry=(string)bt709
> > > /GstPipeline:pipeline0/v4l2video8convert:v4l2video8convert0.GstPad:sink:
> > caps = video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> > colorimetry=(string)bt709, interlace-mode=(string)progressive
> > > /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps =
> > video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, pixel-
> > aspect-ratio=(fraction)1/1, framerate=(fraction)60/1,
> > colorimetry=(string)bt709, interlace-mode=(string)progressive
> > > ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal
> > data stream error.
> > > Additional debug info:
> > > gstbasesrc.c(3055): gst_base_src_loop ():
> > /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
> > > streaming stopped, reason error (-5)
> >
> > Looks like the real error was not propagated (let me know which version you
> > are
> > using, Ubuntu tends to be far behind, or only cherry-picking fixes rather
> > then
> > tracking stable branches).
> >
> > To find our more, set GST_DEBUG="v4l2*:7" env. That will give you a lot more
> > details. By experience, I suspect that your output-io-mode=dmabuf-import is
> > faulty on the v4l2convert element (on newer GStreamer, we now have a fix
> > name
> > for converter too).
> >
> > IMX.6 platform (and IMX.8 too) didn't include an IOMMU in their platform. On
> > the
> > other side, USB cameras (like UVC) produces scattered memory (virtual
> > memory).
> > That makes importation of dmabuf from UVC to v4l2convert impossible in this
> > case. You may want to try the other way around, to set io-mode=dmabuf-import
> > on
> > the v4l2src element. If that does not work, you'll have no choice but to let
> > the
> > pipelein makes a copy (default setting). Normally, I suggest to first try
> > without touching these advance settings, to make it work, and then tweak to
> > gain
> > more performance.
> >
> > > Execution ended after 0:00:00.436501205
> > > Setting pipeline to PAUSED ...
> > > Setting pipeline to READY ...
> > > Setting pipeline to NULL ...
> > > Freeing pipeline ...
> > >
> > > Any hint on debugging would be appreciated!
> > > Thank you,
> > >   -- Constantine.
> > >
> > >
> > >
> > >
> > >
> > > On Sun, Jan 19, 2020 at 2:13 PM Milian Wolff <[hidden email]>
> > wrote:
> > > > On Sonntag, 19. Januar 2020 11:06:55 CET Constantine Elster wrote:
> > > > > Hi devs,
> > > > >
> > > > > I'm trying to construct a pipeline that captures frames from a USB
> > camera
> > > > > (YUV) and encodes them with HW encoder and saves into a file. My setup
> > is
> > > > > iMX6 board running Ubuntu 18.04 on 4.20 mainline kernel.
> > > > >
> > > > > When I try a sw encoder, it works okay albeit I get very high 100% CPU
> > > > > usage. The working pipeline based on software plugins:
> > > > > gst-launch-1.0 -v v4l2src device="/dev/video2" num-buffers=200 !
> > > > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > > > colorimetry=bt709" ! videoconvert ! x264enc ! mp4mux ! filesink
> > > > > location=aha.mp4
> > > > >
> > > > > My attempt to replace sw based plugins by HW based with efficient
> > memory
> > > > > management:
> > > > > gst-launch-1.0 -v v4l2src device="/dev/video9" num-buffers=200 !
> > > > > "video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480,
> > > > > pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)45/1,
> > > > > colorimetry=bt709" ! *v4l2video8convert* *output-io-mode=dmabuf-
> > import* !
> > > > > *v4l2h264enc* *output-io-mode=dmabuf-import* ! mp4mux ! filesink
> > > > > location=aha.mp4
> > > > >
> > > > > I get the following error: "WARNING: erroneous pipeline: could not
> > link
> > > > > v4l2h264enc0 to mp4mux0"
> > > > >
> > > > > Would appreciate any ideas how to understand what's wrong, how to
> > debug and
> > > > > make it work.
> > > >
> > > > To debug, I suggest you compare the SRC of `gst-inspect-1.0 v4l2h264enc`
> > with
> > > > the SINK of `gst-inspect-1.0 mp4mux`. My guess is that you may be able
> > to fix
> > > > the issue by adding a `h264parse` element in the middle to fix the
> > alignment
> > > > since `mp4mux` requires `au` alignment, whereas the encoder may output
> > `nal`
> > > > frames?
> > > >
> > > > Good luck
> > > >
> > > > --
> > > > Milian Wolff | [hidden email] | Senior Software Engineer
> > > > KDAB (Deutschland) GmbH, a KDAB Group company
> > > > Tel: +49-30-521325470
> > > > KDAB - The Qt, C++ and OpenGL Experts
> > >
> > >
> > > _______________________________________________
> > > gstreamer-devel mailing list
> > > [hidden email]
> > > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> >
> > _______________________________________________
> > gstreamer-devel mailing list
> > [hidden email]
> > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
>
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


--

Director, Software Engineering
www.valerann.com

       

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel