Unable to use opengl elements on Raspberry Pi - gstreamer 1.8, openmax 1.0

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Unable to use opengl elements on Raspberry Pi - gstreamer 1.8, openmax 1.0

Adam Langley
Hi,

Im running on Raspbian Jessie Lite, with gstreamer 1.8 that I've compiled, with OMX 1.0
I followed these instructions to disable X, and enable egl - but I am still having an issue.


My goal is to decode h264, then re-encode it again, with utilising some opengl transform filters in between - at 720@24.

This is my pipeline (approximately)
gst-launch-1.0 -e -vvv  uvch264src do-timestamp=1 is-live=1 device=/dev/video0 name=src auto-start=true src.vidsrc     ! h264parse ! omxh264dec ! glupload ! identity silent=0 ! gldownload ! fakesink

And this is the error I'm getting:

Setting pipeline to PAUSED ...

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0

/GstPipeline:pipeline0/GstUvcH264Src:src: ready-for-capture = false

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video0

/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter0: caps = "video/x-h264\,\ width\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ height\=\(int\)\[\ 1\,\ 2147483647\ \]\,\ framerate\=\(fraction\)\[\ 0/1\,\ 2147483647/1\ \]"

Pipeline is live and does not need PREROLL ...

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = event   ******* (identity0:sink) E (type: stream-start (10254), GstEventStreamStart, stream-id=(string)2b7f5679e9f0cee0776e7c9d96ffc3b4682b26f7b8931798b4254fc99bdb­de9b, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)1;) 0xcaadb8

Got context from element 'gluploadelement0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayEGL\)\ gldisplayegl0";

Setting pipeline to PLAYING ...

New clock: GstSystemClock

/GstPipeline:pipeline0/GstUvcH264Src:src: fixed-framerate = false

/GstPipeline:pipeline0/GstUvcH264Src:src: rate-control = Constant bit rate

/GstPipeline:pipeline0/GstUvcH264Src:src: average-bitrate = 2000000

/GstPipeline:pipeline0/GstUvcH264Src:src: peak-bitrate = 2000000

/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0.GstPad:src: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1"

/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vidsrc: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1"

/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ parsed\=\(boolean\)true"

/GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ parsed\=\(boolean\)true"

/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1"

/GstPipeline:pipeline0/GstUvcH264Src:src.GstGhostPad:vidsrc.GstProxyPad:proxypad2: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1"

/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ parsed\=\(boolean\)true\,\ profile\=\(string\)high\,\ level\=\(string\)4"

/GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:sink: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ parsed\=\(boolean\)true\,\ profile\=\(string\)high\,\ level\=\(string\)4"

/GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ chroma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1"

/GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0.GstPad:src: caps = "video/x-raw\(memory:GLMemory\)\,\ format\=\(string\)I420\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ chroma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ texture-target\=\(string\)2D"

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = event   ******* (identity0:sink) E (type: caps (12814), GstEventCaps, caps=(GstCaps)"video/x-raw\(memory:GLMemory\)\,\ format\=\(string\)I420\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ chroma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ texture-target\=\(string\)2D";) 0x71104110

/GstPipeline:pipeline0/GstIdentity:identity0.GstPad:src: caps = "video/x-raw\(memory:GLMemory\)\,\ format\=\(string\)I420\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ chroma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ texture-target\=\(string\)2D"

/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = "video/x-raw\(memory:GLMemory\)\,\ format\=\(string\)I420\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ chroma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ texture-target\=\(string\)2D"

/GstPipeline:pipeline0/GstIdentity:identity0.GstPad:sink: caps = "video/x-raw\(memory:GLMemory\)\,\ format\=\(string\)I420\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ chroma-site\=\(string\)mpeg2\,\ colorimetry\=\(string\)2:4:7:1\,\ framerate\=\(fraction\)30/1\,\ texture-target\=\(string\)2D"

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

EOS on shutdown enabled -- waiting for EOS after Error

Waiting for EOS...

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

ERROR: from element /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0: Internal data stream error.

Additional debug info:

gstomxvideodec.c(1670): gst_omx_video_dec_loop (): /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0:

stream stopped, reason not-negotiated

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

/GstPipeline:pipeline0/GstIdentity:identity0: last-message = event   ******* (identity0:sink) E (type: eos (28174), ) 0x711041a0

ERROR: from element /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0: Can't open display

Additional debug info:

gstglbasefilter.c(445): gst_gl_base_filter_decide_allocation (): /GstPipeline:pipeline0/GstGLUploadElement:gluploadelement0

^Chandling interrupt.

Interrupt: Stopping pipeline ...

Interrupt while waiting for EOS - stopping pipeline...

Execution ended after 0:00:07.260136528

Setting pipeline to PAUSED ...

Setting pipeline to READY ...

Setting pipeline to NULL ...

                                                                           

Freeing pipeline ...



Final short question - if I am using the openmax components to simply transcode h264 from one bitrate to another (i.e h264 -> omxh264decode -> omxh264encode), would forcing the data to go through glmemory provide a performance improvement?

Thanks

Adam


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel