Hi,
I am using below pipeline : gst-launch-1.0 ! filesrc <input_file> ! h264parse ! omxh264dec ! kmssink After inserting some debug messages (after gst_is_kms_memory() in kmssink code) i got to know that omxh264dec doesn't use downstream i.e kmssink bufferpool, however when I use below pipeline, kmssink bufferpool is used : gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)NV12, width=176, height=144, framerate=30/1' ! kmssink QUESTIONS : Any ideas on what changes need to be done in gst-omx so that it uses kmssink buffer pool? Any examples of omxh264dec using downstream bufferpool would also be helpful. I think if downstream element is able to provide a buffer pool and omx component uses it than it should increase the performance of overall pipeline, isn't omxvideodec code designed such a way to handle this or am I missing something? Kindly let me know if there is anything wrong with my understanding as I am a gstreamer newbie. Thanks in advance. Best Regards, Devarsh |
On 01/15/17 at 05:06am, Devarsh Thakkar wrote:
> Hi, > > I am using below pipeline : > > /gst-launch-1.0 ! filesrc <input_file> ! h264parse ! omxh264dec ! kmssink/ > > After inserting some debug messages (after gst_is_kms_memory() in kmssink > code) i got to know that omxh264dec doesn't use downstream i.e kmssink > bufferpool, > > however when I use below pipeline, kmssink bufferpool is used : > /gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)NV12, width=176, > height=144, framerate=30/1' ! kmssink/ > > QUESTIONS : > Any ideas on what changes need to be done in gst-omx so that it uses kmssink > buffer pool? > Any examples of omxh264dec using downstream bufferpool would also be > helpful. I guess, if I recall correctly, that will depend on you hardware/platform: if the decoding subsystem can use an external memory area (in the case of kmssink, a DRM's dumb buffer). Normally it is not the case, and the media subsystem needs a special memory area, hence a memcopy is unavoidable. The special case, if you see the gst-omx code, is the EGL, the video decoder will export a EGL if downstream supports GstGLMemory (EGL). You could add another special case: DMABuf. If your platform can export DMABUf, you could use GstDMABufAllocator, and kmssink can import DMABufs. vmjl > > I think if downstream element is able to provide a buffer pool and omx > component uses it than it should increase the performance of overall > pipeline, isn't omxvideodec code designed such a way to handle this or am I > missing something? > > Kindly let me know if there is anything wrong with my understanding as I am > a gstreamer newbie. > Thanks in advance. > > Best Regards, > Devarsh > > > > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Not-able-to-use-downstream-kmssink-bufferpool-in-omx-decoder-component-tp4681455.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
In reply to this post by Devarsh Thakkar
Le dimanche 15 janvier 2017 à 05:06 -0800, Devarsh Thakkar a écrit :
> Hi, > > I am using below pipeline : > > /gst-launch-1.0 ! filesrc <input_file> ! h264parse ! omxh264dec ! kmssink/ > > After inserting some debug messages (after gst_is_kms_memory() in kmssink > code) i got to know that omxh264dec doesn't use downstream i.e kmssink > bufferpool, That is exact. This was never implemented as no-one had access to an OMX component that supported doing so. A component that supports that would implement UseBuffer method on the output port. If your component supports it, and the color format and strides are compatible, then it would be a great opportunity to enhance gst-omx. > > however when I use below pipeline, kmssink bufferpool is used : > /gst-launch-1.0 videotestsrc ! 'video/x-raw, format=(string)NV12, width=176, > height=144, framerate=30/1' ! kmssink/ > > QUESTIONS : > Any ideas on what changes need to be done in gst-omx so that it uses kmssink > buffer pool? > Any examples of omxh264dec using downstream bufferpool would also be > helpful. > > I think if downstream element is able to provide a buffer pool and omx > component uses it than it should increase the performance of overall > pipeline, isn't omxvideodec code designed such a way to handle this or am I > missing something? > > Kindly let me know if there is anything wrong with my understanding as I am > a gstreamer newbie. > Thanks in advance. > > Best Regards, > Devarsh > > > > -- > View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Not-able-to-use-downstream-kmssink-bufferpool-in-omx-decoder-component-tp4681455.html > Sent from the GStreamer-devel mailing list archive at Nabble.com. > _______________________________________________ > gstreamer-devel mailing list > [hidden email] > https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (188 bytes) Download Attachment |
Free forum by Nabble | Edit this page |