New to gst - need some help:
what is the best way to pass meta data from one element to the other in gst
pipeline? example use case: in a HW accelerated transcoding session, OMX video decoder output buffer is sent to OMX video encode. decoder output is present in a physically contiguous memory identified by a file descriptor and encode needs to use this file descriptor to derive the physical address of decoder's output buffer. OMX buffer header provides platform private field to carry platform specific data. Gst seems to define buffer meta data but what is the best place to embed this information. Does gst have any platform private/reserved fields (similar to OMX buffer hdr platform private data) to carry such information? http://www.gstreamer.net/data/doc/gstreamer/head/gstreamer/html/gstreamer-GstBuffer.html#GstBufferFlag Another question is how does buffer from one element get passed to another element when elements are in different processes? Or gst assume all elements run in the same process.this cannot be practical scenario as usually display resides in a different process. ------------------------------------------------------------------------------ This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
[hidden email] wrote:
> New to gst - need some help: > > what is the best way to pass meta data from one element to the other in gst > pipeline? > example use case: in a HW accelerated transcoding session, OMX video decoder > output buffer is sent to OMX video encode. decoder output is present in a > physically contiguous memory identified by a file descriptor and encode > needs to use this file descriptor to derive the physical address of > decoder's output buffer. OMX buffer header provides platform private field > to carry platform specific data. > Gst seems to define buffer meta data but what is the best place to embed > this information. Does gst have any platform private/reserved fields > (similar to OMX buffer hdr platform private data) to carry such information? > http://www.gstreamer.net/data/doc/gstreamer/head/gstreamer/html/gstreamer-GstBuffer.html#GstBufferFlag > Another question is how does buffer from one element get passed to another > element when elements are in different processes? Or gst assume all elements > run in the same process.this cannot be practical scenario as usually display > resides in a different process. All GStreamer elements run in the same process. The display server might run in a separate process, but that's independent of GStreamer (i.e. the API is X11, not Gst). If you only want to provide custom data between two contiguous elements in the pipeline you have two options: 1) Provide a GstBuffer sub-type, like GstOmxBuffer. Then it should be easy to add any information that you want, but you should check that the GType is the right one. 2) Add a custom field to the GstCaps of the buffer. However, I don't see why you need a fd for contiguous memory. On OMAP3 platform I have simple sink element that provides framebuffer memory (which is contiguous), and the video decoder element mmaps that memory. At kernel level the dspbridge driver is able to identify this memory as VM_IO, and the mmap operation is very fast. IOW; everything happens behind the scene; at kernel level. Are we talking about linux? -- Felipe Contreras ------------------------------------------------------------------------------ This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
thanks for the response - yes - i am referring to linux - can you give a more detailed call flow on what happens in user space and kernel space. How does sink element pass memory region to decoder element in user space? omx decoder element needs an fd, offset, length to mmap framebuffer memory into user space or does this all happen in kernel space without any intervention in userspace/framework. This applies to use cases like camera v4l2 src sending data to omx video encoder in gst framework. if camera and omx vid enc need to share the same physical memory, additional information about memory region (like a buffer identifier or fd) needs to be exchanged between user space elements in gst.
On Tue, Jun 29, 2010 at 3:38 AM, Felipe Contreras <[hidden email]> wrote:
------------------------------------------------------------------------------ This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On 06/29/2010 04:30 PM, hd d wrote:
> thanks for the response - yes - i am referring to linux - can you give > a more detailed call flow on what happens in user space and kernel > space. How does sink element pass memory region to decoder element in > user space? omx decoder element needs an fd, offset, length to mmap > framebuffer memory into user space or does this all happen in kernel > space without any intervention in userspace/framework. This applies to > use cases like camera v4l2 src sending data to omx video encoder in > gst framework. if camera and omx vid enc need to share the same > physical memory, additional information about memory region (like a > buffer identifier or fd) needs to be exchanged between user space > elements in gst. typically the decoder would not care too much about *where* the sink got the memory.. but would just use pad_alloc() to allow the video sink element to allocate memory which the video sink thinks the decoder should decode into.. some video sinks (for example, v4l2sink or omapfbsink) will mmap buffers into userspace, and then pass this to the decoder. But normally the decoder doesn't need to know which fd was mmap'd, the offset, etc. But if your decoder requires physically contiguous memory, with no possibility to remap virtually contiguous memory, then it possibly gets complicated. (Not to mention, various useful elements for debugging, like filesink, won't work to well..) btw, v4l2src (camera/encode) might be a bad example.. since what is in gst-plugins-good doesn't support pad_alloc() / USERPTR. So it allocates it's own buffers. But there are other camera elements which do a similar thing to what you describe, doing pad_alloc() to allocate buffers from the display, which then get sent to both encoder and sink element. BR, -R ------------------------------------------------------------------------------ This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |