Sending GLMemory buffers into VAAPI plugins

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Sending GLMemory buffers into VAAPI plugins

Matt Fischer
Hi, I hope this is the right place for VAAPI questions, I didn't see a dedicated mailing list for it.  Please redirect me if I missed it. :)

I'm trying to set up a GStreamer configuration where I produce GL buffers, and then send them off to a VAAPI encoder.  Currently there's no direct way to do that without a software download in between, so I've been trying to figure out how to add that support.  I've hacked something together that seems to work, based largely on the code which is already present for importing dmabuf buffers.  However there are still several pieces of it that aren't right, and I'm not exactly sure how to proceed, so I was hoping somebody here could give me some guidance.

I've attached a copy of my patch as it currently stands.  Essentially I've just added a new block in gst_vaapi_plugin_base_get_input_buffer() which can detect GLMemory buffers, and which uses gst_vaapi_texture_new_wrapped() to get the texture into a GstVaapiSurfaceProxy.  Here are the parts I still haven't completely worked out:

1. Right now I have to fake the chroma_type so that is_chroma_type_supported() in gstvaapiencoder.c doesn't reject the surface.  Is this check actually necessary when creating a VASurface the way I'm doing?  The colors in the encoded video seem to be right, even though it's an ARGB texture.

2. I had to punch a hole through GstVaapiTexture to get at the GstVaapiSurface that is created in the EGL backend.  That doesn't generalize to the other texture backends...is there a better way for me to turn a GstVaapiTexture into a GstVaapiSurface that would be more portable?

3. I'm creating a new GstVaapiTexture every time a buffer comes in, which obviously isn't ideal.  I'd like to cache these things somehow...maybe a new meta on the incoming GLMemory buffer or something like that?

4. I had to add code to gstvaapipluginbase.c to ensure that we have a GL context before creating the VAAPI display.  I'm not exactly sure how this plays with the existing code that tried to get a GL context during decide_allocation for GLTextureUploadMeta.  Would that still be necessary, or could it just use the same mechanism that I added, which queries for the context like other GL plugins do?

I'd appreciate any feedback you can provide.

Thanks,
Matt

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

0001-TEMP-GL-VAAPI.patch (11K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Sending GLMemory buffers into VAAPI plugins

Hyunjun Ko
> Hi, I hope this is the right place for VAAPI questions, I didn't see a
> dedicated mailing list for it.  Please redirect me if I missed it. :)
You've choosed the right place. :)

> I'm trying to set up a GStreamer configuration where I produce GL buffers,
> and then send them off to a VAAPI encoder.  Currently there's no direct way
> to do that without a software download in between, so I've been trying to
> figure out how to add that support.  I've hacked something together that
> seems to work, based largely on the code which is already present for
> importing dmabuf buffers.  However there are still several pieces of it
> that aren't right, and I'm not exactly sure how to proceed, so I was hoping
> somebody here could give me some guidance.

> I've attached a copy of my patch as it currently stands.  Essentially I've
> just added a new block in gst_vaapi_plugin_base_get_input_buffer() which
> can detect GLMemory buffers, and which uses gst_vaapi_texture_new_wrapped()
> to get the texture into a GstVaapiSurfaceProxy.  Here are the parts I still
> haven't completely worked out:

Interesting try. I've tested your patch with some hacks/modification based on master and have seen it's working.

> 1. Right now I have to fake the chroma_type so that
> is_chroma_type_supported() in gstvaapiencoder.c doesn't reject the
> surface.  Is this check actually necessary when creating a VASurface the
> way I'm doing?  The colors in the encoded video seem to be right, even
> though it's an ARGB texture.
It's necessary, but in this case, i don't know.

> 2. I had to punch a hole through GstVaapiTexture to get at the
> GstVaapiSurface that is created in the EGL backend.  That doesn't
> generalize to the other texture backends...is there a better way for me to
> turn a GstVaapiTexture into a GstVaapiSurface that would be more portable?
You can define vmethod get_surface in base class, which is GstVaapiTexture, and implements each decendant (GstVaapiTextureEGL/GLX)
But GstVaapiTextureGLX doesn't have its own surface, because it's not necessary for current usage,
which means you should implement creating surface and putting the texture to the surface in this case.
(of course, there are apis)

> 3. I'm creating a new GstVaapiTexture every time a buffer comes in, which
> obviously isn't ideal.  I'd like to cache these things somehow...maybe a
> new meta on the incoming GLMemory buffer or something like that?
There's hashmap mechanism inside, VaapiTextureMap. You don't need to worry about this.

> 4. I had to add code to gstvaapipluginbase.c to ensure that we have a GL
> context before creating the VAAPI display.  I'm not exactly sure how this
> plays with the existing code that tried to get a GL context during
> decide_allocation for GLTextureUploadMeta.  Would that still be necessary,
> or could it just use the same mechanism that I added, which queries for the
> context like other GL plugins do?
Just make your plugin handle context query(gst.gl.local_context) and pass gl context just like what glimagesink is doing to make vaapi plugin create proper vaapidisplay. I believe it would be working without your code like gst_gl_handle_set_context/gst_gl_ensure_element_data.

Reply | Threaded
Open this post in threaded view
|

Re: Sending GLMemory buffers into VAAPI plugins

Victor Jaquez
In reply to this post by Matt Fischer

On 01/04/17 at 11:08am, Matt Fischer wrote:
> Hi, I hope this is the right place for VAAPI questions, I didn't see a
> dedicated mailing list for it.  Please redirect me if I missed it. :)

Gee! This is really nice. Thanks!

In order to follow this development, the gstreamer community use
bugzilla. Please open a bug here

https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gstreamer-vaapi

And upload your patch so we could make code reviews and discuss it.

>
> I'm trying to set up a GStreamer configuration where I produce GL buffers,
> and then send them off to a VAAPI encoder.  Currently there's no direct way
> to do that without a software download in between, so I've been trying to
> figure out how to add that support.  I've hacked something together that
> seems to work, based largely on the code which is already present for
> importing dmabuf buffers.  However there are still several pieces of it
> that aren't right, and I'm not exactly sure how to proceed, so I was hoping
> somebody here could give me some guidance.
>
> I've attached a copy of my patch as it currently stands.  Essentially I've
> just added a new block in gst_vaapi_plugin_base_get_input_buffer() which
> can detect GLMemory buffers, and which uses gst_vaapi_texture_new_wrapped()
> to get the texture into a GstVaapiSurfaceProxy.  Here are the parts I still
> haven't completely worked out:
>
> 1. Right now I have to fake the chroma_type so that
> is_chroma_type_supported() in gstvaapiencoder.c doesn't reject the
> surface.  Is this check actually necessary when creating a VASurface the
> way I'm doing?  The colors in the encoded video seem to be right, even
> though it's an ARGB texture.
>
> 2. I had to punch a hole through GstVaapiTexture to get at the
> GstVaapiSurface that is created in the EGL backend.  That doesn't
> generalize to the other texture backends...is there a better way for me to
> turn a GstVaapiTexture into a GstVaapiSurface that would be more portable?
>
> 3. I'm creating a new GstVaapiTexture every time a buffer comes in, which
> obviously isn't ideal.  I'd like to cache these things somehow...maybe a
> new meta on the incoming GLMemory buffer or something like that?
>
> 4. I had to add code to gstvaapipluginbase.c to ensure that we have a GL
> context before creating the VAAPI display.  I'm not exactly sure how this
> plays with the existing code that tried to get a GL context during
> decide_allocation for GLTextureUploadMeta.  Would that still be necessary,
> or could it just use the same mechanism that I added, which queries for the
> context like other GL plugins do?
>
> I'd appreciate any feedback you can provide.

Besides what Hyunjun has already said, I wonder, since you're already
importing only EGLImages, why not extract the dmabuf in it and export it as a
VASurface?

vmjl

>
> Thanks,
> Matt


> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Sending GLMemory buffers into VAAPI plugins

Matt Fischer
In reply to this post by Hyunjun Ko
Thanks for your feedback.  Since it sounds like this is fairly close to a workable solution, I've tossed this patch up into Bugzilla as https://bugzilla.gnome.org/show_bug.cgi?id=776927.  Hopefully we can work out the remaining issues on there.
Reply | Threaded
Open this post in threaded view
|

Re: Sending GLMemory buffers into VAAPI plugins

Matt Fischer
In reply to this post by Victor Jaquez
I was just making use of the API's that are already there, since I don't know this code very well.  Since GstVaapiTextureEGL already creates a surface internally (which it does by creating an EGLImage, exporting it to a DRM handle, and then calling gst_vaapi_surface_new_with_gem_buf_handle()), I just used that code.  It looks like it would be possible to export the EGLImage to a dmabuf handle instead of a DRM handle, and import that into the surface using gst_vaapi_surface_new_with_dma_buf_handle(), but I'm not sure I see how that would improve anything over the existing mechanism.