Opengl framebuffer source

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Opengl framebuffer source

arizonausa
Hi All,

I am looking for a solution on using Gstreamer to grab the framebuffer, transform it's colors place from RGBA to I420, then use OMX-based encoder to finish h264 or h265 encoding.

I don't find any OPENGL based source plugins which can grab framebuffer to a FBO, and save the transformed I420 data to a OPENGL texture.

Do anyone have idea on my case, thanks.

Kevin    
Reply | Threaded
Open this post in threaded view
|

Re: Opengl framebuffer source

Matthew Waters
On 15/07/16 02:21, arizonausa wrote:
> Hi All,
>
> I am looking for a solution on using Gstreamer to grab the framebuffer,
> transform it's colors place from RGBA to I420, then use OMX-based encoder to
> finish h264 or h265 encoding.

This all depends on what exactly you mean by 'framebuffer'.  Inside
OpenGL, framebuffer's don't actually hold any data and are just
container objects for textures/renderbuffers that you can render
from/to.  You can download texture but not renderbuffers.

The other interpretation of framebuffer is /dev/fb like thing which I'm
not entirely sure what facilities are available for readback of that.
This is not in the OpenGL domain though so you'll need to consult your
platform's documentation/samples/etc.

Cheers
-Matt

> I don't find any OPENGL based source plugins which can grab framebuffer to a
> FBO, and save the transformed I420 data to a OPENGL texture.
>
> Do anyone have idea on my case, thanks.
>
> Kevin    


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (484 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Opengl framebuffer source

arizonausa
Hi Matt,

Thanks for your answer. Sorry for the unclear question. what we actually do
is using OpenGL to do alpha blending of two 2D textures, one is in RGBA
format, and another is YUV, we also use fragment shader to do colorspace
transformation for YUV texture first and alpha blending of two textures
before rendering to screen. Right now, in addition to render to screen, I
also want to grab the pixel data after fragment shader to a Frame buffer
object, which should be in RGBA format, and sample this blended pixel data
into YUV420 data, then reroute those YUV420 data back to CPU for the next
processing.

So The question is there have any Gstreamer plugins can save the pixel data
after my fragment shader, and then do the shader based sampling and other
thing then, Thank you.

Kevin



--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Opengl-framebuffer-source-tp4678685p4678697.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Opengl framebuffer source

arizonausa
In reply to this post by Matthew Waters
Hi Matt,

Thanks for your answer. Sorry for the unclear question. what we actually do is using OpenGL to do alpha blending of two 2D textures, one is in RGBA format, and another is YUV, we also use fragment shader to do colorspace transformation for YUV texture first and alpha blending of two textures before rendering to screen. Right now, in addition to render to screen, I also want to grab the pixel data after fragment shader to a Frame buffer object, which should be in RGBA format, and sample this blended pixel data into YUV420 data, then reroute those YUV420 data back to CPU for the next processing.

So The question is there have any Gstreamer plugins can save the pixel data after my fragment shader, and then do the shader based sampling and other thing then, Thank you.

Kevin

Matthew Waters wrote
On 15/07/16 02:21, arizonausa wrote:
> Hi All,
>
> I am looking for a solution on using Gstreamer to grab the framebuffer,
> transform it's colors place from RGBA to I420, then use OMX-based encoder to
> finish h264 or h265 encoding.

This all depends on what exactly you mean by 'framebuffer'.  Inside
OpenGL, framebuffer's don't actually hold any data and are just
container objects for textures/renderbuffers that you can render
from/to.  You can download texture but not renderbuffers.

The other interpretation of framebuffer is /dev/fb like thing which I'm
not entirely sure what facilities are available for readback of that.
This is not in the OpenGL domain though so you'll need to consult your
platform's documentation/samples/etc.

Cheers
-Matt

> I don't find any OPENGL based source plugins which can grab framebuffer to a
> FBO, and save the transformed I420 data to a OPENGL texture.
>
> Do anyone have idea on my case, thanks.
>
> Kevin    


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


signature.asc (484 bytes) <http://gstreamer-devel.966125.n4.nabble.com/attachment/4678693/0/signature.asc>
Reply | Threaded
Open this post in threaded view
|

Re: Re: Opengl framebuffer source

Matthew Waters
On 16/07/16 02:50, arizonausa wrote:

> Hi Matt,
>
> Thanks for your answer. Sorry for the unclear question. what we actually do
> is using OpenGL to do alpha blending of two 2D textures, one is in RGBA
> format, and another is YUV, we also use fragment shader to do colorspace
> transformation for YUV texture first and alpha blending of two textures
> before rendering to screen. Right now, in addition to render to screen, I
> also want to grab the pixel data after fragment shader to a Frame buffer
> object, which should be in RGBA format, and sample this blended pixel data
> into YUV420 data, then reroute those YUV420 data back to CPU for the next
> processing.
>
> So The question is there have any Gstreamer plugins can save the pixel data
> after my fragment shader, and then do the shader based sampling and other
> thing then, Thank you.
No, there are no element that "can save the pixel data after [the]
fragment shader" as that data is only available in some resource
(texture/renderbuffer/frontbuffer/backbuffer).

You can render the fragment shader to a texture-backed FBO that you can
then convert back to YUV and download yourself easily enough but there
is no.  You can also blit from the FBO to the screen.

There are also elements to perform all of this for you such as
glcolorconvert, glvideomixer/gloverlay, glupload, gldownload, glshader, etc.

Cheers
-Matt

> Kevin
>
>
> Matthew Waters wrote
>> On 15/07/16 02:21, arizonausa wrote:
>>> Hi All,
>>>
>>> I am looking for a solution on using Gstreamer to grab the framebuffer,
>>> transform it's colors place from RGBA to I420, then use OMX-based encoder
>>> to
>>> finish h264 or h265 encoding.
>> This all depends on what exactly you mean by 'framebuffer'.  Inside
>> OpenGL, framebuffer's don't actually hold any data and are just
>> container objects for textures/renderbuffers that you can render
>> from/to.  You can download texture but not renderbuffers.
>>
>> The other interpretation of framebuffer is /dev/fb like thing which I'm
>> not entirely sure what facilities are available for readback of that.
>> This is not in the OpenGL domain though so you'll need to consult your
>> platform's documentation/samples/etc.
>>
>> Cheers
>> -Matt
>>
>>> I don't find any OPENGL based source plugins which can grab framebuffer
>>> to a
>>> FBO, and save the transformed I420 data to a OPENGL texture.
>>>
>>> Do anyone have idea on my case, thanks.
>>>
>>> Kevin

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (484 bytes) Download Attachment