Streaming an opengl texture over a network

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Streaming an opengl texture over a network

GStreamer-devel mailing list
Hi all,

I want to stream an OpenGL texture over a network.

I'm struggling with getting the texture into a GStreamer pipeline. I've seen other users resort to using appsink and appsrc elements, but I want to keep the texture in the GPU and do encoding and decoding there.

I've looked at the website and the source (in gst-plugins-base for the OpenGL plugin documentation. However, there seems to be no documentation beyond pads and signals, making it very difficult to learn how to use the plugin.

Any pointers to relevant docs/resources would be much appreciated, and working code would be too good to be true ;)

Thanks!

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Streaming an opengl texture over a network

GStreamer-devel mailing list
It is not clear what you are trying to do exactly. But you can use glshader for GLSL shadering and then use the same memory type (no need to gldownload etc.) to HW  encoder like NVEnc (AVC or HEVC); and then packtize the encoded byte stream using corresponding RTP pays. 

Regards,

Yu

On Mon, 14 Jun 2021, 15:40 Edward Anon via gstreamer-devel, <[hidden email]> wrote:
Hi all,

I want to stream an OpenGL texture over a network.

I'm struggling with getting the texture into a GStreamer pipeline. I've seen other users resort to using appsink and appsrc elements, but I want to keep the texture in the GPU and do encoding and decoding there.

I've looked at the website and the source (in gst-plugins-base for the OpenGL plugin documentation. However, there seems to be no documentation beyond pads and signals, making it very difficult to learn how to use the plugin.

Any pointers to relevant docs/resources would be much appreciated, and working code would be too good to be true ;)

Thanks!
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Streaming an opengl texture over a network

GStreamer-devel mailing list
Hi yu,

I am doing some real-time computation on multiple synced video streams.

I am currently using OpenGL compute shaders to do this. Hence, the current frame of each stream is bound to a texture and passed to the shader. The shader also needs access to some extra information between invocations so I need to swap textures in and out between calls.  I want to stream the output of this shader to another machine which will load that information into an OpenGL texture and render it to the screen. I can't stream directly to the screen as I am using OpenGL to render a rudimentary GUI on top of this. I'm open to alternative solutions on this end of the pipeline. However, for other projects, I will need to stream into a texture and use that later so I thought it would be a good thing to learn now.

I've looked at the GStreamer docs for GstGLShader and I'm not sure how I would set something like that up. There don't seem to be any signals to change the bound textures between invocations. Also, I'm a total noob with GStreamer so this would be my first foray into the OpenGL plugin. Do you have anything closer to a user guide than the quick reference the docs and gst-inspect provide? I have a lot more questions and don't want to waste your time.

If the above is not possible I can modify my code so that the shader does not need to change the textures it has bound between invocations.

Thanks

On Tue, 15 Jun 2021, 20:43 Yu You, <[hidden email]> wrote:
It is not clear what you are trying to do exactly. But you can use glshader for GLSL shadering and then use the same memory type (no need to gldownload etc.) to HW  encoder like NVEnc (AVC or HEVC); and then packtize the encoded byte stream using corresponding RTP pays. 

Regards,

Yu

On Mon, 14 Jun 2021, 15:40 Edward Anon via gstreamer-devel, <[hidden email]> wrote:
Hi all,

I want to stream an OpenGL texture over a network.

I'm struggling with getting the texture into a GStreamer pipeline. I've seen other users resort to using appsink and appsrc elements, but I want to keep the texture in the GPU and do encoding and decoding there.

I've looked at the website and the source (in gst-plugins-base for the OpenGL plugin documentation. However, there seems to be no documentation beyond pads and signals, making it very difficult to learn how to use the plugin.

Any pointers to relevant docs/resources would be much appreciated, and working code would be too good to be true ;)

Thanks!
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Streaming an opengl texture over a network

GStreamer-devel mailing list
The best resource for most of this kind of thing are the OpenGL elements themselves.

The important thing here is that you need is to be creating a GstGLMemory (which is an OpenGL texture) to push GstBuffer's into a GSstreamer pipeline using e.g. appsrc.  If you want to be reusing textures, then you would need to be using a GstGLBufferPool as well.  For wrapping and external texture into a GstGLMemory, you need to create a GstGLVideoAllocationParams using gst_gl_video_allocation_params_new_wrapped_gl_handle() and then either using gst_gl_base_memory_alloc() or gst_gl_memory_setup_buffer().  See e.g. https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/gst-libs/gst/gl/gstglbufferpool.c#L281 and GstGLBufferPool's creation of priv->gl_params.  You can also create other types of GL textures such as EGLImage-backed resources if you match the allocator and the allocation params.  e.g. see https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/gst-libs/gst/gl/gstglupload.c#L1334, https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/gst-libs/gst/gl/gstglupload.c#L677 or some of the other instances of gst_gl_video_allocation_params_new_wrapped_gl_handle().

Cheers
-Matt

On 16/6/21 6:45 am, Edward Anon via gstreamer-devel wrote:
Hi yu,

I am doing some real-time computation on multiple synced video streams.

I am currently using OpenGL compute shaders to do this. Hence, the current frame of each stream is bound to a texture and passed to the shader. The shader also needs access to some extra information between invocations so I need to swap textures in and out between calls.  I want to stream the output of this shader to another machine which will load that information into an OpenGL texture and render it to the screen. I can't stream directly to the screen as I am using OpenGL to render a rudimentary GUI on top of this. I'm open to alternative solutions on this end of the pipeline. However, for other projects, I will need to stream into a texture and use that later so I thought it would be a good thing to learn now.

I've looked at the GStreamer docs for GstGLShader and I'm not sure how I would set something like that up. There don't seem to be any signals to change the bound textures between invocations. Also, I'm a total noob with GStreamer so this would be my first foray into the OpenGL plugin. Do you have anything closer to a user guide than the quick reference the docs and gst-inspect provide? I have a lot more questions and don't want to waste your time.

If the above is not possible I can modify my code so that the shader does not need to change the textures it has bound between invocations.

Thanks

On Tue, 15 Jun 2021, 20:43 Yu You, <[hidden email]> wrote:
It is not clear what you are trying to do exactly. But you can use glshader for GLSL shadering and then use the same memory type (no need to gldownload etc.) to HW  encoder like NVEnc (AVC or HEVC); and then packtize the encoded byte stream using corresponding RTP pays. 

Regards,

Yu

On Mon, 14 Jun 2021, 15:40 Edward Anon via gstreamer-devel, <[hidden email]> wrote:
Hi all,

I want to stream an OpenGL texture over a network.

I'm struggling with getting the texture into a GStreamer pipeline. I've seen other users resort to using appsink and appsrc elements, but I want to keep the texture in the GPU and do encoding and decoding there.

I've looked at the website and the source (in gst-plugins-base for the OpenGL plugin documentation. However, there seems to be no documentation beyond pads and signals, making it very difficult to learn how to use the plugin.

Any pointers to relevant docs/resources would be much appreciated, and working code would be too good to be true ;)

Thanks!
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

OpenPGP_signature (505 bytes) Download Attachment