Hi Wladimir,
On Fri 14 Dec 2007 11:45, "Wladimir van der Laan" <[hidden email]> writes: > Hello, > > I have created a GPU-accelerated plugin to playback Dirac video > streams. Rocking! > Is there infrastructure in place to pass the output of a plugin as a > GL texture, for direct rendering? Or some other recommended way to do > this? Since no one else mentioned this in this thread IIRC, David Schleef has been doing things like this, check his weblog: http://www.schleef.org/blog/2007/12/25/opengl-in-gstreamer/ Eventually when glimagesink gets updated there will be no need for gldownload, the texture can be rendered directly. > I realize I've probably written the first video rendering plugin for > Linux that is accelerated on graphics hardware, so this might get > interesting. There have been GPU-based filters before, and direct-path hardware decoders, but not decoders on the GPU I don't think. You rock :-) Cheers, Andy -- http://wingolog.org/ ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2005. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
tor, 27 12 2007 kl. 14:14 -0500, skrev Andy Wingo: > Hi Wladimir, > > On Fri 14 Dec 2007 11:45, "Wladimir van der Laan" <[hidden email]> writes: > > > Hello, > > > > I have created a GPU-accelerated plugin to playback Dirac video > > streams. > > Rocking! > > > Is there infrastructure in place to pass the output of a plugin as a > > GL texture, for direct rendering? Or some other recommended way to do > > this? > > Since no one else mentioned this in this thread IIRC, David Schleef has > been doing things like this, check his weblog: > > http://www.schleef.org/blog/2007/12/25/opengl-in-gstreamer/ > > Eventually when glimagesink gets updated there will be no need for > gldownload, the texture can be rendered directly. > and debugging, so I wouldn't say it's something temporary. Or maybe you were just referring to the pipeline David Schleef posted on the blog, and the specific case of Wladimir's decoder? > > I realize I've probably written the first video rendering plugin for > > Linux that is accelerated on graphics hardware, so this might get > > interesting. > > There have been GPU-based filters before, and direct-path hardware > decoders, but not decoders on the GPU I don't think. You rock :-) Simon Holm Thøgersen ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2005. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
In reply to this post by Andy Wingo
On Dec 27, 2007 11:14 AM, Andy Wingo <[hidden email]> wrote:
> Hi Wladimir, > > On Fri 14 Dec 2007 11:45, "Wladimir van der Laan" <[hidden email]> writes: > > > Hello, > > > > I have created a GPU-accelerated plugin to playback Dirac video > > streams. > > Rocking! > > > Is there infrastructure in place to pass the output of a plugin as a > > GL texture, for direct rendering? Or some other recommended way to do > > this? > > Since no one else mentioned this in this thread IIRC, David Schleef has > been doing things like this, check his weblog: > > http://www.schleef.org/blog/2007/12/25/opengl-in-gstreamer/ > > Eventually when glimagesink gets updated there will be no need for > gldownload, the texture can be rendered directly. > > > I realize I've probably written the first video rendering plugin for > > Linux that is accelerated on graphics hardware, so this might get > > interesting. > > There have been GPU-based filters before, and direct-path hardware > decoders, but not decoders on the GPU I don't think. You rock :-) In case of direct rendering (decoding and display, both being done from the GPU), how does video clipping work? As in, how does the GPU know what portion of the video to display on-screen? Is the overlay color used as a mask or does someone maintain a list of clipping rectangles? Thanks, Vinay ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2005. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Thu, Dec 27, 2007 at 07:58:08PM -0800, Vinay Reddy wrote:
> > There have been GPU-based filters before, and direct-path hardware > > decoders, but not decoders on the GPU I don't think. You rock :-) > > In case of direct rendering (decoding and display, both being done > from the GPU), how does video clipping work? As in, how does the GPU > know what portion of the video to display on-screen? Is the overlay > color used as a mask or does someone maintain a list of clipping > rectangles? It's no different than any other OpenGL application. dave... ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2005. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
> It's no different than any other OpenGL application.
Okay cool... But, existing applications would have to do some additional work to get subtitles and other on-screen display stuff to work right, wouldn't they? (as in, rendering all text, animations etc. to the OpenGL context) Vinay ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2005. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Hi
To the writer of the accelerated decoder plugin: * did you use GL shaders for implementing the decoding algorithm ? * will you commit the code so that i can peek at the implementation ? I'm wondering if (m)jpeg decoding could be done using the GPU as well; my project involves a lot of parallel mjpeg streams decoding, and it's heavy in the end (although running on a quad-core machine), i can handle about 6 streams at best. The jpeg algo is quite simple, hopefully t may be quick to do. Thanks for any info/code about how to develop a GPU-accelerated decoder. Flo ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2008. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Fri, Jan 18, 2008 at 05:51:25PM +0100, Florent wrote:
> To the writer of the accelerated decoder plugin: > * did you use GL shaders for implementing the decoding algorithm ? > * will you commit the code so that i can peek at the implementation ? Perhaps you're talking about schroedinger? The code is at git://diracvideo.schleef.org/git/schroedinger.git in the cuda branch. It uses CUDA, not OpenGL. dave... ------------------------------------------------------------------------- This SF.net email is sponsored by: Microsoft Defy all challenges. Microsoft(R) Visual Studio 2008. http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |