Hi, I am working on multimedia application
development using gstreamer framework. There might be many cases that some
window is invisible (for example other window is in full screen), its related
XvImageSink is still busy on calling gst_xvimagesink_show_frame for video
rendering. I am wondering that we could have optimization to detect additional
“Obscured” xevent and reduce the rendering frequency to very low
value, like 1 fps, if such window is obscured. In this way, we can save
precious resources for most applications, like CPU and power, at the same time,
having minimal impact to end user. Does that approach make sense? What do you
guys think of it? Thanks a lot for your feedback and comments. Best Regards Yongnian ------------------------------------------------------------------------------ Oracle to DB2 Conversion Guide: Learn learn about native support for PL/SQL, new data types, scalar functions, improved concurrency, built-in packages, OCI, SQL*Plus, data movement tools, best practices and more. http://p.sf.net/sfu/oracle-sfdev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
On Mon, Dec 13, 2010 at 10:55:14AM +0800, Le, Yongnian wrote:
> Hi, > > I am working on multimedia application development using gstreamer > framework. There might be many cases that some window is invisible > (for example other window is in full screen), its related XvImageSink > is still busy on calling gst_xvimagesink_show_frame for video > rendering. I am wondering that we could have optimization to detect > additional "Obscured" xevent and reduce the rendering frequency to > very low value, like 1 fps, if such window is obscured. In this way, > we can save precious resources for most applications, like CPU and > power, at the same time, having minimal impact to end user. Does that > approach make sense? What do you guys think of it? Thanks a lot for > your feedback and comments. Right now, the UI toolkit (Gtk+, Qt) already keeps track of windows being mapped or unmapped, and in some cases, also the obscured or partially-obscured information. So it would be straightforward to extend the GstVideoSink base class to have a property that encourages degraded rendering. It is tempting to put the obscurity detection directly in the video sink, however, leaving this to the application allows for the possibility that the app is obscuring the video on purpose, yet doesn't want the quality degraded. Perhaps both methods are useful. In general, however, rendering is a tiny fraction of the cost of playing video, so it would be good to also push this information upstream, either as part of a QoS event (e.g., proportion=0.0 for video that will be obscured), or (better) an extension of the QoS event. That way, video decoders can avoid decoding the video. And, for bonus points, as there is a trend in the industry toward stream switching for video over the internet, the QoS information could be used to select a stream with low video bit rate, or audio-only stream. David ------------------------------------------------------------------------------ Oracle to DB2 Conversion Guide: Learn learn about native support for PL/SQL, new data types, scalar functions, improved concurrency, built-in packages, OCI, SQL*Plus, data movement tools, best practices and more. http://p.sf.net/sfu/oracle-sfdev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Thanks for suggestions and I moved the obscurity detection to the application layer (Gtk/Qt) and let application to inform video sink through property when it encourages degraded rendering.
Currently I am designing the event propagation mechanism. I feel that we need avoid subtype QoS event, otherwise all related handlers need to be updated for such special event (proportion=0.0) But I am not sure whether we need to extend QoS event or add a special new event called "OBSCURED" or else. If we extend QoS event, it looks a little bit not graceful/general for it only leads to degraded quality and only applies to video. What do you think? 1. event extension: GStreamer::Event::QOS::OBSCURED 2. add new event: GStreamer::Event::OBSCURED /*original event types*/ GStreamer::Event::FlushStart GStreamer::Event::FlushStop GStreamer::Event::EOS GStreamer::Event::NewSegment GStreamer::Event::Tag GStreamer::Event::BufferSize GStreamer::Event::QOS GStreamer::Event::Seek GStreamer::Event::Navigation GStreamer::Event::Custom::UP GStreamer::Event::Custom::DS GStreamer::Event::Custom::DS::OOB GStreamer::Event::Custom::Both GStreamer::Event::Custom::Both::OOB Best Regards Yongnian -----Original Message----- From: David Schleef [mailto:[hidden email]] Sent: Monday, December 13, 2010 12:21 PM To: Discussion of the development of GStreamer Subject: Re: [gst-devel] Reduce GstXvImageSink rendering when its window is invisible On Mon, Dec 13, 2010 at 10:55:14AM +0800, Le, Yongnian wrote: > Hi, > > I am working on multimedia application development using gstreamer > framework. There might be many cases that some window is invisible > (for example other window is in full screen), its related XvImageSink > is still busy on calling gst_xvimagesink_show_frame for video > rendering. I am wondering that we could have optimization to detect > additional "Obscured" xevent and reduce the rendering frequency to > very low value, like 1 fps, if such window is obscured. In this way, > we can save precious resources for most applications, like CPU and > power, at the same time, having minimal impact to end user. Does that > approach make sense? What do you guys think of it? Thanks a lot for > your feedback and comments. Right now, the UI toolkit (Gtk+, Qt) already keeps track of windows being mapped or unmapped, and in some cases, also the obscured or partially-obscured information. So it would be straightforward to extend the GstVideoSink base class to have a property that encourages degraded rendering. It is tempting to put the obscurity detection directly in the video sink, however, leaving this to the application allows for the possibility that the app is obscuring the video on purpose, yet doesn't want the quality degraded. Perhaps both methods are useful. In general, however, rendering is a tiny fraction of the cost of playing video, so it would be good to also push this information upstream, either as part of a QoS event (e.g., proportion=0.0 for video that will be obscured), or (better) an extension of the QoS event. That way, video decoders can avoid decoding the video. And, for bonus points, as there is a trend in the industry toward stream switching for video over the internet, the QoS information could be used to select a stream with low video bit rate, or audio-only stream. David ------------------------------------------------------------------------------ Oracle to DB2 Conversion Guide: Learn learn about native support for PL/SQL, new data types, scalar functions, improved concurrency, built-in packages, OCI, SQL*Plus, data movement tools, best practices and more. http://p.sf.net/sfu/oracle-sfdev2dev _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel ------------------------------------------------------------------------------ Forrester recently released a report on the Return on Investment (ROI) of Google Apps. They found a 300% ROI, 38%-56% cost savings, and break-even within 7 months. Over 3 million businesses have gone Google with Google Apps: an online email calendar, and document program that's accessible from your browser. Read the Forrester report: http://p.sf.net/sfu/googleapps-sfnew _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.sourceforge.net/lists/listinfo/gstreamer-devel |
Free forum by Nabble | Edit this page |