Accelerated subtitle rendering in 1.0

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Accelerated subtitle rendering in 1.0

Arnaud Vrac
Hi all,

I'm trying to implement proper subtitle rendering with 1.0 and my
hardware elements. I'm trying to get the following pipeline to work (I
removed the queues for simplicity):

gst-launch filesrc location=file.mkv ! matroskademux name=dmx
    textoverlay name=overlay
    dmx.video_0 ! ! hwvideodec ! overlay.
    dmx.subtitle_0 ! ssaparse ! overlay.
    overlay. ! tee name=t
        t. ! hwvideosink plane=1
        t. ! overlayrender ! hwimagesink plane=2

As you can see I want to avoid blending the subtitles directly on the
video, as it is costly to map video data. Instead I want to render the
video directly on one hardware plane, and the subtitles on another. In
this pipeline example here is what happens:

1/ Composition metadata is added to video buffers by the textoverlay element
2/ The buffers are sent to both:
  - a video sink that renders the video metadata on one plane
  - an image sink that renders the composition metadata on another
plane. The composition metadata is rendered to an ARGB surface by the
generic overlayrender element in this pipeline.

First, what do you think of this pipeline ? Would it enter the
use-cases that are (or will) be handled by subtitleoverlay ? My goal
is to make this work with playbin.

Second, here are the problems are I have with this pipeline:
 - Allocation queries are not forwarded by the tee element, so I had
to hack the textoverlay element to always attach composition metadata
to the video buffers.
 - I have to force async=false on the hwimagesink element, otherwise
the pipeline will not preroll. Maybe I need to do something special
with GAP events or the GAP flag on buffers ?
 - The video API only has blending functions, but in this case my
overlayrender element needs to render the composition metadata
directly without blending, since the allocated frame to render to is
cleared. It would also be nice to have a function to fill a frame with
any color in any pixel format (for clearing for example).

Thanks for your help.

--
Arnaud Vrac
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Accelerated subtitle rendering in 1.0

Tim-Philipp Müller-2
On Mon, 2012-08-06 at 20:51 +0200, Arnaud Vrac wrote:

Hi,

> 1/ Composition metadata is added to video buffers by the textoverlay element
> 2/ The buffers are sent to both:
>   - a video sink that renders the video metadata on one plane
>   - an image sink that renders the composition metadata on another
> plane. The composition metadata is rendered to an ARGB surface by the
> generic overlayrender element in this pipeline.
>
> First, what do you think of this pipeline ? Would it enter the
> use-cases that are (or will) be handled by subtitleoverlay ? My goal
> is to make this work with playbin.

If you want to make it work well with playbin, it would be much easier
if you made a hwsink element that takes care of both the video data and
the subtitle data in one go, i.e. it would render the video data to
plane 1 and upload the subtitle composition data to plane 2 as needed.


> Second, here are the problems are I have with this pipeline:
>  - Allocation queries are not forwarded by the tee element, so I had
> to hack the textoverlay element to always attach composition metadata
> to the video buffers.
>  - I have to force async=false on the hwimagesink element, otherwise
> the pipeline will not preroll. Maybe I need to do something special
> with GAP events or the GAP flag on buffers ?

You need a queue in each branch after tee (at least until we fix tee to
include them..).

>  - The video API only has blending functions, but in this case my
> overlayrender element needs to render the composition metadata
> directly without blending, since the allocated frame to render to is
> cleared. It would also be nice to have a function to fill a frame with
> any color in any pixel format (for clearing for example).

What pixel format do you need for your API ?

The blending funcs are mostly meant as utility functions for overlay
elements that blend the subtitles on top of raw video data directly.

For your use case, the thought was that just getting the ARGB pixels
with

 gst_video_overlay_rectangle_get_pixels_argb() or
 gst_video_overlay_rectangle_get_pixels_unscaled_argb()

would usually be enough. Is your overlay plane in a different format
then?

Cheers
 -Tim

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Accelerated subtitle rendering in 1.0

Arnaud Vrac
On Mon, Aug 6, 2012 at 10:02 PM, Tim-Philipp Müller <[hidden email]> wrote:

> On Mon, 2012-08-06 at 20:51 +0200, Arnaud Vrac wrote:
>
> Hi,
>
>> 1/ Composition metadata is added to video buffers by the textoverlay element
>> 2/ The buffers are sent to both:
>>   - a video sink that renders the video metadata on one plane
>>   - an image sink that renders the composition metadata on another
>> plane. The composition metadata is rendered to an ARGB surface by the
>> generic overlayrender element in this pipeline.
>>
>> First, what do you think of this pipeline ? Would it enter the
>> use-cases that are (or will) be handled by subtitleoverlay ? My goal
>> is to make this work with playbin.
>
> If you want to make it work well with playbin, it would be much easier
> if you made a hwsink element that takes care of both the video data and
> the subtitle data in one go, i.e. it would render the video data to
> plane 1 and upload the subtitle composition data to plane 2 as needed.
>

Ok, I wanted to avoid that to be able to switch the overlay renderer
from low level one that blits directly to the plane, to a composited
renderer (wayland or x11). I guess your solution is much simpler.

>
>> Second, here are the problems are I have with this pipeline:
>>  - Allocation queries are not forwarded by the tee element, so I had
>> to hack the textoverlay element to always attach composition metadata
>> to the video buffers.
>>  - I have to force async=false on the hwimagesink element, otherwise
>> the pipeline will not preroll. Maybe I need to do something special
>> with GAP events or the GAP flag on buffers ?
>
> You need a queue in each branch after tee (at least until we fix tee to
> include them..).

I did do that, as I said I removed the queues from the pipeline for
simplicity. However it still stalls. I guess this won't happen if I
have a single element for rendering.

>
>>  - The video API only has blending functions, but in this case my
>> overlayrender element needs to render the composition metadata
>> directly without blending, since the allocated frame to render to is
>> cleared. It would also be nice to have a function to fill a frame with
>> any color in any pixel format (for clearing for example).
>
> What pixel format do you need for your API ?
>
> The blending funcs are mostly meant as utility functions for overlay
> elements that blend the subtitles on top of raw video data directly.
>
> For your use case, the thought was that just getting the ARGB pixels
> with
>
>  gst_video_overlay_rectangle_get_pixels_argb() or
>  gst_video_overlay_rectangle_get_pixels_unscaled_argb()
>
> would usually be enough. Is your overlay plane in a different format
> then?

ARGB is fine, however I still need to blend all the rectangles in a
single transparent framebuffer, meaning that I have to use a library
like pixman. It's ok, but it wouldn't add much code to get a
gst_video_overlay_composition_render function that does not blend with
the destination surface pixels. I think it would be a good addition to
the public API.

Thanks for your comments !

--
Arnaud Vrac
_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Accelerated subtitle rendering in 1.0

Tim-Philipp Müller-2
On Mon, 2012-08-06 at 22:41 +0200, Arnaud Vrac wrote:

> I did do that, as I said I removed the queues from the pipeline for
> simplicity. However it still stalls. I guess this won't happen if I
> have a single element for rendering.

Ah, sorry, missed that bit.

Your overlay element would probably need to generate a GAP event if an
input buffer does not have an overlay composition attached. This should
then also preroll the overlay sink. Alternatively, you can set the
overlay sink to async=false.


> ARGB is fine, however I still need to blend all the rectangles in a
> single transparent framebuffer, meaning that I have to use a library
> like pixman. It's ok, but it wouldn't add much code to get a
> gst_video_overlay_composition_render function that does not blend with
> the destination surface pixels. I think it would be a good addition to
> the public API.

Ah, right that would be

http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-subtitle-overlays.txt#n423

then I guess ?

Feel free to provide a patch for that :)

Cheers
 -Tim

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Accelerated subtitle rendering in 1.0

Arnaud Vrac
On Mon, Aug 6, 2012 at 11:01 PM, Tim-Philipp Müller <[hidden email]> wrote:

> On Mon, 2012-08-06 at 22:41 +0200, Arnaud Vrac wrote:
>
>> I did do that, as I said I removed the queues from the pipeline for
>> simplicity. However it still stalls. I guess this won't happen if I
>> have a single element for rendering.
>
> Ah, sorry, missed that bit.
>
> Your overlay element would probably need to generate a GAP event if an
> input buffer does not have an overlay composition attached. This should
> then also preroll the overlay sink. Alternatively, you can set the
> overlay sink to async=false.
>
Generating a gap event does not trigger preroll, instead I send an
empty buffer with the GAP flag set and it now works.

>
>> ARGB is fine, however I still need to blend all the rectangles in a
>> single transparent framebuffer, meaning that I have to use a library
>> like pixman. It's ok, but it wouldn't add much code to get a
>> gst_video_overlay_composition_render function that does not blend with
>> the destination surface pixels. I think it would be a good addition to
>> the public API.
>
> Ah, right that would be
>
> http://cgit.freedesktop.org/gstreamer/gst-plugins-base/tree/docs/design/draft-subtitle-overlays.txt#n423
>
> then I guess ?
>
> Feel free to provide a patch for that :)
Actually the gst_video_blend function is buggy, blending the overlay
on top of a transparent frame should be the same as doing a copy of
the overlay on top of the frame. The destination alpha is not modified
in the current implementation. I have attached a patch that properly
implements the 'over' operator for blending.

With this patch I can render the overlay directly using the
gst_video_overlay_composition_blend function and cache the result
until the overlay seqnum changes.

--
Arnaud Vrac

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

video_blend_over.patch (7K) Download Attachment