NVDEC and 10-bit HEVC decode

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

NVDEC and 10-bit HEVC decode

Samuel Hurst
Hi all,

I find myself experimenting with hardware accelerated HEVC decoding
under GStreamer, and I'm trying to use the nvdec plugin from plugins-bad.

When playing 8-bit HEVC content, playback is fine. However, when I try
playing back 10-bit HEVC content, it comes out as a garbled green mess
as shown in the screenshots in this Dropbox folder:

https://www.dropbox.com/sh/07wzec9lcj689tl/AADo8JuZDNXqCfkYktCcZmEJa?dl=0

I don't see anything in the log that immediately strikes me as why it's
not decoding properly (also in the Dropbox link). There's the WARN line
where it's moaning about CUDA_ERROR_INVALID_HANDLE but it does this for
the 8-bit streams as well and they work fine.

The public DASH MPDs I'm using to test with are available here:

8-bit:
http://testassets.dashif.org/#testvector/details/586fb3879ae9045678eacd3c

10-bit:
http://testassets.dashif.org/#testvector/details/586fb3879ae9045678eacd3f

If anyone has any ideas about what's going wrong, I'd love to hear from you.

For reference, I've tried running on the NVidia 390.48, 390.59 and
396.26 drivers, as well as on CUDA 9.1 and 9.2 and it's broken across
the board. I'm using a GT 1030 GPU on a Fedora 28 install, but it's also
broken on Debian 9. It works fine when I use the libav software decoder,
but that makes my CPU cry.

Best Regards,
Sam
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Nicolas Dufresne-5
Le lundi 18 juin 2018 à 18:08 +0100, Samuel Hurst a écrit :

> Hi all,
>
> I find myself experimenting with hardware accelerated HEVC decoding
> under GStreamer, and I'm trying to use the nvdec plugin from plugins-bad.
>
> When playing 8-bit HEVC content, playback is fine. However, when I try
> playing back 10-bit HEVC content, it comes out as a garbled green mess
> as shown in the screenshots in this Dropbox folder:
>
> https://www.dropbox.com/sh/07wzec9lcj689tl/AADo8JuZDNXqCfkYktCcZmEJa?dl=0
>
> I don't see anything in the log that immediately strikes me as why it's
> not decoding properly (also in the Dropbox link). There's the WARN line
> where it's moaning about CUDA_ERROR_INVALID_HANDLE but it does this for
> the 8-bit streams as well and they work fine.
>
> The public DASH MPDs I'm using to test with are available here:
>
> 8-bit:
> http://testassets.dashif.org/#testvector/details/586fb3879ae9045678eacd3c
>
> 10-bit:
> http://testassets.dashif.org/#testvector/details/586fb3879ae9045678eacd3f
>
> If anyone has any ideas about what's going wrong, I'd love to hear from you.
>
> For reference, I've tried running on the NVidia 390.48, 390.59 and
> 396.26 drivers, as well as on CUDA 9.1 and 9.2 and it's broken across
> the board. I'm using a GT 1030 GPU on a Fedora 28 install, but it's also
> broken on Debian 9. It works fine when I use the libav software decoder,
> but that makes my CPU cry.
When I look at this decoder, I see no code to support 10bit decoding.
This suggests that it outputs 10bit as if it was 8bit NV12. Please file
a bug, the decoder should fail, or support this, right now it is
incorrect.

https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer,component=gst-plugins-bad

>
> Best Regards,
> Sam
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (201 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Matthew Waters
In reply to this post by Samuel Hurst
Currently nvdec doesn't work for 10 (or 12) bit decoding and always
assumes 8-bit mode is used.  Mostly for legacy where the nvdec API
didn't have support for higher bit depths when the code was written.

Changing that requires adding 10-bit pixel formats (actually 10-bit in
16-bit) to libgstgl and plumbing that through for the decoder to output.

Cheers
-Matt

On 19/06/18 03:08, Samuel Hurst wrote:

> Hi all,
>
> I find myself experimenting with hardware accelerated HEVC decoding
> under GStreamer, and I'm trying to use the nvdec plugin from plugins-bad.
>
> When playing 8-bit HEVC content, playback is fine. However, when I try
> playing back 10-bit HEVC content, it comes out as a garbled green mess
> as shown in the screenshots in this Dropbox folder:
>
> https://www.dropbox.com/sh/07wzec9lcj689tl/AADo8JuZDNXqCfkYktCcZmEJa?dl=0
>
> I don't see anything in the log that immediately strikes me as why
> it's not decoding properly (also in the Dropbox link). There's the
> WARN line where it's moaning about CUDA_ERROR_INVALID_HANDLE but it
> does this for the 8-bit streams as well and they work fine.
>
> The public DASH MPDs I'm using to test with are available here:
>
> 8-bit:
> http://testassets.dashif.org/#testvector/details/586fb3879ae9045678eacd3c
>
> 10-bit:
> http://testassets.dashif.org/#testvector/details/586fb3879ae9045678eacd3f
>
> If anyone has any ideas about what's going wrong, I'd love to hear
> from you.
>
> For reference, I've tried running on the NVidia 390.48, 390.59 and
> 396.26 drivers, as well as on CUDA 9.1 and 9.2 and it's broken across
> the board. I'm using a GT 1030 GPU on a Fedora 28 install, but it's
> also broken on Debian 9. It works fine when I use the libav software
> decoder, but that makes my CPU cry.
>
> Best Regards,
> Sam
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel



_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Samuel Hurst
In reply to this post by Nicolas Dufresne-5
On 18/06/18 18:36, Nicolas Dufresne wrote:
> When I look at this decoder, I see no code to support 10bit decoding.
> This suggests that it outputs 10bit as if it was 8bit NV12. Please file
> a bug, the decoder should fail, or support this, right now it is
> incorrect.

As requested, I've created a new bug for this:
https://bugzilla.gnome.org/show_bug.cgi?id=796629

On 19/06/18 01:55, Matthew Waters wrote:
> Currently nvdec doesn't work for 10 (or 12) bit decoding and always
> assumes 8-bit mode is used.  Mostly for legacy where the nvdec API
> didn't have support for higher bit depths when the code was written.
>
> Changing that requires adding 10-bit pixel formats (actually 10-bit in
> 16-bit) to libgstgl and plumbing that through for the decoder to output.

Should I add a related ticket to gst-plugins-base to get this change
into gstgl?


-Sam
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Matthew Waters
On 19/06/18 19:04, Samuel Hurst wrote:

> On 18/06/18 18:36, Nicolas Dufresne wrote:
>> When I look at this decoder, I see no code to support 10bit decoding.
>> This suggests that it outputs 10bit as if it was 8bit NV12. Please file
>> a bug, the decoder should fail, or support this, right now it is
>> incorrect.
>
> As requested, I've created a new bug for this:
> https://bugzilla.gnome.org/show_bug.cgi?id=796629
>
> On 19/06/18 01:55, Matthew Waters wrote:
>> Currently nvdec doesn't work for 10 (or 12) bit decoding and always
>> assumes 8-bit mode is used.  Mostly for legacy where the nvdec API
>> didn't have support for higher bit depths when the code was written.
>>
>> Changing that requires adding 10-bit pixel formats (actually 10-bit in
>> 16-bit) to libgstgl and plumbing that through for the decoder to output.
>
> Should I add a related ticket to gst-plugins-base to get this change
> into gstgl?
That's probably mostly covered by
https://bugzilla.gnome.org/show_bug.cgi?id=703347.  It could be extended
to explicitly mention 12/16-bit though.

Cheers
-Matt

> -Sam


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Samuel Hurst
In reply to this post by Samuel Hurst
On 19/06/18 10:04, Samuel Hurst wrote:

> On 18/06/18 18:36, Nicolas Dufresne wrote:
>> When I look at this decoder, I see no code to support 10bit decoding.
>> This suggests that it outputs 10bit as if it was 8bit NV12. Please file
>> a bug, the decoder should fail, or support this, right now it is
>> incorrect.
>
> As requested, I've created a new bug for this:
> https://bugzilla.gnome.org/show_bug.cgi?id=796629
>
> On 19/06/18 01:55, Matthew Waters wrote:
>> Currently nvdec doesn't work for 10 (or 12) bit decoding and always
>> assumes 8-bit mode is used.  Mostly for legacy where the nvdec API
>> didn't have support for higher bit depths when the code was written.
>>
>> Changing that requires adding 10-bit pixel formats (actually 10-bit in
>> 16-bit) to libgstgl and plumbing that through for the decoder to output.

I've been trying to work on this through the week, but I seem to have
just gotten myself tied into knots and I'm now lost. I'd appreciate some
advice here.

I've managed to tell when the NVidia decoder is outputting not-8-bit
depth video. In parser_sequence_callback, if
format->bit_depth_luma_minus8 is non-zero, then I set the CUDA output
format as cudaVideoSurfaceFormat_P016 (the only other format). According
the the source I've seen, internally this relates to a set of 16-bit
values that for 10-bit video will have their LSB packed as 0.

Later in handle_pending_frames, I'm setting the GstVideoFormat as
GST_VIDEO_FORMAT_P010_10BE that goes into
gst_video_decoder_set_output_state. It now attempts to negotiate the new
format downstream, which is good.

I've then, for want of a better word, bodged in support for P010_10BE
into the GL elements so it negotiates fine. At this point, the video
output is no longer scrambled green, it's now scrambled yellow and
purple, and it looks like it's not putting pixels in the right places.
So possibly still making assumptions of the stream as 8-bit when trying
to convert.

Is what I'm seeing a consequence of the GL elements trying to read those
internal buffers as 8-bit when trying to do the YUV->RGB conversion, and
I need to force it to understand them in a different way? I've tried
adding GST_GL_RGB10 to GstGLFormat, and I've tried plugging that into
various places around in the gst-libs GL code but it's started exploding
in quite varied ways so I've pulled all those changes back out.

I'm sadly not that well versed in OpenGL so I might be trying all the
wrong things, but I'm willing to try and fix it if someone can offer me
some pointers as to where to look and in general what needs doing.
Ticket 703347 says to refer back to a summary, but I can't seem to find
one. I'm assuming this has been swallowed by bugzilla somewhere. It also
mentions shaders, so is this something that needs to be done in the GLSL
stuff in glcolorconvert?


Best Regards,
Sam
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Matthew Waters
On 23/06/18 02:57, Samuel Hurst wrote:

> On 19/06/18 10:04, Samuel Hurst wrote:
>> On 18/06/18 18:36, Nicolas Dufresne wrote:
>>> When I look at this decoder, I see no code to support 10bit decoding.
>>> This suggests that it outputs 10bit as if it was 8bit NV12. Please file
>>> a bug, the decoder should fail, or support this, right now it is
>>> incorrect.
>>
>> As requested, I've created a new bug for this:
>> https://bugzilla.gnome.org/show_bug.cgi?id=796629
>>
>> On 19/06/18 01:55, Matthew Waters wrote:
>>> Currently nvdec doesn't work for 10 (or 12) bit decoding and always
>>> assumes 8-bit mode is used.  Mostly for legacy where the nvdec API
>>> didn't have support for higher bit depths when the code was written.
>>>
>>> Changing that requires adding 10-bit pixel formats (actually 10-bit in
>>> 16-bit) to libgstgl and plumbing that through for the decoder to
>>> output.
>
> I've been trying to work on this through the week, but I seem to have
> just gotten myself tied into knots and I'm now lost. I'd appreciate
> some advice here.
>
> I've managed to tell when the NVidia decoder is outputting not-8-bit
> depth video. In parser_sequence_callback, if
> format->bit_depth_luma_minus8 is non-zero, then I set the CUDA output
> format as cudaVideoSurfaceFormat_P016 (the only other format).
> According the the source I've seen, internally this relates to a set
> of 16-bit values that for 10-bit video will have their LSB packed as 0.
>
> Later in handle_pending_frames, I'm setting the GstVideoFormat as
> GST_VIDEO_FORMAT_P010_10BE that goes into
> gst_video_decoder_set_output_state. It now attempts to negotiate the
> new format downstream, which is good.
>
> I've then, for want of a better word, bodged in support for P010_10BE
> into the GL elements so it negotiates fine. At this point, the video
> output is no longer scrambled green, it's now scrambled yellow and
> purple, and it looks like it's not putting pixels in the right places.
> So possibly still making assumptions of the stream as 8-bit when
> trying to convert.
>
> Is what I'm seeing a consequence of the GL elements trying to read
> those internal buffers as 8-bit when trying to do the YUV->RGB
> conversion, and I need to force it to understand them in a different
> way? I've tried adding GST_GL_RGB10 to GstGLFormat, and I've tried
> plugging that into various places around in the gst-libs GL code but
> it's started exploding in quite varied ways so I've pulled all those
> changes back out.
>
> I'm sadly not that well versed in OpenGL so I might be trying all the
> wrong things, but I'm willing to try and fix it if someone can offer
> me some pointers as to where to look and in general what needs doing.
> Ticket 703347 says to refer back to a summary, but I can't seem to
> find one. I'm assuming this has been swallowed by bugzilla somewhere.
> It also mentions shaders, so is this something that needs to be done
> in the GLSL stuff in glcolorconvert?
Almost :)

You want to add 16-bit formats to GstGL rather than 10/12-bit formats. 
Either that or you need to convert in the decoder to 10/12-bit.

Here's a similar commit for adding ARGB64 support with RGBA16 textures:
https://cgit.freedesktop.org/gstreamer/gst-plugins-base/commit/gst-libs/gst/gl?id=3cfff727b19d450898dbe7931c53ea05bc2a9ac3
Of which I just noticed a bug of it possibly adding the YUY2/UYVY
formats twice!

Cheers
-Matt

> Best Regards,
> Sam


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Samuel Hurst
On 23/06/18 01:16, Matthew Waters wrote:
> Almost :)
>
> You want to add 16-bit formats to GstGL rather than 10/12-bit formats.
> Either that or you need to convert in the decoder to 10/12-bit.

So should I instead create a new GST_VIDEO_FORMAT_P016, which is then
converted to GST_GL_RGB16? How do I tie the new format to be 16-bit per
component instead of 8-bit, or is this just automagically decided in the
GL core?

What else would I need to do to add this to the GST video format code?
Presumably I'd need to add a pack/unpack routine into
gst-libs/gst/video/video-format.c? Hopefully, this should just be a case
of copying the NV12 routines and making them work for 16-bit variables?

I'm guessing something like this:
https://cgit.freedesktop.org/gstreamer/gst-plugins-base/commit/?id=388b48511e90e008138e1842640b76934bd891dc

> Here's a similar commit for adding ARGB64 support with RGBA16 textures:
> https://cgit.freedesktop.org/gstreamer/gst-plugins-base/commit/gst-libs/gst/gl?id=3cfff727b19d450898dbe7931c53ea05bc2a9ac3.
> Of which I just noticed a bug of it possibly adding the YUY2/UYVY
> formats twice!

Thanks, I was messing around in the right bits then :)
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: NVDEC and 10-bit HEVC decode

Matthew Waters
On 25/06/18 20:47, Samuel Hurst wrote:

> On 23/06/18 01:16, Matthew Waters wrote:
>> Almost :)
>>
>> You want to add 16-bit formats to GstGL rather than 10/12-bit formats.
>> Either that or you need to convert in the decoder to 10/12-bit.
>
> So should I instead create a new GST_VIDEO_FORMAT_P016, which is then
> converted to GST_GL_RGB16? How do I tie the new format to be 16-bit
> per component instead of 8-bit, or is this just automagically decided
> in the GL core?
8-bit vs 16-bit is decided based on the format one creates the texture
with and by the data/parameters passed to upload.  By default the
unsized formats are 8-bit (RGBA, RGB, RG, RED) and there are different
sized formats that can be used (RGBA8, RGBA16, etc).  The other variable
is the type which can be bytes (8-bit), short (16-bit), complex
(UNSIGNED_SHORT_565), etc that can also indicate how data is split up. 
There is a table in the OpenGL specifications that outlines what
(unsized format, type, sized format) combinations are valid (which can
be modified by GL extensions).  GStreamer mostly keeps the formats
(unsized+type vs sized) fairly well aligned (same number of components,
pixel strides equal, etc) so the GL driver doesn't need to convert.

> What else would I need to do to add this to the GST video format code?
> Presumably I'd need to add a pack/unpack routine into
> gst-libs/gst/video/video-format.c? Hopefully, this should just be a
> case of copying the NV12 routines and making them work for 16-bit
> variables?
>
> I'm guessing something like this:
> https://cgit.freedesktop.org/gstreamer/gst-plugins-base/commit/?id=388b48511e90e008138e1842640b76934bd891dc

You might be able to find an easier one but the structure would be the
same as that, yes.  Creating a NV12 format for 16-bit values would be
the way to go.

>> Here's a similar commit for adding ARGB64 support with RGBA16 textures:
>> https://cgit.freedesktop.org/gstreamer/gst-plugins-base/commit/gst-libs/gst/gl?id=3cfff727b19d450898dbe7931c53ea05bc2a9ac3.
>>
>> Of which I just noticed a bug of it possibly adding the YUY2/UYVY
>> formats twice!
>
> Thanks, I was messing around in the right bits then :)


_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (499 bytes) Download Attachment