Caps for full-range YUV data?

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Caps for full-range YUV data?

Gruenke, Matt
If I want to represent YUV data with a range of 0-255 per sample, is there any combination of caps I can use to do so?  Normally, the range in use is [16, 235] for luma and [16, 240] for chroma.
 
BTW, MPEG-4 supports both video_range types, depending on how the stream was encoded (16-235 being the default & the only one supported by a recent version of ffmpeg I looked at).  JPEG, on the other hand, is specified always to be full-range (although some video sources use 16-235, regardless).
 
 
Regards,
Matt Gruenke
 

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Caps for full-range YUV data?

David Schleef-2
On Thu, Jul 28, 2011 at 05:35:39AM -0400, Gruenke, Matt wrote:
> If I want to represent YUV data with a range of 0-255 per sample, is there any combination of caps I can use to do so?  Normally, the range in use is [16, 235] for luma and [16, 240] for chroma.

video/x-raw-yuv,color-matrix=jpeg

Unfortunately, very few elements support it.  It's "on my list" to
add support in videotestsrc and colorspace, but don't wait for me
if you need it.  It wouldn't be difficult to add.

> BTW, MPEG-4 supports both video_range types, depending on how the stream was encoded (16-235 being the default & the only one supported by a recent version of ffmpeg I looked at).  JPEG, on the other hand, is specified always to be full-range (although some video sources use 16-235, regardless).

In video, pretty much nobody does full-range components, because
there aren't any underlying video standards for it.  It's JFIF, not
JPEG, that specifies colorspace (full-range, of course).  Container
formats (mainly quicktime) are supposed to specify video-range or
full-range for JPEG video, but often don't.  This creates a giant
mess.



David

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: Caps for full-range YUV data?

Gruenke, Matt
On Thursday, July 28, 2011 at 17:51 -0400, David Schleef wrote:

> In video, pretty much nobody does full-range components, because
> there aren't any underlying video standards for it.

When it comes to conventional video sources (e.g. broadcast, authoring,
and post-production systems; consumer & professional digital video
camcorders), I think you're probably right.  However, if one looks at
slightly less conventional sources, such as network cameras (aka IP
cameras) and digital still cameras, you might find that full-range video
is a bit more common.

Conversely, I know that some older MJPEG-based digital video capture &
editing systems used what I refer to as the '601' range (alluding to the
venerable component digital video standard, ITU-R Rec. BT.601).


> video/x-raw-yuv,color-matrix=jpeg

Regarding your caps, I'm concerned about the potential for conflation
with color space that the property name 'color-matrix' seems to imply.
MPEG-4 (ISO/IEC 14496) part 2 (Video) and part 10 (AVC/H.264) both treat
these as distinct & independent concepts.  In part 2, the relevant
fields are called video_range and matrix_coefficients (there's also
transfer_characteristics (i.e. gamma) and colour_primaries).  In part
10, four fields analogous to these exist, although the equivalent of
video_range is instead called the video_full_range_flag.

In DirectShow (and elsewhere), Microsoft also implemented enum
MFNominalRange (http://msdn.microsoft.com/en-us/library/ms705659) as an
independent field (MF_MT_VIDEO_NOMINAL_RANGE), such as in
VIDEOINFOHEADER2 (http://msdn.microsoft.com/en-us/library/bb970322).
Interestingly, it can indicate two further ranges, as well.


Therefore, for both practical reasons and the precedent set by these
standards & implementations, I feel a separate 'video-range' property
would be warranted.  Did you consider this option, or what's the
rationale behind 'color-matrix'?


BTW, thank you for videotestsrc!  It's surprisingly useful!  :)
(Hint to others: try changing the 'pattern' property.)


Regards,
Matt Gruenke

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

Re: Caps for full-range YUV data?

David Schleef-2
On Thu, Jul 28, 2011 at 11:37:07PM -0400, Gruenke, Matt wrote:
> Therefore, for both practical reasons and the precedent set by these
> standards & implementations, I feel a separate 'video-range' property
> would be warranted.  Did you consider this option, or what's the
> rationale behind 'color-matrix'?

Yes, I considered several options, including adding fields for transfer
characteristics, color primaries, etc.  It turns out that none of these
are relevant, because nobody uses them for real work.  Everyone sticks
to the two standards, BT.601 and BT.709, which we've conveniently labeled
as "sdtv" and "hdtv".  From those two values, you can infer the matrix,
offsets, excursions, color primaries, and transfer characteristic.  It's
also extensible, so "jpeg" can be easily added, for data in standard JFIF
matrixed form and sRGB primaries and transfer characteristic.

Technically, there are different color primaries between NTSC and PAL
for stdv, but the difference is so slight that it's almost completely
invisible.  Even the difference in primaries between SD and HD are
minor and nearly invisible.

In the unlikely event that someone needs transfer characteristic, etc,
fields in the future, we can easily expand to that too.

Over the years, I've heard a number of people ask "But 255 > 219, isn't
full-range YUV more precise?"  The answer is no, because it's almost
always compressed (not to mention noisy), in which case the precision
is entirely dependent on the codec bitrate.  And if you're doing
uncompressed and care about precision, you'd use 10-bit video.



David

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Reply | Threaded
Open this post in threaded view
|

RE: Caps for full-range YUV data?

Gruenke, Matt
Thanks for the prompt reply!


On Friday, July 29, 2011 02:02 -0400, David Schleef wrote:

>> Over the years, I've heard a number of people ask "But 255 > 219,
>> isn't full-range YUV more precise?"  The answer is no, because
>> it's almost always compressed (not to mention noisy), in which
>> case the precision is entirely dependent on the codec bitrate.
>> And if you're doing uncompressed and care about precision, you'd
>> use 10-bit video.

While you are correct that this is typically not an issue, in practice,
I can point out two exceptions I've personally encountered during my
career.

First and foremost to my current interests, some analytic algorithms &
processing are sensitive to quantization.  The issue is not so much
about whether the quantization is above the noise floor, but rather its
aggregate structure that can cause problems.

Years ago, I was involved in adding video effects to a popular nonlinear
video editing system (which was based on MJPEG compression).  Users were
quick to point out cases where the "banding" (i.e. contours visible on
smooth gradients) were readily apparent.  It turned out that some of the
effects (especially blurs and certain color effects) made the
quantization introduced by rescaling between full- & 601- range readily
visible.  The effects had originally been written for a system which
used full-range.  In some cases, the effects could function effectively
as-is on 601-range video.  In other cases, the source code of effects
had to be altered for them to work properly.


>> Did you consider this option, or what's the
>> rationale behind 'color-matrix'?

> Yes, I considered several options, including adding fields for
> transfer characteristics, color primaries, etc.

I guess part of my quibble with 'color-matrix' is that if you didn't
explain it to me, I'd still be thinking it determined only the specific
colorspace, using values like sdtv, hdtv, and jpeg as merely shorthand
for the default matrices used by those formats.  For such a broad cap, I
think a name like 'video-signal-type' would be more self-explanatory.


> It turns out that none of these are relevant, because
> nobody uses them for real work.

I don't mean to be glib, but, there's certainly such a thing as a
self-fulfilling prophesy - or, to put it another way: if you don't build
it, it certainly won't get used.

Speaking for myself, we actually do use that stuff (in our old,
proprietary pipeline that we're moving away from), because we are trying
to analyze video content.  Since errors can accumulate, compound, and
change the characteristics of the data, it can pay off to get the
details right in each stage of processing.  Certain folks in video
editing, post-production, and mastering might care about precision,
artifacts, and processing-induced generation-loss.  Videophiles might
notice problems in certain corner cases, or in an A-B comparison with a
product using another video engine.  These are the kinds of folks who
can really help push the technology forward if you can get/keep them on
board.

I say this because it seems a shame to have spent so much time and
energy on making GStreamer so general, abstract, and flexible, but stop
short of accurately capturing media signal semantics.  But that's just
my perspective, and I realize it doesn't count for much, as I'm not
exactly a primary contributor.


I'll conclude with one idea.  It seems like MPEG-4 tried to do something
similar to what you're after.  The fields I mentioned in my previous
message are all optional, including one I didn't mention: video_format.
This can take values such as NTSC, PAL, SECAM, and MAC.  The spec
doesn't indicate how this should be used, but you can imagine they
considered having it switch the defaults in the other fields.

What I'm getting at is that if you want to make it easy for the majority
of users, one idea might be to have some special utility functions (for
developers) & properties in capsfilter & capssetter (for commandline
users) for applying or filtering by certain video signal defaults (e.g.
SDTV/HDTV or NTSC/PAL/HDTV/JPEG).  Perhaps an approach like that might
keep life simple for the majority of users & developers, while enabling
those developers focused on specialized applications or the low level
minutiae of video processing to get the details right.


Thanks, again, for your quickly and precise answers to my questions.  I
also appreciate your focus on ease of use & the core user community.  I
just hope we can find ways to accomplish this without limiting
applicability to problems in high-end, industrial, professional, and
scientific applications.


Regards,
Matt Gruenke

_______________________________________________
gstreamer-devel mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel