Quote:
Originally Posted by kfbkfb
My idea to minimize the Digital Video compression/data reduction artifacts:
Do an (original) frame by (compressed) frame comparison/subtraction
(make sure the difference between the original and compressed
Digital Video is low, perhaps some sort of algorithm that takes into
account the color and detail sensitivity of the human eye when
determining the difference).
This method would tend to minimize poor "encodes" and could even
be quantified in terms of a "maximum difference" number printed on the
disc package.
Kirk Bayne
|
These are exactly what a good encode using modern codecs already does.
4:2:0 chroma subsampling works because the human eye doesn't see colour in as much detail so this is a trick to have one full resolution luma channel with lower resolution channels for the chroma information, in basic terms. It can cause visible artefacts especially as I've noticed in deep reds on lower resolution formats like DVD and even on occasion Blu-ray, though the Oppo 203 has impressed me in it's Chroma upsampling technique more than any other player I've owned so makes this less obvious.
https://en.wikipedia.org/wiki/Chroma_subsampling
Two or more passes help minimise artefacts by better judging where the bitrate needs to be prioritised and AVC and HEVC are pretty good at hiding artefacts where the human eye tends not to see them.
https://en.wikipedia.org/wiki/Variab...-pass_encoding
Blind encodes are not as good as those overseen by a discerning eye, a human eye can further tweak to ensure an encode is going through as invisibly as possible. This is what differentiates, lets say, a bog-standard Shout encode from an Arrow one by David MacKenzie and why raw bitrate alone is not always a reliable judge of quality.
Even studio masters often have to employ compression due to the sheer size of uncompressed 2K and 4K video, though not as much as consumer formats.