News H.266 VCC Video Codec Promises to Cut File Sizes in Half

mwestall

Commendable
Oct 23, 2017
5
1
1,515
0
MKV to MP4 isn't transcoding, encoding or any sort of coding, it's just demuxing/remuxing into a new container. An encode is from the uncompressed or lightly compressed original into a more compressed version. Which takes hours.
Interns writing copy today?
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
536
426
760
0
MKV to MP4 isn't transcoding, encoding or any sort of coding, it's just demuxing/remuxing into a new container. An encode is from the uncompressed or lightly compressed original into a more compressed version. Which takes hours.
Interns writing copy today?
I assume someone must have already edited this to correct the error? But while MKV and MP4 are just containers, often you'll transcode to a lower bitrate in the process -- which is the real point. So if you have an MP4 captured at 50 Mbps and you want to convert that to a 16 Mbps MP4 (eg, for YouTube), you're doing a transcode. You could get better quality (with a slower transcode) going to a ~10 Mbps H.265 format, and now apparently a 5 Mbps H.266 format will also deliver approximately the same quality.

I'm super curious to see what sort of hardware requirements will exist for decoding and encoding of H.266. I remember when H.264 was brand new and couldn't be decoded at anything close to real-time on most PCs. Then the hardware and software caught up. Then H.265 did it all again, and now H.266 looks to be repeating things. Of course, lots of stuff is still in H.264 because it's 'universally' supported these days. Plenty of PCs and smartphones still have issues with H.265 decoding.
 

InvalidError

Titan
Moderator
An encode is from the uncompressed or lightly compressed original into a more compressed version.
Although one would typically want to re-encode from as close to the original source as possible for the highest quality, a 3rd or 4th-gen re-encode for whatever purposes (ex.: squeezing hours of video on on-board storage for the kids' tablets) is still an encode too.

I'm super curious to see what sort of hardware requirements will exist for decoding and encoding of H.266.
I'm not particularly worried about decoding, I suspect most of the math is just more refined variants of existing algorithms and current hardware will be able to accelerate most of it just like how old hardware could accelerate most of h264 and h265 before full-blown hardware decoders became mainstream. Encoders is where the real challenge is.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
536
426
760
0
I'm not particularly worried about decoding, I suspect most of the math is just more refined variants of existing algorithms and current hardware will be able to accelerate most of it just like how old hardware could accelerate most of h264 and h265 before full-blown hardware decoders became mainstream. Encoders is where the real challenge is.
I haven't looked to see what's being done, but while the math may be similar in some aspects, there are usually some very computationally intensive sections. I'm sure most modern PCs with a good GPU and CPU will be fine decoding H.266. The real question is going to be stuff like laptops with integrated graphics. If the decoding complexity is four times higher (which isn't too unrealistic), anything prior to Ice Lake might come up short. I guess we'll see, and most likely H.266 won't see widespread use for many years -- at least if it's anything like HEVC/H.265. Most streaming videos are still using H.264 (or VP9 or some other codec) rather than H.265 AFAICT.
 

InvalidError

Titan
Moderator
The real question is going to be stuff like laptops with integrated graphics. If the decoding complexity is four times higher (which isn't too unrealistic), anything prior to Ice Lake might come up short.
I did a h265/4k decode test on my i5-3470 a while ago and could simultaneously decode five files in software before things broke down. I think almost anything newer with quad-cores should be fine with h266, especially if any sort of (I)GPU acceleration is available.
 

rickstockton

Prominent
Oct 27, 2018
1
0
510
0
This appears to be a big improvement in compression, while claiming to maintain high visual quality. But, will the license terms be 'free' to use and re-implement? If it's not as "open" as AV1, then people like me will not be able to use use it. Some content creators have extremely limited abilities to pay for software and hardware upgrades. And some kind of VAAPI-equivalent hardware encoding is nearly mandatory for those of us without gigantic server farms of CPUs to encode content.

I am personally retired with limited income, and I create and transcode videos for a non-profit. I currently depend on ffmpeg as my primary encoder. If the software for encoding videos in this algorithm isn't at least as efficient as VP9, while remaining encodeable on a desktop computer with a mid-grade video card - than I won't be able to go there.
 

Makaveli

Distinguished
Jan 15, 2001
880
57
19,070
2
This is an interesting topic for sure.

My place has a mix of hardware my Plasma tv can decode h264 but not h265.

I have an android box that can do both since its newer.

I would love to see the comparison of all 3 codecs in the future.
 

Kamen Rider Blade

Honorable
Dec 2, 2013
80
28
10,560
0
H.266 has the technical side down along with the compression ratio IMO.

The Licensing is the more interesting part of the story at the moment.

Especially with the Free Codec option being better than H.264 in terms of compression ratio, but not quite the best of H.265 while maintaining video quality closer to H.265.

I just hope they get the licensing aspect of the primary H.266 standard down pat and not be a entire <Mod Edit> like H.265.

And we need all the browsers to jointly work on putting JPEG XT into implementation and start getting JPEG XT into all the Image Editors / Viewers.

That can dramatically cut Image file sizes while retaining quality.
 
Last edited by a moderator:

nofanneeded

Prominent
Sep 29, 2019
671
93
470
3
This is Great news for TV channels all moving to 4K in the near future ... when you need low bandwidth to broadcast .. also for SAT Channels where fast internet is hard to find while the SAT link is low speed. and for ships in the sea as well ..
 
I'm not particularly worried about decoding, I suspect most of the math is just more refined variants of existing algorithms and current hardware will be able to accelerate most of it just like how old hardware could accelerate most of h264 and h265 before full-blown hardware decoders became mainstream. Encoders is where the real challenge is.
If the format can offer similar image quality to H.265 at half the file size, I suspect one of the other formats would already be doing something similar, unless there was some catch. My guess is that they are not because the decoding performance demands become far higher. That, or the image quality isn't actually comparable at that level of compression.

And we need all the browsers to jointly work on putting JPEG XT into implementation and start getting JPEG XT into all the Image Editors / Viewers.

That can dramatically cut Image file sizes while retaining quality.
While support for more efficient image formats might be nice, there's arguably much less of a need there. Today's internet connections are mostly fast enough, and server costs are cheap enough, where the bandwidth required for downloading images is generally not much of a concern. The same goes for storage, where hundreds of high-resolution images can be stored for a few cents. With video, the file sizes tend to be far higher, so there can be much more benefit from moving to a new format.

And most importantly, the standard JPEG format is pretty much universally supported in software and on computing devices stretching back a couple decades. And for lossless web images, PNG is widely supported, at least for software and devices from the last decade or so. Software doesn't tend to change overnight to support the newest image formats, and if a format isn't already widely in use, there's even less incentive for developers to devote resources to supporting it. The PNG format came out in the mid-90s, as a much improved and open alternative to GIF, but didn't really receive full, proper support in all major web browsers until close to 15 years later, despite there not really being any alternative for lossless web images.

JPEG XT offers some additional features like HDR support, but HDR-capable hardware is still not all that widespread, and HDR images are relatively rare. So there arguably isn't that much need for those new features at the moment. And of course, there have been lots of other file formats pitched as being "JPEG successors" over the years that haven't really taken off. JPEG 2000, JPEG XR, WebP, HEIF and so on. And even if a format gets implemented properly in all major web browsers, most will likely stick to the older formats for some time, as there are lots of older devices that won't be compatible.
 

dalauder

Splendid
This is an interesting topic for sure.

My place has a mix of hardware my Plasma tv can decode h264 but not h265.

I have an android box that can do both since its newer.

I would love to see the comparison of all 3 codecs in the future.
I really do think there needs to be more mention of H264 when talking about H266. I'm pretty sure it's more in use than H265 today and that H265 will never be the dominant container.

In terms of quality, isn't H264 mostly supposed to do better anyways, with H265 making concessions for certain situations (movement, dark scenes, etc.). Full Disclosure: I read some article comparing H264 to H265 a few months back that I don't remember clearly and settled on encoding my videos in H264.
 

dalauder

Splendid
While support for more efficient image formats might be nice, there's arguably much less of a need there. Today's internet connections are mostly fast enough, and server costs are cheap enough, where the bandwidth required for downloading images is generally not much of a concern. The same goes for storage, where hundreds of high-resolution images can be stored for a few cents. With video, the file sizes tend to be far higher, so there can be much more benefit from moving to a new format.

And most importantly, the standard JPEG format is pretty much universally supported in software and on computing devices stretching back a couple decades. And for lossless web images, PNG is widely supported, at least for software and devices from the last decade or so. Software doesn't tend to change overnight to support the newest image formats, and if a format isn't already widely in use, there's even less incentive for developers to devote resources to supporting it. The PNG format came out in the mid-90s, as a much improved and open alternative to GIF, but didn't really receive full, proper support in all major web browsers until close to 15 years later, despite there not really being any alternative for lossless web images.

JPEG XT offers some additional features like HDR support, but HDR-capable hardware is still not all that widespread, and HDR images are relatively rare. So there arguably isn't that much need for those new features at the moment. And of course, there have been lots of other file formats pitched as being "JPEG successors" over the years that haven't really taken off. JPEG 2000, JPEG XR, WebP, HEIF and so on. And even if a format gets implemented properly in all major web browsers, most will likely stick to the older formats for some time, as there are lots of older devices that won't be compatible.
I agree that there is a lack of incentives to change. PNGs take the same order of magnitude of space as JPEGs (where Bitmaps take like 10x), so I often just use PNGs and don't look back. I can't imagine any reason to switch to a lossy JPEG replacement.

These days, I pretty much would want my phone saving images as PNGs instead (not that they do, for some reason). But then you'd want it to HDR/VR/360 compatible, if it's a new standard. That doesn't lead to JPEG XT, just an updated PNG lossless standard.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
536
426
760
0
I agree that there is a lack of incentives to change. PNGs take the same order of magnitude of space as JPEGs (where Bitmaps take like 10x), so I often just use PNGs and don't look back. I can't imagine any reason to switch to a lossy JPEG replacement.

These days, I pretty much would want my phone saving images as PNGs instead (not that they do, for some reason). But then you'd want it to HDR/VR/360 compatible, if it's a new standard. That doesn't lead to JPEG XT, just an updated PNG lossless standard.
This depends greatly on the content being compressed and the image resolution. A 'high quality' (JPEG 60 quantization) 1080p JPG of a game screenshot is often around 350-400K, where a PNG of the same screenshot is more like 2MB. That's about five times the size. For a BMP, the size would be 8MB. So PNG for 'busy
images like photos and complex games is going to be one fourth the size of an uncompressed BMP, and JPG is one fourth the size of the PNG (give or take).

On the other hand, if you do a screenshot of a typical webpage where there are lots of runs of the same color, PNG can be smaller than JPG. Like a screenshot of me typing this message is a 185K PNG, or a 330K JPG (vs. 9437K for a BMP). Of course no one (sane) even uses BMPs these days.

For websites, PNG vs. JPG is still a big deal. Large files take longer to download and process, which means a webpage loads slower, which means Google may rank your page lower for being 'too slow.'
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
536
426
760
0
I really do think there needs to be more mention of H264 when talking about H266. I'm pretty sure it's more in use than H265 today and that H265 will never be the dominant container.
Neither one is a container -- they're compression standards for video. But otherwise, yeah, H.264 is still dominant over H.265, because it was first, it's "good enough" and it's lighter weight when it comes to encoding and decoding. (Blu-ray used H.264 for most discs as well.) I could see H.265 starting to gain ground right as H.266 comes out -- though as others have mentioned, some other standards like AV1 are probably a better bet since they're royalty free.
 

dalauder

Splendid
This depends greatly on the content being compressed and the image resolution. A 'high quality' (JPEG 60 quantization) 1080p JPG of a game screenshot is often around 350-400K, where a PNG of the same screenshot is more like 2MB. That's about five times the size. For a BMP, the size would be 8MB. So PNG for 'busy
images like photos and complex games is going to be one fourth the size of an uncompressed BMP, and JPG is one fourth the size of the PNG (give or take).

On the other hand, if you do a screenshot of a typical webpage where there are lots of runs of the same color, PNG can be smaller than JPG. Like a screenshot of me typing this message is a 185K PNG, or a 330K JPG (vs. 9437K for a BMP). Of course no one (sane) even uses BMPs these days.

For websites, PNG vs. JPG is still a big deal. Large files take longer to download and process, which means a webpage loads slower, which means Google may rank your page lower for being 'too slow.'
Yes, PNGs are larger. But for personal usage and storage of media on my machine, it won't matter if I use PNGs or JPEGs (I usually use high enough quality JPEGs to where PNGs only take double the space.) Online, JPEGs are often still usually the right choice.

Maybe I'm wrong, but I think a new standard for HDR would also want to avoid artifacting. Then again, I dislike artifacting more than most people.
 

dalauder

Splendid
Neither one is a container -- they're compression standards for video. But otherwise, yeah, H.264 is still dominant over H.265, because it was first, it's "good enough" and it's lighter weight when it comes to encoding and decoding. (Blu-ray used H.264 for most discs as well.) I could see H.265 starting to gain ground right as H.266 comes out -- though as others have mentioned, some other standards like AV1 are probably a better bet since they're royalty free.
Oops. Yep. Container would be mkv/mp4, right?
 
Reactions: JarredWaltonGPU
I can't imagine any reason to switch to a lossy JPEG replacement.
While I referred to them as "JPEG successors", a lot of these newer image formats do tend to include lossless compression as an option too, along with PNG's other major features like transparency. So they pretty much include the features of both JPEG and PNG combined. Of course, in terms of the compression ratio, you can't really improve much over PNG while remaining lossless, so file sizes would remain about the same for lossless encoding, and might only improve for lossy files.

One thing I don't really like about that is that there's more uncertainty about just what characteristics a given file has. With the common JPEG, PNG and many other formats, you can tell just from the file extension whether it's using lossy compression or not. With one format trying to do it all, you may have to investigate a bit more to determine what sort of compression a file is using.
 

alextheblue

Distinguished
Apr 3, 2001
3,078
106
20,970
2
I did a h265/4k decode test on my i5-3470 a while ago and could simultaneously decode five files in software before things broke down. I think almost anything newer with quad-cores should be fine with h266, especially if any sort of (I)GPU acceleration is available.
Five 4K HEVC videos actually decoding on screen, without dropping frames like crazy? Are you sure the software you were using WASN'T using the GPU and/or onboard Quick Sync, to at least accelerate the most demanding parts? A lot of playback software uses a hybrid approach where it leverages the GPU and/or hardware decode block (QS) for some/much of the demanding math, and the CPU handles the rest. I've seen actual zero-hardware-assist decoding of 4K H.265 and it can be somewhat resource heavy. Granted it varies quite a lot depending on the encoder settings / encode level, and the decoder in question... newer decoders are a lot more efficient than when it was brand new.
 
Last edited:
Reactions: JarredWaltonGPU

dalauder

Splendid
...
One thing I don't really like about that is that there's more uncertainty about just what characteristics a given file has. With the common JPEG, PNG and many other formats, you can tell just from the file extension whether it's using lossy compression or not. With one format trying to do it all, you may have to investigate a bit more to determine what sort of compression a file is using.
I would hate it if the new standard could be lossy or not. That really should be absolute. Considering how H266 reduces file size, there really should be better, high-cycle-cost, compression for larger image files. That might save A LOT of space if you were working with 50MP images in a few years. (Not that Megapixels are actually increasing anymore. But maybe they would if file size could be kept in check?)

Do I have that right? That a small image wouldn't have as much opportunity to encode compression through patterns/repetition?
 

alchemy69

Distinguished
Jul 4, 2008
204
0
18,680
0
"Let's put this in perspective. A 90-minute 4K video consumes up to 10GB of space with the current H.265 codec. According to Fraunhofer HHI's figures, the same video with an identical level of quality would only require 5GB of space with H.266 "

Did we really need an example to tell us what 'cut in half' means?
 

nofanneeded

Prominent
Sep 29, 2019
671
93
470
3
"Let's put this in perspective. A 90-minute 4K video consumes up to 10GB of space with the current H.265 codec. According to Fraunhofer HHI's figures, the same video with an identical level of quality would only require 5GB of space with H.266 "

Did we really need an example to tell us what 'cut in half' means?
oh well it is "Toms hardware" "For dummies" version :p
 

ASK THE COMMUNITY

TRENDING THREADS