Intel Releases Open Source Encoder for Next-Gen AV1 Codec

Well at least the RAM part is relatively cheap, although 48 gigabytes is a rather strange number.

The 112 threads is the killer.

112 threads or 56 cores / 2 = 28 threads each for a dual socket/cpu system

On Pcpartpicker, sorting by highest cores, the cpu with the most threads is the AMD Threadripper 2990WX with 32 threads, unfortunately it doesn't appear Threadripper is dual socket capable.

Moving down the list we have a Intel Xeon E5-2699 V4 with 22 threads, unfortunately again 22 threads isn't enough threads leading me to believe a 4 socket system is the only way to support this.

Unless Intel has new 28 or more cores processor with hyper-threading being released soon.

This would make sense since they are pioneering the codec to begin with.
 

extremepenguin

Distinguished
May 21, 2009
32
2
18,535
You would think 48 GB is a strange number for ram, but in dual socket systems it is a somewhat common occurrence to have 48,96 GB in a system. Or at least it was 3-4 years ago when EEC was more expensive that it is now. Given the thread count required you are looking at a 2 socket system at a minimum, so this would fit with many common server builds from 3-4 years ago when they probably started the spec.
 

DerekA_C

Prominent
Mar 1, 2017
177
0
690
threadripper 3 could end up with a 64core 128 thread with that chiplet 7nm stuff lol leaving 16 threads to spare.
 
I see everyone talking about the insane hardware requirements for real-time encoding, but here's a question: why would any consumers need real-time encoding to 4K 10 bit now or any time in the next few years?

I currently have a small collection of 4K HDR Blu-ray rips, but if I'm going to re-encode them, it'd be to reduce the file size for archive purposes, so there's no real-time requirement there.

The only time I ever use real-time encoding is if I'm streaming to a device that requires a lower resolution and/or older codec (i.e, h.264). But why would any consumer need to "reduce" the quality to 4K 10bit?
Some time in the future when I want to stream my 8K source content to my 4K HDR tablet, then I might care about real-time AV1 encoding. But right now, surely consumers would only ever use this for archive content?

Am I missing something?
 

ET3D

Distinguished
Mar 12, 2012
99
33
18,660
That comment about 112 threads is really unclear. To quote: "at least 48GB of RAM is required to run a 4k 10bit stream multi-threading on a 112 logical core system". There's no indication of what the 112 logical cores are needed for, only that running on such a system requires 48GB. The user guide document omits the 112 logical core part when discussing RAM, simply saying that 48GB is required.

It may be that there's a per-thread RAM overhead, and that 48GB is required on a 112 core system, but, for example, only 16GB would be required on a 32 core system. Impossible to say based on the phrasing on that page, and I could find no other mention of this that might shed more light on the subject.
 


Nah you wouldn't,even today qsv is capable of 4k h.265 10bit in real time,or at least it's super fast I'm not 100% sure on the real time.
What I'm saying is that intel just tries to push this technology right now so it becomes the dominant codec,in the future intel will integrate this into qsv and every pentium will be able to do it.
 

Yep, this is exactly my point. There's discussion about how the compute resources required for real time encoding (either hardware or software) are out of consumer hands, but I'm guessing by the time consumers have any vague requirement for real time 4K 10bit encoding, there will be consumer options available at reasonable prices.
 

stdragon

Admirable


Any consumer-based need to encode won't be software based, rather it will be via a dedicated ASIC chip embedded in devices like smart phones and GoPro units.