[quotemsg=21742724,0,1287211]I see everyone talking about the insane hardware requirements for real-time encoding, but here's a question: why would any consumers need real-time encoding to 4K 10 bit now or any time in the next few years?
I currently have a small collection of 4K HDR Blu-ray rips, but if I'm going to re-encode them, it'd be to reduce the file size for archive purposes, so there's no real-time requirement there.
The only time I ever use real-time encoding is if I'm streaming to a device that requires a lower resolution and/or older codec (i.e, h.264). But why would any consumer need to "reduce" the quality to 4K 10bit?
Some time in the future when I want to stream my 8K source content to my 4K HDR tablet, then I might care about real-time AV1 encoding. But right now, surely consumers would only ever use this for archive content?
Am I missing something?[/quotemsg]
Nah you wouldn't,even today qsv is capable of 4k h.265 10bit in real time,or at least it's super fast I'm not 100% sure on the real time.
What I'm saying is that intel just tries to push this technology right now so it becomes the dominant codec,in the future intel will integrate this into qsv and every pentium will be able to do it.