I see everyone talking about the insane hardware requirements for real-time encoding, but here's a question: why would any consumers need real-time encoding to 4K 10 bit now or any time in the next few years?
I currently have a small collection of 4K HDR Blu-ray rips, but if I'm going to re-encode them, it'd be to reduce the file size for archive purposes, so there's no real-time requirement there.
The only time I ever use real-time encoding is if I'm streaming to a device that requires a lower resolution and/or older codec (i.e, h.264). But why would any consumer need to "reduce" the quality to 4K 10bit?
Some time in the future when I want to stream my 8K source content to my 4K HDR tablet, then I might care about real-time AV1 encoding. But right now, surely consumers would only ever use this for archive content?
Am I missing something?