How does compression work for live-streaming?
Here are some Netflix calculations:
Resolution | FPS | Chroma | HDR/SDR | Data Rate |
---|
4K | 24hz | 4:2:0 | 10 bit | 4.46 Gb/s |
1080p | | | 8 bit | 0.89 Gb/s |
Netflix requires 25 Mbps and 5 Mbps respectively. So, each uses 178.2x compression.
The numbers for Google Stadia are:
| | | | | Req. | Compression |
---|
4K | 60hz | 4:2:0 | 10 bit | 8.91 Gb/s | 35 Mb/s | 254.6x |
1080p | | | 8 bit | 1.78 Gb/s | 10 Mb/s | 178.2x |
Using 4:2:2, the data rate and compression are:
4k -- 11.88 Gb/s -- 339.4x
HD -- 2.38 Gb/s -- 237.6x
Assuming 4:2:0 for both, Netflix and Stadia use the same compression for HD.
To use the same compression for 4K, Stadia would require 50 Mb/s.
Netflix probably streams using pre-compressed video files. They were probably compressed using HEVC.
With live sporting event, a couple of seconds of delay would be alright. This will allow the frames to be stored and compressed with a hardware encoder.
But playing a streaming game should not preferably have a latency of more than 100 ms. Assuming 50 ms network latency, compression latency should be 50 ms. This would mean compressing 3 frames of 1080p to 0.5 Mb in 50 ms and pushing it thru (for 60 fps). Does this seem right? How might it be done?