The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
I agree it is not for gaming.It's not MEANT for gaming. It's also way cheaper than any other AMD or Nvidia card that does support 4 displays, and all those integrated graphics don't support 4 displays.
Interesting.Since GPUs feed outputs via the on-board frame buffer, 0Mbps of PCIe bandwidth is needed for static display refresh. PCIe traffic only happens when something needs to be updated in the frame buffer either by direct VRAM manipulation (MMIO) or GPU acceleration functions.
If you open HWInfo and look at the "GPU Bus Load" for Nvidia GPUs, it will be at 0-1% no matter how many 4k120 monitors are attached while the GPU is doing nothing more than displaying mostly static screens.
I was usingDisplay interface bandwidth has little to do with PCIe bandwidth. The display buffer doesn't traverse PCIe on its way to the monitor. Whether you're displaying a still image or gaming the bandwith required to drive the monitor(s) would be the same, but the PCIe traffic should be next to nothing for the former (which is why it typically downclocks to PCIe 1.0 speeds at idle) but much higher during the latter. Basically, a graphics card's ability to output X resolution at Y Hz is more or less independent of its PCIe interface.
Also, in your calculations for converting from pixels to bits you forgot about bits per channel/color (typically 8). And when going from bits to gigabits you divide by 1 billion, not 100 million.
If you are displaying a slideshow, the only thing that needs to be sent to the GPU is the next JPG and a command to trigger the transition shader between the previous and next picture. For a 25MB image, that's 0.1s at PCIe 1.0x1 speed.So what if we aren't displaying a single picture / slideshow and are instead playing a movie?
How is the information stored in the MKV on my SSD going to pass through the graphics cards HDMI/Displayport without traveling through the PCIE link?
Ahh that makes perfect sense.If you are displaying a slideshow, the only thing that needs to be sent to the GPU is the next JPG and a command to trigger the transition shader between the previous and next picture. For a 25MB image, that's 0.1s at PCIe 1.0x1 speed.
For video, the GPU does h264 in hardware. You pipe the 40-50Mbps video stream directly to the GPU and let it do its thing, 20% of PCIe 1.0x1 bandwidth per stream.
Video and (mostly) static images are trivial on remotely recent GPUs.
I'm not sure where you got 15360 horizontal pixels from. But yes, given that in your original you forgot to multiply by a factor of 8 and forgot to divide by 10, your calculations would have ended up relatively close (8/10 = 80%) to the proper value sort of by accident.I was using
https://k.kramerav.com/support/bwcalculator.asp
To check my math, assuming their calculator is working correctly.
Setting it to
Custom > 8 bits per pixel > 15360 Horizontal Resolution > 2160 Vertical Resolution and 30 hertz
Results in a 29.86 Gbps Total Signal Bandwidth, which is fairly close to my Gigabits per second figure.
I'm not sure where you got 15360 horizontal pixels from. But yes, given that in your original you forgot to multiply by a factor of 8 and forgot to divide by 10, your calculations would have ended up relatively close (8/10 = 80%) to the proper value sort of by accident.
On the site you linked, if we set the resolution to UHD @ 30 Hz and then multiply by 3 (for three displays), it's 26.73 Gbps. Or we could could do custom with 3840 x 3 = 11520 horizontal by 2160 vertical @ 30 Hz, which gives us 22.39 Gbps. The latter is lower presumably because it doesn't take into the addtional pixels required for display blanking or something (don't really know much about that).
Back in the days of CRTs, I could run my Viewsonic p95f at 2048x1536 on an ATi Rage64 fine, though calibrating the image to look reasonably straight was a chore. Mostly used it at 1600x1200 since it was much easier to calibrate, more stable and icons were large enough to be recognizable.It's amazing just how little is needed for 2D displays. Hell, the old GeForce 8400 GS can do 1080p (it struggles, but it does it).
Ah, I remember those days. I actually still have a 19" CRT kicking around. I think that it's Viewsonic as well but I'm not certain. ATi cards, going all the way back to my first card, the ATi EGA Wonder (in 1987), always seemed to be able to force CRT monitors to work outside their rated specs. I think it's part of the reason that ATi owned the video card market for almost 20 years.Back in the days of CRTs, I could run my Viewsonic p95f at 2048x1536 on an ATi Rage64 fine, though calibrating the image to look reasonably straight was a chore. Mostly used it at 1600x1200 since it was much easier to calibrate, more stable and icons were large enough to be recognizable.
The only modems I have owned were internal (ISA) USRobotics.Just as a funny aside, do you remember when ATi sold modems?