News Asus Resurrects GeForce GT 710 GPU With Four HDMI Ports

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It's not MEANT for gaming. It's also way cheaper than any other AMD or Nvidia card that does support 4 displays, and all those integrated graphics don't support 4 displays.
I agree it is not for gaming.

The article stated

"Instead, it's an affordable option for users that are looking for a upgrade above integrated graphics or want to use multiple monitors simultaneously "

According to the benchmarks I found even the integrated graphics in modern CPUs are stronger than a Geforce 710 and so the only real purpose is for wanting more display outputs.
 
Since GPUs feed outputs via the on-board frame buffer, 0Mbps of PCIe bandwidth is needed for static display refresh. PCIe traffic only happens when something needs to be updated in the frame buffer either by direct VRAM manipulation (MMIO) or GPU acceleration functions.

If you open HWInfo and look at the "GPU Bus Load" for Nvidia GPUs, it will be at 0-1% no matter how many 4k120 monitors are attached while the GPU is doing nothing more than displaying mostly static screens.
Interesting.

So for a static desktop showing a single picture with icons the static image is stored in the on-board frame buffer, 0Mbps of PCIe bandwidth is needed.
I am following so far.

So what if we aren't displaying a single picture / slideshow and are instead playing a movie?

How is the information stored in the MKV on my SSD going to pass through the graphics cards HDMI/Displayport without traveling through the PCIE link?
 
Display interface bandwidth has little to do with PCIe bandwidth. The display buffer doesn't traverse PCIe on its way to the monitor. Whether you're displaying a still image or gaming the bandwith required to drive the monitor(s) would be the same, but the PCIe traffic should be next to nothing for the former (which is why it typically downclocks to PCIe 1.0 speeds at idle) but much higher during the latter. Basically, a graphics card's ability to output X resolution at Y Hz is more or less independent of its PCIe interface.

Also, in your calculations for converting from pixels to bits you forgot about bits per channel/color (typically 8). And when going from bits to gigabits you divide by 1 billion, not 100 million.
I was using

https://k.kramerav.com/support/bwcalculator.asp

To check my math, assuming their calculator is working correctly.

Setting it to

Custom > 8 bits per pixel > 15360 Horizontal Resolution > 2160 Vertical Resolution and 30 hertz

Results in a 29.86 Gbps Total Signal Bandwidth, which is fairly close to my Gigabits per second figure.

But having said that if 29.86 Gbps isn't crossing the PCIe link during 4 - 4k 30 hertz 8 bit color videos then this is a moot point.

I understand HDMI version has nothing to do with PCIe but I am still having trouble seeing how information (really tons of information) is arriving at the HDMI port of the graphics card and then traveling over the cable to the monitor without first traveling over the PCIe link.
 
Last edited:
So what if we aren't displaying a single picture / slideshow and are instead playing a movie?

How is the information stored in the MKV on my SSD going to pass through the graphics cards HDMI/Displayport without traveling through the PCIE link?
If you are displaying a slideshow, the only thing that needs to be sent to the GPU is the next JPG and a command to trigger the transition shader between the previous and next picture. For a 25MB image, that's 0.1s at PCIe 1.0x1 speed.

For video, the GPU does h264 in hardware. You pipe the 40-50Mbps video stream directly to the GPU and let it do its thing, 20% of PCIe 1.0x1 bandwidth per stream.

Video and (mostly) static images are trivial on remotely recent GPUs.
 
  • Like
Reactions: TJ Hooker
If you are displaying a slideshow, the only thing that needs to be sent to the GPU is the next JPG and a command to trigger the transition shader between the previous and next picture. For a 25MB image, that's 0.1s at PCIe 1.0x1 speed.

For video, the GPU does h264 in hardware. You pipe the 40-50Mbps video stream directly to the GPU and let it do its thing, 20% of PCIe 1.0x1 bandwidth per stream.

Video and (mostly) static images are trivial on remotely recent GPUs.
Ahh that makes perfect sense.

I totally forgot that the video was in h.264 or another codec when it is traveling over the PCIe link.

I have a much greater appreciation for codecs now lol.
 
I was using

https://k.kramerav.com/support/bwcalculator.asp

To check my math, assuming their calculator is working correctly.

Setting it to

Custom > 8 bits per pixel > 15360 Horizontal Resolution > 2160 Vertical Resolution and 30 hertz

Results in a 29.86 Gbps Total Signal Bandwidth, which is fairly close to my Gigabits per second figure.
I'm not sure where you got 15360 horizontal pixels from. But yes, given that in your original you forgot to multiply by a factor of 8 and forgot to divide by 10, your calculations would have ended up relatively close (8/10 = 80%) to the proper value sort of by accident.

On the site you linked, if we set the resolution to UHD @ 30 Hz and then multiply by 3 (for three displays), it's 26.73 Gbps. Or we could could do custom with 3840 x 3 = 11520 horizontal by 2160 vertical @ 30 Hz, which gives us 22.39 Gbps. The latter is lower presumably because it doesn't take into the addtional pixels required for display blanking or something (don't really know much about that).
 
I'm not sure where you got 15360 horizontal pixels from. But yes, given that in your original you forgot to multiply by a factor of 8 and forgot to divide by 10, your calculations would have ended up relatively close (8/10 = 80%) to the proper value sort of by accident.

On the site you linked, if we set the resolution to UHD @ 30 Hz and then multiply by 3 (for three displays), it's 26.73 Gbps. Or we could could do custom with 3840 x 3 = 11520 horizontal by 2160 vertical @ 30 Hz, which gives us 22.39 Gbps. The latter is lower presumably because it doesn't take into the addtional pixels required for display blanking or something (don't really know much about that).

15360 horizontal pixels is 3840 x 4

A worst case scenario of 4 - 4k monitors
 
  • Like
Reactions: TJ Hooker
It's amazing just how little is needed for 2D displays. Hell, the old GeForce 8400 GS can do 1080p (it struggles, but it does it).

There was one line in the article that I think could have been written better because I know that Zhiye is no dummy:

"The GeForce GT 710 doesn't even require a PCIe 3.0 slot; it's perfectly happy on a PCIe 2.0 x1 interface, presenting the opportunity to use the graphics card on older motherboards that lack a PCIe 3.0 slot. "
- Perhaps "The GeForce GT 710 doesn't even suffer a performance penalty from using even an older PCIe 2.0 x1 slot, it's perfectly happy in that scenario."

The only type of slot that any PCIe card "needs" is PCIe1.
 
It's amazing just how little is needed for 2D displays. Hell, the old GeForce 8400 GS can do 1080p (it struggles, but it does it).
Back in the days of CRTs, I could run my Viewsonic p95f at 2048x1536 on an ATi Rage64 fine, though calibrating the image to look reasonably straight was a chore. Mostly used it at 1600x1200 since it was much easier to calibrate, more stable and icons were large enough to be recognizable.
 
  • Like
Reactions: Avro Arrow
Back in the days of CRTs, I could run my Viewsonic p95f at 2048x1536 on an ATi Rage64 fine, though calibrating the image to look reasonably straight was a chore. Mostly used it at 1600x1200 since it was much easier to calibrate, more stable and icons were large enough to be recognizable.
Ah, I remember those days. I actually still have a 19" CRT kicking around. I think that it's Viewsonic as well but I'm not certain. ATi cards, going all the way back to my first card, the ATi EGA Wonder (in 1987), always seemed to be able to force CRT monitors to work outside their rated specs. I think it's part of the reason that ATi owned the video card market for almost 20 years.

Just as a funny aside, do you remember when ATi sold modems?:
$_20.JPG