News Asus Resurrects GeForce GT 710 GPU With Four HDMI Ports

"Instead, it's an affordable option for users that are looking for a upgrade above integrated graphics or want to use multiple monitors simultaneously "

This is useful if you need 4 HDMI ports but as for gaming, current generation integrated from AMD and Intel are better.

https://www.videocardbenchmark.net/...orce-GT-710-vs-Intel-HD-P630/3893vs2910vs3682

This is comparing the Geforce GT 710 to the integrated graphics in the Ryzen 5 3400G and the integrated graphics in an Intel i5-9500.

Radeon RX Vega 11
2102

GeForce GT 710
607

Intel HD P630
1727

Obviously if you had a Pentium 4 with Potato HD graphics the Geforce GT 710 would be an upgrade, assuming you found a motherboard with PCI-E 2.0 that was the same socket as the Pentium 4, but you have much bigger problems at this point.
 

BaRoMeTrIc

Honorable
Jan 30, 2017
164
16
10,715
"Instead, it's an affordable option for users that are looking for a upgrade above integrated graphics or want to use multiple monitors simultaneously "

This is useful if you need 4 HDMI ports but as for gaming, current generation integrated from AMD and Intel are better.

https://www.videocardbenchmark.net/...orce-GT-710-vs-Intel-HD-P630/3893vs2910vs3682

This is comparing the Geforce GT 710 to the integrated graphics in the Ryzen 5 3400G and the integrated graphics in an Intel i5-9500.

Radeon RX Vega 11
2102

GeForce GT 710
607

Intel HD P630
1727

Obviously if you had a Pentium 4 with Potato HD graphics the Geforce GT 710 would be an upgrade, assuming you found a motherboard with PCI-E 2.0 that was the same socket as the Pentium 4, but you have much bigger problems at this point.
Why does it have to be pcie 2.0? it's a pcie 3.0 device, but it will run just fine on pcie 2.0 because of the low memory bandwdith. Even if it was a 8gb gddr5 256bit device it would run on pcie 2.0, you would just reach the bandwidth threshold and bottleneck. It will also run fne on pcie 4.0 because pcie is backward compatible.
 

mchldpy

Distinguished
Jan 16, 2010
145
9
18,695
at invaliderror,
Why would you imagine "the primary market would be ...
across multiple large-format public displays"?
It says with 1 monitor you get 60hz, with 2 you get 30hz, do you get 15hz with 4?
So you think this thing will push multiple large-format public displays?
No fan, no intestinal fortitude.
 
Why does it have to be pcie 2.0? it's a pcie 3.0 device, but it will run just fine on pcie 2.0 because of the low memory bandwdith. Even if it was a 8gb gddr5 256bit device it would run on pcie 2.0, you would just reach the bandwidth threshold and bottleneck. It will also run fne on pcie 4.0 because pcie is backward compatible.

It needs to be a minimum of PCIe 2.0 because Nvidia and math says so. (Math is at the end)
https://www.evga.com/products/specs/gpu.aspx?pn=f345c5ec-d00d-4818-9b43-79885e8a161f

I am not sure where you are getting the PCIe 3.0 information

The point I was trying to make was if the motherboard doesn't have PCIe and instead has AGP you are out of luck but at that point you have bigger problems ... dead end architecture.

Extending your argument that the PCIe level doesn't matter lets compute the numbers for 4 - 4k video feeds at 30 hertz all through a graphics card on a PCIe 1.0
(You may have meant the version doesn't matter as long as you go higher / more bandwidth but I felt like doing the math anyway ... FOR SCIENCE!)

I'm assuming if they had to use chroma subsampling on PCIe 2.0 then there won't be enough bandwidth at PCIe 1.0 but lets let the math tell us!

3840 x 2160 = 8,294,400 pixels on a single 4k screen

8,294,400 x 4 screens = 33,177,600 pixels on 4 - 4k screens

The screen is refreshed 30 times a second so

33,177,600 x 30 hertz = 995,328,000 pixels per second

But this is only for one color.

995,328,000 x 3 primary colors = 2,985,984,000

Then you divide by 100 million because math and you get

2,985,984,000 / 100,000,000 = 29.85 Gigabits per second or 3.73 Gigabytes per second of bandwidth to power all 4 screens.

Now lets compare this to the PCIe chart

https://en.wikipedia.org/wiki/PCI_Express#History_and_revisions

PCIe 1.0 x8 link
2 gigabytes a second

O darn it isn't quite there.
A PCIe 1.0 x16 link would do it but unfortunately we can't just glue on 8 more links.

PCIe 2.0 x8 link
4 gigabytes a second

We have a winner!

Mathematically it is not possible to run 4 - 4k monitors at 30 hertz on anything short of a PCIe 2.0 x8 link, although a single 1280x720 monitor will work just fine.

I hope this sufficiently answers "Why does it have to be pcie 2.0"

Edit: Apparently the chroma subsampling mentioned earlier was only to cram the information over HDMI 1.4 and had nothing to do with the PCIe bandwidth.
 
Last edited:

InvalidError

Titan
Moderator
at invaliderror,
Why would you imagine "the primary market would be ...
across multiple large-format public displays"?
It says with 1 monitor you get 60hz, with 2 you get 30hz, do you get 15hz with 4?
So you think this thing will push multiple large-format public displays?
Yes, a GT710 is perfectly fine for driving 1080p digital signage across multiple TVs. You don't need super-powerful GPUs to push some video and static images with transitions.
 
  • Like
Reactions: aldan and TJ Hooker

logainofhades

Titan
Moderator
The point I was trying to make was if the motherboard doesn't have PCIe and instead has AGP you are out of luck but at that point you have bigger problems ... dead end architecture.

LGA 775 had PCI-E. Even P4 compatible boards. While I didn't have a P4, I did have one of those said boards. I ran mine with a Xeon X3210.

https://www.gigabyte.com/Motherboard/GA-EP35-DS3L-rev-1x/support#support-cpu

An IGP model

https://www.gigabyte.com/Motherboard/GA-G31M-S2L-rev-11-20/support#support-cpu
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
at invaliderror,
It says with 1 monitor you get 60hz, with 2 you get 30hz, do you get 15hz with 4?
So you think this thing will push multiple large-format public displays?
No fan, no intestinal fortitude.

The card will do 30Hz using 2 or more screens at 2160p. At 1080p it should be able to do 60Hz on 4 screens. InvalidError is right, this is perfect for marketing. You don't need RTX Titans to drive the digital menu boards at McDonalds.
 

InvalidError

Titan
Moderator
You don't need RTX Titans to drive the digital menu boards at McDonalds.
One company I worked for over 10 years ago did digital signage software and was running dual displays on hardware a fraction as powerful as modern entry-level stuff. On many chipsets, hardware acceleration had to be disabled due to driver or hardware bugs when handling multiple video overlays. Modern entry-level IGPs would have been worth their weight in gold back then if they had more outputs.
 

TJ Hooker

Titan
Ambassador
I'm assuming if they had to use chroma subsampling on PCIe 2.0 then there won't be enough bandwidth at PCIe 1.0 but lets let the math tell us!
[...]
Display interface bandwidth has little to do with PCIe bandwidth. The display buffer doesn't traverse PCIe on its way to the monitor. Whether you're displaying a still image or gaming the bandwith required to drive the monitor(s) would be the same, but the PCIe traffic should be next to nothing for the former (which is why it typically downclocks to PCIe 1.0 speeds at idle) but much higher during the latter. Basically, a graphics card's ability to output X resolution at Y Hz is more or less independent of its PCIe interface.

Also, in your calculations for converting from pixels to bits you forgot about bits per channel/color (typically 8). And when going from bits to gigabits you divide by 1 billion, not 100 million.
 

InvalidError

Titan
Moderator
Mathematically it is not possible to run 4 - 4k monitors at 30 hertz on anything short of a PCIe 2.0 x8 link, although a single 1280x720 monitor will work just fine.
Since GPUs feed outputs via the on-board frame buffer, 0Mbps of PCIe bandwidth is needed for static display refresh. PCIe traffic only happens when something needs to be updated in the frame buffer either by direct VRAM manipulation (MMIO) or GPU acceleration functions.

If you open HWInfo and look at the "GPU Bus Load" for Nvidia GPUs, it will be at 0-1% no matter how many 4k120 monitors are attached while the GPU is doing nothing more than displaying mostly static screens.
 
  • Like
Reactions: TJ Hooker
I'd imagine the primary market for those would be digital signage - running advertisements and other information across multiple large-format public displays.
Or really anyone who wants multiple displays on an office system. The graphics performance may be worse than modern integrated graphics (and not much better than Skylake's HD 530 from when the GT 710 came out over 4 years ago), but that's not too important if one is just running office applications.
 

mcgge1360

Reputable
Oct 3, 2017
116
3
4,685
"Instead, it's an affordable option for users that are looking for a upgrade above integrated graphics or want to use multiple monitors simultaneously "

This is useful if you need 4 HDMI ports but as for gaming, current generation integrated from AMD and Intel are better.

https://www.videocardbenchmark.net/...orce-GT-710-vs-Intel-HD-P630/3893vs2910vs3682

This is comparing the Geforce GT 710 to the integrated graphics in the Ryzen 5 3400G and the integrated graphics in an Intel i5-9500.

Radeon RX Vega 11
2102

GeForce GT 710
607

Intel HD P630
1727

Obviously if you had a Pentium 4 with Potato HD graphics the Geforce GT 710 would be an upgrade, assuming you found a motherboard with PCI-E 2.0 that was the same socket as the Pentium 4, but you have much bigger problems at this point.
It's not MEANT for gaming. It's also way cheaper than any other AMD or Nvidia card that does support 4 displays, and all those integrated graphics don't support 4 displays.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
For a modern system, you could just use a GT1030 to supplement the IGP's outputs instead. I'm guessing the GT710's other main selling point is legacy BIOS support for the ancient steaming piles likely still in use in digital signage today.
It's nice having 4 HDMI ports on one card as well, which probably makes software configuration easier, especially if you want to use 4 displays in a video wall, and then a 5th primary display separate. Dell is in love with putting display ports on their Optiplex's while most monitors use HDMI requiring an adapter or a DP to HDMI cable.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
As far as my own experience mixing VGA, HDMI, DVI and DP monitors goes, the multi-display setup is interface-agnostic. The only thing having the same interface for everything makes easier is not having to manage a bunch of different cables.
I would not group VGA in with the 3 digital interfaces. If you had 3 digital signals and the 4th was VGA, you would easily be able to pick out the monitor using the analog signal. Otherwise, correct, as I said, having all 4 as HDMI means you don't need to use separate adapters or converting cables.
 

InvalidError

Titan
Moderator
Otherwise, correct, as I said, having all 4 as HDMI means you don't need to use separate adapters or converting cables.
Except cables have nothing to do with software setup. My comment was specifically about you saying that same interface everywhere made SOFTWARE setup easier, my point was that most software is completely interface-agnostic so interfaces and cables make little to no difference. It makes HARDWARE setup easier by reducing the number of different parts you need to worry about.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
Except cables have nothing to do with software setup. My comment was specifically about you saying that same interface everywhere made SOFTWARE setup easier, my point was that most software is completely interface-agnostic so interfaces and cables make little to no difference. It makes HARDWARE setup easier by reducing the number of different parts you need to worry about.

This may just be an outdated perspective as I haven't really messed with more than 2 monitors in a while, but it used to be the case that if you wanted total control your multi monitor setup, you really needed to use the utilities that came with your video card and not built in Windows functionality. This meant that you couldn't have total control over a setup that contained video cards from different manufacturers. In the early days of Windows 10, even dual monitor configurations were completely broken on AMD hardware which is what finally pushed me to Nvidia, as the broken AMD drivers were just infuriating to deal with.
 
For a modern system, you could just use a GT1030 to supplement the IGP's outputs instead. I'm guessing the GT710's other main selling point is legacy BIOS support for the ancient steaming piles likely still in use in digital signage today.
But why spend $85-$100 on a GT 1030 when one just needs some additional display outputs? A number of GT 710 models are available for under $50, while GT 1030s start at nearly double the price. The GT 1030 is arguably priced a bit high to replace it for that task, at least for budget office systems.
 

InvalidError

Titan
Moderator
In the early days of Windows 10, even dual monitor configurations were completely broken on AMD hardware which is what finally pushed me to Nvidia, as the broken AMD drivers were just infuriating to deal with.
I've been using multiple displays since it has been introduced, starting with a pair of ATI Rage 32s. Only problem I have ever had with it was with my Radeon HD5700 which wouldn't go to 3D clocks while the video decoder was active, which mean either I couldn't play videos while gaming or had to turn off hardware video decoding... and AMD never fixing that issue is why I decided to get a GTX1050 when AMD discontinued driver support and 1GB wasn't enough for WoW anymore.

The only time I've had to use the in-drivers multiple display setup was when I tried triple-display gaming where you need games to see only one large logical display and even that still does not care about the interfaces between the monitor and GPU aside from the logical display being limited to the worst display's specs or the GPU's output support limits which usually come before the interfaces' limits. The only case I can think of where this is necessary in a commercial setup would be for playing video across a display wall where you need output to every panel to be in sync.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
The only time I've had to use the in-drivers multiple display setup was when I tried triple-display gaming where you need games to see only one large logical display and even that still does not care about the interfaces between the monitor and GPU aside from the logical display being limited to the worst display's specs or the GPU's output support limits which usually come before the interfaces' limits. The only case I can think of where this is necessary in a commercial setup would be for playing video across a display wall where you need output to every panel to be in sync.

Or pretty common gaming setups like this one:

View: https://www.youtube.com/watch?v=Toft6fMvByA