Question Which features help support 18 MONITORS, non-gaming, and NOT "video wall"?

xXPat

Distinguished
Jul 28, 2013
8
0
18,510
0
I'm trying to gain a better understanding of which features of high-end graphics cards are actually useful/relevant to my application. This is challenging because all the advertising and product literature is mostly focused on the gaming market which is not relevant to me. I seek advice on which features are actually relevant and worth paying for.

I never run games or any other programs designed to use SLI or Crossfire. My (uncertain) understanding is that there is no benefit whatsoever to configuring my hardware to use SLI or Crossfire because it doesn't do anything except help apps like games that are designed to use it. Confirmation on this point would be helpful.

My application is trading (financial markets), and it's all about screen real estate. My current configuration is (16) 4K monitors supported by (2) NVS-810 graphics cards. The reason I've favored that card is that it's the only card I'm aware of with 8 mDP outputs, allowing 16 total monitors on 2 cards. The screens mostly have live financial charts on them. It's graphical in nature, but the update rates are nowhere close to gaming. But I do look at the screens all day long, so steady non-flickering images that reduce eye fatigue is very important. This is NOT a "video wall advertising" application where it just has to look good from a distance. I'm reading text and looking at detailed financial charts on these screens all day long.

Believe it or not, I would like to add 2 more 4K monitors (18 total). I already know from having tried it that (3) NVS810's just don't work well together, even on my 44-lane i9-7940X CPU and Asrock X299 Taichi mobo w/128GB memory. My (sketchy) understanding is that more than 2 video cards is just plain not a good idea (for non-SLI/Crossfire configurations) because of PCI lane contention issues, but I don't understand this very well.

The NVS-810s mostly work ok, but they only support 30hz refresh rates. It's a 5-yr old design... The 30hz displays visibly look fine to me, and since I'm not gaming, I don't need 60hz for sake of appearances. I've read that higher refresh rates reduce fatigue (and I'm looking at these things 12+ hours a day, so that matters). But I don't know whether this business about higher refresh rates being better for fatigue is urban legend or real science. Clarification on that point would be particularly helpful.

I do watch Youtube videos a fair amount, and they often look jumpy. I have no idea if that's because the NVS810s are working too hard supporting all those 4K monitors at once, or if it's just my network connection making the vids jumpy.

It seems there are several new cards on the market offering 60hz refresh on 6 (not 8) mDP outputs. At first glance it's tempting to think (3) of these new cards would give me (18) 4Kx60Hz outputs and that sounds like a tempting upgrade. But I know for sure that (3) NVS810s didn't work well together. When I tried that the system seemed to freeze up and the mouse pointer tracking was suddenly very jumpy and inconsistent. With that experience trying (3) NVS-810s, why should I assume (3) of these newer cards would work together? And just trying it is a pretty expensive way to find out! This is the point I'm in greatest need of advice on: Should I be able to use (3) or even (4) 6-monitor cards on a single i9-7940X / X299 Taichi system, or is that always a bad idea? If the latter, seems the NVS-810s are really my only choice to get 16 4K monitors, unless I've missed something (?).

Or maybe I just don't get it and to build a system to support this many monitors I should be using a different mobo and multiple CPUs or something. If that's the case, I welcome enlightenment.

It's also hard for me to make sense of the differences in features on the new cards RELATIVE TO MY NEEDS. The price difference between a $439 AMD FirePro W600 2GB card and a $2,990 FirePro W9100 32GB card is almost 7:1! And buying 3 of the latter costs almost $9k so it's not something one does without knowing whether or not he even needs the features it offers!

There are choices inbetween, but it's hard to tell from the specs which are relevant to my needs. More memory sounds better for supporting lots of monitors at once, and might make the flicker go away on YouTube vids. But I don't do gaming and don't care about gaming-scale frame rates or SLI/Crossfire features, so maybe those fancy cards would just be a waste of money?

All advice welcome!
 

InvalidError

Titan
Moderator
It really is only the need to drive an ungodly number of those that is an issue of niche market and severely limits your options. Were it not for driver quirks, you could likely get away with using PCIe breakout risers (x16 to 4x4) and a bunch of GT1030 with 3xDP each assuming such variants exist.

As far as GPUs are concerned, the main difference between running a ton of 4k displays and running a video wall on same displays is the need to sync all outputs together across all GPUs. Beyond that, there is fundamentally no difference so digital signage and video wall GPUs (GPUs primarily designed for high output count instead of high performance) are what you are looking for.

On the refresh rate vs eyestrain side of thing, that is mostly a holdover from CRT days. LCDs have a semi-permanent image usually with either constant-current backlight or high-frequency PWM dimming, so eye fatigue from flicker is usually not an issue even at sub-30Hz refresh rates except for gaming monitors where backlight strobing to reduce blur can cause issue. What 60Hz still helps with is more fluid on-screen movement: smoother motion is less jarring, which causes less fatigue all around.
 

Tigerhawk30

Honorable
Dec 16, 2015
163
10
10,765
25
Not sure about the deepest of dives here, but I'll take a stab at this. Hopefully someone with far more intricate and detailed knowledge than I can add some things here. Hopefully I'm not engaging in a Captain Obvious thing here, and if I am I apologize as I'm not precisely sure what the knowledge level is here.

I do believe that the easiest way to do what you're describing is to daisy chain any and all monitors that have DisplayPort v1.2 or above. For this, your monitors should have native DisplayPort In and Out terminals on them or, barring that, there are external hubs to achieve the same effect. When I first did this, I went the latter route where my splitter had a DP jack going into the GPU but had 3 DVI outputs so that I could use my legacy monitors instead without having to worry about that...at the time, DP monitors were prohibitively expensive for me, so this fit the bill....screen real estate for college courses was something I absolutely loved, so I get the real estate love.

I'd found an older article where Sapphire had actually built an 18 screen setup with three FirePro W9000 GPUs installed, but looking at that same video card on Amazon, it's $1k a pop...but evidently it's possible:

https://www.tweaktown.com/news/30841/sapphire-have-a-crazy-insane-18-monitor-setup-running-from-three-gpu-s-at-computex/index.html

If you're looking to stream 4K content out of YouTube, more VRAM and network bandwidth would greatly help. The higher the pixel, the more resources those pixels will demand.

I've found mixed reviews on eye strain thing on the Hz side of the monitors...I'm willing to say that Hz rate doesn't increase or reduce eye strain. In that instance, if you're looking for a lowering on the monitor itself, maybe see if your present monitors have settings where the blue light is reduced or eliminated...otherwise, for example, I personally use a pair of those grandma overlay glasses to combat the strain (mine are from HornetTech) and that does a pretty good job for my old eyes. Just as an FYI, I have a three-monitor setup on my home desk and when I'm working at home, I use two monitors that are not the same...one is a 29" 75Hz LG ultrawide, the other a 34" 144Hz AOC ultrawide. I don't notice any more or less eye strain on one vs the other, personally. To me, that's more eye candy than strain/not strain. The only thing a Hz rating would affect is your FPS while watching YouTube videos, since I believe most of those are uploaded and created around 60Hz on average.

Hope this helps at least somewhat!
 
An interesting problem.
I think the quality issue is not the pcie lanes, but the capability of the card itself to output 4k resolution at >30hz.
Looking up the nvs810 in the techpowerup graphics database, database, I see that the card has the approximate capability of a GT1030.
It can handle in theory 16.3 pixels/second.
https://www.techpowerup.com/gpu-specs/nvs-810.c2772
By comparison, the much more expensive 9100 can do 59.
https://www.techpowerup.com/gpu-specs/firepro-w9100.c2562
Since the 9100 is a dual slot card, you may have trouble on your motherboard installing 3 of them. pcie extender cables are a possible cobble together solution, the way miners did.

Since you have 3 8100 cards, did you try splitting up the load so each card only attached 6 outputs?
Did that make a difference?

If so, you consider using a workstation motherboard and processor like this with 7 single width slots and only using 3 connectors per card.
https://www.newegg.com/gigabyte-mw51-hp0-intel-xeon-w-series-processor-family-processor-tdp-up-to-140w/p/N82E16813145042?&quicklink=true

What is the make/model of your monitors?
If you are using 4k resolution, It must be hard to read text on the more distant of them.
 

xXPat

Distinguished
Jul 28, 2013
8
0
18,510
0
I do believe that the easiest way to do what you're describing is to daisy chain any and all monitors that have DisplayPort v1.2 or above.
I tried using DP daisy-chaining and quickly concluded it is only useful for connecting low-res monitors. I'm pretty sure I tried this in the early days of DP1.2, and it just didn't work with 4K or even 2K resolution.

I suppose it's plausible that with DP2.0 they might have increased the max resolution for daisychaining, but some of my monitors are only DP1.2 compatible.

Everybody suggests this approach first, but I've yet to encounter anyone who has actually built a working system in real life that daisy-chains 2K or higher resolution DP monitors. Even if it's theoretically possible on DP2.0, it wouldn't help with my DP1.2 monitors.

As soon as I tried to run the system with 3 NVS810's, performance was very strange with the mouse pointer locking up for several full seconds at a time. It felt like some sort of bus contention hardware compatibility problem. I suppose it's plausible that some configuration or BIOS changes might have made it work, but I gave up pretty quickly. And yes, I did load-balance the monitors to 6 per card.
 

xXPat

Distinguished
Jul 28, 2013
8
0
18,510
0
An interesting problem.
I think the quality issue is not the pcie lanes, but the capability of the card itself to output 4k resolution at >30hz.
Looking up the nvs810 in the techpowerup graphics database, database, I see that the card has the approximate capability of a GT1030.
It can handle in theory 16.3 pixels/second.
https://www.techpowerup.com/gpu-specs/nvs-810.c2772
By comparison, the much more expensive 9100 can do 59.
https://www.techpowerup.com/gpu-specs/firepro-w9100.c2562
Since the 9100 is a dual slot card, you may have trouble on your motherboard installing 3 of them. pcie extender cables are a possible cobble together solution, the way miners did.
Yes, it's an old card and I'd love to upgrade to something "modern". But the only way to support even my current 16-monitor configuration would be with three new 6-mDP cards. My mobo can accommodate 3 double-high cards. I just might need to add more cooling fans.

But if 3 cards don't all work together because of PCIe lane contention issues (which I don't pretend to understand in complete detail), the whole plan could backfire and 3 of those cards is an expensive experiment.

If so, you consider using a workstation motherboard and processor like this with 7 single width slots and only using 3 connectors per card.
https://www.newegg.com/gigabyte-mw51-hp0-intel-xeon-w-series-processor-family-processor-tdp-up-to-140w/p/N82E16813145042?&quicklink=true
Please help me understand the rationale for this suggestion. Are you recommending this only because the 7-slot board will definitely accomodate 3 double-high cards, or is it because you think the Xeon-based mobo is better able to support 3 cards without running into PCIe lane contention problems? If it's just the former, I'm pretty sure 3 double-height cards can fit on my X299 Taichi mobo, although I'd have to take it apart to verify this.

What is the make/model of your monitors?
If you are using 4k resolution, It must be hard to read text on the more distant of them.
7 of them are Samsumg HDTVs (4x55", 3x65"). The rest are 27- to 32" computer monitors by Dell and ASUS. A few of them are actually only 2K res, but most are full 4K res. To your final question, on the 55" and 65" monitors, the text is just right in native 100% enlargement. You have to upsize the text enlargement in Windows settings to 150% for the 30" monitors and 175% for the 27" monitors to get text sizes consistent with 100% on the 65" Samsungs.
 
I am speculating that your current cards are having problems running 6 4k monitors.
In part because the limited 2gb vram, but probably because the internal graphics chip is old gen and not strong enough.
Vram is used to store frequently used code needed to display an image.
If the gpu needs something in vram, it must get it across the pcie boundary.
Hopefully, from real system ram and not from a hdd. A ssd would make a difference here.
Yes, such activity will stress the pcie setup.
My suggestion is to test to see if you get adequate results with 3 displays sharing the same 2gb vram and the graphics engine.
If that works, the 6 pcie single slot motherboard can install 6 of those 2gb cards with each card needing to support half the number of monitors. I presume that adding 3 more 810 cards would not be all that expensive.
If 3 monitors per card does not work like you want, I think you are going to need stronger cards.

A few other thoughts:
If you need to increase text size on the smaller monitors, why not replace them with monitors that run on a lower resolution in the first place?

I see no requirement that all cards be identical.
Could you not install just one stronger card for the most demanding displays?
That would allow you to lessen the load

Could you replace two 27" monitors with a widescreen 34" widescreen monitor?

I found this product. which attaches two 4k monitors @60hz via usb3.0:
You can add in more usb3.0 support if needed.
 

Andyme177

Great
Apr 26, 2020
141
10
95
19
It might be an option to save money you might run multiple computers with the right number of connections for your needs. Computers designed for home office users are fairly cheap and would be able to handle different display options. I wouldn't think you need more than 16 gb ram since you aren't doing any intensive processing.
 

xXPat

Distinguished
Jul 28, 2013
8
0
18,510
0
geofelt said:
A ssd would make a difference here.
No rotating HDDs on this system. All storage is M.2 NVME.


geofelt said:
A few other thoughts:
If you need to increase text size on the smaller monitors, why not replace them with monitors that run on a lower resolution in the first place?
Because the graphic resolution of the charts is much better at higher resolution. The reason the text has to be upsized is simply that the monitors are farther from my eye than a usual desktop setup.

I wouldn't think you need more than 16 gb ram since you aren't doing any intensive processing.
The point of mentioning the high-end CPU and memory was simply to make it clear that a lack of CPU cycles or RAM is not the reason the system won't perform when a 3rd NVS-810 is added to the configuration. You have no way of knowing what processing I do or why I need 128GB. It's not relevant to this conversation.

Might this work for you?

Multi-monitor.com
What would the rationale be for trading down from a 14-core i9 CPU to a 4-core i7 CPU? To improve performance? Why would I buy a low-end system to replace a high-end one? That makes no sense.
 
Last edited:

Andyme177

Great
Apr 26, 2020
141
10
95
19
It might be an option to save money you might run multiple computers with the right number of connections for your needs. Computers designed for home office users are fairly cheap and would be able to handle different display options. I wouldn't think you need more than 16 gb ram since you aren't doing any intensive processing.
No rotating HDDs on this system. All storage is M.2 NVME.



Because the graphic resolution of the charts is much better at higher resolution. The reason the text has to be upsized is simply that the monitors are farther from my eye than a usual desktop setup.



The point of mentioning the high-end CPU and memory was simply to make it clear that a lack of CPU cycles or RAM is not the reason the system won't perform when a 3rd NVS-810 is added to the configuration. You have no way of knowing what processing I do or why I need 128GB. It's not relevant to this conversation.


What would the rationale be for trading down from a 14-core i9 CPU to a 4-core i7 CPU? To improve performance? Why would I buy a low-end system to replace a high-end one? That makes no sense.
First off you said you aren't gaming and don't need a gaming system most games out today work great at 16gb. The tech that functions on an i7 might be adaptable to your system my only intention in sharing that information was the desire to possibly save you money. I never presumed to know what processing you are doing only know that you stated what you don't need to be able to do. And now I now out of this conversation and wish you all the best in getting what you desire.
 

ASK THE COMMUNITY