Question If I add another RTX 2080 Ti in NV-Link, will that also double my CPU usage?

SeriousGaming101

Reputable
Mar 17, 2016
163
0
4,680
0
For example,

Currently, with my 4k 144hz monitor, my PC usage is as follows under full gaming load:

9900k CPU usage: 30%
RTX 2080 Ti GPU usage: 100%


If I add another RTX 2080 Ti in NV-LINK, will that also double my CPU usage under full gaming load?

9900k CPU usage: 60%
RTX 2080 Ti GPU usage: 100%
RTX 2080 Ti GPU usage: 100%


Is this how scaling works?
 

Gam3r01

Titan
Moderator
No, it wouldnt double CPU load.
The only theoretical increase is the CPU sending information to the GPU for rendering/display, most of your in game CPU load is the CPU running the game itself.
A second card would only see marginal, if any, CPU load.
 
Reactions: chickenballs

hftvhftv

Honorable
Herald
May 26, 2014
883
122
11,290
39
No games uses 16 cores/32 threads and have marginal gains in FPS. 4 cores at a high clock rate is already enough. 6-8 cores is plenty extra.
Huh? I'm assuming this person wants to upgrade, and I'm saying an upgrade is better spent on upgrading their CPU rather than getting another graphics card. Either that or save money for the 2180 Ti or 3080 Ti or whatever
 

Karadjgne

Titan
Herald
For years now, 4k has been pretty much limited to 60Hz. Transmission issues with bandwidth over hdmi or DP, ability of gpus to field such a high resolution at higher fps, whatever the case. Most of that was getting fixed, top line pc's finally getting above refresh limitations in a broader expanse of games.

Now you want to throw in a high refresh, uber gpu usage monitor? Cpu sets the fps limits. It pre-renders the frames according to the game code. Ships that frame to the gpu which then finish renders it according to resolution and detail levels.

Cpu usage isn't what it sounds like. Cpus always run at 100% ability, so if a cpu is capable of pre-rendering 100 frames a second, that's what it sends to the gpu. If it only uses 30% of its capacity to do so, that's usage. It's like the carnival game 'whack-a-mole', you might only be able to smack that thing 10 times, but when you do, you do not hit it with 100% of your strength, or you'd break something. You just hit it as fast as you can, not as hard as you can.

Combined, all a 144Hz monitor does is break the 60Hz limit. Outputs don't change, cpu still puts out whatever it does, gpu puts out what it can. But instead of refresh limits stopping the gpu from actually putting the higher fps on screen, a 144Hz monitor allows it. Higher cpu fps won't change that, higher usage meaning you are using more cpu to get the same results.

You want more on screen, it'll be on the gpu. Not the cpu. Not until the cpu fps output is lower than the gpu ability. Which is going to take a long time with 4k
 
Reactions: Mandark

Third-Eye

Distinguished
Jun 26, 2011
280
37
18,720
3
For example,

Currently, with my 4k 144hz monitor, my PC usage is as follows under full gaming load:

9900k CPU usage: 30%
RTX 2080 Ti GPU usage: 100%


If I add another RTX 2080 Ti in NV-LINK, will that also double my CPU usage under full gaming load?

9900k CPU usage: 60%
RTX 2080 Ti GPU usage: 100%
RTX 2080 Ti GPU usage: 100%


Is this how scaling works?
What games are you running?
 
I'd certainly investigate all desired/played assorted games' compatibility with SLI quite thoroughly prior to forking out another $1200...

Adding another GPU still might not allow 144 Hz at 4K on lots of newest games...; 144 Hz is a challenge for 1080P/max with some games even with a 2080Ti. Adding a 2080Ti but quadrupling the pixel count with 4k will hardly be a magic boost (relative to 1080P frame rates, I mean)
 

Karadjgne

Titan
Herald
With many games, do the research. There's multiple vids and reviews, tweaks and settings optimizes etc showing that Max isn't all it's cracked up to be. There's many occasions where setting certain max on clouds is next to useless visually, you really can't see the difference, but it's a considerable drain on gpu resources compared to medium. It's not uncommon to see a 20-30fps or better increase by lowering certain settings, that you'll not see visually.

You are shooting a white-tailed deer at 600yrds through a scope. Do you really need to see every single hair on its tail, or is a strong representation of hairs good enough. Oh wait, it's a scope, you don't even see it's tail, you aren't concentrating on that, you are aiming at its head. But the gpu at max will still be setting every strand, moving them when it twitches. Putting your gpu to 100% usage, for no reason. Which lowers fps, a lot.

It's 4k resolution. You really do not need max in much of anything.
 

jostegogar

Upstanding
Mar 26, 2019
270
26
220
3
Huh? I'm assuming this person wants to upgrade, and I'm saying an upgrade is better spent on upgrading their CPU rather than getting another graphics card. Either that or save money for the 2180 Ti or 3080 Ti or whatever
im sure 9900k is better than any 3900x 3950x 3970x in terms of gaming. so getting 3950x is not an upgrade its downgrade to be more precise
 

hftvhftv

Honorable
Herald
May 26, 2014
883
122
11,290
39
im sure 9900k is better than any 3900x 3950x 3970x in terms of gaming. so getting 3950x is not an upgrade its downgrade to be more precise
Ah yes, a cultured opinion. Also, for that comment to make sense you would have had to say more accurate, not more precise. The 9900K is better in some games, but the superior architecture and PCIe 4.0 along with faster memory support would make me take a 3950X/3900X over a 9900K any day.
 

Karadjgne

Titan
Herald
It's a matter of usage. Back a few years ago, Intel was the definite king of gaming and AMD's best FX effort didn't really come close. Until BF4. That game reversed all outlooks for both Intel and amd as it was far more thread strong vrs IPC and clock speeds. The HEDT cpus stomped everything, with the i7-4790k strong on their heels. Followed very close, literally a few fps, by the FX 8350. Which stomped the popular i5-4690k across the board.

That trend blew up, and is now how games are done, more threads. Many Ryzen have no issues keeping up with their Intel counterparts and in some cases beat them entirely.

But that's games. Adobe CC is also highly popular and it's there that Ryzens again suffer. For now. Only because Adobe only currently scales well upto 8 threads, making the 9900k/9700k king. For now. Next gen of Adobe will see better scaling, probably upto 16 or more threads, and Ryzens will be right back up in the mix instead of the bottom of the list.

Software used, be it games or programs determines exactly how a cpu will react, how useful it is. Just saying a 3900x or 3950x is a downgrade is ignorant. That only applies to certain software out now, not all software and certainly not tomorrow's efforts.
 

wehler53

Honorable
Dec 30, 2013
466
1
10,865
48
im sure 9900k is better than any 3900x 3950x 3970x in terms of gaming. so getting 3950x is not an upgrade its downgrade to be more precise
I agree with this, unless youre going to be doing activities outside of gaming that require the extra power, i cant see any point in doing an upgrade to any of the Zen 2 Ryzen chips, they all fall roughly 5% behind the intel in gaming. IF you do CPU intensive activites away from gaming then yeah its a no brianer the Ryzen is the better card.


Just a heads up, SLI scaling is terrible, very few games actually take advantage of it in any meaningful way, while some actual do worse with it on. The consensus about SLI, skip it, its not worth the money.
I disagree with this, for SLI (NVlink) supported games ive seen 60% improvements in frame rates, SLI prior to the RTX release was dead no debating that, because of the massive limit on the bandwidth of the bridge, with NVlink coming to standard consumer cards (still refered to SLI in games though) the bandwidth is about 15x (dont quote me on that i just remember 2GB/s vs 35GB/s) hence the ability for SLI is now somewhat pratical, and will no doubt be used by alot more game makers. Not to mention 4k usually sees the best scaling on gaming, its game to game dependent for example Shadow of the tomb raider goes form 61 FPS to 102FPS @4K

With many games, do the research. There's multiple vids and reviews, tweaks and settings optimizes etc showing that Max isn't all it's cracked up to be. There's many occasions where setting certain max on clouds is next to useless visually, you really can't see the difference, but it's a considerable drain on gpu resources compared to medium. It's not uncommon to see a 20-30fps or better increase by lowering certain settings, that you'll not see visually.
Once again this is now, theres been plenty of test of games optimising for SLI getting 60% increases, the amount of reviews that use games that arent formally optismised for SLI is rediculious. And with NVlink most are predicting better and better scaling in games as game makers start to really utilise it for their games.



Essesntially with Nvidia giving us NVlink tech there is a greater possiblity for SLI to function at greater scaling rates, youll never see perfect scaling, so youll never see your value for money, but if youre running two 2080ti cards i doubt thats an issue. This is a massively debated topic, so id suggest forming your own opinion on the worth of SLI, find an updated list of supported and optimisied games and then look for benchmarks in those games. Make sure youre setup will work for a second card, do you have enough spacing so you can leave a gap between the cards so they can cool propely, unless you have a custom loop one card will always run about 10 degrees higher due to limited airflow, so make sure your current card isnt running hot as it is, 1000w is about the minimum for a psu for that setup, overclooking on it could cause issues, i know 1080ti sli systems with lesser cpus were comfortably pulling 900w so be careful with that, but you should be ok with minor OCs.

Heres a website to get you started... Remeber to ONLY look at places testing on a 4K setup thats crucial
https://www.legitreviews.com/nvidia-geforce-rtx-2080-ti-sli-review-with-nvlink_208222

Oh and to answer the original question which has disappeared from this threads direction, no it wont double your useage from the CPU, thats not how SLI works, you might see a slight CPU usage increase, but its 100% not something you even need to think about, hence why this thread has turned into a should you or shouldnt you spend the money on another 2080ti.
 

Karadjgne

Titan
Herald
But that's the issue, sorta. And you said it quite well. "for SLI (NVlink) supported games". DX12 is mgpu. Has 0 support for SLI, CF, NVlink. That's totally reliant on DX11. Windows10 is DX12 native, but includes DX10 and DX11 runtimes, so is compatible. For now.

So any games produced at this time are built for DX12, but include DX11 ability. That's not going to last much longer, DX11 is already quite old. There's only one full DX12 game with mgpu that I've heard about, and it's dismal. The mgpu isn't up to snuf, it's a Lot of work and no real benefit when vendors want their games out on schedule and you've got a choice. Bug fix and get it working or work overtime with no pay and still have the bugs. Mgpu is going to be a long time coming.

So the only 60% improvements you'll get with NVlink are in NVlink supported games, which are getting less as single cards like the RTX2080ti are handling 4k without too many issues.

If you have money to burn, go for it, but with only 1 game in 4 or 5 showing any real benefit, you'd better be playing those actual games or you just wasted $1200+
 
Reactions: Phaaze88

ASK THE COMMUNITY

TRENDING THREADS