Thinking of duel cards

Traxxis

Reputable
Feb 9, 2015
22
0
4,510
Okay so I'm using an Acer 28inch Gsync 4k gaming monitor. I have a single MSI 970 gtx gaming 4g card and was thinking of adding a second one. Will this actually benefit me to run two 970's on this. The graphics are pretty stunning on BF4 but if I can benefit from a second card then I'm on it.. Thank u all for any help you can give
 
Solution


I guess the answer is very subjective, then. What do you consider "good performance"? If you're ok with medium settings at around 60fps (which is what I remember the 970 doing at 4K), then there's no point in adding a second card. In general, though, most people at this level of hardware expect to be able to run high or ultra settings, and that will bring your framerate down on a single 970. The G-Sync should help, but in the end it is up to you to decide if the image you currently get is not...
I'm thinking of doing the same to be honest. DirectX 12 is quite a while away and games actually developed using DirectX 12 even farther, but if the rumours are true regarding stacked VRAM, it will make a nice set up for 4k gaming with a 7gb buffer over a 224 bit bus. But I plan to keep my 970 for a long time.
 


Most games, even at 4K, are not yet limited by VRAM, but rather by the raw GPU power. While the 970 will suffer in the future as games will likely use up over 4GB of VRAM at 4K soon, adding a second one will most definitely help raise framerates, since there will be two GPU's working instead of one. This is specially true now that DX12 is coming out, which should give developers much finer control over how the two GPU's cooperate.
 
so g-sync doesn't help? AFAIK the very point of g-sync was to make lower frame rates feel much smoother. also until more info comes out don't pin your hope on that combine VRAM between the cards in multi gpu configuration. the way i see it features like that needs to pushed heavily by gpu maker not game maker. most game maker did not interested with multi gpu setup in any form aside from triple A developers. also if the feature are specifically tied to DX12 it makes it even harder to push it usage in games. because not all developer will use DX12 depending on their needs. the way i heard about it so far DX12 only offered to developer than interested to go for much closer to the gpu architecture. for most dev DX11 probably good enough to serve their purpose.
 


G-Sync should help, but only to a certain point. I don't have experience with adaptive sync technologies, but I do know they have a lower limit beyond which they won't help. It definitely makes 40-50fps look smoother, from what I hear, but once you get down to 30 (which will probably happen at 4K), it doesn't help much.

Take into account that not only DX12 has stacking memory in multi-GPU systems. Mantle has also officially announced that, and it is likely the new Vulkan API will also support it. So it is quite possible we will see it from many people in many scenarios. I do agree that the chance many developers will use these abilities right off the bat is low, but that should change in a matter of years as the advantage of lower level API's becomes obvious.
 
And this is wear I'm torn. If it will help even just a bit then I'm okay with getting it. If it's just sonething that makes system look better, I guess I can wait. I personally would like to do it. I thought if exchanging it for a 980 .
 


I guess the answer is very subjective, then. What do you consider "good performance"? If you're ok with medium settings at around 60fps (which is what I remember the 970 doing at 4K), then there's no point in adding a second card. In general, though, most people at this level of hardware expect to be able to run high or ultra settings, and that will bring your framerate down on a single 970. The G-Sync should help, but in the end it is up to you to decide if the image you currently get is not good enough to warrant spending more money on another GPU.

Do consider, though, that games will continue to demand more and more GPU power. Thus, even though you might be satisfied with performance now, you may eventually find your system performing worse that you would like. Finding 970's at that time might be harder, since new GPU's will keep coming out to replace it, so maybe getting another 970 won't make a huge difference today, but will be the difference between a good and a bad experience in a few years.
 
Solution


during CES most of the Adaptive Sync monitors have the lower range of 40hx. i think only one of them can go down to 30hx which is Benq model with TN panels. FreeSync was supposed able to go as low as 9hz according to AMD but i think the problem might be more on the panel itself physically. i don't know much of the detail on G-Sync but the physical limitation on the panel itself probably limiting G-Sync solution only down to 30hz.

about DX12 i think i will just wait until MS reveal more info about it. we heard a lot of stuff but so far but there is no actual evidence has been shown to the public that it really works.
 


that's why i'm glad going for 660SLI back in 2013 despite only using 20 inch monitor with 1600x900 res (and still right now)
 


Same here. Rocking dual 680's since 2012 😛

during CES most of the Adaptive Sync monitors have the lower range of 40hx. i think only one of them can go down to 30hx which is Benq model with TN panels. FreeSync was supposed able to go as low as 9hz according to AMD but i think the problem might be more on the panel itself physically. i don't know much of the detail on G-Sync but the physical limitation on the panel itself probably limiting G-Sync solution only down to 30hz.

about DX12 i think i will just wait until MS reveal more info about it. we heard a lot of stuff but so far but there is no actual evidence has been shown to the public that it really works.

Sorry if I caused any confusion. I did forget to say the effectiveness is reduced around 30FPS, not exactly at that number. After all, no matter how synchronized the frames are, 30FPS is still pretty darn slow, let alone 9FPS 😛

The whole VRAM stacking is very much still in development, yes. But given that it seems to be a common feature in low-level API's and it provides great benefits with almost no downsides, I fail to see why developers already developing with SLI/Crossfire in mind won't use this new technology.