GPU For 1080p Gaming

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Ayro

Commendable
Apr 25, 2016
7
0
1,510
I will build a pc in June(I think after the release of the new cards) But I can't decide which GPU should I choose. I searched for 3 hours and didin't get a certain thing.

I need a solid 60FPS(Vsync) with Ultra(No need for high AA and Ambient Oc.) on today's games(like Witcher 3- GTA 5 etc)

And can keep this work for 3 years on high settings.(I know we can't say something about future)

And I won't upgrade my res. from 1080p

My Budget is 500-600$

Thanks

Oh I forgot, i will capture my gaming and do streaming.
 
Solution
I agree with the 970 choice but if your budget is 5-600 I would say grab yourself a 980TI which will give you better future proofing as long as you don't mind that it's ~ $50 over your price limit at around 650. If you're willing to drop that on a gpu, go for it.


Normally, Xfire would be a bad idea. But in a few years, multi-GPUs will probably be what's most commonly used
 
Not on ultra. I'm running GTA med-high (more on the high side) and Fallout with the same, running with between 2.5 - 3 GB of VRam usage. For medium yeah it would be, not for ultra. I think I've maxed around 3.8 on GTA (albeit getting 20FPS nevermind frame drops)
 


That's what I'm getting at. If you're using more than 2gb of vram, are you even getting playable performance?
 
Considering I run it maxing around 3.2 at 50-55 FPS, yeah, absolutely I am. The extra VRam comes from SSAO/MSAA which is a massive tax on your GPU hence the massive frame drop
Fallout 4 constantly hits 2.5Gb of VRam during taxing situations while only hitting around 70-80% of my GPU usage due to the fact that my 6100 even running at 4.2Ghz is still a huge bottleneck.
 


I'm taking about a constant 60fps. Disabling MSAA would drop that number below 2GB, and would probably bump up performance to 60fps.
There are some games that use more than 2GB, but they are usually not the best optimized. The Witcher 3, for example, looks better than Shadow of Mordor, Fallout 4, and Rainbow Six Siege (SoM and RSS require 3GB to run higher textures). Yet, on ultra settings, The Witcher 3 only uses 1.5GB of VRAM. It takes a 970/390 to run this game at a nearly flawless 60fps on ultra.
Why do these other games require 3GB to run on high settings, even though they look worse, and have less details loaded?
 
Hey OP, just get a GTX 970 and you will be fine. If you can wait for Pascal, then that would be even better. But at present, GTX 970 is just perfect for 1080p ultra @ 60 FPS.

@SlightlyEdible, GTX 960 can also do 1080p ultra but will not give 60 FPS. Instead you will get like some 40-45 FPS.

@genthug, Why are you so overly supportive to AMD cards? They are power hungry, run hot, don't have features like PhysX, gameworks etc. and also have poor driver support. Nvidia has better relations with game developers so most of the games are Nvidia supportive. I won't recommend buying an AMD card. Maybe Polaris cards will be good but certainly not the R9 3xx series cards. Maxwell 2.0 is just a much much better architecture than that of AMD.
 
Because "hot" entails a difference of how hard you run your fan, and even then it won't be incredibly taxing. Driver support has gotten significantly better in the last 2 years as they've realized that other than TDP, that's the single biggest thing that NVidia has/had over AMD. Tbh, I don't notice much of a driver difference between either of them as I didn't fiddle with the driver settings. Don't want to mess that stuff up. Aside from that, the PhysX in the games I play still looks just fine/the same as it did on my old GTX card. It's like: Do you want the Porsche with a v6 or a v8? But, the v6 has 4gb more of leather seating, and the DX12 GPS system is way better. But right now, the v8 will get you where you need to go faster.
Personally, I see AMD as a better future proof unless you get SLI Titan zs. I see that 970 bottling his machine with the 3 1/2 GB of VRam. You won't have that issue with the 390(x).
Don't get me wrong, I see all your points and I agree, and I agree that Nvidia makes the more... Straight power card, and that businesses are more likely to side with them than amd. But the differences between a 390 and 970 or a 390x and 980 are negligible. And for that negligible 5-10 fps difference, you get 4gb more of vram.
And onto that point--I agree with wanting a flat 60, and my games will fluctuate between 50-75 depending on circumstances. Also- I only use FXAA or I turn AA off in my games- still over two gb with rest of my settings running medium to high. But, the thing is... Don't count on everything being optimised. I.. Again... Totally agree with your point that game optimization is everything.But not everything is going to be like the either. He can't depend on that imo. That's putting too many eggs in one basket, where the eggs are a couple $100s.
Again, that's all up to speculation. I personally like the higher VRam and I believe the slight frame difference and lack of some small features, the TDP, to be negligible. Talk to someone else, that might change. But, with either company I've never had an issue with a purchase. The biggest thing bottling my old 550 ti was its lack of RAM (probably speed as well) but I couldn't run a good many titles because they used over a gig and I wasn't getting playable frames on them.
TL;DR this question is highly opinionated, and something one person puts stock in another person might not care about, viewing it as unimportant in the overall scheme of things.
 


Wow. Ignorance at its finest.
By that logic, Nvidia doesn't support Xfire (superior to SLI), Eyefinity (NV surround blows), TrueAudio, nor do Nvidia cards gain from API's that utilize asynchronous computing (like DX12).
You've stereotyped. Let's compare the 970 and 390
The 390 uses about 40 more watts at load, and runs 2 degrees hotter. The drivers for AMD have VASTLY improved recently. Nvidia doesn't have better relations, they pay devs to put their technology into the game. The 300 series are decent overclockers, to the point where a 390 OC will usually beat a 97 0OC.
Buying Nvidia at this point would by a very poor choice. They trade blows with AMD in DX11 at this point, and are getting destroyed in DX12.
 


Only one game uses Async compute so far. That is AoTS, which is $**t.
Nvidia 364.91 drivers are adding support for something called "Async transfer queue". Looks comparable to Async compute.
Xfire superior to SLI ? How ?
Eyefinity better than NV surround? Again how?
TrueAudio? Does it even matter?
You are listing some of the least important features which are usually ignored when buying a graphics card. What you are forgetting is that Nvidia has PhysX, hairworks, gameworks, HBAO+, Fallout 4 weapon debris effect, TXAA, MFAA, NVDOF, VXGI, G-SYNC, PCSS (has more support than CHS) etc.

R9 390 only 2 deg cel hotter? Where did you hear that? I think R9 390 can make you feel cozy in winters and burn your PC in summers.

Only 40 watts power difference? GTX 970 uses some 145 W while R9 390 uses 275 W. It is clearly written on MSI's website.

AMD cards overclock like $**t, given the amount of heat R9 390 produces, it will surely be fried after overclocking.

"Buying Nvidia at this point of time is a poor choice ". Do you work for AMD or what?
 
GTX970 is great card for this time but not future proofing I guess.(60FPS on Very High Settings) I don't want to buy another card after 1-2 year.
 
So... Just so you're aware... Most cards will run around 60C if you have even a half decent airflow case and use your fans. That goes for both Nvidia and AMD. A 390s max temp is 94C, at which point if the user hasn't defined a fan curve, the fans will kick up to keep the GPU at 94C. The same can be said for Nvidia's card, they operate up until mid 90s. So as long as you're not stripping your card down to bare bones, taking the heatsink off... turning all of the fans off... You'll be fine to OC.
I encourage you to go look through this: http://www.tomshardware.com/reviews/sapphire-nitro-r9-390-8g-d5,4245.html review that Tom's did on it. In which, it seems as though AMDs cards proved to be better in nearly all regards comparing to their price competitor, except for TDP, and if you go look at the graph... well, 2-3 degrees is the running difference. I think you need to do a bit more research before you start spouting stuff like that yourself, and instead of asking "Oh where did you hear that!?" and then making some claim, you should back that claim up. Any card will overheat if it doesn't get the proper cooling. The 40 watt TDP difference is a load of bull, it's close to 80-100, also maxing far higher in certain applications as is detailed in said review^.

And again, we're back to... If you're careful with it... and have proper cooling... You'll be okay to overclock. The operating temps of the two cards aren't all that different.

Tbh I could ask you the same thing, Do you work for NVidia? Each card has it's pros and cons. Nvidia has better drivers, but I don't really see that being much of a difference as long as you don't crash out due to lack of driver support. I've never had that happen to me, as their drivers have vastly improved. Nvidia has better TDP. AMD has cards that are running on a newer architecture which Nvidia right now can't touch, given the form of the AMD cards. You can certainly fit a 980ti in a mini-itx/itx case, but its gonna massively screw with your cable management/how much you can fit in the rest of the case. Introducing: Fiji. Those cards also have the con of working poorly at 1080p, but with HBM, so again there is a tradeoff. The rest of the features are small niche features that you likely wouldn't notice a difference in gameplay on. Also, G-sync is matched by free-sync, that's not a company specific thing.

As for the DX12 points, I wouldn't say NV is getting destroyed right now, there aren't enough DX12 games out yet to compare the cards. Once it becomes standard we'll see who wins. But at the moment, AMD seems to have the upper hand in possibly utilizing it better, not necessarily destroying.
 

That's the issue I see with the 970 recommendation. You're A) not getting what the company specifies in VRam, and B) not getting enough VRam to last 3-4 years. Like edible said, it's all about how the company optimizes the game, but you can't count on every company optimizing it 110% to get VRam down. It should be well optimized, but asking every single game to go the extra mile IMO is a poor choice.
Edit: The only other thing I'd keep in mind is that technology is advancing at an exponential rate, so we can't be sure where we'll be in 2 years. We may stagnate, or it might suddenly be that no matter what card you choose, you're running it at med-high. That depends entirely on the game devs, and what hardware is out for them to build the game around.
 
I read the thread so what? The point is without you being rude and just saying read the thread....the 980ti's are already dropping in price and im sure that msi card he listed isn't the ONLY one he can get. Search around and find a good deal. They are comming down in price. Again don't be a rude douchebag.
 


If by future proofing, you mean 60 FPS on everything maxed out, then I would suggest you to wait for get a 980 or 980ti.
And I am pretty much sure that even after 2 years, GTX 970 will run all games maxed out at 45+ FPS (on 1080p) which ain't bad either.
But your definition of future proofing is quite strange.

 

So the point is that he does not live in the US. Unless I'm mistaken, he lives in Turkey. Where the 980ti he listed (if you read the thread and clicked the links, you'd know this) is listed at $850. I don't think I'm being rude by asking you to follow the thread's commentary, you're the one being rude by calling me a d-bag for asking you to do a simple task. I'm sure if he really looked around he could find one cheaper than $850, but I am 99% sure that it ain't gonna be under his price range. Card prices typically fluctuate ~ 25-50 USD depending on who you get it from, which would put the price down to MAYBE $750. $100 over his price range. Don't list a US price item when he does not live here.
Edit: This thread's an absolute mess, so I'm done here. @Ayro, if you choose to get a 970, 980, 980ti they will all do you well. The only thing I caution against is VRam, particularly in the 970. If you get a 390, 390x, they will do you just as well. It comes down to what you can find for the best price, they will all last you 3-4 years. They won't run everything high for that time, particularly for the last year or so, but they will definitely last.
 
Instead of just saying over and over again to multiple posters to read the thread maybe just say he is in turkey instead of posting something with no helpful info. I said i read it and that regardless of his countries prices he should he able to find a cheaper one. He also said his budget was 5-600$ which would lead everyone to think usd. The basis of everything thats said is if he wants to max games out at 1080 for the next few years a 980ti will be his best bet. Division alone already taxes even the highest end gpu's even at 1080p so while a 970 or 390 would last him probably 1-2 years TOPS but not at max settings that whole time. Hell gtaV at all max settings causes even a 980ti to dropbelow 60fps at 1080p.
 


Video showing temps and overclock on both cards: https://www.youtube.com/watch?v=k9cKZiJw6Pk
X-fire communitcates through the PCIe rather than an external bridge.
G-sync costs extra money. Free-sync does the same thing, but is open source.
TXAA and MFAA blow. You act like AMD cards can't do HBAO+

Here's the funny thing about the "a-sync transfer queue." Nvidia's GPUs actually perform worse with it enabled. Nvidia's cards can't do asynchronous compute, since they lack the hardware to do it. No driver can make up for it.
AotS shows a performance advantage for Nvidia in DX11, but AMD takes the upper hand in DX12

You're just stereotyping an entire GPU series based on an extreme prejudice. The only reason Nvidia is ahead is ought to it's marketing. Remember the 400 series? AMD had Nvidia screwed, but Nvidia's marketing team saved the day
 


Let's put this discussion to an end. Both the manufacturers are good but Nvidia has better rapport with game developers hence has more support in games.

As far as async compute is concerned, I agree that Nvidia lacks the required hardware but they are known to do wonders though their drivers, so it's better to wait until more async compute games come and Nvidia release the required drivers.

AMD is better in pricing their cards which is a critical factor why people sometimes choose AMD despite Nvidia having better technology.
 


Urgh, there's no educating you. Drivers aren't hardware. Async isn't something that can just be emulated. Nvidia already has benchmarks using async. Their cards perform worsr ought to the nature of async.
Pau attention to what the companies are doing right now. There's a reason Nvidis is desperately trying to get out of the GPU industry