390 vs 970 1440p

camoviking

Reputable
May 18, 2014
55
0
4,630
I'm torn between staying with nvidia and getting a 970 or switching to the 390. I'm upgrading from a 760 and i was gonna get a new monitor ( asus 1440p / 144hrz) and I was wondering which card would be better for 1440p gaming? I rather stick with nvidia but would the 4gb's of vram be enough? Thanks for your help.
 
Solution
The performance difference is basically none at all really. Even a factory overclocked 390 is only going to be a couple percent faster than a reference/stock GTX 970. With the GTX 970s overclocking ability, a nice custom model like the EVGA FTW+ wouldn't have any competition from even the best 390 on the market.

But setting aside performance, the number of pluses on the side of the GTX 970 are pretty decisive. Driver support, power consumption, heat, noise, game compatibility, graphics features all fall firmly on the side of the GTX 970.

Quote:
"PowerColor's R9 390 PCS+ is currently available online for $340, which is $10 more than AMD's reference design pricing, a reasonable increase if you only look at the R9 390. You can find...
both cards trade blosw at that res, however id just take the amd card with more vram anyway if theyre the same price. The only reason i would go nvidia over the 390 would be if i was going to go dual card down the track, nvidia's solution is better for dual cards and the 970 uses less power so psu requirements are down.
 
r9 390 is faster than the gtx 970 at 1440p. And the 8 gigs of vram will help in the future when games will use more than 4 gb of vram

you can see the comparison here: https://www.youtube.com/watch?v=k9cKZiJw6Pk
the 390 is a better buy

plus the crossfire scaling is really good on the r9 390(somewhat better than the gtx 970)

EDIT: another comparision at 1440p. the 390 is clearly better: http://www.eurogamer.net/articles/digitalfoundry-2015-amd-radeon-r9-390-8gb-review


so at 1080p 390 little faster than gtx970
1440p 390 >> gtx 970
4k 390 = gtx 980
 
According to benchmarks, the 390 is faster, according to specs, 390 is much better, but power consumption ajd heat... *sobs*
You know the only reason 970 could perform so well is that it overclocks so well. You can easily get 1500 mhz if you have the slightest of knowledge of what you're doing doing. Again, if you try and push the limits of 390,get it to 1150 or above, it will even outperform the 980.
So, in short, the 390
Best of luck
 
R9 390 and GTX970 is almost the same in term of performance.
The only thing R9 390 has more to offer than GTX970 is only the VRAM.
Having more VRAM could be helpful but I see this not as huge turn down for GTX970.
Witcher 3 on 1440p uses only around 2.2GB but both GTX970 and R9 390 are already screaming since they are not powerful enough to serve Witcher 3 properly (less than 30 fps), if you run Witcher 3 on all ultra, with all effects on (HairWorks) and with mods like E3FX.
Shadow of Mordor @highest is then different, this game eats more than 3.5GB VRAM @1440p and GTX970 has more problem with the game than a R9 390. However, it happens not all the time (very rare in fact) and reducing the settings a bit lower will make GTX970 on par again with R9 390.
The only one you must pay attention more is only if you have plans to upgrade your monitor, especially if you want to have FreeSync or GSync. Just pick your GPU which matches those technology.
If you do not have such plans, just feel free to pick either R9 390 or GTX970, both are good price/performance cards <--- R9 390 is here the better choice.

BTW, if you can afford still it, it would be better to take GTX980Ti or Fury X for 1440p. For most games on 1440p, GTX970 or R9 390 should be sufficient but not on games like Witcher 3 and such games will come.

In my opinion, skip GTX980 or R9 390X unless you see a really good deal (The nVidia price cut for GTX980 could make it more interesting.), these cards are good but not good enough in term of price/performance using the old pricing. Better stay on GTX970 or R9 390 or go higher directly to GTX980Ti or Fury X.
 

yup the r9 390 is better overall. it reaches 980 levels of performance when oced. for 1080p and 1440p r9 390 is great.
for 4k its better go with 2 way r9 390 crossfire or just a 980ti or r9 fury.

A 390 is enough for 1080p or 1440p at 60fps
at 4k 2 r9 390s are more than enough

 



the 390 is a really poor overclocker actually, you cant squeeze much more at all form some cards. the 970, depending on model can be an insane overclocker. Despite this id still take the 8gb card over nvidias 3.5 gb card.
 


well the r9 390 is already a r9 290 pushed near its limits, and reviews across the board suggest there isnt too much to be gained, since its already near its limits. the 970 however is not an overclocked version of an older chip, and has more headroom for OC in comparison.
 
The performance difference is basically none at all really. Even a factory overclocked 390 is only going to be a couple percent faster than a reference/stock GTX 970. With the GTX 970s overclocking ability, a nice custom model like the EVGA FTW+ wouldn't have any competition from even the best 390 on the market.

But setting aside performance, the number of pluses on the side of the GTX 970 are pretty decisive. Driver support, power consumption, heat, noise, game compatibility, graphics features all fall firmly on the side of the GTX 970.

Quote:
"PowerColor's R9 390 PCS+ is currently available online for $340, which is $10 more than AMD's reference design pricing, a reasonable increase if you only look at the R9 390. You can find the GTX 970 online for $310, which is definitely a better option in terms of performance, power draw, and noise—pretty much everything. Don't get me wrong, this isn't PowerColor's fault as their card is good for a R9 390; it's AMD's old technology and their pricing that affects the product negatively. What the R9 390 has going for it is its large 8 GB memory capacity, though it doesn't make a difference because this is a 1080p-performance-class card. Unfortunately, that extra memory is also the reason why the price is so high compared to the R9 290 4 GB that can be found for as little as $250, a price point that delivers excellent value for the money, and you're not missing out on any features."

http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/30.html
perfrel_2560.gif



With DirectX 12 coming out in just over a week, you'll also get the peace of mind of knowing that your card does the full gamut of the DirectX 12.1 feature set and at the hardware level at that. With the 390, you're basically gambling that rasterization is actually not going to be a big factor in the newest AAA games. With the GTX 970, there's no gamble, you get the support. The GTX 970 and the Maxwell architecture are truly next-gen products, while the architecture of the 390/290 is now almost two years old.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69682-amd-r9-fury-x-review-fiji-arrives-5.html
R9-FURY-X-2-12.png
 
Solution
There's actually no DX12 card that's fully DX12 compliant. nVidia's Maxwell cards still lack some features. nVidia is simply so brilliant at PR, that they convinced everyone that the 12_1 feature level (which is probably going to be neglected anyway), is a superior DX12 version as in DX12.1, but it isn't. While AMD is missing FL 12_1, nVidia is missing FL 11_2, not to be confused with DX11.2..

Sadly, MS has made 11_2 a 'ghost' level, as in, they removed it from the required feature level list to accommodate nVidia. Same thing that happened with the tier levels... The tiers were invented to accommodate nVidia. AMD GPUs actually have the advantage in DX12. But you'll see this soon enough...
 

How so?
 


Because of unlimited resource binding.
 


Quotes:

"What the R9 390 has going for it is its large 8 GB memory capacity, though it doesn't make a difference because this is a 1080p-performance-class card."
http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/35.html

"At 1440p there aren't many games that exceed 4GB, Dying Light is one example that does, but the rest are happy right now with 4GB of VRAM at this level of performance. Therefore, as a single video card, 8GB is a bit overkill for 1440p gaming"
http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/11#.Va-jdflVhBc

"We'll state it in all our 390 reviews, we very much doubt the benefits from the extra 4 GB, the number of scenarios where you will pass 4 GB is extremely limited."
http://www.guru3d.com/articles_pages/powercolor_radeon_r9_390_pcs_8gb_review,27.html
 
8GB indeed seems to be overkill as of now. But 3.5GB is really not that much anymore. I would personally play the safe side. Why? Because, well, I'll tell you my experience...
I have an HD 6850 right now. At the time I bought the card, I went with the advice that the 2GB version is unnecessary, and that the 1GB is more than enough for the time I will be using the card. I was told that the GPU itself will be limited way before it could use all its ram.

Well guess what. Nowadays, the GPU is more than capable to handle a lot of games at 1080p still. But I first discovered the RAM limitation with Dragon Age: Inquisition, although I'm pretty sure it has happened earlier and I simply didn't figure it out. I can crank the settings up quite a bit in DA:I. There comes a point where the RAM load reaches a constant 99-100%, and the GPU load spikes to the bottom each second, while with lower settings it's in the relatively constant range of 95-99%. This is reflected in the game every time I turn the camera. If I keep the camera in one direction, I get a nice smooth playable framerate. As soon as I turn the camera, the choppiness begins, lasts for a few seconds, and then fades as the RAM is cleared and re-filled to match what's shown on screen. If I had chosen the 2GB card, I would be able to play the game smoothly.

Since that time, I've personally separated from the crowd telling people that the 'lower' version of the same (or similar) card is enough. If you have a low end GPU, sure, the additional RAM is a waste. But anything that's mid-range or higher, if you have the option to go for more RAM, go for it. Unless it's something crazy like 16GB or something lol.

But I should probably also mention that I buy my GPUs for long term use. I have this GPU for 4 years already, and it still serves me well. I generally go for mid range GPUs, and only upgrade when the performance of the new GPU is at least twice as good, and the GPU is in the $200 price range. I found this to be the most cost-effective way of upgrading while maintaining my performance needs, and more RAM on the card suits my way of upgrading. If you upgrade your GPU every year or two however, less RAM for a lower price is the better choice.
 


Reminds me of when Jersey gamer kept telling us how ATi were going to put Nvidia out of business because the HD2900XT had DX10.1 support and that was somehow going to make ALL the difference. :lol:
 
The main important feature of DX10.1 was that it added AA with a lot less performance impact. nVidia did not support it, so, it did not gain ground. It was first in Assassin's Creed, and was later removed by Ubisoft in one of the later patches of the game. They stated the reason was that DX10.1 caused glitches in the graphics. Yeah, it did cause glitches, but only on nVidia cards, not ATi cards. No nVidia cards supported DX10.1, and that was apparently the main cause for their glitches, and no one could ever prove that ATi cards were affected with the glitch.

The DX10.1 low performance impact AA was removed for BOTH brands afterwards... It's a TWIMTBP game, so what else could be expected..? There was NO reason to do this for ATi cards. A simple detection of which card is in the system would allow the switch between DX10 and DX10.1. But of course, nVidia's power play once again negatively impacted the gaming industry. It wasn't long before DX11 came out either, so, DX10.1 was left in the dust. The potential of DX10.1 cards were never utilized, and that was because of nVidia... And they said the same thing they say today. "We don't purposefully negatively impact the competition, the competition is free to work with the developer to optimize the game for their cards" blah blah blah. And people swallow it whole.

The same things are happening now with nVidia's GameWorks BS. It's too bad that AMD sucks at PR contrary to nVidia, and thus everyone jumps on the nVidia bandwagon since nVidia knows how to lie in your face and get away with it. The GTX 970 is the prime example of this. If nVidia was selling cocaine and AMD selling spinach, 70% of gamers would be cocaine addicts. And as long as developers keep catering to nVidia's needs, rather than allowing the best technology to be used, and people keep buying their cards, nVidia will be able to use their shady business practices to pretend to be better while in fact they're not.

http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1