AMD Radeon R9 390X, R9 380 And R7 370 Tested

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Bad bad AMD, tough Nvidia did shamelss rebadges before, what AMD is doing with the Pitcairn in particular is ridiculous, even if it is at a reduced price. I fear for the future of AMD at this rate.
 
Well the 4gb 380s have hit some UK retailers with prices ranging from £180 up to £200.
At the same stores you can grab a Asus dc2 r9 290 for £205.
It would be a foolish person that chooses one of the rebrands over that.
 
In other news... AMD launches there Mid Range to low end cards as the 300 series.... People complain because said mid range cards don't beat NVidia's high end cards.

Still waiting on the Review for the R9 Fury cards personally. Depending on the pricing, I may go for the Nano as an upgrade from my rapidly aging 7850.
 


I don't quite understand how 390x is this: "The Radeon R9 390X’s 8GB are a great fit for 4K though, no question."

If ONLY two games pass 30fps out of 6 and one of those two if these are averages not minimums probably drops below 30fps also. You can say lots of memory is great for 4K, but the gpu IS NOT or they would all be above 30fps. You could also say the 4GB models suck worse than the 8GB model (never mind AMD's 2GB choice), but you can't say either is GOOD for 4K if 66% of the games you tested can't play there and I'd argue 1 of the others can't play there either since it wasn't much over 30fps.

I'm confused about what winning actually means if nobody can play there anyway 😉

Another point: Steam's Hardware survey shows <1% of monitors are 4K at either one or multi-monitor setups. ~18% of the people even have 4GB+ and most of those are slower than the top gpus here. Most of the monitors even in the multi side are pushing well under 4K's pixels (3840x1080 etc). As Unreal4/Unity5 games etc hit, these numbers in this review will drop even more as these engines will tax you even greater. Turning down the details is not my idea of running there. The point of 4K isn't to reduce it to look like 1440p etc. You could argue 1440p is pushing it here even with top cards in this review. Heck 4 of the games on 390x are 60 or less, so again if that's AVG, I'd barely be wanting to play at mins (GTA 5 and FarCry 4 are under 50 here at 1080p!). You'd be better off saying occasionally you can run 4K with details down :)

Maybe DX12/Vulkan games give them some more power (and only with mass win10 adoption to push devs to even bother with dx12 in the next few years), but until we see that happen 4K is pretty dead until faster cards hit (not many have multiple cards either). Even 980ti has issues here, and we may even see DX12/Vulkan used simply to push more details at us. NV/AMD certainly won't want us all to have reasons to not buy new cards, so they'll push for details being upped in gameworks etc stuff.

Not saying it wasn't a good review given the time you had, just that I don't think we're really at 4K yet without more than one card (or multi-chip) for either side. Not sure why reviewers seem to keep trying to make these cards (from either side) 4k-able when fps say you must be joking (and so does steam survey). :) It might be worth pushing if 50% of us ran that res or something but...My comment isn't about the cards (they speak for themselves), just 4K from all.

That said, FURY may actually get there (4k-able? - would have to beat 980ti hands down I'd think, looks more like match or barely win some stuff techreport showing fps I still don't like), but we don't know what it does yet (no reviews), how hot, how many watts etc to get there. BTW I care about heat & watts (live in AZ, it's HOT 114 today!) with regard to TCO over 3-6yrs and consider myself an enthusiast for many reasons 😉 High watt cards kill you if you game a lot or have kids in there too gaming in many parts of the USA or world. You can easily end up paying a few hundred extra for your card over it's life depending on how many people are gaming on a card, and how bad the watts are and cost in your area. I don't know many people that can completely ignore their electric bills 😉
 
No, but the right emissive presets for measuring, This was this time a Monte Carlo OpenCL-loop from me for a photovoltaic analysis (sunlight radiation, shadows, temperatures, all over the year). A real-world job, not a power virus. Gaming is a little bit lower, but not so significant. :)

 
You would think the 390x with its massive 8GB would beat the GTX 980 with its mere 4GB in all 4K benchmarks. The GTX 980 wins 4 out of 6 (at least in this review)? Why load it up with so much memory and add to the cost (unless there are a lot of chips laying around in inventory?)?

Seems like you can just OC your 2xx series cards (or 7xxx in the case of the 370?!) and get the same performance.

The HBM on the Fury and Fury X seems to be what will be the differentiator. I really hope they deliver. That whole upcoming Fiji-based line seems really compelling. This 300-series does not and seems like a sort of token release.
 


Oh what short memories people have. Already forgotten about the GTX 770 release and complaints of it being nothing but a re-badged 680, huh? I was one of those complainers actually, which is why I skipped the 770 and waited for the 970 (of which in SLI I have yet to be unhappy about even with the so-called full speed only@ 3.5GB VRAM hoopla).

 


You are late to the party on this. We already discussed that. It is going to be fixed. It should be R7 265.
 


As one who spent $700 on two 970s, I am always keeping one eye ahead for when the 4GB (well, 3.5GB for me) VRAM barrier will come into play. I was specifically interested in the GTA5 marks since that is what people have complained about being a VRAM hog. I've seen GURU's bench of Thief and it seems to like the 390x's 8GB at 1440p more than the 980's 4GB (albeit not by much, only 3fps).

That reminds me. I like to run either GPU-Z or Afterburner to log my max VRAM usage in games, especially when playing around with different quality/AA settings. Have you guys ever thought about adding the maximum recorded VRAM in a notation somewhere for each given benchmark test chart? I'd like to know what you guys were hitting on VRAM with GTA5.

 
Thanks for the review guys! Do you think you can look into this? https://www.reddit.com/r/hardware/comments/3aboh8/hairworks_benchmark_shows_grenada_xt_has_vastly/
 
Sneaky sneaky. I don't think AMD should be allowed to re-badge GPUs, aren't there laws against this kind of thing? It would make more sense for them just to rebrand the r9 290x as something like 290x Pro, or +. This is ultimately just confusing for the consumer. Both manufacturers are guilty of this, but at least Nvidia's latest top 3 cards aren't just total rebadges.
 
I am HD7970 Crossfire and all my games run maxed 1080P. So in reality AMD 390x is awesome card and now days you really don't need anything more powerful. I am more interesting in having smooth gameplay locked at 60FPS than dicking weather card runs 150FPS or 180FPS same game.
 
With the r9 390 coming in at as much as $100 LESS than the r9 390x, same 8gb memory bus and same architecture, I really hope that Tom's will review it. I really think it will just take the cake in price/performance.
 
Woah! @ 4k that MSI 390x only falls behind to Gtx 980ti by an average of 10fps. The big difference is on 1080p. Looks like going for CF/SLI is much better than buying a 980ti, maybe XD
 
"But when you compare the efficiency " + "of Nvidia’s competing models, AMD can’t come close."

True for 300 series.

"But when you compare the" + "overall performance of Nvidia’s competing models, AMD can’t come close."

Coolaid drinking Detected...

How about some impartial reviews please. This kind of bias is easily detected through out your review, Hardware Canuks did a much better job being impartial.
 
@Calculatron:
This still pictures were made with a Hi-res infrared video cam from Optris. It's an expensive high-quality German brand and their products are used f.e. from Globalfoundries and the McLaren F1 team (telemetry boxes). I'm using special painting and tape to measure with the right emissive. 😉

Some reviewers are using FLIR products but with the default emissive of 0.95. This is really stupid and gives you totally wrong value. IR measuring is more difficult as only get a cam an use it with default settings. :)

Amen. I've run into this in failure analysis and it really gets irritating. Customers and vendors buy instruments and point and shoot--never mind reading the manual or performing required calibration. It's nice to see appropriate measurement techniques.
 
@Karl Marx:
It is your good right to read other, more AMD biased sites if you feel better. I'm really not biased, because I buy all my hardware and equipment by myself, pay all my Computex things by myself and work in product development / quality control, not only as reviewer. I look at all this gaming crap absolutely emotionless because I don't need it for me personally. I really like 4K, but I'm using since years workstation hardware for this. 😀

Shorter: I have no reason to bash or fawn brand A, I or N :)

@Ergosum:
I wrote a shorter review for our readers to understand the basics behind IR measuring. But it was never translated into English. Too bad. Try Google Translator if you are intereted in 😀
http://www.tomshardware.de/infrarot-kamera-messungen-grundlagen-emission,testberichte-241701.html
 
So the 390X is not really the competition for the 980 Ti. The 980 Ti is just a slightly scaled back large die Titan. You need to wait for Fury results to compare to 980 Ti. It was a brilliant move on Nvidia's part to release the Ti version just before the AMD release, knowing the results of the initial cards 380X, 390X etc. would be compared to it, and people are too stupid to realize it's not an apples apples comparison...
 
The reference GTX 980 Ti is really limited with ther bad cooling solution. The boost clock is not stable enough under heavy load conditions. The same for TitanX. I've tried a TitanX with a fullcover water block and got an awesome performance jump (at 1.4 GHz boost clock and higher).

I think it also makes no sense to compare a hybrid cooled Fury X with an air cooled 980 Ti vanilla and this really old and undersized reference cooler. Take a look at the HIS 290X in this test! 50 Watts less and a similar performance as 390X. I have here a Gigabyte GTX 980 Ti Gaming G1 in my hands, running stable over 1.4 GHz bosst @factory. Up to 50% better performance with only 246 watts average. The 980 Ti @stock is boring. 😉
 
The reference GTX 980 Ti is really limited with ther bad cooling solution. The boost clock is not stable enough under heavy load conditions. The same for TitanX. I've tried a TitanX with a fullcover water block and got an awesome performance jump (at 1.4 GHz boost clock and higher).

I think it also makes no sense to compare a hybrid cooled Fury X with an air cooled 980 Ti vanilla and this really old and undersized reference cooler. Take a look at the HIS 290X in this test! 50 Watts less and a similar performance as 390X. I have here a Gigabyte GTX 980 Ti Gaming G1 in my hands, running stable over 1.4 GHz bosst @factory. Up to 50% better performance with only 246 watts average. The 980 Ti @stock is boring. 😉

It will be compared as long as the price is above or around the Fury X price...
 
Can we please stop calling it the R7 270. For the 300 series the 270 has been dropped to the R7 status but it was an R9 for the 200 series. I see this mistake in multiple articles. Aside from that, i feel about the same reading this article as the person who wrote this article, waste of time.
 



AMD has already publicly stated on multiple occasions that DX12 will be supported all the way back to the first GCN cards including the 7000 series. Can we stop kicking a dead horse already ? And as far as putting out intentionally screwy drivers...you should go talk to the pissed off 780ti owners who are getting constant game crashing errors and nothing but hassles with nVidia's latest drivers...
there is no reason the 290x wouldn't support dx12 since its an identical gpu. Unless AMD purposely bodge their driver not to support it, which wouldn't surprise me. All in all i say its a good time to buy a 290x on clearance.
 
Status
Not open for further replies.