AMD Radeon R9 380X Nitro Launch Review

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Point taken on "possibility." I guess it is possible for even a resolution of 0x0 to bog down the best card using a prime number formula, or even calculating decimals of pi forever. Perhaps the better way to frame the issue is based on what happens in the market place. Developers do offer some graphical features that are hard to notice, but they won't go so far as to offer unlimited AA - or very high multiples beyond 32 x.

Just like you can expect the 680 paired with a sufficient CPU to crush 640x480 for the life of that card, you should be able to have an analogous expectation for 1080p with a more powerful card - though that card may not yet exist. Resolutions do create boundaries. When the 680 was the king, 1080p was still standardizing. People were running 720p and lower resolutions, and 1080p games were programmed with current hardware in mind. At the same time, though, Nvidia was aware of Crysis 3's impeding release when the 680 was launching, and the 680 was never up to snuff for that game. Not even the 690 could keep a min of 60 fps with everything cranked.

Going back as far as Crysis Warhead, which was a 2008 game, the 680 was averaging less than 50 fps, and the 680 came out in 2012 (link to benchmark graph). Running 1080p was always a stretch for the 680 when it came to very intensive graphics, since the resolution was still maturing. It's mainly a historical point.

But there were established resolutions that the 680 would dominate - no question. For instance, 640x360 would be obviously fine for even modern titles on the 680 using a good CPU. I'm not denying that the 680 could run 1080p, since there's plenty of documentation saying that happened for a lot of people. It's just that the card never dominated that resolution in the same way that it can dominate lower resolutions, by which I mean it won't present a graphics bottleneck in actual games.

When I mentioned the current flagship cards (Titan X, 980 ti, Fury X, and Fury Nano), I was trying to point out that I think those cards are just now beginning to approach the point where people can scoff at 1080p as incapable of creating a graphics bottleneck in actual games, though I don't think we're quite there yet. It's still roughly 2.1 million pixels to push. That's a lot even for the Titan X, which only maintains an 81 fps min on Crysis 3 with all settings maxed, except no AA (click to see page with benchmarks).

This is all assuming gaming graphics, though. If we start talking about doing photo-realistic renders, then I don't think any processor of any kind currently exists that can do that in real time at modern resolutions. Not only is it currently too much data to crunch in real time, but there hasn't been enough R & D time or money put into making the formulas and codes efficient enough to properly utilize the processors. Graphics processing is still relatively new, and I think it will be a long time before computers can generate images to the point where the images are indistinguishable from those that cameras capture - all in real time that is. Way too many gaps still need to be filled, and not ever decades of time and billions of dollars have overcome the challenges. So I won't begin to pretend that I have the answer to that set of problems.
 


Thanks for a bit more details. That should have simply all been in conclusion (a bit better worded perhaps). And I guess I'll have to start reading tomshardware.de to get more info in the future. Could you link some of those single-card reviews?

I've more or less decided to spend ~245€ for R9 380 4GB card, because 2GB cards are just ~35€ cheaper, and I'd rather future-proof myself at least a bit as I doubt I'll change hardware in next 5 years. GTX 960 cards are in 10€ range from R9 380 cards, so obviously not a huge difference in price and I'd rather get that much more performance. Unfortunately next step is R9 380X that's locally completely overpriced at over 315€ and I'm in no mood to wait for price drop to some acceptable level which will take months for 10% more performance. And next step after that is 335-360€ for GTX 970 or R9 390 which is too much money for casual gamer... and an overkill for my needs anyway. Than there are still R9 2xx series cards floating around, but prices aren't much different from equivalent R9 3xx cards.
So to sum up the market at the moment, at least where I live - you've got those cheaper models (GTX 950/R7 370) at around 155€-165€ that are too slow for my needs, than a 210€-245€ price range that contains ~ 10 card types in multiple variations in which R9 380 4GB seems to be the best pick (for me at least), and those that start after 335€ which are simply too expensive (GTX 970/R9 390 and onwards, including older models like R9 290).
It's funny that price-wise R9 380X is 10% from GTX 970, and performance wise it's 10% from R9 380 😛
Second interesting point is that 210€-245€ range which is a home for : GTX 260 2GB, GTX 260 4GB, R9 380 2GB, R380 4GB, R9 280 2GB, R9 285 2GB, R9 280X 3GB, next add all the OC and non-OC variants, and all the custom models from 10 different manufacturers --- and you've got a VERY crowded 35€ range. And while 35€ is noticeable, again - if you're paying 210€, you'll probably manage another 35€ for a card that has double the RAM and is ~10-15% faster, price-performance wise it's pretty much a mute point, and you'd get the extra performance that can be welcome along the road. I'd pick 280X probably, as it's a bit cheaper even, but like I've said 5 years from now I'll welcome that my card supports DX12 and unfortunately 280X got locked out of it. To each his own, some will welcome those 40-50W power difference so maybe they'll pick GTX 960, some don't need and never will need more than 2GB RAM so they'll pick a bit cheaper card, but I want relatively fast card for no more than 250€ (even that's a stretch), and I want it to last - so I can't ignore R9 380 with 4GB RAM (future will bring new & larger display probably, and new games with heavier textures) and DX12 support (hey, it's around corner, and I want this to last 5+ years!).

I'm still researching so I'd appreciate those links, but more I read more it seems that near future won't bring any surprises to the table. Certainly - R9 380X wasn't one.
 

I don't know what NVidia may or may not have available, silicon-wise. If I did I'd work there. 😉 My point is that there is still a significant performance gap between the 280X and 970 that can be exploited. If such a card is made and can split the performance difference, and is sold around $250, it would be an incredible seller.

I think part of why the 970 has sold as well as it has is because a lot of people don't want to settle for 960/280 levels of performance. They don't necessarily want to pay ~$330 for the 970, but the price isn't so high to be prohibitive. and the 970 offers excellent performance for its price. The 280X was a great card for $250 the last year, but supply is drying up and there hasn't been anything to replace it yet.
 


That card will never come out because it will just take away sales from the gtx 970.
 

That's completely up to the market. How many people are NOT buying 970s right now because it remains just out of their price range? Say a 960 Ti is released the beautifully splits the difference between a 960 and a 970. Sure, that would eat a little into 970 sales, but it would also mean that NVidia is selling more total cards. Whether that equates to greater actual profits depends on a lot of other factors. However, the 290, while old, is sitting in this position with better than 1080 performance for only $250 and NVidia does not currently have a competing product.
 
Seems like the gap between the 960 and 970 is quite large to exploit, but the 380x does little to middle the gap. I guess AMD thought the 390 was in the same league as the 970, but too close to the top, so it split the difference between the 380 and the 390, which means this card crowds it's own brothers. Apple said, "We don't care if the iPad Pro cannabalizes the iPad, people are still buying Apple." Guess AMD likes the notion.
 
isn't it to risky to by a AMD product at this time ?? maybe in a half a year you won't get any support for it !! when AMD goes under!!
I wouldn't recommend it to my worst enemy!!:no:
 
It would have been interesting to see how this compared to a 290 and a 280x. You can still buy those two cards from most retailers. The reason to show the 290 is it is the same price at the moment. Some are even way cheaper. Newegg had a reference 290 going for $180 after a rebate the other day. Most of the 290's - reference and non-reference - are going for under $230 right now. The reason to show the 280x - I would like to see just how much they have improved the 380x over the 280x/7970 GHz. Yes, I can just go search for a review of one but it would be interesting to see them compared with the same driver version. Also, the power consumption page would have been more informative if it compared the 380x's power usage to the other GPUs and the 280x. I know this is technically a 380 replacement as mentioned in this article but to most of us, this is a 280x replacement and most people shopping for graphics cards with $200-250 to spend will be taking a long hard look at this, the 960 and the 290.
 
Please wait for the Star Wars: Battlefront Benchmark. I benchmarked 45 graphic solutions, Falllout 4 with 35 solutions and Anno 2205 with 35. You will also find 290, 280X, 7970 and a lot of other older cards 😉 I hope, this will be translated soon. The last three weeks were really horrible 😀
 

I can guarantee that this card will not come. 3GB are too less for marketing and 6GB are too expensive. The yield rate of all this GM204 ist too good to cripple this chip again, as long as the 970 (and 960) sold so well.


For people like you we made also the frame time analysis in three different versions. Try to play f.e. a first person shooter or something similar with 40 fps or less if the frame times are not smooth enough. For some persons values of 25 fps are "playable" but this is nothing for ambitious players. 😉

I've benchmarked Anno 2205 and Fallout 4 with 35 different graphic solutions (VGA cards, APU, iGP) - also for peoples like you with lower settings to get also a R7 240 to run. Simply make a difference between entry level cards and mid-class cards like the R9 380X. AMD has positioned this card as the new 1440p option and not as SVGA card. :)

I can guarantee that this card will not come. 3GB are too less for marketing and 6GB are too expensive. The yield rate of all this GM204 ist too good to cripple this chip again, as long as the 970 (and 960) sold so well.


For people like you we made also the frame time analysis in three different versions. Try to play f.e. a first person shooter or something similar with 40 fps or less if the frame times are not smooth enough. For some persons values of 25 fps are "playable" but this is nothing for ambitious players. 😉

I've benchmarked Anno 2205 and Fallout 4 with 35 different graphic solutions (VGA cards, APU, iGP) - also for peoples like you with lower settings to get also a R7 240 to run. Simply make a difference between entry level cards and mid-class cards like the R9 380X. AMD has positioned this card as the new 1440p option and not as SVGA card. :)

I can guarantee that this card will not come. 3GB are too less for marketing and 6GB are too expensive. The yield rate of all this GM204 ist too good to cripple this chip again, as long as the 970 (and 960) sold so well.


For people like you we made also the frame time analysis in three different versions. Try to play f.e. a first person shooter or something similar with 40 fps or less if the frame times are not smooth enough. For some persons values of 25 fps are "playable" but this is nothing for ambitious players. 😉

I've benchmarked Anno 2205 and Fallout 4 with 35 different graphic solutions (VGA cards, APU, iGP) - also for peoples like you with lower settings to get also a R7 240 to run. Simply make a difference between entry level cards and mid-class cards like the R9 380X. AMD has positioned this card as the new 1440p option and not as SVGA card. :)

For people like me, IF the card makes more than 30 fps (single, not multiplayer), it's enough. It looks like nobody cares for those who buy cheap cards, because they are from Eastern Europe and have no money. AND, not eberybody has a 1440p monitor. Some have 27" monitors with FHD resolution. Keep that in mind too. And I know what's a 60+ CS gameplay, and a 30+ one. GT class cards are for poor gamers. Everybody says even Intel IGPs are better than, let's say, my (not arrived yet) GT 630. Maybe they are....in 300 $ CPUs. They say GTs are for HTPC (yes they are, but not only). They ARE WRONG. You can game on a HTPC card. Nut using Ultra, but still you can play them. Many don't realise there are persons who appreciate a story, not graphics. I wish you understood that high-end components aren't the most sold. The most sold are the budget ones (GTX 750 TI isn't really a budget one).
 
I'm testing in my reviews all, from entry level to high-end. But THIS card is mid-class, no entry level. If you are interested in low-budget gaming, please read my other reviews and how-to's with smaller cards. You will be more satisfied because you are simply in another target group 😉
 
Am I the only one who is puzzled by "power consumption" chapter?

Exactly how much more/less does it consume (on average) vs 380/960 at idle/gaming please? 🙁
 
So pretty much barely any difference between X and non-x version. All i've see is that it is only on average about 3-5 fps more. The difference is too little. If I remember, the R9 280X vs 280 was better than this 380 vs 380x.
 


Yeah, I definitely think that the 980 ti, Titan X, FuryX, and Fury Nano are the first cards that adequately exceed 1080p. No cards before those really allow the user to forget about graphics bottlenecks at a higher standard resolution. But even with those, 1440p is about the most you can do when your standards are that high. I consider my 780 ti almost perfect for 1080p, though it does bottleneck here and there in 1080p games. Using 4K without graphics bottlenecks is a lot further out than people realize.
The 295x2 wants you to come outside to talk.
 


Ha, yes. There are dual GPU cards that will work as well. I only mentioned singl-GPU cards.
 
Try to play Anno 2205, Fallout 4 or Star Wars: Battlefront with a multi-GPU setup. The support from Nvidia and AMD is lousy. No working profiles...
I used a modified profile from BF 4 for Battlefront, but this is not optimal. Forget the R9 295X2, it's something for the show room 😉

 

A lot of people who follow PC GPUs might not realize it but if they have been paying attention to the mini-PC and laptop discrete GPU market at all, they probably have already seen what a GTX 960ti would look like. In the mini-PC market, there are now a couple little mini-PC cubes, at leas one of them a Steam box, that boast a "desktop GTX 960". Now, while the name "desktop GTX 960" makes you naturally think of the 1,024 CUDA core, 2GB 128-bit GDDR5 desktop GPU that currently sits in the sweet spot price to performance range, you would actually be thinking wrong. What is in fact inside those little beastly cubes are not desktop GPUs but rather the laptop-class GTX 970m based on a GM204 with 1,280 CUDAs and 3GB of 192-bit GDDR5. The reason that Nvidia and Zotac are calling the "desktop" version of the 970m a GTX 960 is because the performance is very similar to a desktop GTX 960 despite the 970m being superior to it on paper. The reason for the similar performance is the laptop GPU has a much lower clock speed - 924 MHz vs 1,127 MHz and 1,038 MHz vs 1,178 MHz boost to accommodate the lower TDP demands of laptops and mini-PCs. Those clocks should also give you a hint at why Nvidia is probably not going to be releasing a 1,280 CUDA core, 3GB 192-bit GDDR5 desktop GPU and call it a "GTX 960ti" as long as people are still buying 960's and 970's. The fact that a 200 MHz OC is all it would take for a 960 to match a 960ti despite having 1GB less VRAM and a smaller memory bus would put off a lot of potential customers especially if the price difference is more than $40 or $50. The only way I see the 960ti happening is if Nvidia has so many GM204's laying around right now with just enough of the right defects to make a significant amount of 1,280 CUDA GPUs. It would probably be very similar to the one-off GTX 560ti "448 core" edition that only came out during Christmas time 2011 and then disappeared off of the face of the PC market. With the market share that the desktop GTX 970 has right now making it one of the most popular single GPU models of all time, Nvidia probably does have quite a bit of GM 204 silicon with too many defects to become GTX 970's that could be used in a one-off like that. The problem for potential customers of a GPU like that is that they have to wait for Nvidia to milk the 960 and 970 to the last drop first.
 


Even then it may not work out because selling less 970s than (970 + 960Ti) they could still make more money because a 970 costs more than a 960Ti. Of course, the math would have to be done. But realistically here, the 900 series was launched more than a year ago. It's quite late don't you think for them to release a 960Ti? I really do not see it happening. It's just my opinion.
 
It's not the price of the card, it's the profit margin they make on it. Yes, the 970 is more expensive, but it's also more expensive to make. And right now, I doubt NVidia wants to disclose how much they actually make on each card.
 
Status
Not open for further replies.