AMD Radeon R9 300 Series MegaThread: FAQ and Resources

Page 31 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

I think that means products originally going to be on 20nm have been moved to 14nmff, ie stuff not out yet. 20nm never worked out for bigger chips.

 


isn't that they talk about it during their latest financial earnings? means they scrap their 20nm product plan and 'redesign' them using FinFet. Fury X 28nm product. the same as Titan X. i don't know why they specifically mention this in their earnings. and they are not the only one using FinFet. nvidia Pascal is known to use Finfet since the very beginning. and rumors already talking nvidia has tape out GP100 and GP104 design using TSMC finfet with launch target as early as Q1 2016. so i don't think this will give AMD the advantage against nvidia when nvidia since the very beginning are talking about using finfet for Pascal
 


Yes, I think that's right - and even better!



I'm not sure I was able to follow your typing because it came out a little crazy, so let me know if it seems like you meant to say something I missed. Just proof read it next time to help me (and everyone else) understand. 😛

But two things: (1) in financial disclosures, and (2) move to FinFET. I think it's in the Company's 8-K because they needed to explain why they didn't net good earnings in Q2. Part of that was spending $33 Million on Q2 on a new process. And moving to FinFET doesn't need to completely overtake Nvidia. They're probably not even trying to do that. AMD holds up competition on two fronts: GPU and CPU. Those fronts have two notable companies in the market, namely Intel and Nvidia. Going to a smaller FinFET process keeps AMD in the running so it can position itself. If Nvidia goes to FinFET too (or when they do), that is good because we'll have competitors with roughly equal manufacturing tools at their disposal.
 
Well that aside historically AMD will be ahead of nvidia when it comes to relasing product based on new node. But that is before Rory Read helm the company. So honestly I'm not sure if AMD or nvidia will be the first coming out with 16nm gpu. But for nvidia they have reason to release their big gun in 2016 frame time instead of using the strategy they did with Kepler and Maxwell. Instead of AMD, Intel is more of a threat to nvidia right now.
 


It means AMD is canceling some 20nm products, because didn't behave as expected and then and loosing money on porting the designs to 14/16nm FinFET process. It seems they are some previously announced APUs and SoCs, but there are also rumors that Zen was initially designed for 20nm and now ported to 14nm.
 
You know what funny is? Pick 390X/390 because it has 8GB VRAM over nvidia cards (even for 1080p). But then suddenly when it comes to 4K 980Ti does not hold advantage over Fury X because 4GB is enough for 4k 😀

And there are still people running around shouting 4GB HBM = 20GB GDDR5
 


Yes, more information is always welcomed, specially when it confirms again that the 4GB of Fury is an issue today and specially will be an issue tomorrow.

We'll have to see how this memory capacity story plays out over time. The 4GB Radeon Fury cards appear to be close enough to the edge—with a measurable problem in Far Cry 4 at 4K—to cause some worry about slightly more difficult cases we haven't tested, like 5K monitors, for example, or triple-4K setups. Multi-GPU schemes also impose some memory capacity overhead that could cause problems in places where single-GPU Radeons might not struggle. The biggest concern, though, is future games that simply require more memory due to the use of higher-quality textures and other assets. AMD has a bit of a challenge to manage, and it will likely need to tune its driver software carefully during the Fury's lifetime in order to prevent occasional issues. Here's hoping that work is effective.
 


Can you please drop it, Juan? I really don't what you're trying to do here, but the conclusion is very clear: "not a problem now".

Don't try to spin the conclusion to your anti-AMD agenda.

Also, in other news. The Steam Survey is even more conclusive to my intuition that the money is not in the high end. Hell not even in the mid-range. That is bonkers.

http://techreport.com/news/28807/july-2015-steam-survey-modest-hardware-rules-the-world

Still, Steam is not the 100%, but it does contain a big chunk of the market to be representative enough I'd say.

Cheers!
 


Well, at a simple glance viewing the GPU graph, I'd say they're discriminating between what is the "primary video device" for games. Since all Intel CPUs after Sandy come packed with an iGPU, I'd assume if they were mixing them, it would be pretty noticeable.

I do remember there was a time when they had that problem though, but it's been such a long time, I'd expect them to have fixed it. I would have to dig a bit more to give a definitive answer to that, I guess.

Cheers!
 


I recommend the 390X for Crossifiring.
 


The review is very clear about the existence of a problem, I bolded the relevant parts, like "measurable problem". And as many other reviews have noted the 4GB limit will be a biggest problem in near future. Anandtech review concluded that Fury X will be outdated before two years due to the 4GB limit.
 


But by two years you'd have a new gpu, making whatever prediction a bit irrelevant no? At least in terms of hbm v2 being out by then extra vram whos to say in a years time their'd be leaps and bounds in that time. Whos to say a 980 ti wouldn't be, "irrelevant", by then. Heck i expect their to be a fairly significant difference between my ti and the new pascal chip when that drops. Even then theirs no real way to say it's "outdated". It's a relative term. Especially when gpu's are "outdated" every 6 months. I strongly doubt 2 years from now a fury x will be completely unusable in two years time. In respect to the fact people are still able to play gta 5 with an old 9600gt. Technology changes, but at the same time you can argue that tech can't change so quickly that it'll make previous technologies completely irrelevant. That'd be bad business for amd or nvidia, A lot of their revenue comes from older tech, i 'd wager.
 


Exactly.

In two years time, when games are using DX12 and some new fancy techniques for Eye Candy, I'm pretty sure the Fury won't cut it, but not because of VRAM, but because of GPU power. Same will go for the GM210 siblings: they'll have massive amounts of VRAM, but won't be able to push pixels fast enough anyway. If you're on the bleeding edge, that means you *will* upgrade once a new flagship comes along. If you're a mainstream user (like 99% of us), you will either get the second best for the generation or just lower details to maximize FPS'es.

That is also something very interesting from the Steam Survey: the amount of older cards is impressive. If you aggregate the results by DX family and get the video cards families, you get a nice picture telling you that people is using *very* old hardware anyway. So new games are going to be played at lower resolutions and detail levels anyway. That means, even more, 4GB is still plenty for years to come.

I will prepare an extract on the Steam Survey data, because I think there is very interesting information some sites didn't show.

Cheers!
 
Yeah you really get to see an interesting picture with steam survey. Granted it doesn't speak for every user. You can really see that most users have 2-1 core chips, not everyones on the bleeding edge of technology 5% of users have more then 4 cores really a niche market. I mean granted economically it's not exactly viable to upgrade every 6 months. Especially when costs can get to as much as a used car. Really paints a picture of peoples perception of value. Most users use intel cpu's over amd apu's. Really it's an amazing tool for indie devs. Get a general idea of everyone's hardware and build a game around that. Heck almost 20% of all users are using integrated graphics from an intel cpu over 27% amd and 52% nvidia. Thats a pretty big chunk of people just using integrated.
 
I have lots of questions about the steam survey, eg it asks me on both my laptop and pc even though steam is only on the laptop purely for communication with steam friends and browsing the store, so does it count me as 2 users? If it does its one way its skewed big time. Is fx4*** a 2 core? As above does it count unused igp as a gpu?
 


I said "before two years", not two years. In any case here is what Anandtech concluded:

I am concerned that R9 Fury X owners will run into VRAM capacity issues before the card is due for a replacement even under an accelerated 2 year replacement schedule.
 


It would be different "machines", yes. The Steam survey only looks for computers using Steam in them. As for the user difference, I have no idea.

And yes, the unused iGPU counts towards having a GPU, but they are aggregating them differently for the output graph in the main page. You can deduce that when looking at the data inside the graph.

Cheers!
 


with the grunt of two 390X the setup probably have the mean to use more than 4GB VRAM. but still 8GB is very excessive if you're playing at 1080p only.
 


If you look at the statistics, intel only makes up 20% of users who have igp's. If it were to include every single person, who has an igp or apu igp. It would be insanely high. I think they just check for if you have a dedicated gpu and use that dedicated gpu as the preffered card you use. Instead of just generalizing it's the integrated graphics. That'd just be silly as the stats would indicated much closer to 100%.

Also the survey only includes people who willingly participate in the survey. The last time they asked that was a year ago. If you choose to participate it includes you're info. I doubt they purposely intrude on your private information like that.
 


Did you wait? lol Seems like the 290x is still better than the 380x....