AMD Radeon R9 300 Series MegaThread: FAQ and Resources

Page 21 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The 8GB of VRAM on the new 300 series is sort of useless, As the GPU itself couldn't possibly hold a steadily high FPS while using that much VRAM. For example. Even when the 970 is pushed to the MAX and is using over 3GB of VRAM, the FPS is down to 20-30 fps. I think we are about 4 or 5 years away from seeing a GPU powerful enough to use 8GB of VRAM and maintain 60fps. It's unequally matched. Even if a game used over 6GB of VRAM it would be so GPU intensive that you'd be getting 10 to 20 fps. Pointless really.
 


I think tthat those 8GB of VRam is just an excuse to say that it is an "upgrade". Otherwise, nobedy is going to buy a rebrand with al most the same specs.
 
AMD's HBM Fiji GPU Is Made In Korea, But Not By Samsung
http://wccftech.com/amd-fiji-gpu-powering-fury-manufactured-samsung/

AMD's Radeon Fury X architecture revealed (preview)
http://techreport.com/review/28499/amd-radeon-fury-x-architecture-revealed

AMD withdraw KitGuru Fury X sample over ‘negative content’
http://www.kitguru.net/site-news/announcements/zardon/amd-withdraw-kitguru-fury-x-sample-over-negative-content/
DRAMA!

AMD Radeon R9 390X in CrossFire at 4K
http://www.tweaktown.com/tweakipedia/91/amd-radeon-r9-390x-crossfire-4k/index.html
 
Well, like I said a few pages ago, I'd say the 390X is just a neat replacement if you're thinking of going 2x980s (non-TI), now you have the option of going 2x390X'es instead. They should hold longer for higher resolutions given the higher VRAM count. I can't imagine how much longer though, but should edge out the life span of a couple of 980s. That is, if you don't care about power consumption.

The 970 is and will still be the 1080p king though.

Low tier, I don't have a clear picture to be honest. I'm not sure where the 380 stands now. I guess beating the 960, right?

In any case, the official Fury reviews can't come soon enough, haha.

Cheers!
 


Totally agree! The 300-series wasn't meant to be anything major; only the Nano and Furry were.
 
All i want is to be able to play Witcher 3 on ultra at 4k without ever seeing fps below 60. Is that so much to ask for!? hehe horrid idea to force Display port on the new gen gpu's. What a stupid damned idea. Like someone sat at a table and said "lets make less money instead of more" and everyone around that person said "BRILLIANT, YOU GET A RAISE FOR SUCH A GREAT IDEA!"

ugh
 

unfortunately for us common folks, industry standards move at a much slower pace than tech.
 
Well we have our first youtube reviewers in, and here are some of the benchmarks we've seen so far. It looks to be that the 390 is a rebranded 290 and the 390X is a rebranded 290X.

1zydn5w.jpg


14y607t.jpg


2a9pdgi.png


 
:)

Except in this case with the display port, the gpu industry is more than 2 steps ahead of the current technology standard, so it makes no sense. Display port isn't included with a lot of the 4k tvs, it hasn't been adopted in the best rated 4k tvs yet. Standard HDMI please, stop with the fancy stuff that even fewer people own. 😛



 
HDMI 2.0 isn't "FreeSync" compatible. For all "gaming" purposes, DP is the way to go IMO. The only problem is DP can't carry it's signal over 4mts or so.

It's like when DVI and HDMI were around. DVI was and still is the better standard, but everyone went nuts over HDMI and I have no idea why. Maybe because of TVs?

In any case, wait until adapters come out. DP has always been compatible with HDMI in a simple way, so I'm expecting that adapters will come out soon.

Cheers!
 


I dunno as much as i'd love to say fury x's will go at 60 fps on a 4k ultra on witcher 3 i doubt it. I'd say a more reasonable 45-60 fps. Ocassional dips, is what i'm betting. They said fury x's will do smooth framerate at 4k and even 5k, but all those bencmarks were probably done in amd created games, until i see it with my own eyes, i'll be doubtful i'd say fury x2 is your best shot at a stable 60fps or maybe overclocking an x or getting 2 x's if that's able to run in crossfire.
 
Yep, I feel the same. It is a bad deal all around no matter which gpu you end up with.

Route 1: Crossfire AMD R9 390x = Horrid Deal that costs $900 for two cards. Virtually the same performance as the 290x crossfire set that you can pull off at $400 used total for two gpu's. I assume there is a slight fps bump over the 290x setup so I'd expect 50fps, since I get 45fps on my 290x crossfire setup in Witcher 3 at 4k on ultra.

Route 2: $650 GTX 980TI. Near identical performance as crossfire R9 290x setup with 38-45fps averages in Witcher 3 at 4k on ultra.

Route 3: Fury X: Probably the same as the GTX 980TI, except you can expect AMD to not fix anything for 6-8 months with a new patch that is needed on day one of the release of the gpu. Still wont be optimized for Witcher 3 play. Cant use on a 4k tv without display port, which only a select few actually have.

Route 4: Fury X2? - Should achieve 60fps easy in Witcher 3 at well over $1,000, may push the $1300ish price mark IMO. Still no Witcher 3 optimization or patches for months after the card release. Cant be used on a 4k tv with out a display port, which only a select few have.

Route 5: SLI GTX 980 - Same as the Fury X2 maybe at the same price point.

Route 6: Single 290x/GTX 970 - Best overall Deal, run in 2k instead of 4k and get 50+ fps on ultra.

Route 6 looks like the best option and is what I am dealing with right now. Sold off my second 290x and dropped back from 4k to 2k resolution and it still looks incredible. Funny how the new GPU market and models make some take steps backward and use the older generation. Prices aren't getting better, price to performance is being flung out the window and replaced with just performance at a high price. Stupid marketing by Nvidia to release the TI so soon after the 980 was released and the insane price of the Titan. Even more stupid marketing by AMD to release the 390x as a rebrand but also not bother to tell anyone "hey guys, we know the 290x has issues with some games like Witcher 3, we are going to fix it dont worry"

Instead, nodda. Yet, we will still shell the cash out to them. >.>



 
Question why are the thermal image temperatures taken by the Toms Review so drastically different to Guru 3D's.

For Reference :http://www.guru3d.com/articles_pages/msi_radeon_r9_390x_gaming_8g_oc_review,10.html
 

check the comments section/thread. formatc explained how he tested and posted a link to an article from the german site.
 


Think im having a blind moment, can't seem to find it.
 
If I can get the Witcher 3 to run on Ultra with 1440p, 144Hz, 60fps I would be happy. For me 4K is out of my budget. My GTX670 is making me regret playing Witcher 3 before buying a new card because sometimes the cutscenes just glitch and look awful.
 
These power numbers are a problem.

More power than a 7990 is a big problem for any potential Crossfire setup. Almost 100 watts more than a 290x. It seems like they did all they could to push that Hawaii core to its limit to reach their target performance goal.

The score for 'Average' means:
"Average: Metro: Last Light at 1920x1080, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen). In order to heat up the card, we run the benchmark once without measuring power consumption."

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/28.html
power_average.gif

perfwatt_2560.gif
 




I'm in the same boat kind of. I thought I had it all figured out and was going to wait until we got benchmarks from Fury, which really looks good but I'm still leary about 4gb of vram, even if it is HBM. I thought about all of those options as well, and the 295 x2 is one you left out but runs into similar problems with amd support for certain games and crossfire profiles.

I was going to do the evga gtx 970 and then step up to the 980TI but they are only basically reference cards so not much oc room. So now, I am either thinking Gigabyte windforce gtx980 TI at $689 or the hybrid at $749 but I really don't want to do $749.

The Fury still looks good and I might just say F it and buy that card, even with 4gb vram. I really hope we get some 3rd party benchmarks before launch or day of, with a true overclock so we can see how far it can be pushed. I don't really want to drop $649 for it and then find out it can only overclock 10%, they aren't allowing any changes to the water cooling.
 
Just bite your lips until the benchies come out, Reaper. Hold to your hard earned money until you know if Fury is a worthy buy or not.

We all hope it will be, but until benchies, you can't count on that blindly.

Also, the power consumption numbers for the 390X look so weird... And now I know why. That's an OCed version of the 390X. The power numbers for the reference 390X are not in that chart. It's weird for TPU to not list such an important bit of information, where in other charts the reference card is there. I'm not saying it's going to be less than the 290X, but 4GB extra RAM and a few extra Mhz don't come for free, even if the 28nm process is more mature.

Cheers!