FX 8350 with R9 390 (will it bootleneck?)

lionheart051

Reputable
Sep 28, 2014
14
0
4,510
So will this cpu bottleneck a R9 390 gpu? I think i read somewhere, some time back it's ok with a R9 290x but since the R9 390 is a newer card i was wondering if this cpu will cause a bottleneck.
 
Solution
No, it won't bottleneck.

Intel i5 and i7 and also FX 6300 and 8300 series CPUs won't bottleneck a single high end GPU in modern, multi-threaded games.

They also won't bottleneck most 2nd tier Crossfire/SLI setups either, such as R9 280 / HD 7950.

FX 6300 with dual R9 390 or GTX 980ti will start to have a bottleneck in some games.


Example: I had an old Phenom II x6 1090T @ 4.0GHz that would fully load both of my HD 7950 GHz ed. (R9 280) in crossfire with Crysis 3 on max details except for 2x SMAA (* Tom's recommended AA mode for best performance/appearance balance). CPU would be around 90-95% usage on all 6 cores. GPUs would both be at 95-99% usage. No bottleneck, well balanced, average FPS in the 50s.

Midrange OC of 4.4-4.6GHz...

BombTech4

Reputable
Jun 28, 2015
1
0
4,510


The average gamer will ALWAYS have a bottleneck somewhere in their system. If you are like 99% of PC gamers who buy parts based on a specified budget, you can't get around it. Everyone wants to whine and complain about bottlenecking something, when the average user won't even noticed the subtle difference. Like the other guy said, a true bottleneck is when you pair a drastically lower end something with a drastically higher end something. The way you explained it, the subtle differences in electrons passing through the motherboard can be considered a bottleneck (which is technically true, but obviously not noticeable). Short of buying every high end component brand new every 6 months, you just can't get around it (even then I don't think you can if you go by the technical definition). Here's an idea, stop telling people they are bottlenecking unless they are truly going to surpass the limitations of a component. And if you don't really know what the true limitations are, then put your fanboyism aside and stop regretting the ridiculous cost of your Intel components that are still... gasp... bottlenecked by definition.

 

ddog

Reputable
Oct 11, 2014
44
0
4,540
I have been using an AMD 8350 paired with a Gigabyte R9 390 for about 3-4 months now, and haven't had any issues to run some of the more demanding games. Almost all games that I have run are in the 60fps range maxed out. I use a 1080p 60Hz monitor and have to turn on vsync for most of the games to stop screen tearing.

Just to give some examples at max settings, Far Cry 4 runs at 60 fps and over, Metro series at around 60fps, MK X at 60fps, MGS5 60fps while Crysis 3 manages 30 - 40s fps and is completely smooth and playable.

The only problems I encountered so far were dying light and Evolve. Dying light runs like crap on max settings, though from research this game seems to not run properly on AMD GPUs. Evolve on the other hand runs at 60fps on max settings but there are some instances when I noticed a slight "stick" with a dip in the fps for like 1 ms then it continues on smoothly at 60fps. Sure enough using msi afterburner shows that there is a bottleneck in THIS particular game with the 8350 being at 100% utilization with the GPU at 35 to 40% utilization, however it is completely playable to me. But comparing Evolve to the likes of FarCry 4 I will assume this was just bad optimization.

I have run a lot more games on it that I didn't mention, that all ran flawlessly. Based on first hand experience I can recommend this setup.
 

lionheart051

Reputable
Sep 28, 2014
14
0
4,510

I asked for an answer, not your bias fanboyism. The r9 390 has already destroyed the GTX 970 and GTX 980 (in some benchmarks). So don't come at me with foolishness. The guy in the thread is complaining about his mobo not his cpu, his mobo is the problem. Stop letting your fanboyism cause you to give bad advice, i might not be a pro but i know a little about computer hardware (enough to detect foolishness). Your kind of advice is dangerous to people who are new at this sort of stuff and the reason why some people don't want to ask quests on Tom's Hardware.

I usually don't respond to these kinds of posts but I'm tired of guys like you spreading misinformation and foolishness because of you blind fanboyism. Stop it!
 

jkteddy77

Honorable
Jun 13, 2013
1,131
0
11,360
As someone who spent HUNDREDS of hours and nearly 50 threads on here nearly 2 years ago on the 290's release, let me shed some light here.
In certain titles, the 8350 bottles the 290 a ton, drops it down to nearly 70% GPU usage in GPU bound games (biggest problem was BF games) That's REALLY bad, you are only getting 70% of the performance you paid for!!! May as well have just bought a slower GPU.
In nearly all games you'll see 5 fps loss vs an i7, and maybe lower minimums, but in other games you can see massive drops.
In certain bottled games, I saw 10fps less on average and saw minimums 25fps lower than I get now with my Intel.
Not to mention turning down the settings didn't give me much more FPS than Ultra did, this is a VERY significant sign of a CPU bottleneck if the game is GPU bound (today, I'd say 95% or more of titles are).

Here is some actual data I RECORDED, not just word of mouth here. Absolutely Unbiased footage, unlike the very varying results I see linked everyday and like in the images above. This isn't something a benchmark can solve, since every game require different amounts of CPU resources (not to mention most benchamrks aren't that CPU intensive in the first place)
Here is some Real World bottlenecking.
Pay attention to GPU and CPU graphs as well as usage levels.

https://www.youtube.com/watch?v=DcTPLMuQ610
https://www.youtube.com/watch?v=Aq9dSLOElX4
https://www.youtube.com/watch?v=IaVKMNL-aS4

In GPU bound games, if your GPU is ever lower than 99% usage with Vsync off, you ARE bottlenecking, and I saw that routinely with the 8350. I show this reading often throughout the videos, and some parts even have it live overlayed on the screen.

If you want me to explain what some of the graphs mean I can, but basically as seen in the gameplay graph in the bottom left corner, the yellow(CPU) and GPU(green) should be right on top of each other. If one is higher than the other, that part is bottling. It is holding the other back, and is a true bottleneck. In the third video I even locked the framerate so the graph more clearly and smoothly showed this correlation. Even though when locked both parts were not at full usage in this video because of vsync, their usages should still match up with each other, which the graph clearly showed they did not.

Also note that the more pixels you are pushing, the less bottlenecking you will experience. Pushing more pixels or higher resolution textures give the GPU more to do before it receives its next instructions from the CPU, effectively alleviating the bottleneck.
If you're in 1600p or 2160p you don't have as large of a problem, the GPU is being very stressed, and the slower CPU becomes less of a problem because the GPU is more preoccupied. But in 1080p? There is not enough for the GPU to render before it craves new instructions of what to render next from the CPU. If the CPU isn't keeping up, the GPU starts to idle, hence it drops GPU usage, and it's not rendering it's maximum potential amount of frames, hence the FPS performance drop. That is the science of bottlenecking.

Since I got my 4790k, I don't ever see lower than 97-99%, it fixed any low FPS issue I was having in all of my games (tested nearly 300 games) OVERNIGHT

So if I had trouble 2 years ago, I am ssuming today's games still aren't faring any better, if not worse with this combination of hardware.

If you can live with bottled performance to cut money, grab an 8320/8350, but I highly recommend an i5 over one anyday
Maybe Zen will be better???
 
No, it won't bottleneck.

Intel i5 and i7 and also FX 6300 and 8300 series CPUs won't bottleneck a single high end GPU in modern, multi-threaded games.

They also won't bottleneck most 2nd tier Crossfire/SLI setups either, such as R9 280 / HD 7950.

FX 6300 with dual R9 390 or GTX 980ti will start to have a bottleneck in some games.


Example: I had an old Phenom II x6 1090T @ 4.0GHz that would fully load both of my HD 7950 GHz ed. (R9 280) in crossfire with Crysis 3 on max details except for 2x SMAA (* Tom's recommended AA mode for best performance/appearance balance). CPU would be around 90-95% usage on all 6 cores. GPUs would both be at 95-99% usage. No bottleneck, well balanced, average FPS in the 50s.

Midrange OC of 4.4-4.6GHz on FX 6300 or 8350 will be plenty to handle pretty much any single GPU.
 
Solution

DubbleClick

Admirable
It does bottleneck a r9 290, therefore it bottlenecks a r9 390. Yes, you'll still get playable performance, but an (I3/)I5/I7 will give better results. This is shown by any and every review google search lets you find. There is currently no amd cpu that doesn't noticeably holds back high end gpus.

Anyone claiming there won't be a bottleneck does either not understand the word, has no knowledge on this area (didn't read benchmarks and/or has no idea by what it's caused) or is simply straight out denying facts (fanboy).
 


5-10 FPS compared to say, an i7-4790k =/= "bottleneck." A "bottleneck" would mean that there is a MASSIVE performance hit compared to a comparable component (say, an i5). For instance, a first-gen i5 coupled with a GTX 980.

The 8350 can hold its own at 50+ FPS in almost every game out there on a decent graphics card.
 

IamTimTech

Admirable
Oct 13, 2014
1,685
0
6,160


You're misinformed and you're biased. Don't even know where to begin.
 

DubbleClick

Admirable
Exactly, even I5's and I7's do bottleneck high end gpus in a few situations, although the difference between an I7 4790k and an I7 5960x @ 4.4ghz is usually very small, below 5%. The difference between a fx 8350 and an I7 4790k on the other hand is around 20-30% on average, higher on ftv and stutters.

So even if you'd like to call minor differences "measuring inaccuracy", a fx 8350 would still absolutely bottleneck a r9 390 in the majority of titles.

 

partiesplayin

Distinguished
Nov 20, 2013
96
0
18,630
the fx 8350 will diffently bottleneck 2 high end cards such as two 290x's or two 780ti or 980ti's . My fx was bottlenecking my cards 980ti's and holding them back because the ability of the cpu to render physics was just too low.
 

gab_th

Distinguished
Jul 3, 2010
247
0
18,710


The 980 is definitely a beast and I would hesitate to compare it with the R9 390, but to be honest, the 970 is comparable to it, and I think the 390 wins out. I don't know if you have already, but take a look at this article: http://www.tomshardware.com/reviews/sapphire-nitro-r9-390-8g-d5,4245.html

They're more or less on par, with the 390 being slightly above in most games, GTA V being more noticeable, so I would be more inclined toward getting the 390, and if I could find it for cheaper then it would be an absolute no-brainer.

I think it's a situation similar to the 960 vs r9 280. The 280 is SLIGHTLY (DISCLAIMER: SLIGTLY) more powerful in most games, and despite that, the 960 is SLIGHTLY more expensive.

 

Bem-xxx

Reputable
Sep 20, 2015
163
0
4,710


FX won't bottleneck anything.

58_53_core_i7_4770k_vs_amd_fx_8350_with_gtx_980_vs_gtx_780_sli_at_4k.png


 

Eyeball07

Reputable
Oct 11, 2014
51
0
4,630


Sorry to revive this thread, but you're saying that an FX-6300 won't bottleneck an R9 390. I currently own an FX-6300 (clocked at 4.9 GHz) and an R9 390, and I'm getting bottlenecked out the ass. In BF4, my CPU utilization goes up to 95% and I'll get 48fps (not always, but it's ridiculous that it would go below 60fps at 1080p).
I need to upgrade my CPU, so should I go with an FX-8350/8320? I don't want to be stuck in this same situation.
 

gab_th

Distinguished
Jul 3, 2010
247
0
18,710


If you're unwilling/can't afford a change of mobo+ CPU to an i5, then FX-8350 is the way to go for you. The i5 is the best CPU for gaming currently, but the FX-8350 is still a very powerful one, and luckily uses the same socket as the FX-6300.

 

gab_th

Distinguished
Jul 3, 2010
247
0
18,710


The difference in performance is rather slight, so going for the cheaper option is reasonable. What country is that, BTW? I've never seen Nvidia products priced cheaper than AMD in Newegg, Amazon, etc.

 

superstition

Distinguished
Feb 27, 2009
150
0
18,710
All CPUs bottleneck high-performance GPUs generally somewhat. If you want minimal bottleneck get the fastest Skylake i7.

Or, you can (if you have the right equipment) overclock your FX chip to 4.6 GHz or so (with APM off). It won't be as fast as the i7 but it's cheaper and should be fine for most games. You can get an 8 core FX for $100 at Microcenter and get $40 off a board. All 8 core FX chips should overclock to 4.5 GHz, at least, with decent voltages. But you will need a board with at least a 4 doubled phases digital VRM and a fan blowing onto the VRM sink.

Interestingly enough, in some games at 4K with two GPU setups, AMD FX chips can outperform a $1000 Haswell or be right with it. I suppose it's because such scenarios are more GPU-bound. Lower resolutions tend to magnify the differences in CPU performance.
 

Negativelead

Reputable
Dec 5, 2014
167
0
4,690
I have a 390x with an 8350. I don't really see any lag or screen tear when I run my games. But.........when I bench them the number are not all that impressive. I benched ACU at around 45 fps today.

So I really think bombtech4 made an intelligent post. The history of PC 's(Consumer and otherwise) has consisted of moving a bottleneck from one place in the system to another. Technically speaking the PCI Bus can bottleneck your GPU. But will you have sufficient game play. Yes! Will you notice any real world detriment. Not really.

So you'll be ok. But a lot of it is going to be psychological. If you constantly think about the bottleneck you may be more apt to notice it. But if you want to enjoy your games and just have fun you have a very comparable system ;)
 
Hello,

"Lag" is frame times, and screen tear is because a display has a static refresh rate, GPU's don't. Both of which don't depend on what GPU brand you're using, one GPU isn't going to have less consistent frame times because it's NVIDIA, that's how well they optimise games, nor is it going to reduce screen tearing. Bottleneck however, FX CPU's are definetely bottlenecking even mid end GPU's, especially in MMO's, I own FX...


All the best!