GTX 970 With a FX 6300 ... Bottleneck? Help !

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

DayRider

Reputable
Mar 3, 2014
216
0
4,680
Hey guys !

I just bought a GTX 970 .. I have a GTX 660 at the moment and i just bought this card... i was SO Hyped till i realized it might be a bottleneck . Is there one ? And by how much ?? I plan on upgrading CPU to a i7 in a few months but will i notice a reasonable increase in FPS with a 970?

SPECS: FX 6300

8gb RAM

GTX 970

Win 10


Cheers


-Matt
 
Solution
To nip a few things in the bud, joshyboy & phononboy have covered most if it.

1. A stock 6300 WILL limit the upper fps on newer titles paired with a 970.
2. Overclocked to 4.2ghz you will net above 50fps solid in every title apart from a select few.
3. That GTA v bench is hopelessly outdated , updates allowed a lot better multi core use , you will rarely see a 6309 drop below 50fps at any time now.
4. Your 1366x768 monitor is a bigger 'bottleneck' - I use that term loosely , at 60fps you won't see above 60% usage- the benefits - it'll run incredibly cool & quiet & offers you headroom for the future, a 960 or 380 would have given you pretty much the same performance at that resolution though.

Put it in your system, enjoy it & stop...


i'm gonna say it is always silly to use youtube videos as a source of accurate data
 
Yeah... i plan on buying a 1080 P monitor soon . I Really hope i can get ultra settings with the division on my new GTX 970 ... Even with the bottleneck... Is there a major visual differeence form someone comming from a 768P Screen to a 1080 P screen like me?
 


The FX-6300 has a max turbo value of 4.1GHz.
I'm not familiar with the exact overclocking options, though most CPU's have power saving techniques so may change the CPU multiplier as the load increases. Hence the "max" value may be 4.1GHz, and while gaming it may be closer to 3.9GHz.

If we assume the overclock is 4.2GHz under normal gaming load (one multiplier or start higher than 4.1GHz and drop like normal) then this is 42/39 or under 8% theoretical max processing improvement.

With a heavily CPU bottlenecked game that might only mean about a 5% FPS improvement such as 58FPS going to 61FPS.

I mainly wanted to point out that the real-world benefits aren't necessarily that large ( a couple game tweaks that don't change graphics much can often do as good or better).

Other:
*If planning to upgrade the PC as I said before, think carefully about the MONITOR. So many people have crappy monitors and expensive gaming rigs. Since GSYNC is pretty expensive, the monitor to choose depends on:
a) overall budget for monitor + PC
b) when you could replace the monitor (two years?) for a good GSYNC model (NVidia GPU)

Monitor examples (some warranties are way better than others. Some Dell have great, and some Acer crappy like Dell 3-year, zero dead pixel, pre-paid waybill to return vs 1-year, some dead pixel etc but it VARIES a lot. Keep in mind when seeing cheaper Acer pricing for what seems similar):
$140 (IPS, 5ms, 1920x1080, 23", 60Hz)-> http://pcpartpicker.com/part/asus-monitor-vs239hp

$260 (IPS, 4ms, 2560x1440, 25", 60Hz)-> http://pcpartpicker.com/part/acer-monitor-umkg7aa002
(I bought my sister the Dell version for a bit more money)

$260 (TN, 1ms, 1920x1080, 24", 144Hz)-> http://pcpartpicker.com/part/asus-monitor-vg248qe

$800 (GSYNC, IPS, 4ms, 2560x1440, 27", 144Hz)-> http://pcpartpicker.com/part/asus-monitor-pg279q

$1100 (GSYNC, IPS, 4ms, 3440x1440, 35", 100Hz)-> http://pcpartpicker.com/part/acer-monitor-z35bmiphz

So...
A lot of pros and cons. I have a GTX680 which is approximate 70% fast as a GTX970 (for titles with minimal CPU bottleneck).

However, I have a 2560x1440, IPS panel. I love it not just for desktop usage, but for games with a lot of small text/HUD elements such as CIV5. For games like Skyrim the resolution isn't nearly as important and may not look much better beyond 1920x1080. Having said that, I also prefer 27" or higher monitors which aren't ideal for monitors with only 1920x1080 physical pixels at close distances such as two feet away as you can see the individual pixels. So we can't just compare the resolution here.

144Hz panels add a lot to the cost to the point you may have to choose between that or a higher resolution at 60Hz. If you play fast games like shooters then 144Hz might make a lot more sense.

4K models are extremely problematic. Currently the refresh rate is limited so not ideal for gaming. Processing power goes up a lot for usually a minimal visual difference. Many desktop applications have scaling issues that worsen as resolution increases. I do like 32", 4K, IPS panels for productivity though such as multiple programs on-screen rather than multiple monitors. Even then, if you don't mind ultrawide the 3440x1440 is a nice option (price aside).

Asynchronous monitors as said are awesome but add a lot to the cost. I'd like to see that $800 panel drop to $500 or less in another year. I think it may do so.

The best gaming monitor is arguably the $1100 model I listed. Even then there are pros/cons as the refresh rate is a maximum 100Hz, and ultrawide might not be desirable for some games (though you can just have black bars of course on left/right).

My point again with listing these is to suggest you PLAN your upgrade path over a few years. So maybe something like the $140, 1080p, IPS now and keep that for a few years that go with a GSYNC (or AMD + Freesync) arrangement.

Actually, the cost of a 1080p monitor plus replacing your current rig probably gets close to $800USD. If I was going to upgrade your rig I'd be going with:
- monitor, plus
- i5-6600K + mobo + 16GB DDR4 + Windows 10 64-bit + CPU cooler?

That is adding up close to $800. It's pretty hard to compare both scenarios though:

a) GSYNC 1440p only change
- lower FPS overall for CPU bottlenecked games
- games SMOOTHER (can have similar or better at lower frame rates too)
- higher res, larger screen, and better picture (vs basic 1080p monitor)

b) 1080p 60Hz + Skylake replacement
- FPS higher in most games (doesn't necessarily mean smoother or better experience in a particular game though)

Summary:
Most people would replace the core PC. I would personally replace the monitor if the above was my only option due to budget.

Based on everything so far though I'd suggest doing NOTHING (aside from optimizing tweaking for games) and wait until 2017 at least to see how AM4 affects pricing. I'd take Intel unless the benefit is significant. $50 for a CPU may seem like a lot in savings but it's 5% on a $1000 build so keep that perspective.

Skylake quad-core CPU's have about HALF the space used by the GPU portion. Thus, AMD may be able to create an 8-core for about the same size (without GPU.. which would be an APU anyway) and same or possibly cheaper price. We'll see.

I think Zen CPU's and Zen/Polaris APU's are going to do very well. Probably also the next APU or nearly identical for the PS5/XB consoles near 2020 as well.

Okay, must stop talking...
 


My advice: Stop using the word bottleneck. A "BOTTLENECK" has nothing to do with the visual difference between two monitor resolutions. You are killing my patience.
 


I just took this mainly to be an inquiry as to whether the higher resolution by itself made a noticeable difference.

The wording could have been a bit more clear, though I think "you are killing my patience" isn't the best response either.

And for the record, a CPU bottleneck absolutely can affect the visual difference between resolutions. If some CPU bottleneck still exists at 1080p you may have to drop some GPU settings that you wouldn't need to with an i7-4790K or whatever to keep the same frame rate.

With a heavy CPU bottleneck at 768p, the GPU won't be fully utilized. Thus, if he increased the resolution and uses MORE of his GPU he'll get a better jump in visual fidelity compared to someone with a GTX970 but better CPU (because an i7-4790K setup might have had no GPU bottleneck at 768p thus could have higher visual settings at the same frame rate). Again, that assumes he can meet his frame rate goal, but I'm just trying to give an idea of how it works.

Or (theoretical game):
768p:
FX+GTX970-> 80FPS, medium
i7+GTX970-> 80FPS, high

1080p:
FX+GTX970-> 50FPS, high
i7+GTX970-> 50FPS, high

(eliminating the CPU bottleneck at 1080p would mean the same experience, whereas having a CPU bottleneck means lower visuals at same FPS, or the same visuals at a lower FPS due to the GPU left waiting for the CPU)

So yeah, CPU bottlenecks can affect the difference in visual fidelity in a few different ways.

768p vs 1080p:
*The largest difference aside from monitor size is what I just said above. The sharpness of text, smoothing (anti-aliasing) jagged edges. In some cases the HUD elements don't scale uniformly so at 768p they may be too large (or at 1440p/4K too small).

Ultra with "The Division"?
It really depends on what FRAME RATE you deem acceptable. Please refer to my long guide above about the tweaking GOAL (i.e. Adaptive VSync).

It's likely 1920x1080 is ideal for the HUD/Text sharpness for your setup, then after that don't worry too much about meeting specifically "ULTRA" but balance the entire thing for your tweaking goal. That may mean dropping AA, Shadows or other settings.

I've seen lots of games where ULTRA might give 40FPS, and tweaking to give 60FPS gave such a close visual experience that it was almost impossible to notice side-by-side.

Other:
Many games have both GPU and CPU benchmarks. If you see an FX-4300 getting about the same as an i7-4790K then that means there's minimal CPU bottleneck (at those settings).

I've seen several benchmarks for various games where an FX-6300 got 80% the performance at 768p or 900p but raising the resolution to 1080p or higher created minimal CPU bottleneck due to the GPU proportionately being taxed heavier (as I said above). Again, the higher resolution lowers the frame rate so going from 45FPS with a CPU bottleneck to say 30FPS with a GPU bottleneck just to perhaps have a bit sharper text may not be ideal.

*Don't make the common mistake of "can I get Ultra at 1080p" then simply apply Ultra settings and then note you get at least 30FPS average so that's "ideal". It's not ideal as I've attempted to state ad nauseum. There's an optimal setup and it ALWAYS starts with your tweaking GOAL (which you may want to save to refer to).
 
Photon boy - your previous reply to mine looks sensible mathematically but isn't true to what actually happens on a stock fx chip under heavy load.

Turbo to 4.1ghz will only happen on 2 cores while dropping the remaining cores to between 3000-3300mhz.

While this was beneficial until say 2014 when most games were majorly twin threaded the same is not true now.

GTA v/division/ac unity/fc4/primal/project cars - in fact most new titles you'll see a fairly uniform 70-80% load across 4 cores - this is especially prevalent in the division & GTA v.

Turbo core causes erratic voltages & clocks when the load switch & can introduce microstutter & significsnt frame drops - these will only last a few miliseconds but can absolutely spoil fluidity & gameplay

Overclocking an fx is more about reducing this erratic behaviour & providing a more stable & uniform experience.

If I build an fx rig for someone (bear in kind the only 2 chips I use are the 6300/8320 both with a 3.5ghz base clock) & they're unwilling to pay for aftermarket cooling & or a 'good' quality ratherc than 'acceptable' quality board then I will still preset those systems up with turbo core disabled & the multiplier set to 19 (3.8ghz).
At those settings it pulls less voltage than a stock setup & will run slightly cooler & quieter but performance 99% of the time will be equal or better than stock settings & will certainly give a more uniform gaming experience.

The division by the way will be more limited by the 970 at 1080p , its a GPU hog big time at ultra settings, at 1080p its high settings for solid 50fps+ , ultra will dro it to low 40s pretty regularly.

 


I hated quoting this message, because it was so long (that had nothing to do with your points, but the way the system works.) Text will not be controlled by the CPU. They can use two different text formats, one that scales with AA, and one that is pixel based. There isn't a "WHA, uh...here's your sentence, good luck, format for text."

Everything that runs natively at it's given resolution will always look better. Even with lower settings. Thats why 4k doesn't need AA. The OP was acting like a CoD teen with the word "Bottleneck" and I felt the need to call him out. Everyone else can play around with the word like a putty that they don't want to touch, but for me, I'd rather stand over here and say "Stop".

I play The Division. I run it on a 1080p monitor. I play it on HIGH and get 45 fps. Does the OP consider that a win?
 


I'm not necessarily disagreeing with any point.

1) Can you provide a good LINK about the way Turbo works? (because the way you describe really surprised me). I've seen lots of anecdotal evidence, however i can't find a good, reputable link explaining this.

2) Uniform loads can be misleading. Code can finish its task and start on another core. Thus you need good BENCHMARKS to extract useful data.

3) Microstutter due to varying frequency/voltage may happen, but it's hard to get good info on exactly what games it happens in, or if Windows 8/10 affects this due to better CPU management.

A few games had issues with HYPERTHREADING on my i7-3770K, then running Windows 10 cleared this up. I never could get reliable information on why. In Spider-Man Web of Shadows it was a complete stutter-fest in W7 (or fine with HT disabled in BIOS). Some stutter in Witcher #1 too, and a couple Ubisoft games. All much smoother in W8 or W10.

4) The Division - I can't test FPS, though I will note that benchmark results for online games are often not accurate. In BF4 for example, they always seemed to run single player benchmarks and would show an FX-4300 performing the same as an FX-8350. But the they'd show 80%+ usage at times with the FX-8350 which explained the FPS and stutter issues the people with weaker CPU's had.

And many games have periodic stutter due to the CPU being overloaded occasionally even when the AVERAGE can be similar. This ties into your above statement with FX specifically but it can happen with any CPU. That's why we have the "% below 1FPS" or similar wording to describe freezing that affects enough frames to be noticeable.

I've got to stop typing...
 


FTW
 
"I hated quoting this message, because it was so long (that had nothing to do with your points, but the way the system works.) Text will not be controlled by the CPU. They can use two different text formats, one that scales with AA, and one that is pixel based. There isn't a "WHA, uh...here's your sentence, good luck, format for text."

Everything that runs natively at it's given resolution will always look better. Even with lower settings. Thats why 4k doesn't need AA. The OP was acting like a CoD teen with the word "Bottleneck" and I felt the need to call him out. Everyone else can play around with the word like a putty that they don't want to touch, but for me, I'd rather stand over here and say "Stop".

I play The Division. I run it on a 1080p monitor. I play it on HIGH and get 45 fps. Does the OP consider that a win?"

1) Text - I don't know what your point is. I simply said it can look better if you have more pixels. Like Civ5. No need to confuse the issue.

2) Native resolution - you lost me there. First of all, most games don't have a "native" resolution aside from a few older titles with HUD scaling issues (would get smaller as resolution increases). Then you said (I guess) that some games look BETTER if you drop some of the graphic settings provided the game resolution matches the native desktop resolution? Huh?

Some applications do of course which is why we have to play around with DPI settings and Cleartype for higher resolution monitors especially. But this is about GAMING so again not sure what you're getting at.

3) 4K-> It's not true that 4K can't benefit from anti-aliasing. That's been proven many times. It may not be obvious in some games, but it absolutely has looked better in others. Nor would that explain what you meant by: "Thats why 4k doesn't need AA".

By that logic, then a 1366x768 game doesn't need AA provided the game was "optimized" for that resolution and played on a native 1366x768 screen. You seem a bit confused by something but I'm not sure what that is.

4) "acting like a teen" -> ?

5) The Division at 45FPS -> no idea, though I did mention he should set his goal such as VSYNC ON or OFF and plan accordingly.

On a side note, a lot of people throw around words like they run the game at "ultra" or "max" settings, which could mean 30FPS at 768p for all we know. Running 60FPS@1080p vs 30FPS@768p is a huge difference in terms of processing power.

Other:
Viewing distance is also important which is why 768p for some titles looks fine on console but way worse sitting close. Which incidentally exposes a rather glaring problem with 4K HDTV's. If you sit close enough that 4K content looks noticeably better then other content can look horrible.
 


A link mate no not really,I dont use anecdotal evidence from the net - all testing done is my own which is why i know its correct.

done a quick one for you, cpu passmark + aida64 stress test ,one at completely stock settings,one at 3800mhz with turbo disabled.
youll see on the stock one that the reading in red are the turbo & youll see how inefficient it actually is with both the clock speeds & the multi thread passmark rating - also notice that at 3800mhz across all cores & with turbo disabled the cpu pulls less load voltage than at stock settings
3.8ghz wiuth turbo disabled is a win win on both the 6300 & 8320 cps compared to stock settings - no extra voltage (less in fact) & absolutely comparable temps - this a an be done on literally any board capable of running the 6300 & with the stock cooler.

stock
test_stock.jpg


3.8ghz

test_3800.jpg
 


Ok , let's hope that the bottleneck won't be more than 10-20 fps lower than a I 5 .... I'm sure if I buy a cooler for $30 and I oc to 4.6 GHz their will be no bottleneck though ?

 
@DayRider I'm confused by your statement of buying a 300 NZD 1080p display, when you stated earlier that you were out of money.

"The thing is , im OUT of money . i cant upgrade as of now. Any of you know how much bottleneck their will be ?"

Perhaps I misread some additional responses, but knowing that you had 300 NZD (New Zealand Dollars) would have changed the advice you received. For instance, if you're that concerned about bottlenecks then you could have bought an i5-6400 for (300 NZD) and a cheap motherboard from PB Tech (http://www.pbtech.co.nz/index.php?z=c&p=cpus). The additional 50 NZD could possibly be recouped by selling the used FX-6300. I also believe that Pascal is worth waiting for.

@madmatt30 Good job on taking the time to benchmark.
 
i had this same issue when i got my first 970 so here is what it is
yes it will get bottle necked but the fx 6300 is highly overclockable with a good cooler (got mine to 5.4ghz)
upgrade later like i did but until then overclock the cpu to even its turbo of 4.1ghz will help
 


I think ultimately youve been confused by all this 'bottleneck' talk which is why its a term I refuse to use unless absolutely necessary (which is not often)
on a 60htz screen a 6300 with a slight overclock (4ghz) will power a 6300 absolutely fine,50-60fps is achievable on 99% of current titles with a bit of tinkering.
It will never match the upper fps limits of a good quad intel chip but unless your psuhing a higer htz screen it really doesnt matter imo.
All your games will be playable,theyll all look nice - bear in mind that although on release the 970 could stomp through anything at max settings on newer titles youre looking somewhere between high & ultra to maintain a solid fps rate with no fluctuation.
Ive actually taken to running my screen at 1080p 50htz now so all games default to 50fps with adaptive vsync on - I absolutely challenge anyone to tell the difference between 50 & 60 fps if its a solid rate with no drops.
This generally maxes my gpu usage at around 75% & the card stays below 65c at all times - this is & always has been the way I like to run my components,I personally see no point pushing a constant 100% usage to see 80-90fps on a 60htz screen which is ultimately pointless.

& yes an aftermarket cooler on an fx cpu is always recommended,what you buy & how far you can overclock depends on what motherboard model youre running ?

& like rcald2000 Im also confused as to you having no more budget then dropping $300 on new monitor - Im not saying it wasnt a decent thing to do ,just confusing.
What monitor is it btw?