[SOLVED] Does the AMD FX-6300 bottleneck my 1060 6GB? Massive FPS drops

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
Does the AMD FX-6300 bottleneck my MSI 1060 6GB? I get Massive framerate drops in Fortnite, when i first load up fortnite i get about 120-170 constant and after a while it changes to 40-80fps and the game is super laggy!

What can I do if my 6300 is bottlenecking my 1060? Should I look at buying a new cpu, ram and motherboard?



I have water cooling, I have the Corsair H105

GPU:MSI GTX GeForce 1060 6GB

CPU: AMD FX-6300

RAM: Kingston Hyper X Fury DDR3 (not sure exactly what frequency but it shows at 933mhz)

Power Supply: Corsair CS650M
 
Solution
OC that FX to 4.6GHz or better, the cooler can handle it. The fx series suffers from a serious flaw, it's got roughly 67% the IPC (instructions per clock) of a 3rd gen i5. So if you are still at a boost clock of native 3.8GHz, that's only on 2 cores. The 3-4 are 3.6GHz and 5-6 are 3.4GHz. With its IPC, you are roughly equitable to Intel running at 2.6GHz. You need to increase clock speed to get more instructions per time period which equates to more frames per second.

You are getting hard drops after load up because load up is only using 2 cores, after that your speed is dropping, so are frames.

933MHz is single data rate, it's the actual speed of your ram. Your ram however runs 2 data streams, it's DDR 3, dual data rate - 3rd...

Karadjgne

Titan
Ambassador
OC that FX to 4.6GHz or better, the cooler can handle it. The fx series suffers from a serious flaw, it's got roughly 67% the IPC (instructions per clock) of a 3rd gen i5. So if you are still at a boost clock of native 3.8GHz, that's only on 2 cores. The 3-4 are 3.6GHz and 5-6 are 3.4GHz. With its IPC, you are roughly equitable to Intel running at 2.6GHz. You need to increase clock speed to get more instructions per time period which equates to more frames per second.

You are getting hard drops after load up because load up is only using 2 cores, after that your speed is dropping, so are frames.

933MHz is single data rate, it's the actual speed of your ram. Your ram however runs 2 data streams, it's DDR 3, dual data rate - 3rd iteration, commonly referred to by its DDR which is 1866MHz.

Your cpu isn't bottlenecking your gpu. The cpu is working exactly as was intended 6½ years ago when it was released. Your issue is you have paired it with gpu technology that surpasses the best that was available then. It's not slowing down your gpu, it just isn't able to live upto the gpus potential, in this game.
 
Last edited:
  • Like
Reactions: LewisMann
Solution

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
Now I understand fully ! Thanks man, knowing all this do you think it would be a better option to upgrade my system in the means of a new CPU, motherboard and RAM?
Will the 6300 do anything to slow down my gpu? I have a R9 270x which was my old cpu i could be using because i thought that it may be reducing the 1060's potentials and slow the actual graphics card down, even with a new cpu it will still be slowed down because of the effects from the 6300, if you get me? Is this a thing or no?

Thanks
 

Karadjgne

Titan
Ambassador
Everything works at 100%, all the time. Doesn't matter if the load is only using 20% or 90% of the cpu, the cpu is still running at 100%. Kinda like a car engine at 3000rpm, doesn't matter if it's in 1st gear doing 30mph or 5th gear doing 90mpg, it's still at 3000rpm. So your cpu, no matter what you do, will still pre-render frames as fast as it should, give those pre-renders to the gpu. Depending on detail settings and resolution, the gpu will render those pre-renders and throw them on screen. The gpu will also run as fast as it should, you can't slow it down like that. Some frames are highly detailed like explosions or grass, so take longer to paint the picture, but it's still running at 3000rpm. The longer it takes to paint the picture, the less frames it can paint per second.

With your 6300, let's say it can only pre-render 60 pictures every second. That's what it sends to the gpu. The gpu laughs at the cpu because it could paint 100 of those pictures every second.

A modern cpu, with the ability to throw 200 frames of that same picture at the gpu, now laughs at the gpu who is trying hard to get that 100 pictures painted.

Usage is different. With usage, the cpu might be at maximum pre-rendered frames of 60 per second, but the game engine only uses 50% of the cpu to get that. Like only uses 3 cores. Games don't always use every available resource, just the ones the code says to use. Same with gpu.

The way to tell if the cpu is maxed is to change detail levels. Say you get 60 fps at medium settings. Increase detail to ultra/max. If you still get @ 60 fps, then that's all the cpu can do, the gpu can still do more. If fps drops drastically then the cpu is giving plenty, but the gpu is in trouble.

If you believe the cpu isn't enough for fortnite, you should be quite able to throw that game on epic settings and play the same fps as at low ± a few fps. If fps tanks hard, then you have issues with the gpu, or it's drivers or you have 4kDSR set etc.
 
  • Like
Reactions: LewisMann

Karadjgne

Titan
Ambassador
You can't just jump into it, sorry, I should have been a little more clear. There's plenty of YouTube videos, forum how-to's etc for those cpus. It doesn't really matter which make your bios is, the OC theory is all the same, just different vendors and bios revisions can use slightly different terms and locations. The fx6300 is well known as an OC super cpu, it's one of the best to OC and many will hit 5.0 GHz depending on your mobo. Something even Intel couldn't do back then. But I've not heard of a 6300 that was so unlucky in cpu lottery not to be ae to hit 4.3GHz at least. Meaning there's steps and settings that you didn't take/use somewhere.
 
  • Like
Reactions: LewisMann
The FX, even if running at 4.3 GHz, should just about match an i3-6100 (clocked 500 MHz lower at 3.7 GHz) in 1080P average gaming frame rates...

Nonetheless, many folks were/are happy with them. but, they should be upgraded ASAP if planning on tangling with BF1, BF5, etc...

i5-8400 or R5-2600 should be about a minimum these days...(my own 7700K is still hanging in there, but has certainly fallen from it's lofty perch of 2 years ago!) :)
 
  • Like
Reactions: LewisMann

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
Yeah, im planning to want to stream fortnite and play. I want roughly about consistent 150 fps, what should I do? Look at upgrading my CPU? Because i have just played a squad game and it was unplayable, it was stuttering constant, i think maybe screen tearing and my fps wasnt even reaching above 50, the game was incredibly laggy and literally unplayable.
 

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
You can't just jump into it, sorry, I should have been a little more clear. There's plenty of YouTube videos, forum how-to's etc for those cpus. It doesn't really matter which make your bios is, the OC theory is all the same, just different vendors and bios revisions can use slightly different terms and locations. The fx6300 is well known as an OC super cpu, it's one of the best to OC and many will hit 5.0 GHz depending on your mobo. Something even Intel couldn't do back then. But I've not heard of a 6300 that was so unlucky in cpu lottery not to be ae to hit 4.3GHz at least. Meaning there's steps and settings that you didn't take/use somewhere.
Does AMD Overdrive OC my CPU or should I go into the BIOS and OC there?
 
Yeah, im planning to want to stream fortnite and play. I want roughly about consistent 150 fps, what should I do? Look at upgrading my CPU? Because i have just played a squad game and it was unplayable, it was stuttering constant, i think maybe screen tearing and my fps wasnt even reaching above 50, the game was incredibly laggy and literally unplayable.
To get close to 150fps in all games you are needing a modern i7 minimum paired with a strong gpu. If you lower expectations a little you can save a lot with Ryzen.
 

Karadjgne

Titan
Ambassador
2600/X or better, i5 8600k or better, gtx1080 or better is roughly what you'll need to get @ 140ish fps gaming and @ 60ish fps streaming 1080p.

That's at least 6 full cores, high IPC, high clock speeds and a gpu capable of not only NVENC at decent fps, but also simultaneously getting you high fps. And that's an average. In some games like old CSGO, the fps will be off the charts, in new AAA titles like BF5 you might be lucky to be seeing 100fps with a 40fps stream. It's highly game dependant. If using something not gpu dependent like OBS which is more on the cpu, that can change again
 
  • Like
Reactions: LewisMann

Karadjgne

Titan
Ambassador
With cpus it's a matter of perspective. If you figure that any cpu will only go to 4.8GHz and a 2600 is base 3.4GHz and a 2600x is 3.6GHz and you tack on the boost clocks, then for all intents and purposes the 2600 is going to show much larger gains in OC, the 2600x will have less. So it's not so much that one is better than the other, but more that one has more headroom than the other. Both OC to roughly the same limits, ppl are going to OC anyways, so why pay more for a higher base clock?
 
  • Like
Reactions: LewisMann

Karadjgne

Titan
Ambassador
Right. That game is optimized for 8 thread usage. You have 6. So every thread is being used, and there's a little backlog because the cpu can't process the code any faster (low IPC) and strings are taken on a priority basis.

If a code string uses 60% of the cores bandwidth, and the next string in line will use 70%, it'll bump that second string to another core that's coming available and take the 3rd string that'll only use 30% and shove that in. That way all strings go through in a timely manner, but there is some shuffling involved which does result in loss of a few fps.

Even Intel quad cores such as the i5-7500 are running into this as games are really starting to use more than 4 threads. With cpu limitations on temps vrs speeds, the magic 5.0GHz under 70°C, game devs are being required to make games using more threads just to get the info across. Simple games like minecraft are easy for 1-2 threads, but the sheer amount of data needed for the realism used by BF5 etc demands higher core/thread usage.

Because of the low IPC on that FX, games now are really swamping usage. Trying to cram 10lbs of stuff in a 5lb bag. Not too bad if it's 2lb blocks you can throw in and dump out real quick, but at 10lb blocks, somethings gotta give.
 
  • Like
Reactions: LewisMann

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
Right. That game is optimized for 8 thread usage. You have 6. So every thread is being used, and there's a little backlog because the cpu can't process the code any faster (low IPC) and strings are taken on a priority basis.

If a code string uses 60% of the cores bandwidth, and the next string in line will use 70%, it'll bump that second string to another core that's coming available and take the 3rd string that'll only use 30% and shove that in. That way all strings go through in a timely manner, but there is some shuffling involved which does result in loss of a few fps.

Even Intel quad cores such as the i5-7500 are running into this as games are really starting to use more than 4 threads. With cpu limitations on temps vrs speeds, the magic 5.0GHz under 70°C, game devs are being required to make games using more threads just to get the info across. Simple games like minecraft are easy for 1-2 threads, but the sheer amount of data needed for the realism used by BF5 etc demands higher core/thread usage.

Because of the low IPC on that FX, games now are really swamping usage. Trying to cram 10lbs of stuff in a 5lb bag. Not too bad if it's 2lb blocks you can throw in and dump out real quick, but at 10lb blocks, somethings gotta give.
Thank you man
 

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
Hi, I was wondering if a 60hz monitor would be the result in why my frames are dropping significantly in games such as Fortnite. I start my pc and play a game of Fortnite with about 150fps and then it dips to 30-90fps. I was wondering if my 60hz monitor would be the result of this, or if it is my processor not being able to keep up with my 1060. here are my specs:

Monitor: ASUS VS229 - 60hz
GPU: MSI GTX 1060 6GB
CPU: AMD FX-6300
RAM: Kingston Hyper X DDR3 (shows at 933mhz)
MOBO: MSI 970 GAMING
PSU: CORSAIR CS650M
If you need to know any other specs, let me know
Thanks
 

King_V

Illustrious
Ambassador
Hi, I was wondering if a 60hz monitor would be the result in why my frames are dropping significantly in games such as Fortnite. I start my pc and play a game of Fortnite with about 150fps and then it dips to 30-90fps. I was wondering if my 60hz monitor would be the result of this, or if it is my processor not being able to keep up with my 1060. here are my specs:

If your monitor is 60Hz, then it can never display more than 60fps.

If you have VSync on, you will never get more than 60fps.

If you have VSync off, you will get as many FPS as your system can pump out, but there will be screen-tearing.

Honestly, I'd recommend turning on VSync.
 
  • Like
Reactions: LewisMann

LewisMann

Reputable
Mar 2, 2019
66
5
4,545
If your monitor is 60Hz, then it can never display more than 60fps.

If you have VSync on, you will never get more than 60fps.

If you have VSync off, you will get as many FPS as your system can pump out, but there will be screen-tearing.

Honestly, I'd recommend turning on VSync.
I have tried this, it is input laggy but would you say I should keep it on until i can upgrade my system?