[SOLVED] Higher RAM frequency than the CPU supports

Mar 11, 2023
60
6
45
My CPU supports ram speed up to 2666mhz, is there a point in upgrading to 3200mhz, are there any benefits going above 2666 ( mostly regarding FPS, but other stuff to)

Full pc spec:

RAM: G.Skill Ripjaws 2133mhz cl15 (2x16gb)

Cpu: i7 8700k cpu info

Gpu: MSI 1070ti data sheet, next week upgrading to rx 6750xt data sheet (I am aware of the relatively light bottleneck between cpu and gpu)

Motherboard: MSI Z370 A PRO Data sheet

PSU: DeepCool DA700 Plus Bronze / 700w info
 
Solution
Hey, I was debating with somebody else regarding this matter, cpu/gpu bottleneck, is my decision worth it etc. I dont know why I asked him this question because previously he said that if you have i7 8700k with 1060ti and upgrade to 3060ti you will see ZERO performance boost, which I still find hard to belive, so I will just copy paste the question I asked him to you, and if you can tell me your opinion.

So I decided to upgrade gpu as stated above, many people are saying that they paired i7 8700k with 6700xt and that they are getting high fps even at 1080p ( of course 1440 would be far better )

Secondly, benchmarks comparing cpu,s which are considered very good for 6700xt ( for example 5600x) do not give any great amount of FPS...
My CPU supports ram speed up to 2666mhz, is there a point in upgrading to 3200mhz, are there any benefits going above 2666 ( mostly regarding FPS, but other stuff to)

Full pc spec:

RAM: G.Skill Ripjaws 2133mhz cl15 (2x16gb)

Cpu: i7 8700k cpu info

Gpu: MSI 1070ti data sheet, next week upgrading to rx 6750xt data sheet (I am aware of the relatively light bottleneck between cpu and gpu)

Motherboard: MSI Z370 A PRO Data sheet

PSU: DeepCool DA700 Plus Bronze / 700w info
I doubt you would see a great diff shop for a kit of 3200 cl16.
 
  • Like
Reactions: Psycho381

DSzymborski

Titan
Moderator
My CPU supports ram speed up to 2666mhz, is there a point in upgrading to 3200mhz, are there any benefits going above 2666 ( mostly regarding FPS, but other stuff to)

Full pc spec:

RAM: G.Skill Ripjaws 2133mhz cl15 (2x16gb)

Cpu: i7 8700k cpu info

Gpu: MSI 1070ti data sheet, next week upgrading to rx 6750xt data sheet (I am aware of the relatively light bottleneck between cpu and gpu)

Motherboard: MSI Z370 A PRO Data sheet

PSU: DeepCool DA700 Plus Bronze / 700w info

There's no real benefit on an Intel platform of this era.

And even if there was, it would be hard to suggest upgrading for a very marginal gain on an old platform, especially when you're using a very low-quality, group-regulated power supply. Upgrading that is far more crucial.
 
  • Like
Reactions: Psycho381
Mar 11, 2023
60
6
45
There's no real benefit on an Intel platform of this era.

And even if there was, it would be hard to suggest upgrading for a very marginal gain on an old platform, especially when you're using a very low-quality, group-regulated power supply. Upgrading that is far more crucial.
hey, thank you for the reply!
I might be mistaking, but, since my cpu is bottlenecking, and I am playing on 1080p, shouldn't there be at least some performance boost going from 2133mhz to 2666-3200mhz. do correct me if those two pints are irrelevant.

yeah I know about the psu, im kinda on a tight budget since im buying new gpu, but is it realllly that bad?? thankfully I never had any issues with it, so kinda don't feel the need to upgrading as soon as I can.
 
Last edited:

DSzymborski

Titan
Moderator
hey, thank you for the reply!
I might be mistaking, but, since my cpu is bottlenecking, and I am playing on 1080p, shouldn't there be at least some performance boost going from 2133mhz to 2666-3200mhz. do correct me if those two pints are irrelevant.

yeah I know about the psu, im kinda on a tight budget since im buying new gpu, but is it realllly that bad?? thankfully I never had any issues with it, so kinda don't feel the need to upgrading as soon as I can.

If the 8700K is actually bottlenecking -- and it really should never be a problem with a 1070 Ti -- a few extra frames won't save you. Given the platform, I doubt you'd even see a 5% uplift in gaming. If this had been a Ryzen, that really cares about RAM speed, I'd feel differently.

And yes, the PSU is really that bad. And you don't actually know there aren't issues; like heart disease, most of the damage a junk PSU will do is asymptomatic until something very bad happens. Cheaping out on power supply is frequently a very expensive mistake, something we see happen a lot around here.
 
  • Like
Reactions: Roland Of Gilead
Mar 11, 2023
60
6
45
If the 8700K is actually bottlenecking -- and it really should never be a problem with a 1070 Ti -- a few extra frames won't save you. Given the platform, I doubt you'd even see a 5% uplift in gaming. If this had been a Ryzen, that really cares about RAM speed, I'd feel differently.

And yes, the PSU is really that bad. And you don't actually know there aren't issues; like heart disease, most of the damage a junk PSU will do is asymptomatic until something very bad happens. Cheaping out on power supply is frequently a very expensive mistake, something we see happen a lot around here.
No the cpu is far from bottlenecking the 1070ti, I mentioned in my pc spec that I am upgrading to rx 6750xt so I was referring to this ( which should be 10-15% bottleneck depending on a game). but thank you for the advice.

Any recommendation regarding psu? which does not break the bank but it is still good
 
Last edited:

DSzymborski

Titan
Moderator
No the cpu is far from bottlenecking the 1070ti, I mentioned in my pc spec that I am upgrading to rx 6750xt so I was referring to this ( which should be 10-15% bottleneck depending on a game). but thank you for the advice.

Any recommendation regarding psu? which does not break the bank but it is still good

You're not using a bottleneck calculator, are you? Those are absolutely worthless. A CPU can't really bottleneck a GPU -- since the CPU gets the data first -- so it's not quite the same thing as other way around anyway.

Price depends on where you are. On an x70 class card, I absolutely wouldn't go below Tier B and I'd usually counsel sticking with Tier A.


Something like this would be an excellent and you get a full ten-year warranty on these.

PCPartPicker Part List

Power Supply: Corsair RM650x (2021) 650 W 80+ Gold Certified Fully Modular ATX Power Supply ($104.99 @ Amazon)
Total: $104.99
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2023-03-17 19:03 EDT-0400
 
My CPU supports ram speed up to 2666mhz, is there a point in upgrading to 3200mhz, are there any benefits going above 2666 ( mostly regarding FPS, but other stuff to)
Your motherboard supports ram overclocking.
You absolutely can use faster ram than 2666mhz.
Motherboard specs says 4000+mhz in OC mode.

On that generation Intel system ram speed doesn't give a lot of performance increase though.
 
  • Like
Reactions: Psycho381
Mar 11, 2023
60
6
45
You're not using a bottleneck calculator, are you? Those are absolutely worthless. A CPU can't really bottleneck a GPU -- since the CPU gets the data first -- so it's not quite the same thing as other way around anyway.
Hey, I was debating with somebody else regarding this matter, cpu/gpu bottleneck, is my decision worth it etc. I dont know why I asked him this question because previously he said that if you have i7 8700k with 1060ti and upgrade to 3060ti you will see ZERO performance boost, which I still find hard to belive, so I will just copy paste the question I asked him to you, and if you can tell me your opinion.

So I decided to upgrade gpu as stated above, many people are saying that they paired i7 8700k with 6700xt and that they are getting high fps even at 1080p ( of course 1440 would be far better )

Secondly, benchmarks comparing cpu,s which are considered very good for 6700xt ( for example 5600x) do not give any great amount of FPS "more" than if 6700xt is used with i7 8700k ( better yes, but not worth spending money for new CPU ),

Lastly we all know that bottleneck calculators are not crazy accurate, but when you go to the PC Builds bottleneck calculator and select i7 8700k, 6700xt, 1080p it says that the bottleneck is 7.3%, so even tho it is not accurate, I highly doubt that in the real world the actual bottleneck would be like 30% making the upgrade worthless.

And btw I just started digging into the AMD world, therefore I still have many things to learn, so maybe 6700xt would perform better with i7 8700k than 3060ti (even tho the performance of those 2 gpus are similar) , because it is built different or anything like that.

If you can spare some time to check any tests, anything regarding of pairing of those 2 components and tell me your opinion in terms of "is the upgrade worth it". I would really really appreciate it!

pc spec:
RAM: 3200mhz cl16 (2x16gb)

Cpu: i7 8700k cpu info, water cooled (single fan) perfect temperatures, what could be my OC potential, currently runs stable at almost 4.7Ghz zero OC

Gpu: MSI 1070ti data sheet, "Next week upgrading to RX 6750xt data sheet" price difference between 6700xt and 6750xt is only 20$ so I might go with that

Motherboard: MSI Z370 A PRO Data sheet

PSU: DeepCool DA700 Plus Bronze / 700w info ( I know I should replace it😑 , tight budged currently )

Monitor: ASUS VG248QE
Everything here was bought bit less than 4 and a half years ago.

Thank you in advance buddy. Edit: and to anybody else who replies.
 
Last edited:
Mar 11, 2023
60
6
45
Your motherboard supports ram overclocking.
You absolutely can use faster ram than 2666mhz.
Motherboard specs says 4000+mhz in OC mode.

On that generation Intel system ram speed doesn't give a lot of performance increase though.
I know. Just wondered if there is any benefit in maxing out my ram speed if my cpu supports only up to 2666. Thanks for the reply
 

DSzymborski

Titan
Moderator
Hey, I was debating with somebody else regarding this matter, cpu/gpu bottleneck, is my decision worth it etc. I dont know why I asked him this question because previously he said that if you have i7 8700k with 1060ti and upgrade to 3060ti you will see ZERO performance boost, which I still find hard to belive, so I will just copy paste the question I asked him to you, and if you can tell me your opinion.

So I decided to upgrade gpu as stated above, many people are saying that they paired i7 8700k with 6700xt and that they are getting high fps even at 1080p ( of course 1440 would be far better )

Secondly, benchmarks comparing cpu,s which are considered very good for 6700xt ( for example 5600x) do not give any great amount of FPS "more" than if 6700xt is used with i7 8700k ( better yes, but not worth spending money for new CPU ),

Lastly we all know that bottleneck calculators are not crazy accurate, but when you go to the PC Builds bottleneck calculator and select i7 8700k, 6700xt, 1080p it says that the bottleneck is 7.3%, so even tho it is not accurate, I highly doubt that in the real world the actual bottleneck would be like 30% making the upgrade worthless.

And btw I just started digging into the AMD world, therefore I still have many things to learn, so maybe 6700xt would perform better with i7 8700k than 3060ti (even tho the performance of those 2 gpus are similar) , because it is built different or anything like that.

Thank you in advance buddy. Edit: and to anybody else who replies.

The worry is about nothing. The 6700xt is not a problem and again, all those calculators that calculate bottlenecks or claim to calculate X CPU with Y GPU are nonsense because they're not based on actual data. They're very simplified formulas and there's a million of them because they're basically the website equivalent of shovelware.

Let me put this in a very different way, from the opposite direction: the 2080 Ti is generally superior to the 6700xt and trades punches with the 6800 and 6800xt, generally falling in between. At the time the 2080 Ti was released, the 8700K was the best gaming CPU available and from that point (fall 2018) until late 2020, there was no gaming CPU that was significantly better than the 8700k. Then Zen 3 came out in late 2020, represented the first significant bump up from an 8700K and Intel could no longer rest on its gaming laurels and release incremental improvements resulting in their next generation also representing the first real bump up from an 8700k/9700k/9900k.

If the 8700k had been holding back the performance of the 2080 Ti, then there would have been no CPU that could truly use the 2080 Ti for the entire two years after the 2080 Ti was released because there was no gaming CPU significantly better than an 8700K. But we know that's not true; the 2080 Ti benchmarked with significant improvements over the 1080 Ti and the other 20-series GPUs.

So, if the 8700k holds back the potential of a mere 6700xt, one would have to believe that the best CPUs from 2018-2020 which ran 2080 Tis just fine are suddenly just now holding back GPUs weaker than the 2080 Ti. Or, alternatively, one would have to believe that everyone benchmarking 2080 Tis for three years didn't notice there were no CPUs that could use the 2080 Ti in a beneficial manner. Both beliefs strain credulity far beyond its elasticity.

Extraordinary claims require extraordinary evidence. Saying automated benchmark hatchet-job sites are "not crazy accurate" is a bit like saying chopping off your own foot with a hatchet from your garage is "not particularly relaxing."

Honestly, you're chasing shadows here. Yes, the 8700K has aged, but the 6700xt is not a particularly high-end GPU by 2023 standards. There's zero reason to worry about a phantom bottleneck when you're talking today's midrange GPUs like the 6700xt, 3060 and 3060 Ti, which are simply the equivalent of high-end 2018/2019 GPUs, which CPUs like that ran just fine. (And again, the term bottleneck is not applicable here).

I would hate to see you spend money chasing these phantoms, especially when you have legitimately awful, archaic safety equipment in the mix.
 
  • Like
Reactions: Psycho381
Solution
Mar 11, 2023
60
6
45
The worry is about nothing. The 6700xt is not a problem and again, all those calculators that calculate bottlenecks or claim to calculate X CPU with Y GPU are nonsense because they're not based on actual data. They're very simplified formulas and there's a million of them because they're basically the website equivalent of shovelware.

Let me put this in a very different way, from the opposite direction: the 2080 Ti is generally superior to the 6700xt and trades punches with the 6800 and 6800xt, generally falling in between. At the time the 2080 Ti was released, the 8700K was the best gaming CPU available and from that point (fall 2018) until late 2020, there was no gaming CPU that was significantly better than the 8700k. Then Zen 3 came out in late 2020, represented the first significant bump up from an 8700K and Intel could no longer rest on its gaming laurels and release incremental improvements resulting in their next generation also representing the first real bump up from an 8700k/9700k/9900k.

If the 8700k had been holding back the performance of the 2080 Ti, then there would have been no CPU that could truly use the 2080 Ti for the entire two years after the 2080 Ti was released because there was no gaming CPU significantly better than an 8700K. But we know that's not true; the 2080 Ti benchmarked with significant improvements over the 1080 Ti and the other 20-series GPUs.

So, if the 8700k holds back the potential of a mere 6700xt, one would have to believe that the best CPUs from 2018-2020 which ran 2080 Tis just fine are suddenly just now holding back GPUs weaker than the 2080 Ti. Or, alternatively, one would have to believe that everyone benchmarking 2080 Tis for three years didn't notice there were no CPUs that could use the 2080 Ti in a beneficial manner. Both beliefs strain credulity far beyond its elasticity.

Extraordinary claims require extraordinary evidence. Saying automated benchmark hatchet-job sites are "not crazy accurate" is a bit like saying chopping off your own foot with a hatchet from your garage is "not particularly relaxing."

Honestly, you're chasing shadows here. Yes, the 8700K has aged, but the 6700xt is not a particularly high-end GPU by 2023 standards. There's zero reason to worry about a phantom bottleneck when you're talking today's midrange GPUs like the 6700xt, 3060 and 3060 Ti, which are simply the equivalent of high-end 2018/2019 GPUs, which CPUs like that ran just fine. (And again, the term bottleneck is not applicable here).

I would hate to see you spend money chasing these phantoms, especially when you have legitimately awful, archaic safety equipment in the mix.
GOOD POINT! never thought of looking at things this way man, I appreciate a lot. I kind of doubted that my cpu could be an issue, but never had any developed argument like yours.
I might sound stupid but English is not my first language so I kind of dont get the last sentence: I would hate to see you spend money chasing these phantoms, especially when you have """legitimately awful, archaic safety equipment in the mix."""

Edit : Saying automated benchmark hatchet-job sites are "not crazy accurate" is a bit like saying chopping off your own foot with a hatchet from your garage is "not particularly relaxing." Bruh💀, good one 😂😂😂
 
Last edited:

logainofhades

Titan
Moderator
Another reason bottleneck calculators are garbage, is they cannot cover every game and resolution. For instance, I am a WoW player, and even at 1440p ultra, I am CPU bound with an R7 5800x, and an RX 6800. I peak at around 85% gpu utilization. I have dual 170hz monitors, and cannot even fully utilize them. I top out at around 120-140fps, depending on zone. Now say I played some more taxing on graphics, like say cyberpunk, the GPU would be what holds me back. Even newer titles, like overwatch 2, can be highly CPU dependent, in a similar manner, as WoW.
 
Mar 11, 2023
60
6
45
Another reason bottleneck calculators are garbage, is they cannot cover every game and resolution. For instance, I am a WoW player, and even at 1440p ultra, I am CPU bound with an R7 5800x, and an RX 6800. I peak at around 85% gpu utilization. I have dual 170hz monitors, and cannot even fully utilize them. I top out at around 120-140fps, depending on zone. Now say I played some more taxing on graphics, like say cyberpunk, the GPU would be what holds me back. Even newer titles, like overwatch 2, can be highly CPU dependent, in a similar manner, as WoW.
I know. But a valid point nevertheless.
The way I mentioned bottleneck calculator was that even tho they are not accurate that the real world bottleneck should not be 20-25% more than calculated. But tbh now I feel like never using them in any way for any reason.

WoW? 120 frames with R7 5800x and RX 6800 😯, rly when tf did this game become this demanding.
I am more into competitive shooters (E.g Warzone, etc) so I guess I am good with my pc after the GPU upgrade at 1080p, I really don't even mind lowering my settings to max out frames, since in those kind of games you are not really focused on the beauty of the game and its visual appeal which does not help you play better unless the difference is between lowest setting and the highest, it was always more important does the game run smooth so it is easier to make gameplays and so called pop off (IMO).

I guess this thread is done. Thanks everyone for helping me out.
Edit: Unless I find some another stupid thing to ask 😅.
 

DSzymborski

Titan
Moderator
I know. But a valid point nevertheless.
The way I mentioned bottleneck calculator was that even tho they are not accurate that the real world bottleneck should not be 20-25% more than calculated. But tbh now I feel like never using them in any way for any reason.

WoW? 120 frames with R7 5800x and RX 6800 😯, rly when tf did this game become this demanding.
I am more into competitive shooters (E.g Warzone, etc) so I guess I am good with my pc after the GPU upgrade at 1080p, I really don't even mind lowering my settings to max out frames, since in those kind of games you are not really focused on the beauty of the game and its visual appeal which does not help you play better unless the difference is between lowest setting and the highest, it was always more important does the game run smooth so it is easier to make gameplays and so called pop off (IMO).

I guess this thread is done. Thanks everyone for helping me out.
Edit: Unless I find some another stupid thing to ask 😅.

Feel free to ask any question; that's what we're here for! Everyone here loves helping others get the computers they need and want.
 
Mar 11, 2023
60
6
45
Requirements go up every expansion. We are on expansion number 9 now, with Dragonflight. It's always been a cpu dependent gain. If I had a 5800x3d, instead of a 5800x, I would probably make use of my GPU.
Damn, I knew it is a cpu dependent game, however not to that extent, good to know, but on the other hand it has been years since I checked anything regarding WoW. The only moba I really played was smite for about 7-8 years on and off.