Question Bottleneck query: i3-8100 and RTX 4070 ?

Status
Not open for further replies.
Aug 4, 2023
16
2
15
Hi, currently gaming on a Dell 22 inch monitor with 1680 X 1050 resolution (i3-8100, GTX 1060 6GB ,32GB RAM, Corsair CX550M PSU).
I am thinking of taking things a notch up by upgrading to a 27 inch 1440p monitor and consequently, an RTX 4070. How much of a bottleneck I will be looking at?

The games I like to play are graphically intensive (The Witcher 3, Red Dead Redemption 2...), but not very fast paced FPSes.
 
Hi, currently gaming on a Dell 22 inch monitor with 1680 X 1050 resolution (i3-8100, GTX 1060 6GB ,32GB RAM, Corsair CX550M PSU).
I am thinking of taking things a notch up by upgrading to a 27 inch 1440p monitor and consequently, an RTX 4070. How much of a bottleneck I will be looking at?

The games I like to play are graphically intensive (The Witcher 3, Red Dead Redemption 2...), but not very fast paced FPSes.
There is no number that can be applied to "how much".

At the same resolution, you would get the same framerate, and be able to turn the graphics level up.

The CPU provides the framerate, the GPU provides the eyecandy.
 
If you're asking how much performance loss you'll be getting, you'll have to do your own data gathering by answering these questions:
  • What performance do you get if you drop all graphical quality settings, including resolution, to the lowest?
  • What performance does the RTX 4070 get at 1440p?
  • Which is the lower of the two?
That's the performance you can expect to get.
 
There is no number that can be applied to "how much".

At the same resolution, you would get the same framerate, and be able to turn the graphics level up.

The CPU provides the framerate, the GPU provides the eyecandy.
Can you please elaborate? Changing the CPU means changing the motherboard. The upgrade is on a budget. Resolution will go up from 1680 X 1050 to 2560 X 1440.
 
If you're asking how much performance loss you'll be getting, you'll have to do your own data gathering by answering these questions:
  • What performance do you get if you drop all graphical quality settings, including resolution, to the lowest?
  • What performance does the RTX 4070 get at 1440p?
  • Which is the lower of the two?
That's the performance you can expect to get.
Thanks, I sure can find out item no. 1 at present.
 
What performance change are you hoping to get from this new GPU?
It will NOT be more FPS.
e.g Red Dead Redemption 2 is perfectly playable with most bells and whistles enabled at 1680 X 1050. More FPS is not what I had in mind. Just wanted to know whether this upgrade would cause some Bottleneck armageddon at 2560 X 1440? While checking out some RTX goodness in games like Cyberpunk 2077 or Hogwarts Legacy?
 
While checking out some RTX goodness in games like Cyberpunk 2077 or Hogwarts Legacy?
Having used a 3080 with an i7 6700K (more powerful than i3 8100) for several months. 4 cores/4 threads is simply insufficient for games like Cyberpunk, the game will largely be unplayable. You need a CPU upgrade for those kind of games. Even for Red Dead Redemption 2, a smooth constant 60 FPS on high settings at 1440p is going to be a struggle on the 8100.
 
Having used a 3080 with an i7 6700K (more powerful than i3 8100) for several months. 4 cores/4 threads is simply insufficient for games like Cyberpunk, the game will largely be unplayable. You need a CPU upgrade for those kind of games. Even for Red Dead Redemption 2, a smooth constant 60 FPS on high settings at 1440p is going to be a struggle on the 8100.
OK. Red Dead Redemption 2 ran fine with most settings enabled at 1680 X 1050 ( i3-8100, GTX 1060 6GB ,32GB RAM, Corsair CX550 PSU ).
 
Last edited:
  • Like
Reactions: Nighthawk117
8100 is below the minimum specs for the gpu. Performance will be poor. Getting a 8700k is a good idea. That way you get the 6 cores you need for that gpu. At least the setup can run windows 11.

10600k would get better performance once overclocked and with decent tuning with the RAM. A 8700k should be a quick and easy upgrade if one can be found cheap.
 
There's no such thing as a "minimum required CPU" for a GPU.

Unless you can point to some NVIDIA official documentation that says otherwise.
You need a 6 core cpu minimum.

Still, if you're running less than a 6-core CPU, we'd definitely recommend upgrading your CPU and perhaps even the rest of your PC before taking the RTX 3080 plunge at 1440p.
You can get a 8700k very cheap on ebay if you are lucky. 8100 will half the performance of the gpu and will be below minimum specs for most games.

Ryzen 7 3700X will bottleneck the RTX 4070 by 15.7% at 1080p using ultra quality settings.

Intel Core i7-8700K Bottlenecks the RTX 4070 at 1080p - ultra settings by 16% on average.
 
Last edited:
And that depends on what your performance requirements are. If the bare minimum you care about is 1080p 60 FPS on average, then an i7-4770K does that just fine according to that article.

There's no such thing as a minimum CPU requirement to actually run a GPU. Even if you want to go "but the GPU will be horribly bottlenecked if you pair a crappy CPU with it!", it won't if I plug in my 4K TV to the computer and run the game with a 4x DSR resolution.
 
And that depends on what your performance requirements are. If the bare minimum you care about is 1080p 60 FPS on average, then an i7-4770K does that just fine according to that article.

There's no such thing as a minimum CPU requirement to actually run a GPU. Even if you want to go "but the GPU will be horribly bottlenecked if you pair a crappy CPU with it!" Not if I plug in my 4K TV to the computer and run the game with a 4x DSR resolution.
The mimimum is a 6 core cpu with 12 threads. 4770k will be a massive bottleneck and wont be able to run windows 11 and thus the most modern games. DDR3 is below minimum spec for most games.

I replaced a 4930k which is 6 cores and 12 threads. Has DDR3 quad channel at 2400. This setup bottlenecked a RTX 2080. Back then it was quite playable but today with a 4070?

Most games can be run with a very bad setup but the FPS will be all over the place. This is why I replaced the 4930k with a 3800x.

If hes luckly get a 8700k cheap and then upgrade everything. That GPU is ment to be paired with the likes of the 12700k/13700k.

Then there is the 550 watt old PSU.
 
Last edited:
The mimimum is a 6 core cpu with 12 threads.
Again, I need an official source from the manufacturer themselves. Tom's Hardware does not count, this is just their assessment on what they think is needed to run games at a certain performance level.

Because I can most certainly go into my BIOS right now, disable 2 cores off my 5600X, and everything will still run. And I would argue outside of some recent games that I have in my library, they'll more than likely run at roughly the same performance. Also, core count means little with regards to the CPU's actual performance. An i3 from today will handily outperform an i7 from more than 8 years ago.

4770k will be a massive bottleneck
This depends on what settings you're using. If you're using 1080p, sure. If you're using 8K, definitely not. If you want to ask me how would I know that, I don't. But I can look at the 4K benchmarks, see that most of them float around 140 FPS, know that 8K renders 4 times the pixels as 4K, which would require around 3-4 times more work. That would drop the frame rate to below 60FPS

That is well within the range of what the i7-4770K is capable of pushing.

and wont be able to run windows 11 and thus the most modern games.
Uh, there's no game I'm aware of at the moment that only runs on Windows 11. You're free to point one out though.

DDR3 is below minimum spec for most games.
RAM type has little to do with overall performance. And "specifications" for games doesn't really mean anything with regards to performance. It's just what the developer is willing to support if you have a problem. I've done tests where the game complains up front that my hardware won't work, but it'll still play and run it just fine.
 
Windows 10 will reach end of support on October 14, 2025. Based on the benchmark of 6 games, the Intel Core i7-4770K Bottlenecks the RTX 4070 at 1080p - ultra settings by 36.9% on average.
Take a game like Control.
Intel Core i3-8100 is too weak for NVIDIA GeForce RTX 4070 on 1920 × 1080 pixels screen resolution for Control. This configuration has 61.3% of processor bottleneck .
Intel Core i7-8700K is too weak for NVIDIA GeForce RTX 4070 on 1920 × 1080 pixels screen resolution for Control. This configuration has 56.6% of processor bottleneck .
Intel Core i7-12700K is too weak for NVIDIA GeForce RTX 4070 on 1920 × 1080 pixels screen resolution for Cyberpunk 2077. This configuration has 6.2% of processor bottleneck .
Intel Core i7-12700K is too weak for NVIDIA GeForce RTX 4070 on 1920 × 1080 pixels screen resolution for Control. This configuration has 31.3% of processor bottleneck .
 
Last edited:
And "bottleneck" here simply means that an uber GPU like the proposed RTX 4070 will not be allowed to reach its full potential.
Only because the CPU cannot feed it fast enough to take full benefit of what it can do.

It does NOT mean the system will fail to run, or run slower than it did before the new GPU was installed.
 
Status
Not open for further replies.