• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

How much do old CPU's bottleneck new games?

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
Its a common point of discusion. Someone wonders about updating their video card on their old system and they wonder if their CPU is up for the task.

The problem is, most cpu benchies are with new processors, with new games.

All the old cpu benchies are with old games.

We need a benchie with old CPU's, one common high end GPU, and NEW games.


See, a CPU bottleneck is the same in all resolutions, think of it like a FPS ceiling. With most old games, the CPU ceiling is way above the GPU, even with older CPU's. You may only begin to see the CPU ceiling at LOW resolutions where the GPU ceiling is higher.
With new CPU's and new games (like crysis), the CPU ceiling is also well above what you can see, as the bottleneck is in the GPU.

What we need to see, are some OLD CPU shooters, say some P4 single cores (like a 2.4ghz and an overclocked P4 at over 3.6ghz), and an old athlon xp 2400 and 3000 or something. Throw in, like, a Pentium D, and an X2 4000. Then a for comparison throw in a Quad core c2d.
All that with the SAME high end video card.. perhaps a card that AGP and PCI-e versions (if need be).

Even if we just had ONE of the older pc's... with a kicking video card, benched over a wide scope of bleeding edge games. To see which games are REALLY CPU heavy... and which arent.

So many people think these old P4's AGP systems aren't up to the task. But i havnt seen much in numbers on old CPU's with new games. Makes me wonder where everybody is gettign their CPU bottleneck info (besides maybe personal experience).

It's hard to find a current benchie of this sorts. Seems like most reviewer sites want to forget about the old, and focus on new stuff. (which keeps the hardware marketers happy). But reader interest could spawn an article of this type.

Phew, that is all.
 
I would tend to agree with you. What most people don't realize is that the late model HeatBurst Pentiums actually performed well enough to avoid bottlenecking the video card a huge amount. The only thing I have upgraded in my system is 3 gigs of DDR2 800 and a new nvidia card and it runs better than my dual core AMD system. My 3.0 Ghz P4 with HT has gone unchanged since 2005 and saved me money which was better spent of video cards.
 
i agree, id like to see more of these. but didnt they do a review already of sticking a new card with a older cpu and seeing where the bottleneck was?
 
Basically most games these days are starting to support dual core. If you have anything over a 3.2ghz Pentium D you'll be fine until the release of Nehalem, imo. My 4.3ghz Pentium D is still a great chip. Beats pretty much all current AMD chips.
 
I'm still on AGP (ATI 1950GT + Asrock K8upgradeNF3 + AM2 module + Athlon 64 X2 4200 EE + 2gigs DDR667 on single channel)

All I can say is that I can play Silent Hunter 4 without a glith, graphics at medium, never tried higher graphs. I had a problem with it: Silent Hunter 4 slowed down dramatically when I had just 1gig of RAM.

The specs in Silent Hunter 4 recommends 2gigs, but I also checked that on task manager and the game eats 1.5gigs for breakfast. The hard drive couldn't catch up as usual for virtual mem, so I bought another 1gig-DDR2 module and it was the best thing that could happen to my PC, despite I can't do dual channel since thay are different cheapo modules.

I could try to leave task manager opened and play the game to see CPU use, then I'll post it.

 


As far as i'm aware Silent Hunter 4 is only programmed for single core usage, so you'll still have 1 core free. I could be wrong though.
 
got to check that when I'm home. I have supreme commander also, and I think my neighbor downloaded the crysis demo. As I don't use my PC for gaming I don't have lot of games... but its a nice way of measuring the performance.
 


I havnt seen a review of this type. Atleast not a moden one. I'd like to see how a P4 or Athlon XP (single core) can handle Crysis ,UT3, and COD4 with maybe a 8800gt, or if it has to be AGP.. maybe a x1950xt or the new 3850agp.
I'd like to see where the CPU bottleneck actually drops below the GPU bottlneck and actually effects gameplay (less than 30fps). Thats what i'd like to see.

So many people diss throwing a state of the art GPU into an old system (not an X2 4200ee, I'm talking older than that) but i havnt seen any numbers to back up their claims.

Infact, most benchmarks show the CPU bottleneck to be WAY above the GPU. Certainly some OLD cpu's must bottleneck these new games to below 30fps. But how old are those CPU's and which games?

Rly tho, this is important info.
 
as you can see in my sig, i have an amd 64 4000+ s939 with 2gig ddr and a 7950gt 512mb. it handles most games fine on. all the source games i play with everything cranked up, including aa and af @ 1680x1050. cod4 demo on high with aa but there is some slowdown. company of heroes i play on high with no aa and the crysis demo on med @ 1440x900. crysis is fine until i hit the gun config key (c or v?). the whole game seems to freeze for 5-6 seconds before i get the menu, then it's fine but takes another 5-6 seconds to get back to the gameplay. once in-game, it runs acceptable with these settings. i am assuming this delay is not normal and can be attributed to my single core cpu.

the only real benchmark i can give is in HL:LC i get about 72fps with all the goodies cranked and 4xaa and 8x af. for old tech, my s939 does a nice job, but it's starting to show it's age with cod4 (have to run a mix of med and high to run it smooth) and crysis.
 
MayDay you need more ram then. Those 5-6 seconds are those I had before. well, you can always check task manager and look.

silenthunter4mq9.jpg


silenthunter4screenlm8.jpg



I love this game...!
Despite topping the CPU, the game runs without a glitch, with almost no slowdowns. It eats a lot of ram as you can see and I'm pretty sure it would be slightly faster if I had dual channel and a better processor.



 
More ram won't help him at all (He has 2gigs which is perfect for XP).

I just upgraded from a opteron 175 to a q6600(already running it @ 3ghz) and I can tell you *some* games are more cpu bound then others. I believe COD4 and MOH Airborne, as well as some of the steam games are more cpu intensive then gpu.

MOH used to load my opty to 100% usage all the time and I noticed some slow downs (I play my games with as many options turned up as I possibly can, I like the eye candy). With the new quad I'm playing MOH right now and cpu usage is around 55-60%. It uses ALL 4 cores so thats quite a bit of power it's using, don't forget thats 4 cores at 3ghz and it's faster clock for clock then my opty. Not all games are as CPU bound as that so it really varies depending on what you're doing. I haven't tried COH yet or any other RTS games, I probably will soon.
 
i agree to this post. This topic needs to be covered critically.. If so this will truely tell us if it is necessary to upgrade our CPU's or GPU's accordingly.




 
I just got a ATi X1650 Pro Agp card for my AMD 2400+ system with 2x 512mb ram. I haven't played games for years, but decided to give WIC a try, so I got the 1650. I was getting very low fps with verylow settings. I noticed in the ati control center that my current agp speed was 0 . So I downloaded the latest agp drivers for my el cheap pc chip mb. That fixed that issue. Now its at 8x agp and my fsp on the wic benchmark went from 6 average fps to a whopping 12 fps ! So I put it in window mode and watched the task manager and the cpu stayed at 100% almost the entire time. I was able to squeeze a little more out of my cpu by increasing the FSB. Now my cpu shows up as a 2600+. I think it was about a 200mhz gain. So my average fps went from 12 to about 16.

With all that said.. If you have an old slow hunka junk like mine.. The CPU will almost always bottleneck.
 
It's funny that guys on this forum claim the topic hasn't been covered when in fact it has - by THG itself! Here is the article: http://www.tomshardware.com/2007/01/10/agp-platform-analysis/index.html

The conclusion? An Athlon XP 2500+ will definitely bottleneck an AGP 1950 Pro. If you have a CPU of this vintage, forget about running one with an AGP 3850 (which itself is about 2x the performance of a 1950 Pro), it's just not worth it. It's time to save up for a new PCI-E system. 😉
 
well, my other mobo is AGP and supports phenom (Asrock AM2NF3), or well, it supports the X2 6400 itself. But yeah, if you are upgrading both CPU and GPU, PCIe is the way to go.
 


That's nice, although I believe the OP wasn't talking about A64 X2s or Phenom, which we all know won't really bottleneck a modern GPU. He was talking about REALLY old CPUs such as early P4s and Athlon XPs. It would be a total waste to match a high end card with such slow CPUs.
 
that's what it seems after reading the article you posted...
Also the task manager screen I posted is topping my CPU too much, and its a "modern" one, lets say. Its evident an older CPU can't catch up with modern games...
 
Nice article. Indeed it does show CPU bottlnecks, but not enough to the render any of the games unplayable.

Infact its own conclusion was
"we wanted to see if an Athlon XP 2500+ would bottleneck today's newest AGP cards. It would seem that the answer is a resounding sometimes"

that was 1 year ago. an update on that article, with the same test system but newer games, would be sweet. in that article the cpu was not slow enough to render the games unplayable. some games had plenty of CPU power.. in others it was just enough to play.

It would be cool if they use that same processor, with same video cards (or some more modern ones too), and modern games.
I'd like to see where the CPU renders the games unplayable. seeing a cpu bottleneck at 50fps isnt enough to freak out. alot of shooters havnt dramatically changed the nature of gameplay, physics, and AI. which are the CPU's domain. they've certainly made the graphics prettier. but i'm not sure the cpu is going to limit that.
obviously some new games are goign to be too much for an old cpu, but which ones?

either way, more coverage on this topic is needed.
tho with the spotlight on older platforms, its not the type of article the big marketers want to see. its always about the new. unfortunately.
 


Also interesting is the follow-up article, Part II:

http://www.tomshardware.com/2007/02/01/agp-platform-analysis/page10.html

This particular page of the article highlights Oblivion, which is even by today's standards a relatively stressful program. For games, the 3850 may be a viable option for some AGP MBs and processor/ram combinations.
 


Definitely. Even high end P4s and heavily overclocked Athlon XPs probably can still take advantage of an AGP 3850 to a certain extent, but in all honestly I think an A64 would be a minimum requirement for such a powerful card.

Don't forget that AGP versions of these cards generally carry a premium over the equivalent PCI-E cards. You can build a cheap X2 platform (minus GPU) for around $150 or even less. There comes a point where extending the life of an old AGP platform is just not worth it.
 
In the first article with the old CPU.... The CPU heavy games, Oblivion an X3 show a CPU bottleneck around 30 FPS at every resolution.

I speculate:
basically in those games with that CPU,
a 3850 should give me 30fps on 800x600 on low
(since its CPU bound),
and the same 30 fps at much higher settings
(until it becomes GPU bound).
An old GPU might give you 30fps at 800x600 on low,
but couldnt aproach that on decent settings.

So even on that old CPU, the benefits of upgrading are substantial.
Ofcourse, once the CPU bounds you BELOW 30fps, thats another story. We need an benchie that demonstrates that on recent games.

Indeed as long as your CPU barrier isnt below 30fps. Than you stand to benefit from even the highest end GPU's, in the form of raising graphics quality and resolution with no drop in FPS.
The upgrade won't improve your FPS, but it can let you turn the settings higher for NO PENALTY. But your CPU FPS ceiling must be at playable levels.

Its finding the exact FPS ceiling of different games for given CPU's. Thats what we need. and ceilings below 30fps are the ones that indicate a CPU is too slow.
A cpu bottleneck at 30+ fps is still ok, as it indicates a GPU upgrade can still improve things (improve the range of playable settings, not the FPS).
 
My E6600 boosted my framerates by up to 2x in some games over my single core 3700. Strangely, BF2142 was even more than 2x, I couldn't get decent framerates on low with my 3700, yet I get the same framerates on max with 4xaa and a higher res now.
 
you guys have a great thread going on here, and I agree, you can get by on a single core, but:

If you can grab a dual core cheap for your platform, do it if you multi-task, encode and burn movies etc. It just makes computing so smooth.

single core for general useage now sucks. I am sick of virus scanning and not being able to use my computer, or worrying about ruining a burn.

Or trying to use DVD Shrink and surf the web.

Guys, single core is fine if that is all you have. Trust me, if you can find a cheap dual core (especially S939) just pick it up.

I debated skipping dual core with my S939 single core 2800+, and going quad. But now, with my Opty 165 DC OC'd to 2.25 Ghz for like 60$, I think I'll skip quad and go Octal.

I can play Crysis max, 1440x900 with an 8800GTS 640 in it, so I have no need for a quad.

Leaving single core? Worth the 60$ Too much hassle for multi-tasking.