Best Gaming CPUs For The Money: January 2012 (Archive)

Page 39 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Still as technology buffs we rant and rave about the latest and greatest and as soon as it drops, anything before it is now antiquated. So every month we talk about the latest and greatest CPU's and why they are considered so. How does it make sense to gauge the performance of these chips with ancient obsolete titles that do not utilize the technology to it's full potential or even attempt to reflect the rapidly changing state of CPU utilization in the software industry.It's like writing a review about the brand new fastest car in the world but only testing it on a 100 MPH 1/4 mile dirt track. You are not going to get any real word demonstration of what the car was meant to do or the claims it makes because you are not testing it in an arena where it is allowed to demonstrate it's strengths. Core utilization in mainstream softwares (most specifically graphically intensive games and production suites) is definitely moving away from single threaded efficiency in favor of utilizing more threads. I mean seriously all of the biggest latest and greatest next gen engines from all of the top dog studios are now scaling across multiple cores (cryengine3, frostbite4, Luminous, Fox, UR4) Just to name a few off the top of my head.How can people sit here and continue to ignore this trend as it stares them dead in the face. Yes the i3 is faster clock for clock but the dual core is dead and that is painfully obvious when comparing the 2 in thread friendly modern software.
 

Rapidly?

Multi-core and multi-threaded CPUs have been on the desktop for nearly a decade yet it is still only a relatively small minority of software and games that make meaningful use of more than one single core. Mainstream PC software and games are adopting multi-threaded programming at a snail's pace.
 


Ok lets look at how many games used more than one core 5 years ago... Now basically every major release in the past year and all next gen titles will easily scale well past 2 cores with 4+ being more the norm. This isn't blind speculation this is real and has been happening and still is right in front of us, how can you ignore it ? Still suggesting a dual core simply because it performs better in an ancient test suite while blindly ignoring that several (most) modern titles easily reveal the i3's core deficiency just seems down right stupid to me. It happened with single cores, now it's happening with dual cores and some day it will happen to quads so on and so on. Its technological evolution, ignoring this is senseless.
 

For each "next gen major release", how many "lower-tech" yet still fairly popular new games come out?

The PS3 and XB360 had multiple CPUs and they did not cause much of a landslide in multi-threaded PC games after so many years on the market.

And if you are betting on consoles driving threaded games on PCs now, do keep in mind that individual console cores are less than half as fast as most current desktop CPU cores so games backported from consoles are not going to need to depend on extra cores anywhere near as much as they do on consoles. It is also quite likely threading will get dialed back on PC ports to make code more efficient, easier to debug and less likely to trigger multi-threading quirks that are much more difficult to avoid on PCs where hardware and software configurations can vary wildly from one machine to the next.

I'm not "ignoring" the possibility of more threading in PC games. I'm taking it with a truckload of salt: if it was truly meant to happen, it should have happened already. The fact that so few games manage to make meaningful use of extra cores on PCs tells me it is much easier said than done... many may try but few will succeed.
 



Just bumped to an FX 6350, it beats an i3 3225. Any 125w FX capable board can safely handle that.
 

The Core i3-3225 was never a good buy for gaming. You should be comparing it to the Core i3-4130. $112 at the moment, with the FX-6300 at $110 (and the 6350 currently at $135).
 


considering the fx6300 is pacing the i5-3570k/4670k in most recent gaming benches I would say in multithreaded titles we should be comparing it against those.

But if you want to compare it with a core i3, go right ahead. Generally if i was building a gaming machine today i would use an fx6300 over any i3... and probably a fx8320 over any i5. but if i was looking to spend 250 or more on a cpu i would certainly buy the i7s out there.

At this point i think suggesting any cpu with less then 4 cores is quite shortsighted with all the multithreaded titles we've seen hit the market in the past couple months. looks like multithreaded games are here to stay finally.
 


That is a great deal but it isn't the 840 Pro, its the 1st version of the 840 TLC before the EVO model. Still a great deal though.
 


Of course the i3 can be upgraded to and i5 of which there are several models and the i7 which also have several models so lots of upgrade possibilities. The FX-6300 can be upgraded to the FX-6350, then the FX 8 core models about 2 to 3 models not quite as many. Of course if you were going to buy Intel right now go to the newer Haswell, the Broadwell coming out next year will use the same LGA 1150 socket. We don't know if AMD will be supporting the AM3+ socket next year.
 



Yes we do.
10861225406_8dbbd0dc46_o.png
 

AMD might be too busy taping out updated XB1, PS4 and other chips that aren't on the usual mainstream roadmaps - they do have some new 20-28nm options to sort out in the near-future and I bet both Sony and Microsoft are interested in lower-power cheaper chips for their consoles to reduce costs.

 
We are just getting a Vishera/Piledriver refresh next year from what I have seen. We won't probably see new AMD high end desktop chips on Steamroller. Probably won't see any till Excavator. If global foundries can fix their 28nm issues, that might change, but highly unlikely either way.
 

But it's not... at least not in most cases, excluding GPU bottlenecked situations that really say nothing about CPU performance.

I mean, the Core i3-3220 was pretty close to the FX-8350 in BF4 multiplayer when Tom's tested... and the good old i5-2500K well ahead of that (and GPU bottlenecked; little improvement from OCing).
 
And those were all done under beta. Other benchmarks have differed from the THG one. There really needs to be a rerun of the tests now that it has been released. Only released benches I have seen are from that russian site that bf4blog posted awhile back.
 


the beta BF4 was very poorly optimized for piledriver. Piledriver was pulling a little less then 60% on all of it's cores and wouldn't pull more no matter which piledriver cpu you were talking about... there were serious problems getting bf4 to run right on an AMD cpu in beta.

Anyone who did the math on the benching results would see piledriver was getting terrible performance... in short it was punching well bellow it's weight. Seriously... it was getting half the ipc of a sandy-bridge cpu. HALF! piledriver is slower then sandybridge, but it's not half the speed. Even in skyrim it does better then that.
 


49.5fps vs 41.7fps = 19% faster
21fps vs 17fps = 24% faster

considering according to the reviewer the i3 wasn't even playable at those numbers i'd say it's a big difference.

At Very High quality with 2x SMAA enabled, our platform starts suffering from a graphics bottleneck. Yet, scaling amongst these processors is still evident. While the FX-6350 takes a huge performance hit, it's still noticeably smoother than the quad-core models. Basically, an AMD FX-6300 or Intel Core i5 is my minimum CPU recommendation for enjoying Crysis 3. This hard-hitting title isn’t one for low budgets, and it prefers an FX-8350 or Core i7 if you have one.
http://www.tomshardware.com/reviews/piledriver-k10-cpu-overclocking,3584-10.html
 
The articles was for the most part accurate , but had a seiously misleading comment that Kaveri "can't make use of ddr5 memory" , While it is not designed to support ssytem memory other than ddr3, it can make use of ddr5 memory on your graphics card if you have H.S>A> enabled applications. Heterogeneous Sysytem Architecture allows an application to use bnoth system memory and unused graphics memory. DDR5 memory is much faster than system memory and can sugnificantly boost throughput of applications that use the HUMA architecture. HUMA stands for heterogeneous Unified Memory Architecture. One of the main points of APU '13 Developers Conference of AMD was to build a industry wide alliance of software developers to commit to wrting HSA enabled applications. Adobe has fully embraced HSA as well as Oracle in its JDK (Java Development Kit). ARM, many gaming ISV's such as DICE and EA are also working on this. HP is also a contributing member of the HSA Foundation. Expect to se HSA enabled apps that only Kaveri can exploit on the desktop in the next 6 to 12 months.
 
When I say ONLY Kaveri can exploit HSA desktop applications, that is because Intel has noy yet signed on to this technology and would have to redesign their cpu's to iimplement it into the architecture of their cpus. I dobelieve Intel will come running to the HSA Foundation in another year or so as these applications start proliferating and AMD processors perform better as a result than their corresponding Intel cpu's.
 



Those are all APUs, I don't recommend those for gaming rigs.
 

In case you may have missed AMD's developer conferences earlier this month, AMD is betting the farm on APUs and HSA. With that sort of commitment, it is likely safe to presume AMD will be going APU across the board with its next batch of new designs.
 
Status
Not open for further replies.