Best Gaming CPUs For The Money: January 2012 (Archive)

Page 53 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The answer is quite simple and frankly should have gotten Intel busted for antitrust because its just as nasty as Microsoft with their "Windows isn't done until lotus won't run" dirty trick. Look up "Intel cripple compiler" and you'll see that if a program is compiled with the Intel compiler? Then the Intel chip will get a 30% "speed boost" thanks to the code tying a boat anchor to the AMD chip. The smoking gun is the Via CPU which is the only CPU you can softmod the CPUID. If you change the CPUID from "centaur hauls" to genuine Intel? then magically the chip becomes 30% faster thanks to having the boat anchor removed.

Oh and for those that say "poo poo, Intel is merely optimizing for their designs because they know them?" sorry but we have the proof that is false, the Pentium 3. When the cripple code was first added the P3 was beating the P4 by nearly 30%, after ICC crippled the code? Suddenly the P4 was ahead by over 20%. For second proof one only has to look at the code the cripple compiler puts out, even though EVERY CPU MADE has had SSE2 since 2003 even to this very day the ICC will put the codepath for 487SX, code that hasn't been even close to optimal since 2000, if and ONLY if the chip is a P3 or non Intel CPU.

So there is your answer, if they used programs compiled with ICC? Then a Celeron will win, the game is rigged. If you use programs that aren't crippled? Then you see a more sensible and logical result where Intel wins on single threaded while AMD wins on heavily threaded.
 

i've researched on intel's shenanigans enough to know that intel was forced to use a compiler switch in icc. it's up to the developers to use it. if they don't..well.. it's still intel's fault for trying to cripple amd cpus' performance but there's nothing an end user can do except raise awareness.
however, my post was about intel phasing out the core i5 3350p and the gaming cpu recommendations needing a replacement. it has absolutely nothing to do with compilers. afaik, icc isn;t even the most widely used compiler.

no one said anything like that, and you're stuck in the old days. read agner fog's research.

yes, IF they used programs compiled with i.c.c. but a celeron still won't win. don't bother trying to sensationalize this.
lastly, your tirade (however righteous, while stuck in the last decade) has nothing to do with core i5 3350p needing a replacement. it's an e.o.l.ed cpu. that's just it.
 
Article quote "Our tests demonstrate fairly little difference between a $240 LGA 1155 Core i5-2500K and a $1000 LGA 2011 Core i7-4960X, even when three-way graphics card configurations are involved."I've been sitting on my 2500k for like 3 years of so now because there is no reason to upgrade. I live in a town for of Intel engineers and when I tell them this, they don't seem to get it. Maybe when the next generation of Intel CPU's come out in fall 2014 this will change.
 

What do you expect to change? Broadwell is not going to be much more than a Haswell die shrink and apart from AVX3, I'm not expecting much extra performance from Skylake in 2015 either.

If you meant the "Intel not getting it" (that people are skipping upgrades because there is nothing worth upgrading to), I think they showed that they know very well why people are not upgrading by starting to rent their fabs to Altera and other companies - Intel has too much fab capacity for what the PC market still wants (because there is not enough mainstream software people care about to make them buy $200+ CPUs - that's why Intel switched a large chunk of the $40-100 range to Atom-based chips) so Intel is branching out in the fabs-for-hire business to keep them running makes sense even if Intel does it relatively cheap - stopping properly and restarting fabs costs a fortune due to preventive maintenance during shutdown and calibration during restart so it makes sense to contract fab capacity out to reduce costly restart cycles.
 
If someone is going to be a gamer and NOT overclock they should get one of the Intel Xeon processors. I don't get why there's not much mention of them. Probably because they are labeled as "server" processors. Sure they don't have integrated graphics but so what? Almost everyone will be purchasing a discrete graphics card. Their speed is the same as many getting mention AND they have hyperthreading. Oh, did I mention that they are cheaper than the i5s and i7s frequently mentioned?
 

The single-socket Xeons are rebadged i7 and some models (those ending in 5s) do have the IGP enabled - the Xeon E1225v3 costs ~$225 and has P4600 graphics.
 


for 20 bucks, i'll go with the xeon and hyperthreading...
 

Then you'll need to spend $65 extra to get the 1230. The 1220 doesn't have hyper-threading. And notice I said "gaming first machine"? Hyper-threading may help if you're doing other compute intensive workloads, or running a lot of tasks in the background while gaming. But for a strict gaming system, it's irrelevant.
 
when, if ever, will we get benchmarks on stuff that DOESN'T use 4+ cores? The mists and myths about using various CAD software are always about, but no one really tackles it. anywhere. as if the market is non-existant. I know gaming brings more money to the manufacturers, but not all play games. And since the site is on hardware and not on gaming, i think my question stands :)other than that - i'll keep reading you 😉
 



Well, this is a gaming CPU review to start with, but I do agree with you it could be highly beneficial to see a best workstations cpu for the money article, since they already have workstations GPU review here, but they do it on yearly basis instead. I thought that any modern CAD application uses all the threads/cores it can.
 


yeee, i know. i did not wanna turn it into anything else :) but there's no CAD topic 😛
Autodesk is one of the reveared companies, no? Well, i use AutoCAD and Civil3D extensively and, besides rendering, they don't use the cores. and yes, i use 64bits :)
let's let the people get back to the topic, i just hoped to nudge someone.
thanks for the reply and see ya about!
 

The problems with threading CAD code are the same as those that make threading other stuff so difficult: not all algorithms can be threaded in a meaningful and efficient manner. User-interactive code (like CAD) is notoriously difficult to do meaningful threading with since everything tends to end up waiting for the user-interactive thread all the time anyway.
 
You guys keep recommending the i5 3350 p as the best price : performance pick. Sadly this doesn't seem to be available anywhere in Australia, which leaves me torn between whether to go up to the 4670 (l don't plan on overclocking) or down to the i3. Is there any other, more widely available middle ground? (or anywhere in Oz that sells the 3350 p?)
 
Ive tested some of my higher demanding games on i3s , core 2 quad q9550 and equals, i5 2500k , Phenom II x6 1090T, Phenom II x4 965, 955 and all of these cpus seem to work fine with the games i tested them with , My i5 seemed to shine a little more one 3d mark boost it gave my gpu i used in it over some of the other cpus i mentioned however.
 


Great - thanks for clearing that up.
 
The fact that none of the A series APU's are on here surprises me considering that the A6 series are direct competition for the i3's and one of those are in the list
 

The fact that the A6-5400k is unlocked alone should have landed it at least in an entry level considering it will play battle field 4 on medium settings with its stock clock of 3.6ghz and top that with the low price of $60
 


All im saying is for as cheap as it is and the decent gaming abilities it should have at least made it in the entry level category; I'm not trying to start any arguments just stating my opinion.
 
the issue with apus isn't gaming performance. it's gaming performance for money, and user's goal. if you want to play games with powerful discreet cards right away, then you're paying more for the igpu in apus, igpu that you'd never use. (exceptions include using the igpu in case the discreet card dies or for general system performance debugging - nothing to do with gaming.)

apus only make sense if you make use of your money for all of the apu instead of the cpu part alone. that's why apus should never make best gaming cpus for money recommendations - they don't undercut the cpus that deliver better performance for gaming with powerful discreet gfx cards. cheap doesn't always mean good for money. if amd charged nothing for the igpu in a10 6800k or 7850k, then they'd easily get a recommendation for $70-under $90 range.

if you still want to see where the apu stand as gaming cpus, hierarchy chart is the way to go.
 
Status
Not open for further replies.