Best Gaming CPUs For The Money: January 2012 (Archive)

Page 44 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
1) The Gigabyte 78x-chipset motherboards are excellent budget motherboards, will 'literally' mount more than 100 AM3/AM3+ CPUs, and perform 'sky-high' OCs - capable of increasing the system clock 150% over stock;

2) It's high-larious the gyrations that some folks take to make a point (or, allegation) so far out of a 'real-world' scenario that I simply have to laugh ... like taking a $1,000 Titan @ 16x10 in a poorly-coded console port to x86-64 to demonstrate CPU scaling :pt1cable:

Intel makes fine CPUs. No one is disrespecting 'Chipzilla' products by choosing another path. No arguments, here, and no disrespect intended. As a matter of fact, I offer my gratitude.

I'm off to snag one of those Gigabyte GA-78LMT-USB3 AM3+ motherboards for $50 and have some fun :lol:

 

Just remember to duck when those 4+1 power phases explode.
 
I never said you could overclock it through the roof with that board. You can safely match FX 6350 speeds with that board, though. It is a 125w capable board. FX 6350 is a 125w chip. FX 6350 beats i3. For a low budget solution it is an OK motherboard. Your love affair with Intel is pretty pathetic. I am no AMD fanboy. All my systems, save one, are i5's. (i5 750, 2400, 2430m, 3570k) I am a value shopper. I was able to get good deals on my Intel hardware and AMD had nothing competitive at the time. Now they actually do on a price/performance level.
 

You implied it, and Wisecracker explicitly said you could. And it's just wrong. That board is so flimsy it could fail with an FX-6300 at stock clocks. Also, your blanket statement that the FX-6350 beats Core i3s is clear evidence of fanboyism. The Core i3s do better in old games, and even in many new games like CoD: Ghosts they match the FX-6300 and 6350.
 
Yea I am such an AMD fanboy. Out 4 desktops and a laptop I have 1 AMD system that isn't even currently in use right now, because I haven't had the time to get it up and running, since I upgraded it from a PhII to an FX 8320. Only reason I bought the FX 8320 was the $100 Microcenter deal for Black Friday.
 
Yea, for $100 I couldn't turn that deal down. My i5 750 rig will be gone soon, so the PhII X2 rig needed upgraded. I admittingly picked up the cheap Gigabyte board to go with it because that was all I really could afford since I needed a case too. The old Emachines tower I had been using has poor airflow and I didn't want to stick an FX 8320 and an HD 5850 inside of it.
 
Y'know, I wonder if AMD will release a series of CPUs that have a modifiable CPUID string, akin to VIA. That would lead to *achem* underground ways of bypassing Intel's CrippleAMD() bollocks in Intel's compilers.

At least, it would work for current and past software 'n' games. Intel would most certainly implement a new method of CPU identification in subsequent compilers.
 

Not if the check was "if cpuid() == "GenuineIntel" optimized_code() else generic_code();"

What Intel most likely does with ICC and optional feature optimization is checking the CPU capabilities flag and optimizing only for certain combinations of capabilities. If the most heavily optimized code path includes AVX2, that effectively restricts it to Haswell CPUs for the time being.
 
It's been proven that masking the CPUID on a non-intel processor to GenuineIntel will yield better results. Agner's blog is the best source of information about this.

And no, with the ICC optimization and whatnot, it checks the CPUID and the supported code paths. For an older version of the ICC (Version 7 or something, around 2006), if you had an AMD CPU that would support SSE2, the ICC would use 386 instructions. If you had an Intel, however, it would use SSE2. If you loaded up a virtual machine and then masked the AMD CPUID to GenuineIntel, the ICC would use SSE2 for the AMD CPU.

Similar thing is happening with AVX and SSE2. Just replace SSE2 with AVX and 386 with SSE2.

Note: AVX vs SSE2 yields a 50% performance increase in favour of AVX.
 
I just heard there are now the first 16GB DDR3 unbuffered DIMMs and SO-DIMMs and they work on all AMD platforms. Just on most Intel boards, there seems to be a BIOS limitation (not a hardware limit) that restrict the max capacity per module to only 8GB.
The modules are available from www.intelligentmemory.com
 


What do I do to stop getting notifications whenever someone replies to this thread?
 
A bit over 2 years and my CPU is still in the top rung on the CPU hierarchy chart (i7 2600). That is almost a 3 year old chip; That it is still in the top of the class really shows the stagnation of the CPU industry the last few years. The only reason to ever upgrade these days seems to be for the chipset rather than the processor.

Anywho, I am very excited to see what this next year is going to bring. Now that the next gen consoles are out, and carrying somewhat capable 8 core (4 module) processors, I hope to finally see a push on the PC industry to start pushing more mainstream 6-8 core options... or for games to at least take advantage of all 4 current cores more effectively. Though I suppose the main push here is going to be for more RAM and vRAM in systems more than anything else. Maybe finally a good argument for a quad channel memory setup?
 


Every month I expect the $140 8320 to be up there lol. It's a bloody miracle that the 6300 was even considered, seems like everyone on these forums is wearing blue-tinted glasses now-a-days 😉
 

The console CPUs have 2 modules, not 4.

As for memory... needing more seems likely. Needing more bandwidth (whether by going quad channel or just higher clocks) seems less likely. The main driver of higher bandwidth is graphics, but on PCs this is still handled separately on the GPU (unless you're using integrated graphics). And GPUs already have very high memory bandwidth (288 GB/s on an R9 280X vs. 176 GB/s on a PS4, and that's shared by the CPU cores and the GPU).
 

I would not set my expectations that high: not enough mainstream applications and games make remotely significant use of more than two cores to really justify having even more cores in mainstream computers. Until that changes, extra cores in mainstream systems are going to mostly go to waste. The bulk of people who want 6+ cores today and in the foreseeable future is enthusiasts, prosumers and professionals in compute-intensive domains... not exactly what I would call mainstream.

I doubt Intel will make mainstream chips quad-channel either: there already is only a 5-10% difference going from 1066 to 2133 so most mainstream applications would have nearly zero benefit from jumping from 2400 dual to quad. It would also undermine sales of LGA2011 parts or whatever its successors ends up being. For mainstream chips with beefier IGPs like Broadwell-K where Iris Pro appears to be standard, Intel seems to be going with eDRAM.
 


As far as gaming goes, gamers have proven they'll buy anything and everything. Some might get very loud and vocal on the internets about their gripes, but they still haul tail to whomever to get the goods, maybe grumbling, but still buying. There's no incentive for the execs to allow the game developers to really crank the dial higher, spend more money, and more time for increased complexity. They've already got it figured out about how many units they are going to sell, so any of this nonsense about using more cores efficiently, not just "busy", will just eat into profits. That's my opinion anyway. Check back in 2020 =)
 
look all cpu i have in my life. Pentium d 805 4.2ghz 930 4.1ghz pentium 2160 3.1ghz e7300 4.8ghz e7500 4.5g e8200 4.0ghz amd phenom 920 940 955 965 1055t all overclocked. on air. The last Cpu i have is i3 3225 i5 3470 4.0ghz and a i7 3770k 4.6ghz
THE best CPU in my opnion e7300 for 130us 4.8ghz (old days)
But today an FX isn't better than a i3 and a I5 have the same performance than a i3. If the i3 have turbo or unlocked no one will buy a i5. but the i7 is insane. all programs work fast. and games barely you see a fps droop because the cpu. When i see the people saying the FX 8350 is fast than a 2500k or 3570k and a i7 920. i think (he got an FX) just it
 
Status
Not open for further replies.