Best Gaming CPUs For The Money: January 2012 (Archive)

Page 31 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


but i don't get it ? if the game runs awesome and the graphics are great what is the problem ? oh I know people have forgot about the game to become hardware buffs that have to baby everything
 
well while you boys baby your games I am gonna enjoy Mine there is no arguement if the game runs smooth no lag and great graphics no complaints from me who cares about 100+ FPS like I would notice anyways the camera turns smooth in my games hen I walk it is smooth no lag and I get graphics that a console cant touch
 


lol the games you mentioned are exactly what I was talking about Bad design Does not count lol that is the developers fault I know games with better graphics that run better lol
 
 
 

It depends on your definition of 'awesome.'

For casual gamers, "awesome" might be 30-60fps at 1080p.
For some people, "awesome" is 60fps at at 1080-1200p
For others, "awesome" is 60fps at 5760x1080p
And for yet others, it may be 144fps at 1080p
Etc.

There is no universal definition of "awesome" so each gamer has to decide what that means for themselves.

Same goes for the "ideal" screen resolution or PPI: some people want 4k on 10" screens, others are happy with 1080p on a 24" and for me, 1600p on a 24" would feel just right.

Everyone has different Goldilocks zones.
 
I could care less what the FPS is, if:

* My game plays smoothly w/decent settings (I don't need all of the eye candy).
* Doesn't cost me too much to implement.
* Works with existing hardware.
* Plays at a decent resolution.
 



right on man that is my point people over exaggerate hardware demand like i said if your game is smooth as butter and the graphics are looking great and way better than console what does it matter because if you see the game running fast and smooth what more is 144 ps going to do ? I'll enjoy My games while people spend a lot of money and time jerking off to more hardware and the guy above said your only a casual gamer if you get 30-60 fps lol what do they expect 60 fps puts the icing on the cake doesn't it lmao

 


so why not just play on a console?
 


Did you not read what i said ? plus a console could not do what my pc can i get way better graphics
 
lo and behold. the collective hatred of the glorious pc gaming master race. if it ain't running max settings + max AA/AF and shooting rainbows out of its ass, you ain't part of the master race, you are merely a pc/console hybrid dwarf.
 
^BINGO. I've noticed that, and even tended to go a little overboard myself with upgrades a few years ago. While sometimes still curious about how a game might look on UltraMaxOhWOW settings, I've realized that even "Medium" looks pretty good in most games, and "High" looks great.
 


nay. it is too cruel to hate people that makes questionable financial decisions. always wanting the best and buying the most expensive bling all the time is torment enough on its own especially for those who made a habit out of it. which is why this article is called "best gaming cpu for the money" - to help us make the right choices and not splurge any more unnecessary expenses for trivial gains. more power to you mate. peace.
 
I am a cheapskate myself, most of my rigs were built using clearance deals and used parts. Microcenter is a big help as well. I would much rather get the most out of my money than have the best of the best at any cost.
 


they are a minority (but a very loud and vocal minority). lots of them whom will jump and attack you the moment you say "i enjoy playing (insert game) at high 30 - 40fps with a (insert entry/mid range) card" as if it was crime and that your enjoyment suffers. that the only way to play a game is max-all. i have a pretty capable rig but i don't bash other people with it (both metaphorically and literally) nor do i tell them that they aren't enjoying the game if there aren't extra sparks that fly whenever they shoot a unicorn in the face. heck, whenever they aren't rubbing their ginormous "graphical setting ego" they are arguing even about the most minute things. if you browse the previous pages of the comments you'll even see people bashing each other with power consumption... with freaking bills of all things. my goodness. peace out bro.
 
I'm not sure it's worth an argument, but I do keep an eye on power consumption. After all, things change, like power costs climbing. One thing that brings me back to earth is the effect of faster task completion on power consumption, although that doesn't apply to games. I doubt the actual power-saving ability of the "S" and "T" processors in non-gaming workloads, and I suspect their gaming performance will be visibly lower than their standard and "K" equivalents. Considering their performance does not justify their prices, I really don't consider them anymore, regardless of their placement on the charts.
 
I don't have any problem using however much power is needed to accomplish a goal, even if that goal is just playing games on great settings. Wasting power, that is paying for power that is accomplishing nothing useful (e.g. heating one's workspace in summer) is another matter. I'd rather use 500W, than waste 100W.
 
Thumbs up on the AMD FX4130... being 30$ cheaper in where I live, I opted for it over the FX-6300, despite the higher power draw and lower L3 cache.

Good news for tuning enthusiasts tho:

I've been able to overclock it to a stable 4.62 Ghz at stock Vcore, putting it neck and neck with the Intel i3-3220 in the benchmarks and physics scores. By raising the voltage, I wouldn't be surprised if I could push it up 5Ghz... but it's plenty speedy for now.

It handles everything i've thrown at it so far, and for a low end CPU, that ain't too shabby.
 


the bad thing is that at 4.6ghz, it is merely just neck to neck vs an i3 3220 based on your words. an oc'd fx6300 nips at the heels if not equal that of a low end i5 in heavily multi-threaded games like crysis3 which isn't shabby at all.
 
i'll say it now and for quite some time AMD drop this FX crap and revamp your phenoms , i quit buying AMD after the Phenom 965BE and the 1090T six core they are the last good cpus amd made until amd quites this BS with these 220 watt watercooled 850 dollar cpus that can barely compete with a 300 dollar 77 watt intel cpu then i'm INTEL from now on
 
the problem is the k10 architecture hit a dead end. they did a die shrink on the k10 for the Stars cored APU, got just a 6% boost in IPC over deneb (which is slower then Thuban). in short, it's a spent structure. I suspect intel is reaching that limit on their current core i structure. It's been viable since the Pentium M, and core2duo was introduced... that's a long run... longer then the k10 lasted... yet if you look at the increases in performance, there almost is none to be had even with die shrinks. Its probably at the end of it's rope too.
 
Status
Not open for further replies.