Picking A Sub-$200 Gaming CPU: FX, An APU, Or A Pentium?

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

I thought you would so cool. I guessed it already as the page of disagrees to all the links I provided as proof.
 


(see 4 posts above)
 

beat-a-dead-horse.jpg


elbert - let it go already.
look like arse much.?
 


(see 7 posts above)

Not quite getting the whole 'being trolled' thing, huh? 😀
 

Come on cleeve you know I was just trying to get my point across. What he did tho was just rude. I do see how hes got twice your posts in under 2 years.
 


Actually I had way more posts but they reset me a while back. Buggers.

Anyway, just a little meme humor. All in good fun. Nobody's swearing, threatening violence, or making comparisons to Hitler, so its all good.
 

Me too. 🙁 That why we both have the joined in 1970. Thanks cleeve your the coolest.
 
Bulldozer is beaten in pretty much every case by the Pentium G630. A $60 processor beats something more than 4 times its price. Intel is so far ahead of AMD.
 
[citation][nom]tourist[/nom]Intel is the one behind the 8-ball. Does anyone really believe amd will not discount the liano to retailers like gateway,dell for mass propagation. I see a big chunk of intels business going by by to amd. a 60dollar g630 will still need a 60 dollar video card to compete with liano.[/citation]

The $60 Pentium with the $60 graphics card will beat the best Llano has, the A8-3870K and the A8 is more expensive than the Pentium+discrete card combo. Radeon 6670s and even 6750sw can be had at that price and they will easily beat Llano's IGP. The closest Radeon to A8's IGP is probably the 6570, maybe the 5570.

I'll applaud AMD for having the best IGP but it still isn't enough for modern gaming and neither is Llano's CPU portion.

As for AMD discounting Llano? Unless yields have improved substantially there aren't enough Llano APUs to go around anyway so it wouldn't matter. I've read about speculation on improved Llano yields but nothing factual.
 
Someone earlier here said it best though with regards to Llano;

AMD's integrated GPU is better to be sure, BUT...

Intel's integrated GPU is now good enough for anything non-gaming. (We're talking normal user stuff here)

AMD's integrated GPU still isn't good enough for gaming.

So the difference between the two is largely useless. There's just no real use for the level of performance that falls between the 'accomplishes all normal tasks' category and the 'gaming' category.

It's like say video cards are on a 1-10 scale. Everyday tasks require at least a 3, gaming requires at least a 7. Intel's integrated may be a 3 or 4, while Llano's is a 5-6, but since both meet the requirement of being >/= 3, but are also both </= 7 who cares. A 3/4/5/6 for practical purposes all have the same capabilities and limitations to what roles they can fill.

That is what kills Llano's potential. Unless they can boost it's performance to at least that magical '7' (which they might be able to do with the HD7000 series, we'll just have to wait and see), they're still effectively no better than Intel in the graphics department, because there is just nothing that can utilize their current level of advantage.

Llano needs to roughly double it's current video performance (at least add another 50%), before it can really be considered and entry level gaming alternative. If the next gen does this then that'd be awesome, but for now it sadly doesn't.
 
[citation][nom]tourist[/nom]While it is true liaano's gpu is not yet up to yours or gamers exspectations, for 70% of the population liano is a major move leapfrogging intels best igp. One area you are not considering is many people do not or will not upgrade there video or will pay a retailer big bucks for the privilege, negating any cost advantage the intel combo will offer. I really see amd shifting the playing field from who is fastest to who can build the best priced box for the masses. The yields are getting better, i have heard of very few instances of shortages still taking place. I guess i am just simplistic in my thinking, johnny is bugging mom for a new puter to play games, parents being cheap want the cheapest puter that will do the job, if amd can get the public to recognize liano with gaming even the lower a4 ,a6 will be a viable option.[/citation]

Llano not being up to my nor any gamer's expectations makes it not viable as a gaming solution, the whole point of this article. It doesn't matter if most people don't upgrade their graphics since even then the Intel solution is still better and cheaper. Intel's HD 3000 and upcoming HD 4000 are good enough for non-gaming use. The HD 3000/4000 have uses that expensive video cards aren't as good at, encoding/transcoding.

Llano is neither good enough for gaming nor as good as Intel's solutions for non-gaming work. Radeon 7000 might change this but AMD's quick sync style tech isn't working right yet so we don't know if they can compete yet.

You can't play recent games far beyond the minimum resolutions/settings on Llano and even then it doesn't doesn't give great frame rates. If Johnny wants a gaming computer then the Intel Pentium + Radeon 6670 would be about the same price as the A8 and would give much better gaming performance. If the 6750 can be squeezed in then it's even better.

Not only is Intel the fastest, but it is also the cheaper solution for it's performance. Sure you could get a crappy A4 or A6 instead of the A8 but then you really can't play games as much anymore. Intel doesn't make current processors slow enough to compete with the A4s and A6s besides their garbage Sandy Bridge Celerons and although they are a little cheaper than the Pentiums they are much worse.

The A4's graphics is probably comparable to Intel's HD 3000 anyway, maybe even worse. A6 is probably a little faster than HD 4000 but none of the integrated solutions are worth it for Johnny's gaming computer.

For the mobile market, AMD is much better than comparatively priced Intel laptops since most cheap Intel machines have HD 2000/3000, but for desktops there is no such problem even in the low end.
 
... i tried this with my fx-8120... i OC'd just the turboCORE @ 4,8Ghz... the consumption waz lower, than a full CPU OC, but in some games the frame rates waz a bit higher... because in order for turboCORE to work, there is a need to lower or power down other cores to get within the TDP limits... and older games use no more than two cores anyway... this is done with a ASUS MoBo... but there was a Gigabyte MoBo and software for it to dedicate, witch cores are used for turboCORE... it seams, that i like to struggle... to buy a AMD FX instead of intel one... but... i always root for the underdog...
 
My previous post was ridiculed--since it was not germane to this
particular implementation of this topic "best $100-200 cpu for gaming"
--since it assumes that you will also put in the system a decent graphics card.

However, I would like to ask a different question on the
"best $100-200 cpu for gaming" --can one get a budget
gaming system without a graphics card--such as a $500 laptop.

In that case, A8-3500 has roughly twice the fps as i3/Intel-3000 (based
on several reviews in this site)--whether this is adequate is up to you.
Perhaps the A8-3870 and future AMD offereing will even do better,
leading to a truely-budget full-gaming laptop.
I would guess that it will interest many people, but perhpas not the
system builders frequenting this site.

(my $499/on-sale Gateway NV55S05u A8-3500 6GB/640GB HD
laptop plays Skyrim at 20 fps at the laptop's native res (1366x768)
with medium effects (8 samples in AA/Anis, High in Texture, Shadow,
Decal, Distance, Medium in Radial and FXAA off).
 
Not much of a fair shake here for a true budget reader
You actually used a $550 video card so you wouldn't have to worry about bottlenecking the test - sorry, that doesn't float since none of us get the free equipment you do so we're stuck at under $100 for our rigs (my last card was $55 after rebate) - we work around these bottlenecks in other ways
Did we crossfire the APU's with the onchip video and the (gulp!) $550 video card ? - just plugging it in doesn't do this - you actually have to set this up and they would have been the only crossfired system in the group - seems rather an important step which wasn't done
The Intel chip motherboards cost $50 more than both the AMD boards used - curious ? do more expensive products improve speed ? - years of reading these articles would point to that conclusion
4 Gigs of Ram used ??? - Sorry but us budget folks would rather spend the $40 for 8 gigs or $80 for 16 gigs and run with that - no one reading this is building a machine with 4 gigs - Did you know that those APU's used more RAM ? - you should have before the test since it's the cheapest upgrade available
Has anyone ever played a non-online game ??? - let's test stand alone games as well please
-
Very biased article toward the Rich and Intel, you've done everything possible for the Intel chips and crippled the AMD's before you even began

 
Not much of a fair shake here for a true budget reader
You actually used a $550 video card so you wouldn't have to worry about bottlenecking the test - sorry, that doesn't float since none of us get the free equipment you do so we're stuck at under $100 for our rigs (my last card was $55 after rebate) - we work around these bottlenecks in other ways
Did we crossfire the APU's with the onchip video and the (gulp!) $550 video card ? - just plugging it in doesn't do this - you actually have to set this up and they would have been the only crossfired system in the group - seems rather an important step which wasn't done
The Intel chip motherboards cost $50 more than both the AMD boards used - curious ? do more expensive products improve speed ? - years of reading these articles would point to that conclusion
4 Gigs of Ram used ??? - Sorry but us budget folks would rather spend the $40 for 8 gigs or $80 for 16 gigs and run with that - no one reading this is building a machine with 4 gigs - Did you know that those APU's used more RAM ? - you should have before the test since it's the cheapest upgrade available
Has anyone ever played a non-online game ??? - let's test stand alone games as well please
-
Very biased article toward the Rich and Intel, you've done everything possible for the Intel chips and crippled the AMD's before you even began


You cannot crossfire the AMD APU's with anything besides a couple low end GPU's.

More expensive motherboard don't improve performance per clock in any way either. They'll add more features, but as long as the motherboard used has the features required for the test being done, it makes no difference what-so-ever. The only real performance impact they can have is that top-tier motherboards will oftentimes support higher voltages for extreme CPU overclocking. Neither difference matters here.

For the RAM I do agree it would have been nice to test with 8GB because it's so cheap that buying 4GB is pointless anymore, but it doesn't matter either way. The APU's aren't at a disadvantage at all in this regard as their integrated GPU is disabled, so it isn't using any of the system RAM.

I think the use of a $550 GPU has been responded to plenty of times already. Cutting the GPU budget would server no useful purpose but to eliminate data-points. The whole idea here was to give the CPU's as much room to max out as possible to compare their effective ceilings. a cheap GPU would make it impossible to see where those ceiling are.

The idea is to run all of the CPU's at 100% and see how they fall, with a cheap GPU you'd have it i5's sitting at 50% usage, the i3's at 70%, and the FX's at 90% for example (just random numbers). But those differences wouldn't be able to be graphed because the graph would be stuck at the point of the GPU bottleneck.

Yeah, they could run all the tests on a $100 GPU and graph the % CPU usage for each instead of the FPS, but it would be much more subjective to background interference than the current method, and be much harder for the majority of readers to understand.
 
[citation][nom]fluffybear[/nom]You actually used a $550 video card so you wouldn't have to worry about bottlenecking the test - sorry, that doesn't float since none of us get the free equipment you do so we're stuck at under $100 for our rigs (my last card was $55 after rebate) - we work around these bottlenecks in other ways[/citation]

The way you would work around the bottleneck is by lowering visual settings, which in turn shifts the load back to the graphics card. The test methodology floats just fine. Nevertheless, we're following up with a budget CPU test across a range of graphics cards from the 5570 to 6950 to address this concern.

[citation][nom]fluffybear[/nom]Did we crossfire the APU's with the onchip video and the (gulp!) $550 video card ? - just plugging it in doesn't do this - you actually have to set this up and they would have been the only crossfired system in the group - seems rather an important step which wasn't done[/citation]


You can't CrossFire a 7970 with an APU. I can only assume you don't know what you're talking about on this one.

[nom]fluffybear[/nom]The Intel chip motherboards cost $50 more than both the AMD boards used - curious ? do more expensive products improve speed ? - [/citation]

Our motherboard reviews have proven that different motherboards don't make a difference in speed if they use the same chipset. The chipset and CPU determines that, the motherboard usually determines overclockability, reliability, and features. Overclockability was fine on the motherboards used, so your argument is invalid.


[citation][nom]fluffybear[/nom]4 Gigs of Ram used ??? - Sorry but us budget folks would rather spend the $40 for 8 gigs or $80 for 16 gigs and run with that - no one reading this is building a machine with 4 gigs - Did you know that those APU's used more RAM ? - you should have before the test since it's the cheapest upgrade available[/citation]

There are a lot of things you do not understand about computers.

First, APUs only need more RAM if you're using the onboard graphics, but we used a discrete card. Second, the vast majority of folks are running 4 GB. Third, if we upped it to 8 MB, the Intel CPUs would have the same advantage. The net result is that your concern is invalid.



[citation][nom]fluffybear[/nom]Very biased article toward the Rich and Intel, you've done everything possible for the Intel chips and crippled the AMD's before you even began[/citation]

That conclusion is based on your misinformed assumptions that i have showed to be false. Your conclusion is therefore invalid.
 
It would be nice to see these benches run again with an actual FX 4100 , instead of a simulated one .

It would also be interesting to build a SBM $500 machine using an FX 4100 , and a Radeon 6870 to compare to last years $500 machines that used a 6870 and a Phenom or an i3 2100 . To me that would give a better indication of where the best bang for buck is in that segment .

Im not a fan of the methodology you have used here .
 
[citation][nom]Outlander_04[/nom]It would be nice to see these benches run again with an actual FX 4100 , instead of a simulated one.[/citation]

As for as simulating the FX-4100 with a 6100 or 8120, It's a valid methodology proven by testing and confirmed by AMD. In fact, an actual 4100 would be a little slower because we ran it at top turbo frequency; actual processors will drop 100 or 200 MHz depending on the load.
Having said that, you could force any FX-4100 to stay at 3.8 GHz.

[citation][nom]Outlander_04[/nom]It would also be interesting to build a SBM $500 machine using an FX 4100 , and a Radeon 6870 to compare to last years $500 machines that used a 6870 and a Phenom or an i3 2100 . To me that would give a better indication of where the best bang for buck is in that segment .[/citation]

Not for the SBM, but I'm currently writing up an article with the i3-2100 vs the FX-4100 using graphics cards from the 6950 to the 5570 to see if/where the graphics becomes a bottleneck.

[citation][nom]Outlander_04[/nom]Im not a fan of the methodology you have used here .[/citation]

As we said, simulating the FX-4100 is proven legit (except for giving it a slight clock benefit). As far as the graphics cards used, hopefully the follow-up will cover that to your satisfaction. :)
 
Status
Not open for further replies.