Gaming Shoot-Out: 18 CPUs And APUs Under $200, Benchmarked

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mikenygmail

Distinguished
Aug 29, 2009
362
0
18,780
[citation][nom]Outlander_04[/nom]Bottom line Intel i5 3570K + Radeon 7850 1 gig = $380AMD FX 6300 + 7870 GHz edition = $380One of those two games WAY better .In case casual readers cant work out the answer ...Its not the one with the"must have" intel processor[/citation]

Very good point. I'd like to see how the AMD APU's would do if the test graphics card was something the APU's could crossfire with. APU's + crossfire'd graphics card would crush all Intel CPU's + same graphics card, for MUCH less money.
 

cleeve

Illustrious



You forgot Intel i3-3220 + Radeon 7870 LE (XT, Tahiti chip) = $370

Better than both options above for gaming.

Casual readers, the Intel chip wins this round... ;)
 

ice445

Distinguished
Dec 20, 2011
10
0
18,510
[citation][nom]cleeve[/nom]Totally disagree with you there. Skyrim and StarCraft II remain some of the most significant games of our day.[/citation]

I think what he was getting at was the fact that both of these games use really antiquated engines (Skyrim especially). Both are only capable of utilizing two threads at most, and Skyrim's engine is so bad that the framerate is actually tied to the physics engine.

They're good games to show who has more single thread performance, but not really useful otherwise. Starcraft 2's engine is abysmal tbh. I mean even the higher end Intels can barely pull 55 fps with a minimum at 30? Is that a joke? Blizzard needs to code better.
 
G

Guest

Guest

ddoooh and i hit the refresh on this tab in chrome . . . ok, i lied i came back . .

cleeve with all due respect, i think you're not understanding me. there seems to be a very big and fast short cut with the data. going straight to a percentile graph throws out too much data and will lead to erroneous conclusion. as a point, just how does an athlonII x3 450 show less latency in dirt than every other cpu besides the 8350 and i5s?

an 1100T is higher clocked, has a slightly refined version of the same architecture and more L3 cache/a better memory controller so it gets more done clock for clock;that makes no sense. and sorry to say the results of this article does not reflect several other articles based on the latency metic(s)

sorry again but don't skype - i get into much trouble . . . :lol: but i do appreciate the effort and apologize for getting a bit snarky the few times i did. lets just leave it lay for now and see if a better understanding don't evolve naturally.
 

Niva

Distinguished
Jul 20, 2006
383
1
18,785
Still running a Phenom II x4 950 chip in my main rig and not seeing a good reason to upgrade. That being said I'm not much of a gamer, Torchlight 2 is the most graphics intensive game I run... which runs just fine on my rig.

I really enjoyed the consecutive frames benchmarks in this test. Kudos to THG for bringing those up!
 

cleeve

Illustrious


I never said our approach is perfect, but I think it's a big step in the right direction from what's been done. There are some other aspects of the data I'm working on presenting (such as how often latency happens over time), but we'll get there.




I don't believe any other article has taken this approach. That doesn't invalidate this approach, though. This is a new frontier as far as game performance is concerned.

Do I know what causes the discrepancies? I don't claim to. But I think it's a better approach to derive theories from the actual data than to assume the data is flawed because it doesn't meet your expectations.




Fair enough. Hopefully you'll pay attention to our upcoming stuff and we can pick this up at a later date if we can get over the communication barrier. :)
 

cleeve

Illustrious


Ah! That's the problem. We can't control how well important games are coded.

Good or bad, popular games that are coded poorly a lot are more important than perfectly coded games that nobody plays. :)
 
Would it be possible for toms to devise a form of "background task" benchmark system? Currently all bench's are done with everything turned off and focusing purely on a single component, yet nobody actually runs their system that way. I know I sure as heck don't run my box striped down. This leads to results like a dual core i3 being ranked higher then an 8350 when the 8350 has significantly more total CPU power. And while the game might not utilize it, the game isn't the only thing running on people's home systems. It underemphasis processing scalability for value when the exact opposite is happening across the industry.

I know it's hard enough to get a single set of controlled benchmarks much less thing involving a second one. Maybe run a video decoder in the background to simulate additional system activity while you do a fraps run on a games demo mode. Would generate more interesting numbers me thinks.

In either case good job on breaking things down and emphasizing the effects of latency / stuttering on experience vs raw fps.
 

mikenygmail

Distinguished
Aug 29, 2009
362
0
18,780
Let's see how AMD APU's would do if the test graphics card was something the APU's could crossfire with. APU's + crossfire'd graphics card would crush all Intel CPU's + same graphics card, for MUCH less money.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Most desktops are 99% idle 90% of the time. The normal overhead of running windows+other services is ususally 1% of total CPU power or less. And a antivirus scanner is normally smart enough to no start a scan during heavy CPU/GPU/disc load.
IOW, what background tasks take more than 2-3% CPU ?
Doing what you describe would definitely generate interesting numbers, but IMO they arent real-world.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Cant emphasize this enough.

And, AFAIK, console port games are usually better multithreaded than native PC games, no ? They do have shitty graphics, but the CPU part is better.
 



*Cracks knuckles*


The "99% idle 90%" is a red hearing. It's idle because the user is not sitting in front of it and thus the system has nothing or little to do. A computer will not be playing a video game while the user is not present. The system could be rendered to 0% power and the user would not be effected. The system could be replaced by a lump of coal and the user would not be effected during those idle times, if anything the coal would be beneficial to the user as it doesn't consumer power and could be burned at a later date to generate electricity.

Now if we're talking about when a user is actually sitting there doing something, then that's another matter. Obviously the coal wouldn't do the job so we must move on to something more modern. Now for most non-gaming user activities a four core low power CPU is more then sufficient. I emphasize four cores because while a user won't be doing demanding activities they will be doing multiple activities and you want to ensure that no single application could consume all available processing resources. Things like web-browsing, listening to music, watch youtube videos, working with open office, digging through files and other common tasks. Now that we've determined a cheap phone CPU could do "most" user desktop work, lets focus on the situations were users need something "more".

Occasionally there will come a time when the user needs to do something a bit more then just browsing the web, watching cat videos and / and or doing open office work. Those tend to fall into two broad categories, gaming and multimedia. Multimedia is when the user is encoding / trans-coding material and is a very CPU intensive task. Gaming is when the user is playing something more demanding then flash and can be demanding or not depending on it's design. This is where we make our money and determine what CPU is right for which user. Based on cost vs performance required.

Now that we've laid that out, here is what I do when I'm not browsing websites or doing other common tasks. I play a multitude of video games, it helps me relieve stress from work. Now I tend to keep waterfox open in the background with 10~20 tabs with common sites already loaded and waiting for my reference. I know I'm not the only one who does this, so before we've loaded the game we have a need for some parallel processing. I also have Ventrilo (MMO fans know what this is) up and connected to my guilds vent server, often people will hop on and off and we can chat about various things. If the game I'm playing is the MMO with my guild then this because our primary communications method for command and control. During groups sometimes I'll load up third party mapping applications to assist with runs or do other tasks I need done, this is more processing power that is required. And often while doing all this I'm download a few videos / files or doing some other random work in the background.

Now all that "extra" isn't nearly as intensive as the game itself (most of the time) but it does add to the system workload. Most importantly it highlights that CPU's that "just barely" make it as "gaming", namely the i3's and low core count AMD's are actually insufficient for the task at hand. They play the single player timed loop demo like a champ, yet load up vent, waterfox and a few others and suddenly there is no longer enough CPU resources available and stuttering happens. Just stop and think, the i5 is literally 200% more powerful then the i3 yet the above demonstrates that not all of it's resources are being utilized yet we can see the i3 is being maxed out with just the game itself.
 

QEFX

Distinguished
Jun 12, 2007
258
0
18,790
I like this, however ... while I understand the reasoning behind using a top of the line graphics card, could you use a realistic set-up at some point?

Maybe something that somebody who is buying a sub 200 may actually buy as well. For me all these test numbers are nice, but completely irrelevant as only a fool would buy this set-up. I'd love to see some real world numbers ... just a thought. Otherwise a great job.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]cleeve[/nom]Vsync isn't married to 30 FPS, it's married to half the monitors refresh rate (in many instances this is 60 Hz).[/citation]
I thought it was factors/multiples of the refresh rate (15, 30, 60, 120 or 42.5, 85)...
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
@Don:
On the subjectivity of latency/frame rate perception...i've found that when the fps swings slowly, it's not too noticeable, but if there's a sudden drop from 85 fps to 60 fps i usually do notice, and sometimes i notice when it goes up suddenly. Below 20 fps is usually the slideshow point, though some games manage to pull that off too...

I've also noticed stuttering when games access the HDD to load stuff in the middle of a level. That's actually the most noticeable, IMO. Otherwise sub 30 frames just feel like a slow-down, especially when you change a few settings and see the same scene in 60 fps.

Though 40-60 seems pretty similar to me. Gamers are though to have a 50ms reaction time, i think? I remember data from Nvidia when they intro'd GRID last year...they said average person has a 200ms response time, but gamers/athletes etc have a much lower time. So maybe 20ms could be made standard? Though that's 50 fps...seems a bit high for a minimum...

[citation][nom]Outlander_04[/nom]Its not that the demands of those background services and programs are large its that they take one entire thread for the time they operate , and that thread cannot then be contributing to running the game[/citation]
Isn't that what scheduling takes care of? Not locking a single thread to a process for too long...

[citation][nom]QEFX[/nom]I like this, however ... while I understand the reasoning behind using a top of the line graphics card, could you use a realistic set-up at some point?Maybe something that somebody who is buying a sub 200 may actually buy as well. For me all these test numbers are nice, but completely irrelevant as only a fool would buy this set-up. I'd love to see some real world numbers ... just a thought. Otherwise a great job.[/citation]
This was supposed to be a CPU performance comparison and a latency analysis...by bottlenecking performance with an underpowered GPU doesn't make sense.

SBMs are for the real world stuff you're looking for...
 

Th-z

Distinguished
May 13, 2008
74
0
18,630
Just a quick question: the numbers in Consecutive Frame Latency tables, are they cut off from the "normal" FPS latency? For example in Far Cry 3, the avg latency for A10-5800K is shown as 1.3 ms, we know from your FPS table A10's avg FPS is 47.7 fps, which equates roughly 20.96 ms per frame as perfect condition. So does 1.3 ms come from after you subtracting 20.96 ms? Meaning the original number is 22.26 (1.3 + 20.96) ms, and meaning your Consecutive Frame Latency table only wants to show the additional latency beyond normal FPS latency.

I read Tech Report's article about Radeon/Catalyst's latency problem actually about a week ago and their old article about frame time latency, which is very good and I can understand their findings, so I didn't pay much attention to each number until Far Cry 3, then I realize you guys may have omitted some info or use different methods than Tech Report's.
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795
To all these praising more AMD cores vs i3: I happen to not only run web browser, ts and such but at times 2 games (one minimized) and when that happens I see about 10-20% FPS drops and when I run just simple background apps I see no performance impact. So please don't tell me that teamspeak, winamp or web browser are cpu intensive and meaningful at any time.
I use i3-2100.
Please don't forget that i3 is hyper threaded that's why it manages to do so well against amd lineup because it can deal with 4 threads at the same time.

I'm dissapointed at path AMD took with it's CPU's yah you can oc them to match stock i5 but power bill after year of using it will just be asking why you didn't bught i5 in the first place.
 

ceh4702

Distinguished
Jan 1, 2011
305
0
18,790
Most i-5 quads do not cost under $200. Some cost just over $200 and up. So this article by picking that arbitrary $200 number are keeping their main competition out of the benchmark.
 

cleeve

Illustrious
[citation][nom]Th-z[/nom]Just a quick question: the numbers in Consecutive Frame Latency tables, are they cut off from the "normal" FPS latency? [/citation]

That is explained thoroughly on page 2 of the article far better than I could duplicate it here.
 

idecris

Honorable
Jan 23, 2013
16
0
10,510
[citation][nom]cypeq[/nom]To all these praising more AMD cores vs i3: I happen to not only run web browser, ts and such but at times 2 games (one minimized) and when that happens I see about 10-20% FPS drops and when I run just simple background apps I see no performance impact. So please don't tell me that teamspeak, winamp or web browser are cpu intensive and meaningful at any time. I use i3-2100.Please don't forget that i3 is hyper threaded that's why it manages to do so well against amd lineup because it can deal with 4 threads at the same time.I'm dissapointed at path AMD took with it's CPU's yah you can oc them to match stock i5 but power bill after year of using it will just be asking why you didn't bught i5 in the first place.[/citation]

exactly. the AMD chips simply don't make any sense.

at i3 price point, go for an AMD chip and you go for less.
same with the i5 price point, go for an AMD chip and you go for way much more less...

the added cost of overclocking is something that does not really make any sense at all, the better cooler, the better PSU, the better case, more fans, larger electricty bill...

i mean just get the stock i5 running like a beast on the stock hsf cooler with a simple 500 watt psu...

is common sense really not so common nowadays?
 

idecris

Honorable
Jan 23, 2013
16
0
10,510
[citation][nom]cleeve[/nom]You forgot Intel i3-3220 + Radeon 7870 LE (XT, Tahiti chip) = $370Better than both options above for gaming. Casual readers, the Intel chip wins this round...[/citation]

was just about to say the same thing...

why is it so hard to accept?
 

azraa

Honorable
Jul 3, 2012
323
0
10,790
This article conflicts with pretty much every other article and benchmark here, and invalidates the last gaming CPU comparison chart and recommended purchases.

Pay-off much?
 
Status
Not open for further replies.