Gaming Shoot-Out: 18 CPUs And APUs Under $200, Benchmarked

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cleeve

Illustrious
[citation][nom]azraa[/nom]This article conflicts with pretty much every other article and benchmark here, and invalidates the last gaming CPU comparison chart and recommended purchases.Pay-off much?[/citation]

No, actually. Your statement doesn't mesh with reality on a number of levels.

Think much? :)
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
[citation][nom]ojas[/nom]I thought it was factors/multiples of the refresh rate (15, 30, 60, 120 or 42.5, 85)...[/citation]

Right. More to the point, the exact formula for vsync is FPS = (Refresh rate)/n, where n is an integer. Thus, with a 60Hz LCD, you can only get FPS of precisely 60, 30, 20, 15, 12, 10...Of which 30FPS is the smallest one giving you a smooth animation without tearing. Thus, for anyone enabling vsync, the important number is 30FPS. The next one being 60 FPS. Between these, it doesn't matter if your card is pushing 40 or 50FPS, since these will both result in a constant 30FPS...

I was thinking this was the origin of the >30fps unofficial rule...
 

cleeve

Illustrious
[citation][nom]Wendigo[/nom]I was thinking this was the origin of the >30fps unofficial rule...[/citation]

30 FPS was around for years before Vsync was even invented. 24 FPS for film and 30 FPS (60 fields) for television are very old standards in comparison.
 

Th-z

Distinguished
May 13, 2008
74
0
18,630
[citation][nom]cleeve[/nom]That is explained thoroughly on page 2 of the article far better than I could duplicate it here.[/citation]

Don, I read it before I posted the question, I don't see you explained it. You'll have to point me which paragraph and sentence you think you've explained if what I said was the case or not.

Another thing is about the percentiles, I think you've misinterpreted the meaning pencentile (if you use Tech Report's method), a nth percentile is not the worst latency we see n% of time, but a latency at a specific rank of all data points. For example, if at 95th percentile the latency is 30 ms, it's not 30 ms is the highest latency we see 95% of time (which would imply randomly select 95% of all latencies), but rather at the 95% rank of all latencies, the latency is 30 ms.
 

instinctgone

Distinguished
Jun 2, 2006
13
0
18,510
It would be awesome if the game developers would release a 20 minute timed demo of their game so that we can see how it performs on our particular rigs. That way I don't get pissed when I drop $60 on a game only to lose interest because it stutters constantly. Some of us can't afford to build a $2k rig every year.
 

shikamaru31789

Honorable
Nov 23, 2012
274
0
10,780
[citation][nom]Th-z[/nom]Don, I read it before I posted the question, I don't see you explained it. You'll have to point me which paragraph and sentence you think you've explained if what I said was the case or not.Another thing is about the percentiles, I think you've misinterpreted the meaning pencentile (if you use Tech Report's method), a nth percentile is not the worst latency we see n% of time, but a latency at a specific rank of all data points. For example, if at 95th percentile the latency is 30 ms, it's not 30 ms is the highest latency we see 95% of time (which would imply randomly select 95% of all latencies), but rather at the 95% rank of all latencies, the latency is 30 ms.[/citation]

Unless I'm mistaken, 95th percentile in Tom's graph means that 5% of all frame transfers are at the posted ms rate. So in the example on page 2, system B has an average or 50th percentile of 11.7 ms per frame transfer, 25% of all frame transfers are as slow as or slower than 24.5 ms, and 5% of all frame transfers are as slow as or slower than 29.9 ms.
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
[citation][nom]cleeve[/nom]30 FPS was around for years before Vsync was even invented. 24 FPS for film and 30 FPS (60 fields) for television are very old standards in comparison.[/citation]

My point is that, contrary to your post above, 30fps isn't a totally subjective cutline decided for no obvious reason. This has been selected as the line between for acceptable performance for video cards on TH and other websites because, for most people, it's the lowest framerate that gives a playable, smooth and artefact free gameplay.

That said, I more than agrre with the same post that, actually, 30FPS isn't the end of everything. Latency certainly is an interesting factor to put in the equation. It certainly needs to be dig deeper.
 

cleeve

Illustrious
[citation][nom]Wendigo[/nom]My point is that, contrary to your post above, 30fps isn't a totally subjective cutline decided for no obvious reason. This has been selected as the line between for acceptable performance for video cards on TH and other websites because, for most people, it's the lowest framerate that gives a playable, smooth and artefact free gameplay.[/citation]

OK, you're missing the point a bit. I agree that's WHY it was selected, but my point is there is no OBJECTIVE measurement that defines 30 FPS as "playable, smooth, and artifact free".

30 FPS is a limit based on people's opinion, a de-facto standard born of someone's arbitrary decision, not scientific fact. There is no repeatable experiment that you can provide to produce 30 FPS as an ideal result. If you show videos with different frame rates to many people they will all have an opinion as to what point that frame rate becomes 'smooth'. That's subjective.

An objective measurement can be verified by anyone, regardless. If someone measured how many inches were in 1 foot, the result would always be 12, no matter who checked. That's objective.
 
30/60 "fps" isn't entirely arbitrary. It goes back to early television and the fact that in the USA house hold electricity is at 60hz. It was a electrical necessity at that time that for home media broadcasts. Eventually cheap integrated circuits completely did away with the need for the display to run at the same rate as the line, but by then the standard has been set.

The difference between fluid and non-fluid motion is actually about 20~25 depending on the human. Of course this doesn't take into account sudden high contrast scene changes and fast motion, both require a significantly higher frame speed to be perceived as fluid. The faster / more difference in the frame changes the faster then update speed needs to be to maintain fluidity.
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
[citation][nom]cleeve[/nom]OK, you're missing the point a bit. I agree that's WHY it was selected, but my point is there is no OBJECTIVE measurement that defines 30 FPS as "playable, smooth, and artifact free".30 FPS is a limitbased on people's opinion, a de-facto standard born of someone's arbitrary decision, not scientific fact. There is no repeatable experiment that you can provide to produce 30 FPS as an ideal result. If you show videos with different frame rates to many people they will all have an opinion as to what point that frame rate becomes 'smooth'. That's subjective.An objective measurement can be verified by anyone, regardless. If someone measured how many inches were in 1 foot, the result would always be 12, no matter who checked. That's objective.[/citation]

Sorry, but I think you mix up "objective" and "quantifiable".

For artefact free, I was making reference to the well known tearing artefact. Only way to not getting it is to be in sync with your display, giving the fixed FPS I've given above. No subjectivity there, it's just the way electronics works.

Now, for smooth, take the in-sync FPS, let say 15, 20, 30 and 60, and show a game, or animation, to ten random people at these FPS. Ask them if they perceive differences between these and if they find them fluid. Most, if not all people, will easily perceive the differences between 15, 20 and 30 FPS. On the other hand, most people will not see any obvious differences between 30 and 60FPS and will not see apparent, or disturbing, stutter at these FPS. And even those that see differences will usually agree that they have seen them only because they were specifically looking for it. You can repeat this test as much as you want, I'm pretty sure you well get to the same conclusion. Is it quantifiable ? No, you cannot put a number or units on "smooth" or "stutterless". Is it objective ? Sure, it's based on facts, not on the personnal feelings or opinions of the experimenter.

Now, take a hammer and hit your finger. Do you fell pain ? Sure. Can you put a exact number on this pain ? No. Does it mean that pain when you hit your finger with a hammer isn't an objective fact ? No, it's just not quantifiable...
 

cleeve

Illustrious


Nope. Although the meaning is related in this case.

Dictionary.com defines objective: not influenced by personal feelings, interpretations, or prejudice; based on facts; unbiased: an objective opinion.

Saying '30 FPS is the minimum for smooth animation' is not objective. Some people might feel 25 FPS the minimum, others might feel 40 FPS is the minimum. There's no universal scientific proof that can be leveraged to 'prove' that 30 FPS is the minimum for smooth animation.

It's simply a de-facto standard that has been accepted, probably based on NTSC. :)
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
[citation][nom]cleeve[/nom]...Some people might feel 25 FPS the minimum, others might feel 40 FPS is the minimum. There's no universal scientific proof that can be leveraged to 'prove' that 30 FPS is the minimum for smooth animation. It's simply a de-facto standard that has been accepted, probably based on NTSC.[/citation]

I'm not talking about comparing 30FPS relative to any other FPS. In that case, it's true that you cannot point a specific number. However, if you read what I posted above, I'm strictly comparing 30FPS with other synchronized FPS, which aren't 25 or 40, but 20 if you go low and 60 if you go up. All other values have been discarded because they will cause tearing and, as a cutline, you want to use a value that doesn't cause a well known graphical artefact.

And yes, we get to this number because it's directly related to the refresh rate, which comes from NTSC. Which itself depends on the 60Hz AC standard established in the XIX century.
 

Wendigo

Distinguished
Nov 26, 2002
133
35
18,620
[citation][nom]cleeve[/nom]Nope. Although the meaning is related in this case.Dictionary.com defines objective: not influenced by personal feelings, interpretations, or prejudice; based on facts; unbiased: an objective opinion. Saying '30 FPS is the minimum for smooth animation' is not objective. Some people might feel 25 FPS the minimum, others might feel 40 FPS is the minimum. There's no universal scientific proof that can be leveraged to 'prove' that 30 FPS is the minimum for smooth animation. It's simply a de-facto standard that has been accepted, probably based on NTSC.[/citation]

If you read what I posted above, I'm not comparing 30FPS to just any other but only to those in sync, which are 20 on the low side and 60 on the up...
 

cleeve

Illustrious


I thought you had posted that "This has been selected as the line between for acceptable performance for video cards on TH and other websites because, for most people, it's the lowest framerate that gives a playable, smooth and artifact free gameplay. "

*ASSUMING* Vsync is on and ASSUMING a 60 Hz frame rate, maybe.

As a reviewer I can tell you that Vsync isn't really considered, though. We use 30 FPS because it's a generally accepted minimum standard of how many frames per second it takes to deliver a smooth experience. Below 30 FPS, it gets dicey, and it's not because of Vsync.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]BigMack70[/nom]Except that you're trying to make a blatantly nonsensical argument, with nothing to back it up, that people can tell the difference between 20 and 30fps without looking for it but not 30 and 60.It BOGGLES MY MIND why someone always shows up with this stupid argument in any thread where framerate, smoothness, or monitor refresh rate is discussed. It just absolutely boggles it. It's one of the dumbest arguments ever - it's COMPLETELY impossible to substantiate, and most peoples' own experience invalidates it in less time than it takes to type the argument in the first place. It's only a teeny, tiny quality step above it's twin - the "human eye can't see more than [x] fps" argument.[/citation]
True, especially in fast motion sequences, the difference is obvious. I think anyone saying that the difference between 30 fps and 60 fps isn't significant should play a racing game or two :D

Also the reason why traditionally shot movies seem slower than some TV serials and 3D movies...
 

nokiddingboss

Honorable
Feb 5, 2013
671
0
11,160
with the emergence of multithreaded games able to utilize 4 or more cores, amd's products sure is getting the love they deserve. all they need to do now is boost their ipc performance (though i doubt they can catch up to intel for at least 1 more generation) and we will have ourselves a real fight... after a very very long time. only time can tell if they can climb out of the ditch they dug themselves... and if they do, say hello to "true" quad core i3's for $120 or sub $150 "K" series i5's or ,at the very least, an unlocked HT i3. please, intel? no? ok then.
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795


we probably won't see unlocked i3 ever because why would you buy i5 ?
i3 is already stepping on i5 heels having lower clock and no turbo functions imagine it running at ~4.7-5 GHz. Intel won't shoot it's own foot offering so much value in this pricerange.

30 FPS is a good value below which you start to see choppy animation frames 60 FPS and up stuff is really fluid and starts to look smooth and faster than before but I advise everyone to vsync at highest FPS your hardware can handle 75, 85 or 100 Hz? the higher the better it is on your eyes and immersion. We can't forget that vsync imparts a delay on rendering and imput which in fast paced multiplayer gaming would be a dissadvantage.
 
it comes down to the game, latency between frames and a number of other considerations... but i've found everything from 20fps to 70fps to be smooth and stutter free. Sometimes 20 works, sometimes it's terrible. Sometimes it's 40 that works, sometimes not. Heck, I was just playing a game i was getting a pretty consistent 60fps and frankly it was a jittery stutter filled mess. Nearly unplayable at times.

In the end, frame latencies are a big thing. Its why the HD 79XX mostly feel choppy to me (while the 78XX feel smooth as glass). When tech report reported the issue with the Tahiti core Radeon I was nodding my head as their report mimicked my own experience. Those cards might pump out insane FPS... but their playing experience tends to be choppy. Or it did... the problem got a lot better recently.
 

Th-z

Distinguished
May 13, 2008
74
0
18,630
[citation][nom]shikamaru31789[/nom]system B has an average or 50th percentile of 11.7 ms[/citation]
11.7 would be the average (mean) of all data entries, 50th percentile is median, the middle data entry.

[citation][nom]shikamaru31789[/nom]25% of all frame transfers are as slow as or slower than 24.5 ms, and 5% of all frame transfers are as slow as or slower than 29.9 ms.[/citation]
It's close. When using percentile, it's about setting a boundary of data after the data are sorted from low to high. To be precise:

If at 75th percentile the data entry is 24.5 ms, it means 75% of entries are less than 24.5 ms
If at 95th percentile the data entry is 29.9 ms, it means 95% of entries are less than 29.9 ms
 

cleeve

Illustrious


So you're asking if the 75th and 95th percentile latencies subtract the average latency? No, they do not.



I believe an nth percentile is the worst latency you'd see in the lowest nth percent results.

[edit; based on your reply above you seem to understand it. I'm not sure where the confusion was? Perhaps I didn't explain it in the best terms.]
For example, the 75th percentile should return the worst latency in the lowest 75% of latencies.

Wikipedia explains it like this, which may help: if a score is in the 86th percentile, it is higher than 85% of the other scores.
 

nokiddingboss

Honorable
Feb 5, 2013
671
0
11,160
[citation][nom]cypeq[/nom]we probably won't see unlocked i3 ever because why would you buy i5 ?i3 is already stepping on i5 heels having lower clock and no turbo functions imagine it running at ~4.7-5 GHz. Intel won't shoot it's own foot offering so much value in this pricerange.30 FPS is a good value below which you start to see choppy animation frames 60 FPS and up stuff is really fluid and starts to look smooth and faster than before but I advise everyone to vsync at highest FPS your hardware can handle 75, 85 or 100 Hz? the higher the better it is on your eyes and immersion. We can't forget that vsync imparts a delay on rendering and imput which in fast paced multiplayer gaming would be a dissadvantage.[/citation]
well you could make the i3 quad core but without hyperthreading and run on lower clockrates while the i5 will have HT and higher clocks while the i7 will have more cores. thats what i'm hoping for at least.
 

i just erased a longer version of the answer, trimming a bit of attempted sarcasm off it. :p
simply, intel will not do that. we can sure hope for it, but it won't happen. any time soon. the main reason intel doesn't offer more because it can get away by selling less i.e. their lesser cpus can keep up with their amd counterparts. if amd offered more worthycompetition, intel woulda certainly responded. iirc, they did in the past.
right now intel's priority is mobile - drastically improving performance per watt to make socs that people will buy, because mobile is where the money is at. desktop and even mainstream laptop is less of a priority. they have an excuse though - if you want enthusiast-class hardware, buy the ridiculously overpriced lga2011 platform, you get your desire of moar cores fulfilled. just gotta pay moar.
there was an old rumored cpu roadmap. it showed that the ore i3 would replace celerons (in terms of hardware) and core i5 would become the entry level cpu, 6 core i7 would make high end mainstream cpu while 6-8 core i7(likely with ht) cpu would fill up enthusiast(current lga2011) space.
 

nokiddingboss

Honorable
Feb 5, 2013
671
0
11,160
[citation][nom]de5_Roy[/nom]i just erased a longer version of the answer, trimming a bit of attempted sarcasm off it. simply, intel will not do that. we can sure hope for it, but it won't happen. any time soon. the main reason intel doesn't offer more because it can get away by selling less i.e. their lesser cpus can keep up with their amd counterparts. if amd offered more worthycompetition, intel woulda certainly responded. iirc, they did in the past.right now intel's priority is mobile - drastically improving performance per watt to make socs that people will buy, because mobile is where the money is at. desktop and even mainstream laptop is less of a priority. they have an excuse though - if you want enthusiast-class hardware, buy the ridiculously overpriced lga2011 platform, you get your desire of moar cores fulfilled. just gotta pay moar.there was an old rumored cpu roadmap. it showed that the ore i3 would replace celerons (in terms of hardware) and core i5 would become the entry level cpu, 6 core i7 would make high end mainstream cpu while 6-8 core i7(likely with ht) cpu would fill up enthusiast(current lga2011) space.[/citation]
about my earlier post, well that was my theory "IF" in the unlikely event that amd catches intel in the ipc performance next gen, then they'd theoretically "make the i3 quad core but without hyperthreading and run on lower clockrates while the i5 will have HT and higher clocks while the i7 will have more cores. thats what i'm hoping for at least." though all of that was just hopeful wishing from me and NOT the reality. thanks for the input though :)
 
Status
Not open for further replies.