Editor's Corner: Getting Benchmarks Right

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

wdmso

Distinguished
Mar 8, 2008
40
0
18,530
0
All i know is i got a drop in upgrade an x4 940 for my aging AM2 GA-m57sli-s4 for 200 bucks to go i7+ MB and memory would have cost me 700.00 bucks in today's economy 200 was a stretch 700 is totally un realistic
 

gamefreak62

Distinguished
Nov 30, 2008
15
0
18,510
0
I don't know if anyone has thought of this yet, but the core i7's have 2 threads per core, whereas the P2 only has 1 thread per core. Who's to say that the Nvidia drivers are simply quad-thread optimized, and that the i7's are only being used efficiently on two or three cores, or in other words, just using the threads from less cores. That would explain a whole lot in the power difference, especially if AMD drivers are optimized to use all of the threads and Nvidia drivers only benefit from the four.

That might be a little confusing, so simply put it may be that the drivers aren't using all four i7 cores because it is utilizing four threads.

But then, I'm no insider, and couldn't say for certain.
 

cruiseoveride

Distinguished
Sep 16, 2006
847
0
18,980
0
Nvidia's drivers are not allowing the i7 to reach its full gaming potential.
What a predicament!

Intel processors are fastest with AMD graphics.

It could be a chipset issue too you know. Not necessarily a driver issue. Nvidia has a pretty good reputation for writing good graphics drivers (as compared to AMD).

Nevertheless, if this is an architecture issue with Geforce and nothing can be done about it, what is the best price/performance setup?
 

fayskittles

Distinguished
Jan 29, 2009
7
0
18,510
0
I was wondering if it has anything to do with The HT on the Core I7/Nvidia drivers. the drivers are being to split up over the 8 cores in a way to slow it all down

If possible try turning off some cores and then try it?
 

MagicPants

Distinguished
Jun 16, 2006
1,315
0
19,660
127
[citation][nom]fayskittles[/nom]I was wondering if it has anything to do with The HT on the Core I7/Nvidia drivers. the drivers are being to split up over the 8 cores in a way to slow it all downIf possible try turning off some cores and then try it?[/citation]

I think you can turn off hyper-threading in some BIOSes.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,079
0
20,810
9
[citation][nom]cah027[/nom]I think Nvidia Hates Intel so much that they wrote their drivers this way to give AMD a boost in sales. Plus it will boost their own sales with AMD P2 fans. If P2 users want the best setup they would go with 2x 280's over the 4780x2[/citation]
Why on earth would Nvidia want to boost AMD's sales? Nvidia and AMD are competing on graphics sales. They wouldn't want AMD to earn enough to surpass their own r&d efforts.

[citation][nom]Pei-chen[/nom]Fact: i7 is the fastest desktop CPU available. Enough said.
Jane is a free freelancer. She isn’t being paid to write, she just like writing on her Macbook / iPhone in a Starbuck.I wish Tom’s had kept Sara, Tamara, Rob and Ben.Good job Chris. Finding the problem is halfway of fixing it.[/citation]
Ben and Rob are gone? That's why there's no second take anymore then? But I saw ben had a hand in the recent OC competition (editing video or something) .... Where've they gone to?
[citation][nom]gamefreak62[/nom]I don't know if anyone has thought of this yet, but the core i7's have 2 threads per core, whereas the P2 only has 1 thread per core. Who's to say that the Nvidia drivers are simply quad-thread optimized, and that the i7's are only being used efficiently on two or three cores, or in other words, just using the threads from less cores. That would explain a whole lot in the power difference, especially if AMD drivers are optimized to use all of the threads and Nvidia drivers only benefit from the four.That might be a little confusing, so simply put it may be that the drivers aren't using all four i7 cores because it is utilizing four threads.But then, I'm no insider, and couldn't say for certain.[/citation]
That sounds reasonable. And should be simple for chris to test! I'm sure either the asus or the intel board has a feature to turn HT off. It simply must have.


@ Chris : Brilliant piece there. Thanks mate.
 

JPForums

Distinguished
Oct 9, 2007
104
0
18,680
0
I can't understate how helpful these articles are for both your readers and your reputation. Kudos Chris.

On a side note, do you think this problem is affects all of nVidias supported cards, CUDA capable cards, the GTX2xx series, or the GTX280 specifically?
 

lashton

Distinguished
Mar 5, 2006
607
0
18,990
2
We are all forgetting that the release of the dragon platform with its on board 4800 series card should I suspect rectify the Phenom II and AMD card problem, I would expect a much faster CPU - GPU combination if only via the bandwidth available to each other
 
G

Guest

Guest
Asus Rampage II Extreme is SLI enabled. I'm not sure, but if I remember right, a special chip from Nvidia was needed for this? If that's right, couldn't it be this that's causing the problem? I know this is a single card configuration, but maybe that chip is used anyway?

If no chip is needed just skip this comment. :)
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
An interesting and very useful analysis (thanks toms!) but how do
we know the results would be the same with a different game?
Reviews of gfx cards as they come out show how some games run
better on a particular AMD card than NVIDIA, or vice versa,
especially when mutliple GPUs are thrown into the mix. This is
made even more complex by driver issues which sometimes mean a
game runs poorly on a specific card with more than 2 GPUs, or
even just 2 vs. 1.

Thus, how confident are you guys that the results you've obtained
would be the same if you'd tested with a very different game such
as Stalker Clear Sky?

Ian.

 

warezme

Distinguished
Dec 18, 2006
2,441
46
19,840
20
aside from the modelesque posing of that picture hinting at a self indulgent ego centric personality and hours of Juuging, I would say Chris made a good article.
 

scooterlibby

Distinguished
Mar 18, 2008
195
0
18,680
0
Maybe a dumb question, but why not run some simple OLS regressions to make a more convincing case? I'm guessing maybe it would take too much time and maybe nobody would care, but that would convince me more than these hypotheses 'tests'.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
I'd still like to see it tested with a different game. As long
as they only test with one type of gfx engine, it's hard to draw
any general conclusions. It could all just be highly specific to
Far Cry 2.

Ian.

 

neiroatopelcc

Distinguished
Oct 3, 2006
3,079
0
20,810
9
[citation][nom]mapesdhs[/nom]I'd still like to see it tested with a different game. As longas they only test with one type of gfx engine, it's hard to drawany general conclusions. It could all just be highly specific toFar Cry 2.Ian.[/citation]

This whole 'article' was based on another one, where it was obvious that the same behavour isn't limited to this particular title, but appeared on most games. So you can expect similar results with different games.
 

da bahstid

Distinguished
Oct 10, 2008
35
0
18,530
0
NVidia drivers being more poorly-optimized than AMD drivers?

AMD graphics drivers that are better optimized for Intel's new i7 than their own Dragon platform?

And more than two pages of comments go by before anybody posts any fanboy crap.

This is truly a crazy world we live in today.

Great article Chris. Nice to see someone employing the scientific method around here. I'd be interested in seeing if this turns out to be something more related with the X58. It seems plausible since NVidia was holding off of supporting i7 for so long. Whether it ends up being solvable within chipset drivers or ends up sticking around until X68 releases will be another point of curiosity.
 

rlevitov

Distinguished
Apr 27, 2008
3
0
18,510
0
what i wont 2 know is who will buy a 2500 or 5000$ pc every year just to be at the top??? i'm using X2-3800 cpu from 3 yrs ago (still with DDR1) along with 4670 1GB and newest games run fine on my 42" FULL HD screen.. the problem with new titles are the developers... who makes nice looking games but purly design (take GTA4 for example). and btw all u great gamers...most can't see the differance above 30 fps anyway... (it's an EYE thing ask a doctor)... i would like to know for all those 650$ and 1250$ pc's how low can the FPS get not avarage because if lowest is 30 the i dont need more!
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,079
0
20,810
9
[citation][nom]rlevitov[/nom]what i wont 2 know is who will buy a 2500 or 5000$ pc every year just to be at the top??? i'm using X2-3800 cpu from 3 yrs ago (still with DDR1) along with 4670 1GB and newest games run fine on my 42" FULL HD screen.. the problem with new titles are the developers... who makes nice looking games but purly design (take GTA4 for example). and btw all u great gamers...most can't see the differance above 30 fps anyway... (it's an EYE thing ask a doctor)... i would like to know for all those 650$ and 1250$ pc's how low can the FPS get not avarage because if lowest is 30 the i dont need more![/citation]
There are a lot of things one could start to argue about with your post. I certainly see the point in keeping your old system as long as it works fine. But I don't believe it'll run games anywhere as well as a modern system. My backup system (which is only there in case my primary fails) is based on an athlon X2 at 2.3ghz - and even with my 4870 it doesn't perform particularily well on a 22" (1680x1050), so I doubt a 4670 will do much better. I've got ddr2 memory and a faster gpu after all.
As for some of the other posts. I don't know a single person who'd buy a system worth more than $2500, but that doesn't mean they don't exist. My social circle relies on me, or one like me, to tell them what computers they need. And people like I will always settle for a 790gx system if it's for very light or no gaming at all, and a setup with a 4830 or 4870 card and either an x2 2.8 or a similarily priced c2d (like the e5200) depending on preferences. Anyway. There are people who can afford, and want, a really potent computer system. It's like investing in a new beemer every year. My social circle would rather be content with a 5 year old vectra and replace it once it's 10 years old. But beemers are being sold to those who want one, and can afford it. Same here. One of my best friend's aquantances from uni spent 5k on a computer 2 years ago, so I know at least one exists :)

As for the 30fps thing - well it's simply not true. Yes the eye can only perceive a certain amount of pictures per second, but it can detect irrelguar frequencies. So if those 30fps aren't with v-sync enabled, you can still feel it's lagging. Also input lag can still occur when you have 30fps (if the cpu is too weak), and that can be felt as well. You'd be amazed as to how sensitive our eyes really are. So you'd ultimately always want a system that can provide a minimum of 30fps with vsync enabled (or more without naturally) in situations where the action is fast paced. It's a bit less important in slowmoving stuff like strategy games or billiard games or the like, but you'll still be able to notice if the system can't provide 30fps with vsync on.
Also remember that it's the minimum framerate that is important, not the average. So an average 30fps isn't going to cut it for most games.
As for your developers remark. Yes to an extend you're right. But with your particular example I believe the problem isn't the development itself, but the fact it was ported from a console game into a pc game. And that process has been done very poorly. But that's nothing new really, rockstar never were good at that. The gameplay is quite good in many games imo. Where it all falls down is, that those where developers try to make it look good, somehow they fail to make the gameplay good. Case in point - crysis and farcry 2. I've got both, and I dare say I was disappointed twice. Farcry 2 is a great game until you start getting used to the place and start being annoyed by the less practical things. The scripted ai that is poor at best, the horrible ending where you can't even cancel the credits without alt+f4, or the fact that once you've done so, you can't go back and do side missions etc. The game could've been good, but they spent too much time doing something other than gameplay design. Even the bonus dvd that can with my collectors ed was rubbish. On the other side of the scale though, you have games like hoyle board games, world of warcraft and older stuff like diablo 2 and heroes of might and magic, and probably soon bloodbowl. In those games graphics were never important, only gameplay is. I think developers just have yet to learn how to implement both in the same product. Anyhow, with ea having a shitty time I think there's still hope that quality will return to the pc world. In the end I believe the internet is actually the culprit as to why quality is so low anyway. Before the internet was a standard feature in a home I don't recall games needing many patches to actually work. How many games didn't work out of the box back in the 90s? I don't recall any. The ease at which developers can fix mistakes these days probably is a major reason for why they don't bother thoroughly testing their software prior to release anymore.

I could go on for ages, so I'll move on to the next thread.
 
Status
Not open for further replies.

ASK THE COMMUNITY