Regarding the article...
In general, I thought it was great. But one thing that I find a bit... odd is the Performance Recap at the end. While I like the idea of an overall gaming performance graph, I don't know that its results are
completely reflective of the individual tests that it's based upon. I mean, is 10 FPS in Oblivion really equal to 10 FPS in Crysis?
Wouldn't a relative graph at the end be more useful? IOW, if the 8800GT 512MB was the fastest card in every test, why not show a graph at the end that reflects how each card performed relative to it in every test?
For example, if we were simply comparing the 8800GT 512MB and the 9600GT 512MB, we would see...
Crysis: 8800GT = 44.5 FPS, 9600 GT = 34.9 FPS
STALKER: 8800GT = 54.4 FPS, 9600 GT = 46.5 FPS
Oblivion: 8800GT = 97.2 FPS, 9600 GT = 91.7 FPS
UT3: 8800GT = 123.6 FPS, 9600 GT = 109.2 FPS
AVERAGE: 8800GT = 79.925 FPS, 9600GT = 70.575 (8800GT 512MB = 13.2% better FPS than 9600GT)
Crysis (1280x1024, no AA/AF): 8800GT 512MB = 27.5% better FPS than 9600GT
STALKER (1280x1024): 8800GT 512MB = 17.0% better FPS than 9600GT
Oblivion (1280x1024): 8800GT 512MB = 6.0% better FPS than 9600GT
Unreal Tournament III (1280x1024, no AA/AF): 8800GT 512MB = 13.1% better FPS than 9600GT
AVERAGE: 8800GT 512MB = 15.9% better FPS than 9600GT
So, with just these four games, it looks like there's a decent difference between simply the average of the FPS values of each game and the average of the relative difference between them for each game. The second value, though, is actually rather close to the difference shown by the Tom's Hardware Average Performance graph, so maybe they're already taking this into consideration somehow.
Anyway, just thought it might be worth mentioning. But maybe I'm just being an idiot and I overlooked something.
(I
am a history major, not a math major.
)