Futuremark useless, 3DMark waste of time

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Here is the point. You have no way of benchmarking for games that have not come out yet. 3DMark tests a cards ability to use the different standards. In the case of DX9, it tests their ability to utilize those features. I think it is a valuable tool for doing so. You cant rate a card's performance capabilities for future games by using existing games because existing games to not use those new features. IF a chip maker optimizes a driver for a benchmark then they are cheating. They are deliberatly trying to use a benchmark to lie about their cards performance. Futuremark, in an effort to perserve the integrity of the results, is trying to keep chipmakers from doing this. WHQL drivers are generally not optimized for the benchmark so they provide a better indication of performance. The number that 3Dmark comes up with is just a number that is used for comparison purposes. The number in and of itself doesnt mean a whole lot.

So here is the question, how else do you compare performance, tartarhus? You seem to know better than a good amount of the industry, right? I am assuming that you have some sort of grand idea on how to compare performance. Obviously we cant go by clock speed or memory size and using existing games is not an indication of how the card will perform with future games. I dont think you have reasonable answer to that question anymore than you have a reasonable basis for your complaint.

---------------

Ray Charles is my co-pilot
 
That is the thing, we are all talking to a wall. Dave has tried to launch the hard truth and the guy selfishly ignores it.

Methinks we shouldn't bother and go help someone looking for a new card.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 
i have a question about the inefficiencies in the benchmark, i'll quote some complaints from nVidia:

they show sample code, then say
The portion of this algorithm labeled "Skin Object in Vertex Shader" is doing the exact same skinning calculation over and over for each object. In a scene with five lights, for example, each object gets re-skinned 11 times. This inefficiency is further amplified by the bloated algorithm that is used for stencil extrusion calculation. Rather than using the Doom method, 3DMark03 uses an approach that adds six times the number of vertices required for the extrusion. In our five light example, this is the equivalent of skinning each object 36 times! No game would ever do this.

This approach creates such a serious bottleneck in the vertex portion of the graphics pipeline that the remainder of the graphics engine (texturing, pixel programs, raster operations, etc.) never gets an opportunity to stretch its legs.
how true is this? why do drivers fix it? (with regard to the supposed over-swamped graphics pipeline)

<A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=120&dPage=1" target="_new">WinXP tweak guide</A>
<A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=145&dPage=1" target="_new">WinXP tweak guide 2</A>
 
well nvidia might have done it again, they released new drivers that got the card back up to rank in 3dmark03, and although it is a benchmarking program, it can give us an idea on what games will be like in the future. Given that game devlopers use these DX9 standards, although most developers won't use all of DX9, they will take chunks that can be added to make it pretty. So technically, yes and no, it does/n't show the future of gaming. At least that's what i see from my point of view...

"What kind of idiot are you?"
"I don't know, what kinds are there?"
 
I'm still waiting for them to release WHQL certified drivers no matter what. It's been 5 months and numerous questionable releases. Many couldn't even get the 40.72 drivers to work and had to go back to the 3x.xx WHQL drivers.

<font color=red>GOD</font color=red> <font color=blue>BLESS</font color=blue> <font color=red>AMERICA</font color=red>
 
Is the FX actually out on the market, officially?
I have yet to find any store in Canada.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 
The drivers that boosted the score are questionable. There are talks of low Zbuffer and FP precision, low image quality. It is rather questionable in how come the score quadruples in PS2.0 code suddenly. No drivers including the Detonator XP can do such just with optimization code.

Additionally, some individual DX9 shader tests out of 3dMark03 revealed extremly horrid DX9 performance. Dunno if they are to be believed, they were on HardOCP and someone had posted them here.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 
41.13 drivers seems better than 40.72. It comes with nForce driver package.

Submit your opinion <A HREF="http://forumz.tomshardware.com/community/modules.php?name=Forums&file=viewtopic&p=28537#28537" target="_new"> Should Tom Fire Omid? </A>
 
In regards to Defective,
Yes, your right. Trying to benchmark the future performance of a video card presents an enormous number of challenges, especially when DX9 is factored in. And I understand what futuremark is trying to do. It's just that their software lacks the sophistication employed by modern gaming companies like ID software. When ID came out and said that ATI's latest Radeon really rocked, the world dropped everything and listened. That's because everyone knows that ID software really knows what the hell they are talking about, probably more than anyone else in the industry.

The way games are written and optimized has gotten to the point that general purpose 3Dmarks simply fail to capture what's going on. If you really want to know, you're gonna have to run benchmarks for individual games. Most popular 3D games are written by the bad asses of the programing world, the graphics experts, gurus. When you benchmark their games, you can clearly see how well a card shines.

What really pissed me off was that after Futermark made a mistake, instead of fixing it, they went blaming the world for it. The slap in the face was the WHQL crap, which was just meant to keep business suits interested in the quality of their benchmark. Their moves seem to be purely political these days.

-- Chaos is the better order.<P ID="edit"><FONT SIZE=-1><EM>Edited by tartarhus on 03/27/03 11:51 PM.</EM></FONT></P>
 
I don't go showing my benchmarks, their triple digits in 2001 and not running in 03 right now are not things I like to brag about. So that's not my focus.
What does it show me? It shows me how my card stacks up against other cards, and gives me an idea of how much Nvidia's 'how it's suppozed ta be played' program screws over my card to benifit Nvidia cards. With that incestuous compacts (which are in games like 1942, UT2003, STALKER, and a few others [including the upcoming Tron 2.0]).
What this bench shows me is what many people have always thought, Nvidia is stacking the deck for their inferior technology by having game developers optimize their games at the expense of other cards; Optimize for ALL.
I want an unbiased benchmark. Ati and Nvidia may spend time optimizing for the Benchmark, but it appears that only Nvidia so far has improved their score at the cost of all else (quality and stability in other games).
I don't want a predictor of specific games, I want an unbiased unoptimized test of the the technology that game engines are based on. Perhaps with an unbiased test, we will see more people giving the proper props to cards like the 8500 series, and then more gamers will realize that Nvidia's way isn't the only way, and that they will optimize for ALL cards, instead of just the way NVIDIA or for that matter ATI wants you to play it. IF we want competition in the Graphics market we can't have that inhibited or halted by behind the scene optimizations that favour just one party.
Don't think of 3Dmk03 as simply a predictor of ALL games, but gives you an idea of how SOME game/applications COULD perform, and that they could do better based on if there is preferential optimization.
Should a benchmark favou on product over another, no of course not; althought the results may favour one over the other.
As for complaining about DX9 and MS, I am no fan of MS, however they are the standard by which most games base their engines on, we may not like it, but until Linux and others become the GAME OS of choice, there is no choice.

P.S. People listen to ID / Carmack because THEY will base their games on HIS engine, so of course they listen, they have to.

P.P.S. what is this mistake of Futuremark you speak of, I haven't seen ANY major ones, only little things that will be tweaked in later versions. You speak in generalities, how about some concrete arguments.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 03/27/03 10:51 PM.</EM></FONT></P>
 
That's not Futuremarks problem. The multiple passes problem is an Nvidia problem. Ati's cards don't have that problem, heck even the Matrox cards don't do it that way (not that that helps their performance/scores much)
Nvidia's complaints were even adressed here in Tom's and shown that they are simply poor Nvidia implementation of DX standards, and methods.

If you want an article that refutes it here is a link to the specific page on Beyond3D that adresses this issue;

<A HREF="http://www.beyond3d.com/articles/3dmark03/post/index.php?p=2" target="_new">http://www.beyond3d.com/articles/3dmark03/post/index.php?p=2</A>

Once again Nvidia is complaining that Futuremark doesn't favour their cards at the expense of all other cards.

The Sad thing is that Nvidia's argument doesn't affect the FX, but their older boards, and show them to be poor designs when compared to others. Now they aren't complaining about the FX or even really the Gf4, it's trying to defend why the
Gf3 line does so poorly when up against cards like the 8500.
Once again Nvidia trying to get people to optimize for their cards, to cover their shortcomings.
The FX is the best card sout there now, but that's the QUADRO FX, and they don't seem to have problems with synthetic benchmarks when the QFX kicks the FireGL. So their complaints about 3dMk03 ring hollow to me.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
 
I read an interview, I can't remember where; Where a rep. of Nvidia said that they were expecting a release of the 50 series drivers in a week or so. Now he was being coy but that was the hint. So you may see anothter series of drivers, likely not to get certification first, but liekly an FX level driver a start of the 'official' FX level set.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
 
I take comfort in knowing that there are so many blank slates out there to keep marketing guys like me employed. Apparently, I'm due for a raise. But should I be frustrated that it's so hard to erase someone else's work, or encouraged by how soundly one's work stays put? It's a diliemma, that's for sure.


-- Chaos is the better order.<P ID="edit"><FONT SIZE=-1><EM>Edited by tartarhus on 03/28/03 02:11 AM.</EM></FONT></P>
 
For all of the FX bashing I've done, now that things have settled, I have a teeny bit of admiration towards Nvidia for managing to squeeze all that performance out of a 128bit memory bus...even if they used DDR2 to acomplish it...

(Crashman) "That pic..Are you sane?)...<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
 
i do agree with the questionable drivers, as a matter of fact the discussion is going on in futuremark's fourm, where they posted pic's and their IS a difference between the drivers. It appears their still doing the FP16, although i don't know if it will be a big deal, it looks like Doom ]|[ will be using FP16 anyway (from the chat) Either way i hope MS pimp slaps nVidia down and doesn't let the bully push around the giant -_-

"What kind of idiot are you?"
"I don't know, what kinds are there?"