Interesting HL2 link

Ion

Distinguished
Feb 18, 2003
379
0
18,780
<A HREF="http://www.anandtech.com/weblog/index.html" target="_new">http://www.anandtech.com/weblog/index.html</A>

Cant believe anandtech actually posted those numbers coz it will be a great tool for Nvidiot to claim back their "performance crown". :tongue:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
It looks like the 5950 <i> just mights</i> save nVidia's arse. But at what Image Quality Loss? I guess we will find out soon enough. Personally, I'm very upset that the HL2 has not been publicly released as Valve promissed. Now if they would have said in advance "Hey guys, we're gonna be a little late on gettig the benchmark out. Sorry guys." I would have been cool with that. But Valve's dodgy tactics are really getting on my nerves lately.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
nah, the fact that they were tested using different codepaths means they werent on a level platform to begin with. Put Nvidia back on the DX9 codepath like they should be and they'd be fuxored again. Mixmode means vendor specific optimizations....I wonder how the 9800XT would perform if it was able to use mixedmode....

And mixmode is how the game is played...so they should have levelled off the playing field for this one first...otherwise it's like giving someone a head start in the contest and letting them claim victory.

Plus consider the fact that we don't even know where or who these marks come from.

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 

Ion

Distinguished
Feb 18, 2003
379
0
18,780
R3xx series isnt capable of FP16 calculation...

And it is quite unprofession for AT to post this kind of stuff on his site, it could easily mislead people.
 

cleeve

Illustrious
Interesting. This would definitely bee good news for FX owners out there.

Still, I think we should reserve judgement until these results are confirmed with official reviews...

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2400+
3dMark03: 3586
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
Ya it stopped my purchase of a 9800 dead. Perhaps their is hope for my card. *sternly looks at his FX*

-Jeremy

:evil: <A HREF="http://forumz.tomshardware.com/modules.php?name=Forums&file=faq&notfound=1&code=1" target="_new">Busting Sh@t Up In My Buddies Face!!!</A> :evil:
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
You're right they aren't...and for good reason. If you go down to that level you're pushing the envelope for crappy image quality. And consider this, Mix Mode for Nvidia means that they go from 12FP, 16FP, and 32FP. They bounce around to give the best performance. And ATI renders everything at 24bit FIXED...they support 24bit at all times which is a fantastic move if you think about it. You give yourself a bottom you will not go below...not only that, but ATI can operate in full 32bit precision as well.

So what does mixed mode mean? It means bouncing quality settings in order to attain higher FPS. This is something that Nvidia promised they wouldn't do when they came out with their pseudo apologetic reply to the 3dmark Scandal. And here they are...letting it happen again.

I took a look at ATI's site and found no mention of support for 16bit FP. Good thing too since it is evident they can't go beneath 24bit FP. It isn't possible for ATI to go beneath this either as it is integrated into the architecture or supported by their drivers.

Now here's the kicker: According to the DX 9 spec, a card must use at least 24-Bit FP precision per channel to be considered "DX 9 compliant." So what the hell is the point benchmarking a game with mixmode and saying it is a DX9 benchmark? I've had it with sites and individuals not doing their research before they slap a piece of hardware and pop a demo in and call it a valid performance indicator. This post from Anand is pure crap because the nvidia card uses mixed mode...purely sacrificing quality for quantity IE Frames per second. CRAP! that's what that 'benchmark' session was. You can't say Nvidia succeeded there when it isn't on a level playing field and they barely edge out the 9800XT.

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Not really...check my above post...they're still sacrificing quality for quantity.

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 

jmecor

Distinguished
Jul 7, 2003
2,332
0
19,780
but the very huge fan with the fx5950 is a thing to consider, eating up additional pci slot (and even larger than that of the gfx5800U). The scores in e3_techdemo5 has a large margin favoring ATi, but's it's almost a tie.

<b><font color=purple>
The real troubles in your life are apt to be things that never crossed your worried mind.
</font color=purple></b>
---
<A HREF="http://www.mapua.edu.ph/" target="_new">
MIT
</A>
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
I'd still buy the 9800...check my above post...nvidia is still doing the same thing...quality sac for quantity,.

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 

cleeve

Illustrious
From what I understand, the Radeon R3xx class GPU's can't calculate true 32 bit precision, but instead fudge it if they have to.

Not that this is an issue, really... because the DX9 spec is 24 bits...

Might be when and if DX10 comes out with a 32 bit spec, but that's a long way off yet.

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2400+
3dMark03: 3586
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
R3xx's can only do 24bit period no more no less. They followed spec for DX9.0 to the tee.

-Jeremy
Unoffical Intel PR Spokesman.

:evil: <A HREF="http://forumz.tomshardware.com/modules.php?name=Forums&file=faq&notfound=1&code=1" target="_new">Busting Sh@t Up In My Buddies Face!!!</A> :evil:
 

cleeve

Illustrious
They might be sacrificing quality for quantity, but it's STILL good news for FX owners if HL2 is at least playable on their cards.

If I was an FX owner, I'd prefer playable framerates and DirectX 8 shaders over 10 frames per second and impeccable image quality.

I'm not saying the FX cards are good in DirectX 9, by any stretch of the imagination TKS;
But I AM saying that playable is better than non-playable.

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2400+
3dMark03: 3586
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I've got some troubling News about Half-Life 2 guys: The source code has been leaked. In fact I probably should post a new thread about this. In fact I think I'll go ahead & do that.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
That doesn't trouble me in the least. It's not going to affect me, or the game. All it's gonna do is possibly let nVidia get a head start at cheating and making their fanbois feel good.


Stop asking people "whats up" or "hows it goin", you know you don't give a sh*t. And noone cares how you're doing either.

"We are far less than we knew." - Bright Eyes