Nvidia lying about specs of FX-5900?

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
hi all,

just invested in a nice (and expensive) MicroStar Nvidia GeForce FX 5900 (128mb / non-ultra version).

problem is, I think Nvidia are playing stupid games with the tech specs for the chip (or the drivers), was wondering if anyone round here might have heard anything about this sort of thing..??

Being a graphics programmer, I was quite looking forward to using some of the new features on this particular card. One set of features being the HDR (High Dynamic Range) textures - 64bit and 128bit floating point textures. Very cool.

However, a few sample programs I have say that my card doesn't support D3D9 HDR textures. The programming tools in the DX9-SDK back this up.

I have the latest drivers (44.03), so I'm either thinking that nvidia have lied on the box, or the latest drivers have disabled support for such things.

I cant find anyone who has a logical explanation to this, anyone here heard anything like this??

Cheers,
Jack
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
a guy called "Daverperman" might know, as he does all kinds of stuff like that


but i dunno. doesnt suprise me in the least if this is true

-------

<A HREF="http://www.quake3world.com/ubb/Forum1/HTML/001355.html" target="_new">*I hate thug gangstas*</A>
 

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
>> a guy called "Daverperman" might know

I'll look out for him :)

I've contacted Nvidia developer relations to see if they can clarify the matter.

But still, if anyone else round here has any clues I'm all-ears :)

Jack
 

jiffy

Distinguished
Oct 2, 2001
1,951
0
19,780
I’m sorry to hear that. Do you mind if I ask you a question though? I’m considering the same card, I’ll be using it for playing games, would you recommend it?
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
Install the just released DX9.0b SDK and update your drivers to 44.67 (WHQL) or even 44.90 (non-WHQL)... maybe this will help.
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
dude do u read the posts??? he's a graphics programmer, he's not having problems with playing game -_-" u fanATics go home and do it with ur precious 9800pro people here are having big boy talks. (Coolsquirtle walks away)

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 
LOL, can I try with the 9600Pro? Likely less burns due to cooler running. :wink:

JHoxley, yeah Dave will likely be able to give you some insight. He's sorta the resident progmming guru.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
I would suggest the radeon 9800 pro right now for Im hearing way too many issues with the FX card with the upcommng games.
Right away, your blind fanboyism has played you a trick but maybe your just a bot which automatically answer posts where one of the following words are used: nvidia, geforce, fx or detonator :)

Anyway, the only bug FXes have in a upcoming game is the AA issue in HL2 and currently ATI has the same issue and is not really clear if ATI can or will fix it or not...
 

daddywags214

Distinguished
Mar 30, 2003
939
0
18,980
Relax, man. Jeez, the word fanboy is way overused. If you look closely at the situation, they said that they could fix the problem with ATi, and as for now, they were leaving it up to nVidia to find a solution. AA is important to some people. And if it's reason enough to buy the 9800 Pro for this guy, you really have no business calling him fanboy. Just breathe, man...

These days, no matter what company you like, be it <b>nVidia, ATi, or whatever,</b> no matter how logical your reasons, you're labeled an <b>idiot</b> or a <b>fanboy</b>, or <b>both.</b>
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
Like some of the others said, daveperman might answer this, i dont know if he uses the fx series though.
My guess would be a driver issue, it'd be much to serious of an issue (and discovered long ago) if they lied about the features. Dave has confirmed though that a few of its features (128bit precision color) is as he put it, math marketing.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500<P ID="edit"><FONT SIZE=-1><EM>Edited by kinney on 07/19/03 09:42 PM.</EM></FONT></P>
 
Dude, you forgot the TNTs! :wink:

Actually there is a <A HREF="http://www.hardocp.com/article.html?art=NDk2LDEy" target="_new">BF1942 issue</A> but I'm not sure if it's been resolved yet.

Anywhoo, whatever. I'm waiting for PCI-EX cards. MMMmm toasty! :smile:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
I eat PCI-EX cereals for breakfast LOL



Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
as he put it, math marketing.
I'm fairly familiar with the play-with-numbers-to-look-better marketing, and whilst in some cases it can be down right annoying its one of those things I (we?) get used to.

To my knowledge it was the ATI's that were "cheating" with the colour precsision; to be D3D9 compliant it needs 24bit FP resolution, so thats what ATI used (and got speed for it) whereas Nvidia used "true" 32bit precision.

I'm far more annoyed if theres absolutely no (visible) support for a feature they say is actually there :)

Jack
 

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
he's not having problems with playing game
yup, thats true :) however, I'd lie if I said I didn't play games as well as make them!

I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
hmm, dont see much of a problem with the HSF, its more the length of the board thats an issue :) mines got 5mm clear of the 2 HDD's in my system - and I have a big server grade case!!

Jack
 
For usually the most RECENT drivers go to <A HREF="http://www.station-drivers.com/page/nvidia drv.htm" target="_new">Station Drviers</A> they have the 44.71 WHQL and 44.90 BETA.

Also you can soon go back to <A HREF="http://www.omegacorner.com/" target="_new">Omega Drive's Little Corner</A> as NVidia has decided not to mess with their fans anymore and is letting OmegaDrive adapt his NV drivers again. He currently has just tweaked 44.03s (I think) but expect something soon now that he can tweak agin.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
thanks for the links

I'm trying to download the files, but not getting any connection - although that might be my top-of-the-line 56k modem playing up ;)

Jack
 
Ok I tried too , and it appears the link goes to PNY (makers of the Quadros) so it may have high traffic. The 44.90s gave me a 'forbidden' message so maybe that link has been discontinued.
Try <A HREF="http://www.driverheaven.net/downloads/index3.htm" target="_new">DriverHeaven</A> instead they have the 44.65 & 44.61 & lower. They should work I downloaded a WHOLE bunch of benchmarks today and just tried the 44.65 just now with no problem (Eeww almost got Detonators on my Cat! :tongue: )


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
HERE I AM HERE I AM!!! just had a nice weekend with my gf, so no time for you, dudes:D

uhm.. yes, the gfFX cards are not capable of supporting everything of dx9. (and they emulate f.e. the pixelshaders with their own thing quite by some bit..).

the most important thing of dx9, the floatingpoint support, wich you eagerly waited for, is the biggest problem for nvidia. because they don't really have it.

i'm working more in opengl, but i think the dx caps can report you equal features:

gfFX does have floating point texture rectangles
gfFX does not have floating point 2d textures (the ones with width and height == power of 2, the good old textures simply)
gfFX does not have floating point 1d,3d,cubemaps
all it has are texture rectangles.

they have other problems with floatingpoint integration into standard opengl, and solve it with a lot of useless extensions. dunno how well they fit' into dx9, as they can't extend it..


same for render targets, where they have fun limitations currently, too (but, nobody really knows what is limited where how and why.. just.. some work, some don't :D)


yes, the gfFX hw is definitely flawed.. pixelshaders emulated with ps2.0+, but no direct ps2.0 support, floating point support is.. laughable imho..

and they put the "we have 128bit floatingpoint hw".. yes they have it in vertex and pixel shaders.. but about nowhere else :|

oh, and if you don't want to use pixelshaders, as far as i know, you can forget to even use the floatingpoint textures then..




for you, a radeon9800pro would be a much bether choise, i think.. haven't seen any HDR demo i (owning a radeon9700pro) can't run..


i'm sorry for you that you falled in false marketing..

me, as an opengl coder, i could at least switch to the NV_ proprietary extensions (while i would not like to..), and manually emulate the features i would have liked in gl..

you, as a dx coder, don't have that opportunity..


yes, imho, nvidia cheats a bit when writing dx9 on their card.. they can run dx9 vertex and pixelshaders.. but thats not dx9 yet..

"take a look around" - limp bizkit

www.google.com
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
there 'e is! =)

and ther you go folks. abolute proof that Nvidia lied about their DX9 support

-------

<A HREF="http://www.quake3world.com/ubb/Forum1/HTML/001355.html" target="_new">*I hate thug gangstas*</A>
 

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
uhm.. yes, the gfFX cards are not capable of supporting everything of dx9. (and they emulate f.e. the pixelshaders with their own thing quite by some bit..).
thanks <i>davepermen</i>, great to get a confirmation (even tho it wasn't what I wanted to here!) of these things.

I emailed nvidia and ATI developer relations, got a nice email back from ATI but nothing from Nvidia. hmm, wonder why now :)

you, as a dx coder, don't have that opportunity..
true, I'm not well versed in OpenGL - and the whole extension mechanism has put me off learning it, even tho it does seem to shoot me in the foot when it comes to these issues...!

Again, thanks very much for the informed reply... now I'll be off to see what I can do about it, which may require a phone call to the office of trading standards (UK) :(

Jack
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
well, i don't bother about the extension mechanism, i just use extgl wich wraps this for me, so i don't have to bother.. i can use every ext just as if it standard.. (with some if(!extgl_Extension.ARB_fragment_program) { throw Exception("ARB_fragment_program required but not supported.. get a radeon r300 based card!!! :D"); }

opengl is quite simple then.. using its new features..


else.. hm.. radeons are quite cheap.. ... :D

oh, and, yes, its annoying, as it means that i can support radeons without problems, but i have troubles for fx' cards.. though they have dx9, they don't have what i need from them..

"take a look around" - limp bizkit

www.google.com
 

JHoxley

Distinguished
Oct 19, 2002
107
0
18,680
i just use extgl wich wraps this for me
interesting, didn't know such a thing exists. The extra checking aint too bad really, you have to do the same in D3D - just checking against device caps that are built in...

I've always liked the GeForce cards, but I have to say I've been a little worried by Nvidia's latest moves - the CineFX and Cg specification. On they're own they're pretty cool - but they dont mesh so well with the other standards.

Maybe its just me, but it seems that they're going off on their own path - making their own (similar) standards, similar to what 3DFX did with the Glide API.

Jack