Nvidia Flares.

fragglefart

Distinguished
Sep 5, 2003
132
0
18,680
Well the consensus is that NV FX cards are a bit shite at flares. After running RTHDRBL you can see they do seem a bit poop.
But then in Halo, check out the sunlight filtering through tree branches, and check out Prince of Persia, the Sands of Time and the plethora, nay multitude(!) of effects found within (great game, shame about the bleeping camera)

Both real-game cases where NVFX cards are doing ok.
(Im running the 5900Ultra) in Prince of Persia, 1290x960, 4xAA, 4xAF, 100Hz...... about 40-50 FPS average.

This again makes me wonder exactly how "limited" these cards will actually be in DX9... i seriously doubt there are too many floptimisations going on in sands of time, despite it being a TWIMTBP game.

Thoughts please (as long as they are not 1. yes but a 9500/9600 will do it for half price lol, or 2. yes but a 9800 will get 241FPS)

Anyone know if the NV flares/water/smoke/glows/particle effects are full whack PS2/DX9 or will they be kinda "mixed mode".

Im just trying to figure out what sorta experience i will have with this card for the next year or so, coz at the mo, its been a great 4 months!!!

............................................
Render times? You'll find me down the pub...
 

cleeve

Illustrious
It's all in the specs.

DirectX 9 calls for 24 bit precision. Nvidia hardware does it's calculations in either 32 or 16 bit mode, and the Nvidia drivers default to 16 bit in cases where other true DX9 cards are calculating in 24 bit.

To put this into perspective, I have yet to see a convincing image quality comparison where this is noticable in a game.

A second issue... from what I understand, the GeforceFX architecture is incapable of displaying High Dynamic Ranges of colors like the Radeon 9500+s are. This is a very noticable effect that adds beautiful realism in scenes that have alot of contrast (i.e. lens flares), but only if the software calls for it, and as far as I know the first commercial game to use this feature will be half-life 2.

The third issue is raw shader power, which drives DirectX 9 effects.
The Radeons have very powerful shaders, more powerful than the GeforceFX shaders by far. But recently, Nvidia has released the forceware recompiler that increase the shader performance substantially and make it competitive.

This is the only real question mark with the FX cards because it's not known if the recompilers will hold up or not in upcoming DX9 titles that heavily use shaders... or if Nvidia can (will have to) optimize their shaders without a noticable quality loss for each title on a game-by-game basis.

Hope that helps,

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 

fragglefart

Distinguished
Sep 5, 2003
132
0
18,680
Nice answer, thanks.
What im really hoping for is that image quality will be on par (or close to) that of the Radeons for (in particular) shader effects. I am aware of the NVFX hardware limitations, but so far have not really seen any real world "actual game" situations where this becomes glaringly obvious, (im talking IQ, not FPS) and i am pretty sensitive to these things ;)
Perhaps HL2 will be the first title which really makes me want a Radeon, but i am increasingly hoping the NVFX will keep its end up (if not the framerates, 50FPS is fine for me as long as the graphics are sexy!) just so that i do not get the urge to upgrade again for another year or so.
Come on Ati/Nv, i wanna see those new spangly 2004 cards!

Please please HL2.... run faster than Halo does lol! (if you can play at those speeds, im sure we'l cope with Valves latest and greatest!)

............................................
Render times? You'll find me down the pub...
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
DirectX 9 calls for 24 bit precision. Nvidia hardware does it's calculations in either 32 or 16 bit mode, and the Nvidia drivers default to 16 bit in cases where other true DX9 cards are calculating in 24 bit.
<b>Just to be clear, nVidia's fp32/fp16 partial precision mode is within DirectX 9 specifications. You make it seem as though ATi is the only ones sticking to what DX9 calls for, when in fact, both IHV's do...</b>
from what I understand, the GeforceFX architecture is incapable of displaying High Dynamic Ranges of colors like the Radeon 9500+s are.
<b>Please document this...</b>


<b>Athlon XP 2100+ @2.02Ghz
MSI K7N2 Delta-L nForce2 Ultra 400
768mb of Generic DDR266 @310 6-3-3-2
Built by ATi R9800 @420/630</b>
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
GW. A man ready to kick ass or chew bubble gum. And he's all out of gum.

----
Yeah, thats right. I support the NV/AMD/IBM axis of evil.
 

cleeve

Illustrious
Just to be clear, nVidia's fp32/fp16 partial precision mode is within DirectX 9 specifications. You make it seem as though ATi is the only ones sticking to what DX9 calls for, when in fact, both IHV's do...

Maybe I'm wrong on this one. From what I understood, DX9 minimum precision is 24 bits. So Nvidia hardware would only be compliant if the FX hardware calculates precision in 32 bit mode because 16 bit is beneath the specified 24 bit precision.

When queried about DX9 and GeforceFX hardware, Carmack Said:
"Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec." - John Carmack

Based on this and other stuff I've read, I've been under the assumption that the DX9 spec calls for shader calculations to be at least 24 bits. So if I've got the wrong idea about this I'd appreciate some clarification, GW.

Please document this...

To be honest, I said "from what I understand" because I've heard this while arguing in favor of the FX 5900 on the Rage3d forums and wasn't totally sure about it. But looking back in context I think the source might have been asserting that the GFX can't do it in Half Life 2, which might be a limitation of the FX's special mixed codepath in that game?

If you're looking for a retraction from me, yes, I suppose it was stupid to post this at THG without further research. But like I said, that's why I worded it "from what I understand". Should have known better than to trust a Rage3d'er anyway I guess.

I'm looking into this more now, if I find anything noteworthy I'll post it.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 

cleeve

Illustrious
Bump for GW, just wanted to see if you could clarify my mistake as far as the DX9 spec goes dude.

If I'm off base on this one, let me know how...

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 
Gw's right about the partial precision, but it's the use of 12bit that's the issue with DX9. Depends on their use of the run-time compiler how it goes about doing it. The 12bit precision level was the real issue IIRC.

I had also heard of structural limitations about HDR on the current FX line. I'll see if I can find more about that. The Valve relationship I think you're thinking of is <A HREF="http://www.tech-report.com/etc/2003q3/hl2bench/index.x?pg=1" target="_new">THIS one</A>. I have read a more recent assesment from somewhere, (like nVnews) which came to the conclusion that it is a hardware limitation. But I won't say anything definite about it until I can find that blurb.

AIn any case I'm not sure it will matter much as those HDR effects are limited to HL2 for the near term, and from what I've heard they are interesting, but not overly impressive (from what I remember of initial reviewers' impression during this whole early days of HL2).


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

cleeve

Illustrious
Ack... OK, I was wayy off. Help me out here because I'm afraid I've been spreding false info like crazy:

From what I've understood up till now, Nvidia has been criticized for having mixed 16/32 precision and Ati has been praised for having 24 bit precision, and I'm pretty sure I have seen some info that talks about Ati's choice to go with 24 bit being a good one becuase the DX9 minimum spec calls for 24 bit.

I assumed the 24 VS. 16/32 argument was about shader precision.

Where does the FX use 12 bit precision? And what are hardware sites talking about when they talk about 24 bit VS. 16/32 bit precision?

God, you guys must have been biting your tongues at my ignance for quite a while... help Cleevey out and let me know I'm off base a little earlier next time, eh fellowes?

Now excuse me, I've got some reading to do...

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 

cleeve

Illustrious
OK, on Tom's Hardware:
<A HREF="http://www6.tomshardware.com/business/20030911/half-life-02.html" target="_new">http://www6.tomshardware.com/business/20030911/half-life-02.html</A>

Mixed Mode for NVIDIA cards

Facing the low performance results of the NVIDIA products, Valve decided to develop a so called "Mixed Mode" code path for NV3x cards. In this code path the floating point precision is reduced to 16-bit instead of 32-bit wherever it was possible without reducing image quality. Valve stated that the development of that special code path took 5 times the development time of the standard DX9 code path. Special optimizations for ATI cards were not nescesarry.

I'm pretty sure this is talking about Pixel Shader 2 precision, which is spec'd at 24 bit... isn't it?

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 

cleeve

Illustrious
Please document this...

Well, I've found <i>something</i>, GW. But it doesn't really answer the question as much as it implies that the dynamic range of the GeforceFX architecture is limited to 16 bit.

But mostly, it asks: "Is 16 bit enough to be called a high dynamic range?"

Some excerpts:

First, we have to discuss the dynamic range. A dynamic range is the ratio of the maximum intensity in an image to the minimum detectable intensity, according to the definition used at NVIDIA’s GDC 2002 presentation...

In a way, it’s best to think of HDR as a film camera with film that can adjust its exposure time dynamically to suit the situation...

Now that’s the $65,535 question! Is Gabe Newell right and 16-bit HDR is inadequate? Yes and no. Perhaps, for Half-Life 2’s purposes, there will be such drastic changes of light intensity that 16-bit has the same relative weakness that 16-bit rendering does...

Taking a look at Doom 3, which has far superior lighting effects when compared to Half-Life 2, it makes me wonder just what exactly Half-Life 2 does that it needs 24-bit HDR. Then again, Half-Life 2 combines outdoor and indoor scenes on the same level, and may require that higher scale.

<A HREF="http://www.firingsquad.com/features/nvidia_editors_day/page5.asp" target="_new">http://www.firingsquad.com/features/nvidia_editors_day/page5.asp</A>

So what I'd like to see is a GeforceFX VS. Radeon screenshot of an outdoor scene that has lots of bright/dark contrast. Then we should be able to see if it makes a real-world difference.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 
The way I've always read it, the nV run-time compiler runs so that the DX9 32bit effects are still present but the functions that do not need such a high level of precision are replaced with lower levels FP16 or FX12. This results in it displaying what you would see under almost any other DX9 compliant card, but it would allow the FXs to perform better than using all 32bit. I do remember reading about the reall issue being FX12 as there is NO wiggle room for that under the spec. I'll see what I have bookmarked at home.

The only thing here at work is this interview @ B3D;
<A HREF="http://www.beyond3d.com/interviews/ps_precision/" target="_new">http://www.beyond3d.com/interviews/ps_precision/</A>

It does mention HDR(L) as well.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Vimp

Distinguished
Jul 13, 2003
358
0
18,780
This is somewhat old info from The Tech Report. However It may or may not have something to do with what you guys are talking about.
More precision everywhere — The watchword for DX9 is precision, as you might have gathered by now. DX9 calls for larger, floating-point datatypes throughout the rendering pipeline, from texture storage to pixel shaders, from the Z-buffers to the frame buffers. 128-bit floating-point color precision is the most complex color mode, but the DX9 spec calls for a range of color formats, including 32, 40, and 64-bit integer modes (with red, green, blue, and alpha channels of 8:8:8:8, 10:10:10:10, and 16:16:16:16), plus 16 and 32-bit floating-point modes.
The additional precision will, of course, help with color fidelity and dynamic range, but it will also help with visual quality in other ways. For example, depth-handling errors will be dramatically reduced with the addition of floating-point Z buffers. Bump mapping elevations will increase (and quantization error will be reduced) with added precision for normal maps.
The R300's 96-bit precision in its pixel shaders seem to run contrary to the 128-bit color precision available in other parts of the chip. These pixel shaders compute color data in four channels of 24 bits each, or 24:24:24:24 for red, green, blue, and alpha. That's not as good as 128-bits and 32:32:32:32 RGBA, but it's not exactly horrible, either. This is one of those compromises that happen in chip design, but given where we're coming from, I'm having a hard time complaining about 24 bits of floating-point precision per color channel. Still, technically, NVIDIA should have an edge here. I expect ATI to add more precision to its pixel shaders in future hardware. Whether or not it matters in R300 is another story.
 

cleeve

Illustrious
A good thread indeed. This is going to take some time to digest, but I think I'm getting a clearer picture of the differences between architectures.

Here is some more stuff on the GeforceFX's HDR limitations I got from a learned friend:

NVidia has support for floating point textures and render targets. It's just that they are much more limited. They don't support wrap mode, can't be addressed with uniform coordinates and have no mipmapping, which means that they are NOT compliant with DX9 specs. They are exposed in OpenGL, but only because OpenGL allows vendor specific extensions to expose non-compliant hardware. Nvidia supports 4 channel 8 bit integer render-targets not DX9 compliant, ATI supports 4 channel 16-bit FP render targets.

Lighting in DX9 requires FP render targets, lets not forget precision modes either. Although the newer Nvidia drivers exposed FP32 (only in NV35 and up chipsets),on NV30/NV31/NV34 chipsets the driver exposes only s10e5 or FP16, which is not enough range for HDR lighting in some cases. Tests have shown that programs that ask for High Precision, return with the same FPS exactly as low precision which means the driver is 'cheating'...read this thread:

http://www.beyond3d.com/forum/viewt...525&start=0

An easy test is tell someone with a 9800 and 5900 to run this Demo, compare screen shots of zoomed lighting (this demo needs High Precision FP render targets similar to HL2 lighting model). A 9800 should be at least 40% faster than a 5900 here, even though the 5900 is rendering in lower precision and not even rendering everything.

http://www.daionet.gr.jp/~masa/rthdribl/

It seems that GPU's have different precisions all over the board. I didn't know that there were different precisions between GeforceFX GPUs, and, from what it looks like here there are different kinds of render targets that also have different precisions.

Time to do some research on render targets.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>