DX9.1 + Dets 5x.xx = NV saved?

fragglefart

Distinguished
Sep 5, 2003
132
0
18,680
Well, you have seen the title. Anyone with any technical expertise know any reasons why NVidias FX line may be saved with DX9.1 and the new Dets? I have seen rumours of 60% performance gains, which seem a little over the top, but anyone have any exact knowledge?
Cheers, from a 5900U user who wants more FPS :)

............................................
Render times? You'll find me down the pub...
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
How would DX9.1 help the FX cards out? THese are underdeveloped DX9.0 based cards.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

speeduk

Distinguished
Feb 20, 2003
1,476
0
19,280
ATI sucks, what you on about?!

<A HREF="http://service.futuremark.com/compare?2k1=7000747" target="_new"> 3D-2001 </A>
<A HREF="http://service.futuremark.com/compare?2k3=1284380" target="_new"> 3D-03 </A>
<font color=red> 120% overclocker </font color=red> (cheapskate)
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Okay, I'm going there now.

EDIT:Oh, & before I go, I had to point out that The name "nvnews" sound just as fishy as "amdzone". I would feel the same way about any reviewer that included parts of a brandname into ther website name.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!<P ID="edit"><FONT SIZE=-1><EM>Edited by UFO_WARVIPER on 10/16/03 04:39 PM.</EM></FONT></P>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Okay, there's like 50 gazillions article there. I scanned over the list & did not see a thing partaining to DX 9.1 off hand.

<pre>{Maybe I just need a good skull bashing)</pre><p>My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
You got it wrong, it's DX9.1 - Nvidia = DX1.1 since Nvidia runs all of their cards on the DX8.1 standard.

Nvidia is going to have to scrap the whole project and come out with different architecture or they will not be able to utilize anything in their current line of cards. It's like every graphics vendor said, 'hey lets go camping' and Nvidia was the only one that didn't pack anything but an inflatible turd...and god knows what good that will do them. But everyone else packed all of their DX9 camping gear and are living comfortably into the future...I feel bad for them...I loved my GF2 GTS...and still run that overclocked puppy.

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 

Snorkius

Splendid
Sep 16, 2003
3,659
0
22,780
If Nvidia sucked long and hard on Mr. William Gates' willy, then MS could, theoreticaly, adopt Nvidia stanards as their own 9.1dx standards.

<font color=blue>
I will not add another word.
Horace </font color=blue>
 

fragglefart

Distinguished
Sep 5, 2003
132
0
18,680
GW i know this is a pro Ati forum, thats why i ask guys here what they think, im not trolling, or after a flaming, but to see what you guys say.
I pop into NVnews already, because they are quickest with new det details and releases of betas etc., but for hardware understanding, this forum is better.
UFO- with DX9.1, there is supposed to be a better handling of Nvidias hardware, especially regarding pixel shader 2.0 (very vague i know, but then im not a hardware manufacturer, and chose NV5900U over Ati9800P doh!, mind you that was in the DX8 days.)
Anyway, 60% performance increase sounds a bit OTT to me, so what do you reckon?
I am currently using det 52.13 betas and can confirm they have improved DX9 performance and IQ quality, although i believe that was at the expense of AF *sigh....*
but, i say again, i want this card to perform better, so what do you think of the DX9.1 rumours?
Here is a link to the thread <A HREF="http://www.nvnews.net/vbulletin/showthread.php?s=b0a0a1cdef42309471d0e4307852292a&threadid=19613&perpage=25&pagenumber=2" target="_new">nvnews link</A>
Dont have a go just because i chose NV, the card is still fast as hell, and i run 3DS Max as much as games.

............................................
Render times? You'll find me down the pub...
 

Thor1182

Distinguished
Sep 2, 2003
36
0
18,530
It is possible for a hudge gain with a change in software that deals with hardware.

Ever compare the running times of Bubble sort to Quick Sort? On my pIII 1.2, sorting 5,000 random numbers Bubble has 1793 ms run time vs Quick sorts 0.0 ms run time. The 0.0 means that it sorted faster than the CPU could take the next time. If you multi thread the quick sort and have a multi CPU computer, the threaded version can run all over the normal quick sort in run time. For you CS people, thats a difference of a O(N^2) algorithm, to a O(NlogN) algorithm. (Hang with me I'm going somewhere with this).
here's a link compareing sorting algorithms: http://www.cs.rit.edu/~atk/Java/Sorting/sorting.html

If we scale this example up to drivers and hardware. NVIDIA made thier cards for thier Cg language becuase DX9 wasn't out yet. When DX9 came out NVIDIA had quickly change thier overhead to convert DX9 to thier Cg. Time pressed, alot of thier code might of been the bubble sort kind. when the goal is to generate frames quickly, bubble is a bad way to go. the Det, 50.xx could be the transition to quick sort code. along with the nex DX9.1 we could have the code properly coded to use the whole chip, and not part of it. (like threading functions to use multi CPU's)

Any one who has tried to write threaded code, and more complex but quicker sorting algorithms knows that it takes time to get right, but when done right, hudge performance gains can be made.

Sooo.. 60% is posible, and if any one who can do it, MS and NVIDIA can. I just want my NV 5600 go to be able to play Doom III anf HL2



I'm a Computer Science and Electrical Engineering Student, not English, nor will I ever claim to be.
 
Balderdash brought this up about a month ago.

<A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=367915#367915" target="_new">HERE</A>'s the Post.

I'm still not convinced until it actually happens, and someone can provide screenies.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

euphoria651

Distinguished
Nov 12, 2003
19
0
18,510
So nvidia has released new drivers that improve performance with Direct X 9.0? and whats all this about direct X9.1. As yet Direct X 9.0 is only just beginning to come into games so why direct X9.1? Will this new driver save the FX5600 and the rest of their cards?


Judge me all you want; just keep the verdict to yourself!!!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
"And the Whole in mode of precision 32bits on the floating ones in the PS2.0 please mister!"

lmao

Me: are you saying I can't provide?
Me: cause I know I can provide.
Me: oh and I can provide money too;)
Rachel:): why do we need money when we can just stay in our room and have sex all day?
 
The only possible benifit would be using both pixel pipes to their max in the TOP end cards, not the FX5600/700.

I don't expect much, because like WS said you still have a major issue with the speed of 32 vs 24.

I think that alot of the benifits expected from DX9.1 are being derived right now from the Forceware drivers.

But of course that's just a guess until we actually see it happen.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
G

Guest

Guest
Captain Obvious points out that DX9.1 - Nvidia(8.1) = DX1.0 (not DX1.1)!

<b><font color=red>Captain Obvious To The Rescue!!!</font color=red></b>
 

tombance

Distinguished
Jun 16, 2002
1,412
0
19,280
lol, just what I was thinking.

<A HREF="http://service.futuremark.com/compare?2k1=6752830" target="_new">Yay, I Finally broke the 12k barrier!!</A>
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
Hehe, dude, that doesn't seet quite fair...1793 vs 0? If that was true, runtime would be something like O(N^2) to O(10) lol. What were the database sizes, how close to assorted was everything, what other processes were u running, etc.?

The one and only "Monstrous BULLgarian!"
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
This is totally based on that article about how NV got screwed when they tried to do their own thing. If that is the case and DX9.1 is tailored more to their GPU's, then that kind of increase may be attainable. Again tho, this may be a whole load of sh!te:)

The one and only "Monstrous BULLgarian!"