D]|[ benchmark released at same time as NV40 PR?

This could be a repeat of the FX5900U's Debut.

An early release of the D]|[ benchie as per an agreement with nV. Hmmm, wonder if it will show the cards as fairly *cough* equally *cough* as the previous special release benchmark from Id.

I wonder if it will allow the user to select code-paths and such?

Could be interesting. Perhaps start the campaign off with a loaded deck?

<A HREF="http://www.warp2search.net/modules.php?name=News&file=article&sid=16808&mode=thread&order=0&thold=-1" target="_new">http://www.warp2search.net/modules.php?name=News&file=article&sid=16808&mode=thread&order=0&thold=-1</A>


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

pauldh

Illustrious
I think part of the "Agreement" will be that it doesn't in any way show them "equal". :wink:

ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 

splenda20

Distinguished
Mar 2, 2003
422
0
18,780
Doom 3 is turning out to be one of biggest whores in video game history.
First it was with ATI, showing off the 9700Pro at E3 2002. Next, I guess ID got upset with ATI for sleeping with Valve at E3 2003 and packaging HL with Radeon cards. So the ultimate payback, go sleep with ATI's enemy, NVidia! And now we have ID with NV, HL with ATI and still NOTHING HAS BEEN RELEASED!

Wow, the business of video cards is more dirty than an episode of Jerry Springer!
 

TheRod

Distinguished
Aug 2, 2002
2,031
0
19,780
As you said Doom 3 is really a HYPE!

Of what I have seen, the game will looks incredible on top systems, but I doubt this game will be fun! Do you know why? Because I actually think that id Software will release this game to show the potential of his 3D engine.

The latest id engine was great in games like MOHAA and RTCW... And I think it willl be the same with Doom 3, the best games for this engine will be release in the future.

But, there is a big problem... For what I have seen yet, the HL2 engine seems to have better physics. If I were a "Game Studio", I would probably buy the HL2 engine to give a better experience to the players.

And, of course, maybe I'm wrong... Because nothing is released yet, and this makes me angry!!!

id and Valve are following the Blizzard path... Long delay before releasing games. I only wish they will give us kick-ass games like Blizzard do every time!

--
Would you buy a GPS enabled soap bar?
 

splenda20

Distinguished
Mar 2, 2003
422
0
18,780
Like I said on other posts, I'll wait, and I think everyone else should wait until the game is actually released before saying how good it is.
In regards to the physics, having played the alpha myself, it is a very impressive physics engine and is very realistic. Great ragdoll effects too. What's different between D3 and HL2 is that the physics in HL2 seem to be an integral part of the gameplay, whereas with Doom, it just happens to be there.
 

Xeon

Distinguished
Feb 21, 2004
1,304
0
19,280
Highly unlikely Doom 3 will not be fun. ID software has built quality engines along with quality games for years.

Physics will also be about the same less the fine tuning that accompany John Carmacks work. I’m not convinced Valve has what it takes to make a good sequel to Half Life. Since last I checked eye candy is what it is eye candy.

You are also not a game studio so your if I was I would use the Half Life 2 engine has no merit. What you are is a consumer that is showing bias for an odd reason which is the way it is I suppose.

Last I checked both engines had about the same capability the only difference is John doesn’t like press shoots like Gabe does.

Blizzard also doesn’t make kick ass games anymore it was quite evident with War 3. The game was flat and held nothing to Star Craft in the sense of strategy.

It’s been said that ID, BIOWARE, and Blizzard sell games based on their name. They are elite development studios and as such get guaranteed sales with that. But it appears Blizzard is weakened from the resent conflict with its parent company over the sale of the company as a whole, also the loss of major founding players. BIOWARE is restricted on making anything D&D based due to Wizards fear of loss of sales to their P&P market. All that’s left will be ID and time has shown that they don’t disappoint.

But all this is opinion anyways, I personally believe Doom3 will be a better game than HL2 due to the fact Valve has done with the CS, Steam, and marketing. But this will be the year that says if ID is still top dog if Nvidia can get back to No.1 and if Valve hasn’t been sitting on their butts doing nothing but hype something that isn’t up to par.

Xeon
 

TheRod

Distinguished
Aug 2, 2002
2,031
0
19,780
I liked your post, I know I'm not the best in game comparison, because I don't play games much... I mean, I usually stick to 1 or 2 games at a time ofr a long time.

By the way, I don't agree with you about Blizzard, WC3 is a great game, I still play it at least once a week over BATTLE.NET and I still enjoy it a lot... And I know why! First, there is a lot of possible strategy and gameplay style with the 4 races. Second, the "balanced match-matching" of BATTLE.NET is incredible, you always play with people of your range, that is great!

Player matching is still a problem in today's shooter... I stopped playing CS, cause of this issue... I was annoyed to always be matched against good player. In shooter I'm not a good player, I would like to play with not good player, to have a fair chance to win.

I wish HL2/Doom3 will have great MultiPlayer capabilities, because it's what makes a game last. I admire Blizzard for this, they know how to retain gamers with the multiplayer portion of their games.

I think in the "war" between ID/Valve, the game that will last longer will be the one tat will have the best MultiPlayer components. The best action, player matching, server availability, etc...

So, I stop here! And again M. Schumacher just won again... F1 gets boring...

--
Would you buy a GPS enabled soap bar?
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
nVidia is going to own this benchmark/Game. They designed a chip around it, and had this game been released when nVidia predicted it would be, people's opinions about them would likely be different today. nVidia's Ultra shadow isnt just a marketing gimmick(well, it is but) it's a useful feature as far as this game is concerned. nVidia's programmers wrote an extension called GL_EXT_depth_bounds_test that works like this: You provide two reference values(min, max) which form a depth range. For every pixel/sample, the depth value in the depth buffer is checked against this range, and if it's outside, the pixel/sample is discarded.
In plain english, this is a bandwidth saving procedure for stencil shadows. Just like smoke and explosions eat bandwidth and choke video cards, so do shadows, and nVidia has put alot of effort into designing a card that can muscle through this. Radeon's do not support checking of the depth buffer content against a reference range, so they'll run through shadows slower, almost like a ship without radar(R3x0 only stores one value per tile in the hierarchical Z buffer)
If Carmack would have programmed all card's to run the same ARB path(which he couldnt of anyway's) ATi would own this game, but with nVidia only running in half of the precision as ATi, taking advantage of their huge library of proprietary OpenGL extensions, and having the benefit's of Ultra shadow, it's not going to be much of a competition IMHO...


<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2106920" target="_new"><b>3DMark03</b></A>
 
Dice aren't the only thing that get loaded, but I guess you're too young for that. :tongue:

It was simply an off the cuff reference off the cusp of my keyboard. But most people get the picture even if it wasn't crystal clear. :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Nah, nV and Id have already shown that they don't use the standard path for their benchmarking. HL2 ran the standard unoptomized path for their benchies, Id and nV used the nV-centric codepath which uses partial precision. Even Carmack admitted that the standard path would show the FXs as SLOWER not faster than the Radeons, and that even with the special code path the FX take only a slight lead over the Radeons.

So in reality, the HL2 benchies are much more accurate and reliable (and show the same performance issues reflected in many other unoptomized games and benchies), whereas the D]|[ benchies use a bastardized code path (1 of 4) that cripples all other hardware in comparison; hardly a good benchmark for the hardware, and hardly a good benchmark for the game if either you can't chose the codepath or that there is only one provided for this early demo release.

But then again, I'm not surpised you had a problem with the HL2 benchies, it simply backed up what all the previous benchies had been indicating, poor pixel shader strength in the FX line of cards, which you love so much.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
the only difference is John doesn’t like press shoots like Gabe does.
Sure he doesn't, SURE!

I've seen Carmack's face plastered all over TV in the last year, be it for D]|[ or the X-prize, whereas I've only seen Gabe on the .net and only at tailored events.

I'll wait for the games to ship to give my final word, they both look promising, but Carmack is far from the wallflower you make him out to be.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Yeah, I think so to, heck the Ultrashadow is supposed to be that 'gotta have' feature, that was basically only available on nV, until the R9800 series decided to extend their capabilities (whether it will be able to work is another story). I just wonder if there is an equivalent extension for ATI. I have the library list from the OpenGL forum, but I don't see an equivalent (no description of the extension from the list I have).

The Ultrasdhaow I don't think is so much the issue as it is sold (the R350/380/RV350/360 all have similar hardware capability), if it's only nV that has anything similar to _depth_bounds then it would be a big advantage much more than Ultrashadow itself. ATI has GL_ARB_fragment_program_shadow which was only recently added (approved Dec 2003) so I don't know much about it's functionality (not much talk and I haven't looked into it). Admitedly I don't follow the developer side of things that close (I just tend to read the release notes [for 1.5 and preview 2.0] and look for new stuff they are trumpeting). From <A HREF="http://oss.sgi.com/projects/ogl-sample/registry/EXT/depth_bounds_test.txt" target="_new">the minimal description</A> in the extension lists they don't seem to be too similar as the nV one does seem to specifically define and discard shadows, whereas both mention the lighting the targeted definition. I don't think ATI's will do much discarding. I would say that ATI's hardware is definitely up tot he tasks as their results in most occlusion culling tests are far greater than nV's, but that exclusive extension will make a huge difference if the large number of shadows in D]|[ do overwhelm the VPU with extraneous instructions. It should be a and interesting comparison. Interesting that HP may indeed own the intelectual property behind the extension (from what little else I found about this at this late hour with a head full of ski/beer) as that may leave it open to ATI, unless nV have locked HP into a deal (more money for HP if they didn't I would think). Interesting that the extension is no longer NV_XX but EXT_XXX (as described in that brief run down), doesn't that mean it's no longer proprietary?

It will be interesting to see if the partial precision is even close to being noticeable, I doubt it in this case.

I agree that the competition would favour nV, but I just hope they don't exacerbate the issue by only providing the nV-centric ARB path in the demo, like they did that demo/benchie they provided at the FX5900's release. If they win, let it be a clean win for a change. Like you say, they definitely have the tools for it.


<b>EDIT</b>; three other extensions I'd like to read up on (when I'm more able to) to compare are;

ARB_shadow
and
ARB_depth_texture
and
ARB_shadow_ambient

But for now, I wouldn't be surprised if it's currently mainly nV. I really need more info on that new ATI extension and what it does. Here's nV's take on it (read the caption);
<A HREF="http://www.nvidia.com/object/nzone_doom3_home.html" target="_new">http://www.nvidia.com/object/nzone_doom3_home.html</A>

Still I don't think all the extensions in the world will help the FX5200 series. :evil:

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 03/07/04 04:46 AM.</EM></FONT></P>
 
Oh yeah , here's an old article I bookmarked about the Ultrashadow and D]|[, cause it seemed to do a good job of describing it. Of all the ones I had bookmarked it's the only one that mentions the <i>EXT_depth_bounds_test</i> extension (well Digit-Life may have, but they seem to have been down the last 2 days [can't get my links to them to work at all]).

<A HREF="http://www.hothardware.com/hh_files/S&V/ushadow_doom3.shtml
" target="_new">http://www.hothardware.com/hh_files/S&V/ushadow_doom3.shtml
</A>

Now gonna go play some UT2K4 and go to bed. MMmm Avril Living! Lock On Dudes! :cool:

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Grape, the last link you provided is dead at the moment, but I'm going to keep checking it. Dave B said that nVidia lifted the licensing restrictions on their fragment program extensions a little while ago, because they were taking flack for it, others though had commented in the same thread that even if ATi had licensed that particular extension for their own use, they couldnt implement it through hardware, their chip's do not have the functionality.(an actual explanation was given, but admitedly, it went over my head)

Interesting that the extension is no longer NV_XX but EXT_XXX (as described in that brief run down), doesn't that mean it's no longer proprietary?
I'm glad that you brought that up because it confused the fck out of me through the discussion I was reading. That particular extension was talked about as being written/owned by nVidia, but it's lacking the "NV_" prefix, I personally do not believe nvidia is into writing extension and not keeping them for themselves, or freely turning them over to the ARB for all to use as they please. I'm going to ask somebody about that me thinks.

** P.S.**
Avril is jailbait :smile:


<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2106920" target="_new"><b>3DMark03</b></A>
<P ID="edit"><FONT SIZE=-1><EM>Edited by GeneticWeapon on 03/07/04 01:37 PM.</EM></FONT></P>
 

Xeon

Distinguished
Feb 21, 2004
1,304
0
19,280
That doesn’t make much sense all the other paths run fine on the Radeon. The need for the special path was to allow partial precision of the FX. As explained <A HREF="http://www.webdog.org/plans/1/" target="_new">here.</A>

There is no underlying issues other than Carmack is probably a NVIDIA fan. They do support OpenGL very well and we all know he’s an uber OpenGL guy.

I have also never seen him on TV so I guess it’s what you claim you have seen. I know he spends a great deal of time in Japan selling the engine technologies to Japanese studios.

But with you logic is like saying a P4 shouldn’t be allowed you use a SSE2 compiled code path for testing purposes. It’s the same with the NV30 path. It doesn’t change the end result with the path just allows NVIDIA to get a leg up on the fact they built an iffy core. It’s not cheating unless the end result is skewed from "special" optimizations.

Hmm it’s actually starting to look like I responded to the wrong post oh well said my 2 cents either way.

Xeon
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
psh... why do u think Carmack is trying to build himself a rocket? so now he can have his face in space! This guy tunes ferraris for fun! so he'll be wellknown as the richass programmer!

anyways.... nowadays benchmarks don't mean much anymore, only see them as a reference, building a flame war upon it is stupid (except when NV40 and R420 are released) Personally i dont give a F*** if they used optimized path or codes, as long as it runs smoothly on my comp with no IQ lost. WHO CARES?! soo let's hope nVidia puts some kickass shaders and instructions in NV40.........pretty please?

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy
never tried to go crazy when it comes to o/cing.
THGC's resident Asian and Graphics Forum's resident nVboy :D
 

JoeB

Distinguished
Aug 8, 2003
139
0
18,680
<A HREF="http://www.hothardware.com/hh_files/S&V/ushadow_doom3.shtml" target="_new">http://www.hothardware.com/hh_files/S&V/ushadow_doom3.shtml</A>
working link... somehow there was < b r > (without the spaces) added to it.

_____________
whompiedompie
 

eden

Champion
NV40 is gonna be a Riva 128 core with 175 million more transistors specially made for more efficiency. :lol:


PS: I want your pic on the THGC album damn you.

--
<i>Ede</i>
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>More updates and added sites as over <font color=red>62</font color=red> no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol:
 

eden

Champion
*responds to stimuli*
Someone here mentioned Avril Lavigne?

--
<i>Ede</i>
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>More updates and added sites as over <font color=red>62</font color=red> no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol:
 
You are missing the points completely.

First you haven't seen Carmack on TV, yet I'm sure anyone with TechTV i their home has even if the name meant little to them.

Second, it's not a question of extendions, it's a question of quality/speed. If the partial precision does an equal job, then good on them, otherwise it's not a good comparison now is it? If the benchmark isto mean anything other than oh, my computer can play D]|[ (which alot of cards can thanks to the multiple paths), then fine, the FX series will likely get very close to the GF4ti in speed, but then again I guess it doesn't matter that there will be not the same level of fragment and vertex shaders shown, cause it's all the same right?

The point is that this will likely be used as nV PR and not actually to determine minimum requirements.

The main thing that annoys me is that these Special optimizations (FX has it's own path) take time and resources away from releasing the games. How many anticipated games would be out soner without the need to code for just ONE line of cards.

In he end the consumer loses becuase it doesn't support ALL the mfrs out there, and it delays the game.



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Thanks for fixing the link, I just cut and pasted it, but I guess something got FUBAR'd!


<b>CoolC</b>, see my statement about delays and slanted marketplace. That's why I don't like it. It ammounts to making all the other cards slower to favour one card. Now you might say it just make one card faster, but why not optimize for all card by using a... hmmm. let's see, maybe a STANDARD. I hate EA doing it, I'd hate it even if it were ATI. It's basically screwing everyong else. HL2 is nowhere near the same scenario.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Xeon

Distinguished
Feb 21, 2004
1,304
0
19,280
First you haven't seen Carmack on TV, yet I'm sure anyone with TechTV i their home has even if the name meant little to them.
Could be the small fact I don’t watch TV since its got nothing but re-runs and soaps on :smile: .

The point is that this will likely be used as nV PR and not actually to determine minimum requirements.
Don’t see me arguing there, but there must be a reason ID is standing behind NVIDIA other than PR wouldn’t you think?

The main thing that annoys me is that these Special optimizations (FX has it's own path) take time and resources away from releasing the games. How many anticipated games would be out soner without the need to code for just ONE line of cards.
Then why bother having more than one card manufacturer if they cant go the path less traveled in their engineering? Why bother trying to enhance anything when everyone has to support it their hardware before software will support it?

Custom paths will always be here they are evident in CPU's and you accept that why is it so hard to say hey NVIDIA decided they are going to try something new better of for worst they are going to try it. Then to find out it works well with this said software but not this and yada yada.

As far as I am concerned we should all be happy that they are still trying to innovate instead of just running with the tried tested stuff. But that’s just my opinion obviously yours is different due to your distaste of NVIDIA’s "special" optimizations.

Just got to look at it this way even if its bad engineering its still innovative and that’s what this is all about in the end who can be more cleaver with the 0's and 1's. Well that’s what it is to me makes it more exciting to me.

Xeon