Small observation on the FX5600 in DX8 for HL2

eden

Champion
Yep, yet another HL2 thread. ROFL, there are very little threads on the main page of 25 threads asking for help!

Guys, if the drivers DO improve on the pipe architecture for the FXs, just how much can we expect, if that is to boost PS2.0 strength, and in DX8.1 mode, the FX5600 already has a hard time performing right?

What I am saying is, if the Det50s improve DX9 performance, but the FX5600 has trouble managing HL2 in DX8, then just how much can we expect to gain in DX8 performance for it?

Seems to me the FX5600 Ultra is pretty doomed there.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

bandikoot

Distinguished
Apr 23, 2003
423
0
18,780
Logically one would expect an increase in the FX5600's performance. I haven't bothered to look up any numbers, but simple economics dictate that your average computer user or enthusiast on a budget is more likely to have a mid range card than a $400 one in their computer. Not targeting an increase for that market is kinda like shooting one's self in the foot. No?
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Looking at the performance of the 5600 in HL2, it seems like its closer to the 5200 Ultra in performance than it is a 5900 Ultra. Right now, the only FX card I expect to even have a chance at DX9 is the 5900 Ultra which could compete with an ATi midrange 9600 Pro card if it gets like a 20 - 30% boost. Such a boost could be possible, but EXTREMELY unlikely without cheaptimizations enabled.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

eden

Champion
Hehe, considering what's been going on, I think nVidia doesn't have feet anymore! :lol:

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
Notice how every Nvidia driver since the fx line came out is to patch up old cheats and install new ones?

As each day goes by, I hug my 9600Pro just a little tighter.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Its gonna be a continuous neverending cycle of coverups. This reminds me of the "liar" analagy You remember that standard stereotype of a liar? A liar will lie once. He will soon lie again to cover up his first lie. And continue to endlessly lie until he forgets about some of the things he lies about.

I tell you, this is REALLy gonna backfire on Nvidea. BIGTIME!

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

endyen

Splendid
Nvidia is out to screw Valve. They pretended to help optimize HL2 for thier cards. Perhaps what they did had the opposite effect. At the same time they were developing Det 50 drivers, that would play the game well. So what's the difference so long as the game plays well? Now the HL2 engine sucks on Nvidia cards. Each game will need special nvidia drivers to work well. The engine just lost most of it's value. It also means fewer good games for us. I hate Nvidia again.
 
The engine just lost most of it's value.
Actually that has little to do with the game engine's value, but nV's hardware's value.

The game engine is still revolutionary from the preview Demos (especially E3's) and games will still (already have been [based on the early version]) be created on it.

The engine itself offers too much to be abandoned, and everyone knows that nV will HAVE to do something in their next-gen cards to address this issue, not just 'fl-optimize' (see UFO I can make one up too! :tongue: ). Games kill CARDS, not the other way around; well except for an Errant SIS HSF killed Duke Nukem, for which we will get Revenge! Ba$tardz! :eek:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Actually I was wondering the same thing as Endyen. It seems like game developers would select an engine that will both run & look good on most gamer's systems. John carmack, in order to make the DOOM ]|[ engine run well on everybody's system, he had to cheaptimize/floptimize Nvidia cards with the vendor specific instruction sets that we've discussed alot. But DirectX doesn't have the advantage of open source that OpenGL has, so there was less chance of getting nVidia cards to run well. It seems like game developers would choose Doom ]|[ over Source because of this. I personally would have preffered the Source Engine because in many ways the mod community is almost synonomous to an exent with Valve's Half-Life. I would really be dissapointed if the mod community flocks to the Doom ]|[ engine. I don't know why for sure, i guess its just me and my bias towards Half-Life. :wink:

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
I think alot of it will have to do with game play. The game physics in HL2 and it's interactions with other parts of the environment are supposed to be the most revolutionary part of it. Those aspects will still be there in the nV cards, and REALISTICALLY it ONLY affect the FX line, and even that line will be short-lived, and has been undersold due to the recent successes/fortunes of ATI. I think that the FX performance factor will have minimal impact on the adoption of the game. The main thing may be whicg one gets fully into the devlopment market first. Rememebr that FAR more people have Non FX card sthan have them, heck they are likely less than 10% of the gaming market. So for those people too bad so sad, and even the FX5200 (and basic 5600) people had to know there was 'no REAL DX9 for you little boy'. I doubt it will have much impact on the development market other than what we have already seen, the potential for spending 5X as much time floptimising for a dead-end architecture (NV40 will have to be different really). But in the end it really depends on the consumers.

Anywhoo, as always, that's just my two frames worth.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

eden

Champion
Honestly since there are maniac HL players who probably got screwed with FXs, won't care. HL2 is still something major, and returning it because the game runs bad won't be much of an issue. They knew that new games ask a lot, so they'll figure it asks this much on any card.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

endyen

Splendid
But as a game developer, would you buy the HL2 engine, or wait for the Doom3 engine. After all the D3 engine will give you a 10% greater potential market, and you will see fewer games returned because of unplayability. If you were Gabe, would you be worried about this?
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
From a business point of view yes, as well as Valve track record they didnt licence out Half Lifes original engine out much. It just got the mod communities attention due to the excellent SDK they made.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil:
 
But the thing is that it's still not unplayable on the FX it's just not as 'sexy' as full DX9. So I don't see why people think this is a bad thing for the game, it's really just a bad thing for the hardware. IMO. In any case I think it will likely give people pause. We shall see, I still think the markets that WOULD have bought them in the first place (focus on games that need certain physics or lip/dialogue aspects) will still go with that choice. Likely any mod on the HL2 engine will arrive to market AFTER the NV40 so there will always be that option for nV players and for all the other FX players those games will be like playing any DX8 title. So it's not unplayable by a long shot. It's really like having built-in 'Get in the Game' technology similar to the 'Way it's meant to be played stuff that only appears on nV cards'.

We shall see.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

mulletkid

Distinguished
Dec 21, 2001
106
0
18,680
As I recall the original Half-Life engine was a modified quake2 engine. Therefor Valve could not license it out because it wasnt theirs.
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
Doesnt the rest of the FX series besides the 5900 perform around what the GF4 does in DX8?
If so, then they will be ok.
Theres so many GF2s out there that they wouldnt dare alienate the DX8 crowd. And the most populous of those are the GF4 people.

So round about GF4 performance will be ok for HL2. And NVs slow DX9 cards should IMHO, run in DX8 mode.

I accept NVs excuse that every card is different and needs certain pathing, but this is ridiculous, you should be able to get around halfass what your competitor is doing in general DX mode!

Instead Dawn is taking it in her stinkhole bent over ATIs viking warlord feasting table.

The 5900 has near no chance of running full DX9 mode in any DX9 game and keep pace with the radeons.
Looks like mixed mode it is for the 5900 and that IMO is only to save face.

Might as well run all FXs in DX8 mode if it was up to me.

Oh sorry about the details about dawn, just a small fantasy of mine- LOL

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500
 

eden

Champion
Doesnt the rest of the FX series besides the 5900 perform around what the GF4 does in DX8?
No, the FX5600 benchmarks, in DX8 mode were also weak.
<A HREF="http://www.anandtech.com/video/showdoc.html?i=1863&p=8" target="_new">http://www.anandtech.com/video/showdoc.html?i=1863&p=8</A>

FAR from what I would call playable.

Hence why I was wondering if the Det50s affect DX9 shaders, then DX8 is unaffected and likely the 5600 will still suck badly, even in DX8. I could be just confused though. We need these Detonators ASAP.

Also, this is where I think you too should just give nVidia one last chance for the DX9 battle before completely considering them cheap bastards (for the DX9 era, again!) who screwed customers. One last chance being: The Det50s better be honest, and be saving all the FXes.
I know I am giving them this final one. Otherwise, from now till a new NV core that is honest and actually performs GREATLY and with the DX9 functions at full throttle, I am gonna just scream and curse at them. Everyone should join in, it'll feel good. :wink:

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 09/15/03 10:34 PM.</EM></FONT></P>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Yeah, that's funny. The FX5200 and FX5600 seemed closer together than the 5600 & 5900.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

kinney

Distinguished
Sep 24, 2001
2,262
17
19,785
Thanks for the linky.
I swear the benchmarks from different sites vary... but I'm probably imagining things.
I just thought the 5600Ultra did a TAD bit better than THAT!
What a disgrace.

Yet they seemed pretty good in older DX7 games.

I hope it doesnt take that long to run as good in DX8/9 games as it does DX7 (or anything not leanient towards GFs like Q3A and fixed function shaders in the first place).

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500