HL2 Video Stress Test

pauldh

Illustrious
Forgive me if this has been posted. But Firing Squad has a <A HREF="http://www.firingsquad.com/hardware/half-life2_vst/" target="_new">little review</A> with the new HL2 video stress test. They liken it more to a synthetic benchmark that shows off what the engine can do, and not a performance test of gameplay.

Anyway, X800 is doing to GF6800 in this test, just as the GF6800's did to X800 in Doom 3. But we will soon see once custom timedemos of the real game-play come out.

Also noted is the older ATI cards (9700 Pro, 9800 Pro, and 9800XT) are doing very well for this HL2 test. Hopefully that will be the case with actual gameplay.



ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
Is it me or is image 13 and 14 showing the 5950 MUCH better at image quality than the 9800XT?

Great performance on all cards, but I bet anything we won't be seeing X800XTs with 4X AA and 8X AF playing at 1024*768 at 120FPS. More like 60 I'd bet. Nothing's intensive from what I see, anyways. (were there any characters or monsters on screen at all?)

EDIT: Holy crap, did anybody notice the major difference between 128 and 256MB PROs? If there is no clock speed difference, then holy crap, this game seems as taxing as Doom III on the video RAM! And the XT's little 10% higher clock speeds are really making it kick the 9800PROs' arses by much more than 10%. Is that even normal?!
Oh and the 6800 Non-Ultra's performance, honestly sucks. It's slightly better than the 9800XT. That's just not that good IMO.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/20/04 00:54 AM.</EM></FONT></P>
 
Let us all post our Source Benchies, now that we can.

I got around 50 FPS....6xAA(max), 16x AF(max in game and forced), ALL reflections on, All high values, Vsync on, All max.

SYS: p4 3.2, 2gb ram, X800XT, 1280*1024, cat 4.8 with OGL of 4.9

<i> :evil: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil: </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
 
Ughhh, I don't really own any of the new Valve games, can't really try that out.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
 
I'm not going to play Doom 3, but I will play HL2. Was the 6800 GT a bad buy?

"Go forward until the last round is fired and the last drop of gas is expended...then go forward on foot!" -Patton
 
Is it me or is image 13 and 14 showing the 5950 MUCH better at image quality than the 9800XT?
Must be either you or the DX8.1 codepath. :wink:

Holy crap, did anybody notice the major difference between 128 and 256MB PROs?
Yeah, but only with 4XAA enabled. Otherwise they are equal. It's looking like No longer the 1-2 fps difference we saw only at high res in older games. I'll be keeping an eye on that in future reviews, since I own a 128MB one. Hopefully it is just in the stress video and not gameplay. Or maybe it was a beta fluke 😱




ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
I don't think a 6800GT could be called a bad buy at all. One of this generations the best buys right now. But if HL2 is your thing, and D3 meant nothing, then taking past advice from these here forums would have been to WAIT for HL2 and buy what proves to be the best. But even if a X800Pro is better in HL2, I still think the 6800GT is as good a buy for the same price. It will win out in more games than it loses, and I doubt either will struggle in the final HL2.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
Ok. I bought the 6800 GT because it was for 400 bucks and it was in a store nearby. I would have ended up paying probably 430 if I would have gotten it later.

I just hope it does well in next-gen games.

"Go forward until the last round is fired and the last drop of gas is expended...then go forward on foot!" -Patton
 
Must be either you or the DX8.1 codepath.
Why would an older codepath give out a better image quality?!

Yeah, but only with 4XAA enabled. Otherwise they are equal. It's looking like No longer the 1-2 fps difference we saw only at high res in older games. I'll be keeping an eye on that in future reviews, since I own a 128MB one. Hopefully it is just in the stress video and not gameplay. Or maybe it was a beta fluke
Yeah you're right, I kinda overreacted, but it still is a major performance difference for an extra 128MB that once proved useless. AA is no longer just dependant on efficient memory bandwidth usage and its technology along with color compression.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/21/04 00:16 AM.</EM></FONT></P>
 
Sorry, I was just having fun and poking a joke at the FX series, hence the :wink: . They were getting framerates too low to play at the DX9 codepath. Anyway, at DX8.1, they are getting good numbers and should play HL2 just fine.

Looking at the pics 13&14, it does seem in some ways the FX looks better, but which is correct? Anyway, look in the center of the screen at the pink rectangle down below. On the radeon it has clearly defined edges, on the FX it is just a blur, no definition to it's edges at all. In that aspect, the Radeons look much better than the FX.

Edit: and I don't think you were over-reacting. You brought up a good point. The 256MB version crushed the 128MB with AA/AF. I hope actual gameplay doesn't show the same. Now that Gateway cancelled my $400 X800XTpe order(offering a $350 Pro instead 😡 ), I was just planning on keeping this 128MB 9800 Pro in use for a long time until next generation cards come out and/or this generation's supply>demand and prices fall.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt<P ID="edit"><FONT SIZE=-1><EM>Edited by Pauldh on 08/21/04 09:02 AM.</EM></FONT></P>
 
They also have <A HREF="http://www.firingsquad.com/hardware/counter_strike_source/default.asp" target="_new">CS:Source benchies</A> up for the new cards, with 9800/FX benchies to come later. Looks like the only real advantage the ATI cards will have is at the rediculous resolution of 2048x1536.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
They also have CS:Source benchies up for the new cards, with 9800/FX benchies to come later. Looks like the only real advantage the ATI cards will have is at the rediculous resolution of 2048x1536.
Is it me or is the game at 1024*768 either CPU limited or FPS locked? Because they certainly can't all be running at equal performance.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
 
They were getting framerates too low to play at the DX9 codepath.
Oh yeah I know, hence why I was wondering how the heck did the DX8.1 codepath give it sharper image quality.
Looking at the pics 13&14, it does seem in some ways the FX looks better, but which is correct?
I was mostly looking at texture sharpness, and if you look at the area below that pink rectangle, that's where the sharpness is most noticeable. Maybe the AF was working better on the FX, seeing as it's not as evaluative of angle as the R300s are?

On the radeon it has clearly defined edges, on the FX it is just a blur, no definition to it's edges at all.
You're right. It almost looks crunched from all sides too! The 9800XT shows it as a "real" rectangle.

and I don't think you were over-reacting. You brought up a good point.
Actually it was because I forget the non-IQ enhanced results and for a second mistook the 256MB clear wins as layed out on all the tests including normal. So I kinda overreacted about it. I had to look again to remember that the normal quality tests were not the same as the IQ enhancement ones.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
 
<A HREF="http://www.xbitlabs.com/articles/video/display/counterstrike-source.html" target="_new">clicky</A>

<i> :evil: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil: </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
 
I've really become so lazy as to hope someone will make clickies out of URLs! :lol:

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
 
Seems like they're all roughly equal in most benchmarks, except the X800s start flexing their muscles at high resolutions, and with IQ enhancements.
I was surprised to see the NV40s doing so well though in certain benchies, and in rare cases the 6800 Ultra even beat the X800XT.

Truly a powerful new generation from nVidia and ATi, and it's good that nVidia is competing so well with what seems to be no cheating so far in the new games. That's commendable.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/22/04 01:17 AM.</EM></FONT></P>
 
Firing Squad has more to add today. Seems they figured out how to force DX8, DX8.1, DX9 paths. But even still, DX9 doesn't work with the FX cards. The midrange FX's default to DX8 like the GF4Ti's. The High end FX's default to DX8.1 That means the old Radeon 8500's default to a higher shader level than the FX5700's!

I'd like to have seen 1600x1200 tests. At 1280x1024 4XAA/8XAF, the R9800XT equals the 6800GT in DX8.1 mode, and isn't too far behind in DX9. Yet at lower resolutions, the 6800GT blows it away. I wonder if the gap would shrink even more at 1600x1200? Could the 9800XT pass the 6800GT? I'd never have expected that at higher resolutions the GT's lead would diminish over the last gen ATI card. I guess AA&AF at the same time are still slowing down the 6800's like was shown in FarCry benchies.

<A HREF="http://www.firingsquad.com/hardware/half_life_2_fx/default.asp" target="_new">http://www.firingsquad.com/hardware/half_life_2_fx/default.asp</A>

<A HREF="http://www.firingsquad.com/hardware/half_life_2_fx/page6.asp" target="_new">http://www.firingsquad.com/hardware/half_life_2_fx/page6.asp</A>


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt<P ID="edit"><FONT SIZE=-1><EM>Edited by Pauldh on 08/24/04 06:39 PM.</EM></FONT></P>
 
those are good links.

<i> :evil: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil: </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
 
Yeah, I still have a few of the screenshots to look over. I still can't get over the HUGE performance hit the GF6800GT takes going from 1024x768 to 1280x1024 with AA& AF on. It's FPS are cut in half! Seems odd. I know if I owned one of the newer cards. I'd want to be playing at 1280x1024 4X/8X or maybe even 1600x1200. 8x6 and 10x4 on a $400 card is a joke.

I expected to see them give more CS:Source benchies for the old cards. Maybe soon.



ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
Is it just me or do the walls look BETTER in the dx8.1 path (on the 9800)?

Me: are you saying I can't provide?
Me: cause I know I can provide.
Me: oh and I can provide money too😉
Rachel:): why do we need money when we can just stay in our room and have sex all day?
 
There seems to be more contrast with the DX8.1 path, or as they said deeper grooves. But I've been jumping back and forth between image 48 and 57, and I think that the DX 9 path looks better in that the edges of each rock are more defined. Especially on the lower part of the wall. Like the edges have been sharpened, compared to DX8.1 just showing a dark area on the edges, but not looking sharp. Make sense?

<A HREF="http://www.firingsquad.com/media/hirez.asp?file=/hardware/half_life_2_fx/images/57.png" target="_new">http://www.firingsquad.com/media/hirez.asp?file=/hardware/half_life_2_fx/images/57.png</A>

<A HREF="http://www.firingsquad.com/media/hirez.asp?file=/hardware/half_life_2_fx/images/48.png" target="_new">http://www.firingsquad.com/media/hirez.asp?file=/hardware/half_life_2_fx/images/48.png</A>



ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
48 shows more grooves. on the right top and left. more cuts in the wall. but 57 looks more realistic unless there was some dermite problem.

yeah....57...definitely id much better.

<i> :evil: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil: </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
 
Very interesting article. Puts into question just how much of HL2 is DX9 really. Seems like not much.

What really surprised me though is the HDR. It seemed like it existed on DX8 cards!

Visually, except from the Radeon vs geForce brightness change, the game seems to use HDR on DX7 without a single change in quality. It makes me wonder just how hyped was this feature and how recent is it. Maybe those were crappy examples, but I thought HDR was all about powerful lighting effects against things that block light sources. Then again I still wonder if the expression High Dynamic Range really means special ligthing effects.
I liked some of the bump mapping used in HL2 as well, but again, looking at those wall shots, I can't help but remember modern games' graphics and feel like it's very much like them. That's where I was coming from when I said DOOM III currently holds better technological advancements visually. If you can feel CG quality playing HL2, I will retract my statement, gladly.

Personally though, I can't help but feel like there's something wrong between DX8 and 9 shots of wall textures. It just doesn't make sense one detailed sharp texture later is supposed to be a less detailed one but more "bumpy" (by a little). Ask yourself this, if you first played in DX9 and saw the wall example, then switched to DX8, would you feel like it's gotten worse or more detailed?

Overall this is one of the most interesting articles I've seen in a while, simply because for once, we're shown in depth examples of DirectX version differences! Finally DX8 shows itself as a serious evolution over DX7. And DX9 now reveals to us the truth, that it simply ISN'T being used as much as we thought lately, nor is it displaying anything amazing so far. We need a more serious programming to see that, and I feel if it's not now, it won't be for a while too. (yet DX9 was out 2 years ago!)

As for the geForce FX forced at DX8.1, it's as if we're told it never supported DX9. Doesn't anyone find that just a tad bit misleading? I also don't like the fact Valve FORCES DX9 card owners (the GF FX ones) to run the game at DX8.1, even if DX9 would slow it down, they still paid for a DX9 card. Kinda disappointing honestly. Even though I can see their intentions and they're not really wrong in nature.

BTW it seemed that the rippled pink rectangle in the well pictures we talked about days ago where I said the 9800XT seemed to have less IQ than the FX, was maybe a true effect. That crunched pink rectangle in the middle was actually supposed to look rippled from what was in fact water over it! :wink:

Bah I've talked too much. :tongue:

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
 
I still can't get over the HUGE performance hit the GF6800GT takes going from 1024x768 to 1280x1024 with AA& AF on. It's FPS are cut in half! Seems odd.
That's a bug, guaranteed.
The 6800s have nothing to do in DX9 with the FXs, so this shouldn't happen at all.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/25/04 00:10 AM.</EM></FONT></P>
 

TRENDING THREADS