Halflife 2 Performance Revealed

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yum.

I am not a AMD fanboy.
I am not a Via fanboy.
I am not a ATI fanboy.
I AM a performance fanboy.
And a low price fanboy.
Regards,
Mr no integrity coward.
I share your sentiments Master Poobaa!! :wink:



<b><font color=red>Psychiatrist says I'm crazy but the voices in my head say I'm not 😎 </font color=red></b>
 
I'm sorry, could you please rephrase that dude? I'm not trying to be critical but your last post was somewhat incoherently constructed and a tad vague in meaning and I was completely unable to follow you

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
Haha DinoX. I guess you saw through all the conspiracy!

Even though Gabe stated they put in five times the amount of time to OPTIMIZE Nvidia cards, where Ati cards got no optimization. Yeah, he's obviously out to get Nvidia.

Just like the creators of the new Tomb Raider games. Yes, all the software developers of DirectX 9 games are out to KILL NVIDIA!

I mean, are you for real? It's in game developers best interests for all hardware to work fast with their products.

But I tell you what, you're right. Buy yourself a superior and expensive GeforceFX 5900 ULTRA for $300.
You'll be showing me, what with my piece of $hit $80 Radeon 9500 PRO.

Even though I'll get double your framerates in Half-Life 2, and every other DX9 game, because Ati has bought every developer out there (even though Nvidia has more money than them).

lol conspiracy theorits are bad enough... but when you mix in some die-hard fanboi (the kind that is desparate to believe that their favorite company is perfect and never makes mistakes), it's always good for a chuckle.

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2000+
3dMark03: 3529
 
"Even though Gabe stated they put in five times the amount of time to OPTIMIZE Nvidia cards, where Ati cards got no optimization. Yeah, he's obviously out to get Nvidia.

Just like the creators of the new Tomb Raider games. Yes, all the software developers of DirectX 9 games are out to KILL NVIDIA!"

When companies don't take the time to optimize their games to an equal quality - then yeah, there is something amiss.

"But I tell you what, you're right. Buy yourself a superior and expensive GeforceFX 5900 ULTRA for $300.
You'll be showing me, what with my piece of $hit $80 Radeon 9500 PRO.

Even though I'll get double your framerates in Half-Life 2, and every other DX9 game, because Ati has bought every developer out there (even though Nvidia has more money than them)."

who said I was going to buy an FX card? Not me. Still running strong with my GF4 Ti4600, and shall be till the next-gen nVidia cards are out.

"lol conspiracy theorits are bad enough... but when you mix in some die-hard fanboi (the kind that is desparate to believe that their favorite company is perfect and never makes mistakes), it's always good for a chuckle."

I'm glad your hypocracy is apparant.

dinoX aka BlackDog
 
Hey dinoX, just a question, are you a joke or realy serious about what your saying about Nvidia?





<b><font color=red>Psychiatrist says I'm crazy but the voices in my head say I'm not 😎 </font color=red></b>
 
<b>"When companies don't take the time to optimize their games to an equal quality - then yeah, there is something amiss."</b>

Haha. I tell you what, you can dump as much money into optimizing a game engine as you like, but you'll never get a TNT to run as fast as a Geforce4 Ti.

That doesn't mean the developers are biased, Dino... that means the hardware is simply not as good.

Like the Geforce FX.

Right now, it's painfully obvious to those who are not blind or ignorant that Ati's hardware is vastly superior to Nvidias when it comes to DirectX 9 shaders.

The fact that you don't want to believe it does not make the GeforceFX's architecture any better.

<b>"I'm glad your hypocracy is apparant."</b>

Which hypocrisy is that, Dino?
Because I'm not a conspiracy theorist, nor am I a fanboi.

Let me help you with the definition: A fanboi is someone who, regardless of anything else, blindly follows a manufacturer without question.

**You** like Nvidia and nobody else.

As for **Me**, I have owned both Nvidia and Ati cards... I buy what's best, for my money, when I need to upgrade. You see, that doesn't make me a fanboi. That makes me an informed consumer. And if Nvidia's next graphics chip is better than Ati's when it's time for me to upgrade, I'll buy it. Hell, if Matrox has the best card for the money, or S3... all will get equal consideration.

As for you, your next graphics card purchase will be Nvidia, regardless of price/performance.

That's what makes **you** a fanboi.

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2000+
3dMark03: 3529
 
Now I'm annoyed. They're benchmarking on the Intel 2.8GHzC. I just purchased a 2.6GHzC about a month and a half ago on the belief it will still be up-to-date. The 2.8GHzC was only £40 more (but still £40 more than I could afford)
 
They invented standards like OpenGl and DX to let ppl develop one coding path. But since NVidia doens't follow the standard, for exemple the 16 precision/32 bit when DX9 demands 24 bits no less, then it's nvidia's fault. As a developer i don't want to double code my game because some A$$ company can't follow stardards.
 
"**You** like Nvidia and nobody else."

glad to see you can speak for me even though you weren't officially delegated to do so.

just b/c I speak out against ATi automatically makes me an nVidia fan? gotta love your logic man. return to school, as you've obviously got alot of learning to do in the logic department. a lifetimes worth apparantly.

as for ATi, I was seriously considering getting the Radeon 98800 Pro before I read this.

after reading it, I'll just stick with my GF4 Ti4600, which apparantly doesn't do too bad in the benchmarks.

Just will wait till the next-gen ATI and nVidia cards come to buy a new card.

dinoX aka BlackDog
 
I'm still trying to decide whether you are; 1) a troll, 2) ignorant, 3) stoopid or 4) all of the above.

When companies don't take the time to optimize their games to an equal quality - then yeah, there is something amiss.
Did you even bother to read the info (like the article from the Tech Report)? Valve specifcally mentioned that they spent 5X as much time optimizing for the nV path as they had for the generic DX9 path. And the Radeons didn't need that extra opimization effort. The only reason that they even bothered with the optimizations for nV is that they realize that a larger portion/majority of their potential customers are running crippled nV hardware. So that's actually HELPING nV because they know that some of Valve's customers would be best served by that.

So I don't get your faulty logic. Either you haven't read the recent 'events' relating to nV (there are so many) or your bias has helped you ignore the. Try 'optimizing' your thinking.

I also love how you changed your last post to mention next-gen ATI or nVidia so as to not look so biased. Just to cover your but. Yeah and 'm not speaking for YOU, that's just my edjumakated op-onion, from a life-time of experience.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
gotta say im dissapointed in nvidia... this is like the nail that seals the coffin. I own a gainward ti4600 and swear by it and avoided upgrading to the next generation of gfx cards because of the all the finger pointing of optimizations and performance scandels. After this i think i will buy the 9700 or 9800 when the game comes out so i can have fun in a dx 9.0 environment since all previous directx 9.0 games were mediocore. To all nvidia loyalist wake up and think with your head... times have fully change and the direct x 9.0 market belongs to ati.
 
Not just that, but according to Newell, the R3xx are able to get the EXACT same frame rate as DirectX 8 display mode when rendering in DX9. So you can get the cheaper quality performance of DX8 in the higher quality DX9 for free!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 
<b>"just b/c I speak out against ATi automatically makes me an nVidia fan? gotta love your logic man. return to school, as you've obviously got alot of learning to do in the logic department. a lifetimes worth apparantly."</b>

Er... it was you who said that first, Dino. And I quote:

<b>"one more reason I won't buy ATI cards"</b>

Good lord. If you're going to pretend you're not a fanboi, go back a page and edit the evidence out of your posts first.

I mean, c'mon man.

------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2000+
3dMark03: 3529
 
Don't wanna hurt your feelings, but if the next card you purchase is an nVidia, its safe to say you will be MAJORLY dissappointed. I'm warning you, if you want to go the nVidia route that's your choice just as much as it is flamethrower's. I'm not angry or emotionally upset about their product choices, but I wish you 2 would give ATi a chance, but it probably won't happen. Lately, Nvidia's market becomes more & more embarrasing with each progressive product launch.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
There is only one more sickoed user here who would support nVidia and I seriously hope he is not actually ok with the current results: Papasmurf.

I can't wait to tackle him on this one. He'll likely say: They'll get better, they just need time, I have faith...

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 
He'll likely say: They'll get better, they just need time, I have faith...

Smurf ought to realize that its now FAR too late to have any hope for Nvidia right now. There is just way too much evidence. With all this proof floating on the net concerning Nvidia's current shortcomings, I wonder how one could rationally conjecture that Nvidia is doing well. Some people will never cease to amaze me. I've never met smurf before, now even CoolSquirtle who touts his "120% Nvidia fanboy" badge in every post doesn't reccomend Nvidia's latest. I'm thinking though that the 5900 Ultra should be priced at about $100 - $130. Really, it might "smash" the 9600 Pro in 'old' DX8 games(which both cards offer high framerates at high resolutions anywayz), but for chips that suck so badly at DX9 at such a Premium price is totally unacceptable. I can't see any justification for charging an exorbitant premium to run merely DX8 games at uber=fast framerates.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
I understand what your saying dinoX. I'm an ATI user but its not really my drug of choice. :smile:

You are dealing with a forum that has now fully jumped on the ATI bandwagon now that they have no fear that ATI is going to be better in DX9.
They watch and prey for people such as yourself and anyone supporting the infidels.
Here that is AMD, Nvidia, whoever our forum hawks don't prefer.

ATI has historically been shittier than NV. ATI has burned me financially and taken probably 100 hours of my time to get their product to work. NV works first time everytime in my experience with everything I've thrown at it.
I'm with you in that NV is the better of the companies overall.

The FX series sucks though. :smile:
You might want to skip this winters NV products because they are just higher clocked versions of the current cards.. might have some other tweaks.

Do yourself a favor, wait for the NV40.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500
 
These benchmarks are pathetic for both Nvidia and ATI cards...anyone who is happy playing FPS games with under 150 fps is a moron...

Don't expect current Nvidia (FX5900) or ATI (9800 pro) cards to meet your needs for HL2...
 
Only nV fans (trying to be nice, trying REAL hard) would wait to play a game simply because their cardmaker of choice's current line-up SUX in their choice of games (I know I didn't). But then again sure, go ahead, add another 4-6mths wait to play HL2. I'll be happy playing it on my R9600P while Dino 'waits'. And I'll probably have an R420 or RV380 in my hands/machine before he sees his NV40 in 'coming soon adds' in stores, according to all reports.

If he can afford an NV40, then he can afford to buy an R9600P, and then re-sell it when his replacement arrives, but he's obviously more Ati-phobic than some. Oh well so be it. DX8 for him and all his kind!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
...anyone who is happy playing FPS games with under 150 fps is a moron...
I guess most of us are morons then. I'd be quite happy playing with 1/3 of the "moron" amount. So I guess that makes me 3 times more a moron than the moron that is happy with 150 fps.

P.S.:Are morons capable of comprehending the fundamental mathematical principle of conversion factors as demonstrated above?


My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
dino, my friend. come here for a minute



you are exactly the reason why teh GFFX was promoted, and people wasted their money. <b>ignorance</b>

im not going to bother debating with you , because evertying you argue is BS and known to be BS by just about everyone who bothers to follow the news

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 
These benchmarks are pathetic for both Nvidia and ATI cards...anyone who is happy playing FPS games with under 150 fps is a moron...


well seeing as your monitor cant display over 60 or 85 fps because of its refresh rate, i guess everyones a moron then eh?


again, another example of <b>ignorance</b> folks

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 
I love the new sermon on <b>ignorance</b> that your preaching this morning. :tongue:

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
Oh yeah well my monitor can GET 160hz refresh @ 640X480! :tongue:

SO there! Who's the moron now! :wink:

This is the exact reason I didn't bother replying to his post, he likely thinks that it's advanatageous having 400FPS in Quake 3 at 8x6, 'Cause I'm a master-fragger/fraggle and I need my FPS!'

Oh well I'd prefer 1920x1440 at 75hz (or 1600x1200 @ 85hz) than having 160hz 640x460. But maybe I'm CRAZY! 😱

The only way you'll notice the difference is if you don't adjust the strobe-a-scope you have sitting in your drug den.
Go do that now, Eh! [coool]


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
WTF??? Under 150fps? Hmm so I'm a moron eh 'coz I play counter-strike @ 70fps to help my 64k ping. :/

IMO anyone who can tell the difference playing over 50fps must have eyes of an eagle... I play GP4 @ 30fps, and because of the way the game works, a higher fps actually makes it run strange... You know T.V displays images at about 25fps or some fairly low number, but theres nothing wrong with the image...

OMG I'm gonna be playin HL2 @ only 60fps... Shoot me with a spud gun!!!!

<A HREF="http://service.futuremark.com/compare?2k1=6988331" target="_new"> 3D-2001 </A>
<A HREF="http://service.futuremark.com/compare?2k3=1284380" target="_new"> 3D-03 </A>
<font color=red> 120% overclocker </font color=red> (cheapskate)