Nvidia can't compete?

POPEGOLDX

Distinguished
Aug 26, 2001
307
0
18,780
well with HL2 being bundled with the 9800XT...and ATI slaughtering NVIDIA in the latest games because of NVIDIA weak weak pixel shader, is this the curtain call for nvidia reign on top

I think so... Ati is firing on all cylinders and have a core that will keep up wth the new games.

people who still look at quake 3 numbers will preach nvidia pride... but for the rest who are gearing up for HL2 and call of duty and the like.... we are buying ATI cards

just traded in the old faithful g4 4400 and got a 9600 Pro....i may get a 9800XT later in the year to go with my hammer system.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I must congratulate you pope. THis is perhaps the most logical thread you have ever posted here. Even though it still displays the good ol' PopeyX tradition of Brand loyalty posts. But then I could be wrong since your last card was appearantly an Nvidia based chip :tongue: neverthless, I couldn't find a thing I disagreed with, so again congrats. Maybe your improving afterall.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

eden

Champion
LOL it is funny, I actually agree with him.

'xcept for the Hammer part, no one is sure yet anyways.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

ytoledano

Distinguished
Jan 16, 2003
974
0
18,980
I'd like to add that I've bben seing a trend when I look at THG's benchmarks of games. Usualy, ATI cards don't have the fastest FPS at low res, details, FSAA and Anisotropic but there is no difference there cause it is well above the V-sync. However, when res, details, FSAA and Anisotropic are on and high ATI is well above Nvidia - Where it counts.

Roses are <font color=red>red</font color=red>, violets are <font color=blue>blue</font color=blue>, post something stupid and I won't reply to you!
 

sweatlaserxp

Distinguished
Sep 7, 2003
965
0
18,980
Don't disregard the Doom III Special Preview- the 5900 slaughtered the 9800 Pro in all but one setting. Granted HL2 and D3 are two very different programs, but I feel that D3 is a better demonstration of next-gen graphics capability.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
In many ways though HL2 will look better than Doom III, but that saying goes vice versa.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

POPEGOLDX

Distinguished
Aug 26, 2001
307
0
18,780
i have no brand loyalty... i wont overspend for any more intel crap ... but video cards i can care less about name

this is my first ATI card since the rage fury maxx in 99 after a string of NVIDIA cards.

Ati is ripe to take over... no wonder NVIDIA is diversifying its product base with chipsets...
 

POPEGOLDX

Distinguished
Aug 26, 2001
307
0
18,780
take away the AMD vs INTEL and i am pretty much an OK guy

after nvidia crushed 3Dfx... nvidiots got so smug and let ATI blindside em
 

POPEGOLDX

Distinguished
Aug 26, 2001
307
0
18,780
doom3? doom3 is so unoptimized right now.. any scores with it are dubious at best

carmack said they hadn't really tweaked the engine for ATI's techniques of rendering just yet
 

Gastrian

Distinguished
May 26, 2003
169
0
18,680
And the demo was brought to us in association with nVidia.

So let's see, ATI beat nVidia in TombRaider3 and in HL2. nVidia beat ATI in D3 when it's already been optomized for nVidia and had nothing optomized for ATI.

nVidia fans are clutching at straws with the D3 benchmarks.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I agree with conservation of resources as well. You have been posting about HAMMER POWER lately. You realize that the cheapest, Hammer CPU won't be less than $400 I would imagine when they first debut. This is just my opinion, but if the 754 pin hammer's gonna croak in about 9months to a year, It would probably be wortht the wait for the 939 pin, don't ya think? I'm sure as heck not going to spend $600 for a mobo + CPU combo that will have long-gone croaked, had its funeral, burried, and decayed into petroleum by the time an upgrade is warrented. Today, a decent Barton 2500+ and a good mobo can be had for $200 or less.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
awwww poppy's here :D

ATi might have won for now~~~~~~~ hehheheheheheheheheheheh
(pulls out secret weapon)


Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 
And subsequently get whacked in the head with a Matrox Super Nova (or whatever it will be called). HeHe! :tongue:

nV has alot of work to do, but don't ever count anyone out in this industry, except maybe 3Dlabs/Creative. Where's my P10? Ba$tardz!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
[Crazy pointles post]

*UFO_WARVIPER pulls out his Gene<b>r</b>ic Weapon* "Ooops wrong one!" [puts weapon back in sheath] *UFO_WARVIPER pulls out his Gene<b>t</b>ic weapon* "There we go! [BOOOM!!!] "Oh yeah!"

Wait I almost forgot...........


<font color=red>bumpbumpbumpbump</font color=red>**<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bump</font color=red>********<font color=red>bump</font color=red>**<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bump</font color=red>********<font color=red>bump</font color=red>**<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bumpbumpbumpbump</font color=red>**<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bump</font color=red>********<font color=red>bump</font color=red>**<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bump</font color=red>********<font color=red>bump</font color=red>**<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bumpbumpbumpbump</font color=red>**<font color=red>bumpbumpbumpbump</font color=red>

<font color=red>bumpbump</font color=red>**********<font color=red>bumpbump</font color=red>****<font color=red>bumpbumpbumpbump</font color=red>
<font color=red>bump</font color=red>*<font color=red>bump</font color=red>********<font color=red>bump</font color=red>*<font color=red>bump</font color=red>****<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bump</font color=red>**<font color=red>bump</font color=red>******<font color=red>bump</font color=red>**<font color=red>bump</font color=red>****<font color=red>bump</font color=red>********<font color=red>bump</font color=red>
<font color=red>bump</font color=red>***<font color=red>bump</font color=red>****<font color=red>bump</font color=red>***<font color=red>bump</font color=red>****<font color=red>bumpbumpbumpbump</font color=red>
<font color=red>bump</font color=red>****<font color=red>bump</font color=red>**<font color=red>bump</font color=red>****<font color=red>bump</font color=red>****<font color=red>bump</font color=red>************
<font color=red>bump</font color=red>*****<font color=red>bumpbump</font color=red>*****<font color=red>bump</font color=red>****<font color=red>bump</font color=red>************
<font color=red>bump</font color=red>******************<font color=red>bump</font color=red>****<font color=red>bump</font color=red>************


Boy, that was GOOD!

[/crazy pointless post]

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

sweatlaserxp

Distinguished
Sep 7, 2003
965
0
18,980
It's pretty close to finished. Even if ID hasn't finished optimizing for ATi, Carmack has abandoned vendor-specific paths altogether. He's hinted repeatedly at the fact that the NV30/35 is just better suited to the particular demands of the graphics engine. Now Half Life 2? I wouldn't be surprised if the 9800 Pro came out on top, but nobody is to say at this point. I am hard-pressed to think that the final build of Doom will pose any more than slightly different benchmarks.
 

Nicjac

Distinguished
Aug 15, 2003
5
0
18,510
We dont know anything about NV35 performances in HL2. Only thing we heard was someone who presumably asked a dev about it... im personaly waiting for real benchs in-game.
 

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
Well, I was reading over at gamersdepot, and they seem to have some interesting comments regarding this very issue. They have Gabe Newell and John Carmack making some comments, as well as some benches of the latest dx9 games. Read the article <A HREF="http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm" target="_new"> here </A>.

No matter where you go, there you are.
 

eden

Champion
Now that is downright low. Carmack has to adapt to nVidia by using 16-bit freakin' precision?

HA!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

eden

Champion
Get your head out man, it's plain obvious nVidia will SUFFER in DX9 games. Just wait till Dave comes over to explain this (if he feels like it again). He's our forum's graphics programmer with extensive experience. He has a firm belief nVidia will go down, and he's proven it, so as many sites including the one Sarge linked you to here, which includes COMMENTS from the main producers of the two big games, both agreeing on the nV crap FX.

Carmack has abandoned vendor-specific paths altogether.
Bullcrap. Prove.
He's hinted repeatedly at the fact that the NV30/35 is just better suited to the particular demands of the graphics engine
BIGGER BULLCRAP! Yeah if you like 16-bit instead of 32-bit.

You spewed FUD twice in 2 sentences in one post, and you're still a newbie to here. I strongly suggest you start backing your statements and in fact try to really learn about what is going on in the GFX industry, before losing your credibility already, here.


--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
have u SEEN what the MSI nBox COME WITH? (drools)

find a ATi company that does that and i'll dump nVidia for ATi before u can said my name

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

eden

Champion
CoolSqu.......wow, that was fast! :eek:

One day I'll use that joke if you switch lol!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

eden

Champion
Hey Ape, you didn't yet jump on the guy here who believes in the DOOM III benches of THG?

You've been lately passive my young padawan.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
that day will come Eden, that day will come :D

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
You're off a bit. Carmack said he was GOING TO abandon vendor-specific paths, because hes tired of programming them. Doom 3 is the last game he'll work on (so he says) that will have them. It does, in fact, have them.

No vendor specific paths? Sounds like ARB and ARB2, in which ATI pwnz nVidia.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd