ATI Inferior to NVidia?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

wishmaster12

Distinguished
Jul 28, 2006
197
0
18,710
i found the nvidia slower and had little pauses when i played games and for some reasons the game looked different from my ati card then the nvidia and it was smoother on the ati card
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
Details, it is all in the details... I still believe for Linux Nvidia may hold the advantage currently.

As for the drivers, I've used both a lot of ForceWare and Catalyst. Both, as far as the drivers themselves go, are fine. I'll say that the settings panels for both kinda suck, though. As for Linux, which I'm fairly certain you (or somebody) is getting at, we all know how Linux-based games dominate the market. If you want to do professional 3D work, you don't buy a GeForce or Radeon; you go for a FireGL or QuadroFX, or possibly something made by another company like 3Dlabs.
 

pauldh

Illustrious
I like both manufacturers and probably buy about the same amount of each. But I agree with you on the games. For quite some time it seems ATI has/had the edge in my games when I was playing them. (Farcry, HL2, NFSMW, BF2, Oblivion) I hated Doom 3. But ATI is actually very close in Doom 3 and Quake 4 now; still way behind in Riddick. NV actually has managed to win HL2 sometimes too.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
While Half-Life2 itself was a better game, the Doom3 engine was more graphically complex and when comparing games sponsored by a GPU company, that's one thing you should look for.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Actually, Doom3's engine features a more advanced lighting engine than Half-Life2 and with add-ons seen in Enemy Territory: Quake Wars, it is a lot more sophisticated graphically. I'm not saying Half-Life2 is bad, because it's one of the best first person shooters on the PC, I'm just saying graphically it's not quite as advanced as Doom3.
 

DrNeil

Distinguished
Apr 26, 2006
45
0
18,530
Currently Ati cards can cope with HDR and AA at the same time. I think that means their product is better.... Things like that are more important than a difference of 2 FPS in a benchmark....
 

KentEl

Distinguished
Jul 27, 2006
5
0
18,510
"don't listen to me as i know nothing.

thermaltake armour
seasonic 600W psu
A8R-MVP mobo
AMD X2 4400+
TWINX2048-3200C2PT
sapphire 512MB X1900XT
74GB WD raptor
X-FI fatality
Dell 2405fpw
creative gigaworks S750"

I like your sense of humour and that is one freakin' kickass system......ever raided a couple of those raptors and then run the page file in a ram drive? Try it sometime but be prepared to go very very fast!:)
 

SuperG

Distinguished
Jul 21, 2006
28
0
18,530
Currently ATI has a big innovation Unified shaders for a long time. The Xbox360 GPU. And R600 is comming.
Guess something nV has't the time or know how or just runs behind to repel that. So there G80 doesn't have it. US but instead the old way VS & PS. Atho DX10 doesnot demand it.
ATI is the innovation leader. I think some people state nVidia to high as if ATI are some hobbi'ist. Because That innovation takes the whole architekture into account.
 

FITCamaro

Distinguished
Feb 28, 2006
700
0
18,990
yeah like I'm gonna spend over $400 to get a 1900xtx to play at 800x600 or 1028x764......riiiiiiiiiiiiiiiiiight....especially now that i dropped almost a grand on my new flatpanel with an insanely high resolution on it.....riiiiiiiiiiight

i think MORE companies should aim their flagship graphics cards at the 800x600 resolution crowd...riiiiiiiiiiiiiiiiiiiiight

"Look mom I just popped $1500 on a new 2400x1800 flat panel and UBER-grpahics card....but i only play at 800x600 cause that's its performance "sweet-spot"'.......riiiiiiiiiiiiight
As the others have said, you should've actually bothered to pay attention when skimming over that post.

The truth is, because most people don't have that $1500US for that flat-panel, most (>90%) people who are serious gamers play at one of two resolutions: 1024x768, or 1280x960/1024. Those stuck with even less expensive graphics cards MIGHT go for 800x600, but a larger number of people (close to 5-8%) run at 1600x1200. The number of people who actually do go beyond that could be considered negligable; I'd estimate that at most, 2% of serious gamers do that.

Well at least someone knows how to read.

Spacey. Last I checked 1280x1024 and 1600x1200 are both over 1024x768. In even back to my first post I stated that the XTX has outperformed the GX2 in both of those resolutions as well as 1024x768. You're the one who brought up playing at 800x600. I said up to 1600x1200. Yes that includes 800x600 but theres 2 other standard resolutions between that point plus a widescreen one.

So learn to actually read something instead of just briefly skimming it and spout nonsense not even relating to what someone said.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
Actually, Doom3's engine features a more advanced lighting engine than Half-Life2 and with add-ons seen in Enemy Territory: Quake Wars, it is a lot more sophisticated graphically. I'm not saying Half-Life2 is bad, because it's one of the best first person shooters on the PC, I'm just saying graphically it's not quite as advanced as Doom3.

I will agree that the doom3 lighting model is exceptional. In fact probably the best in existence today. The complete real-time aspect of it all w/ the pseudo-ray tracing that it does is incredible.

My dissagreement is that the geometry sucks royal a$$ in that engine. Texturing at a distance looks terrible (quake 4) and that engine cannot render a round-enough circle to save its life.

Witness the pentagon shaped heads in any game or just fire up the prey demo and look at anything in that bathroom at the start. Even most creatures in doom3 based games have really blocky shapes. They are very well lit, and bump-mapped to look amazing as long as you ignore the sharp edges of the square that makes up a cross-section of an arm or leg.

JMO, but in that regard hl2 stomps it. Now if you could put doom3's lighting model w/ hl2's HDR, geometry and character models and place it in a blender w/ FarCry's or even Oblivion's draw distance you would have one killer looking game.
 

spacey

Distinguished
Jul 27, 2006
5
0
18,510
you're STILL on this???????

sigh.........


ok, i'll reply ONE MORE TIME to you (this will be the third)...

my post was a JOKE...a sarcastic joke....if you can't see that and MOVE ON WITH YOUR LIFE...then I can't do anything for you....

I already explained this in my LAST reply to you...which you apparently didn't even read....ironic isn't it?
Mr. Read-the-whole-post-before-you-reply-back.....you missed a WHOLE enitre post from me...but hey...whatever....

I did read your whole post..every word...if had lots of words...hooray for you....yes i know you said UNTIL...DUH....that's why I've stated over and over again "LOW RES"...THAT would be all the resolutions BELOW 1600 x 1200...got that now?

for being so "good" at reading the WHOLE post instead of skimming before you reply..I'm having to repeat myself a lot to you...

this will be the last time...for I can't spend all my time trying to explain everything this simple to you....and i see it WOULD take all my time...
 

cleeve

Illustrious
I've stated over and over again "LOW RES"...THAT would be all the resolutions BELOW 1600 x 1200...got that now?.

1280x1024 = low res?

Wow. I have a feeling your definition only applies to your personal opinion, and not to reality...
 

KentEl

Distinguished
Jul 27, 2006
5
0
18,510
Why not do a Raid 0 with a couple of raptors 74 gig each, then load the mobo with as much of the fastest ram it will recognize, and make a ram drive - then allocate a gig of your page file to the ram drive. I've done this and it is kind of scary how fast the thing goes.

And of course the best 2 ATI cards, overclocked and cross-firing with NVidia :)
 
Spacey. Last I checked 1280x1024 and 1600x1200 are both over 1024x768. In even back to my first post I stated that the XTX has outperformed the GX2 in both of those resolutions as well as 1024x768. You're the one who brought up playing at 800x600. I said up to 1600x1200. Yes that includes 800x600 but theres 2 other standard resolutions between that point plus a widescreen one.

Actually 2 standard widescreens too (not including my 14x9) 1680x1050 and 1366x768, and that doesn't include the native to many 1280x720 even.

Despite spacey now claiming he was joking (I know jokes and that didn't look like a joke) at the very least it was a bad joke in a thread that already creates tension.

Also, spacey, those emoticons on the left that are meant to help tag such poor jokes for the audience, like an applause sign for SNL. This is especially useful for sarcasm, which doesn't convey itself well on the net.

Now on to the serious aspect of that topic, it's interesting how the performance of the GX2 in SLi does against GTX in SLi and XTX in Xfire;

http://www.firingsquad.com/hardware/geforce_7900_7950_gx2_quad_sli_update/page4.asp
http://www.firingsquad.com/hardware/geforce_7900_7950_gx2_quad_sli_update/page5.asp
http://www.firingsquad.com/hardware/geforce_7900_7950_gx2_quad_sli_update/page9.asp

So unless 2560x1600 is considered 'low res' then I'd say joking or not the sentiments expressed don't match the reality of the truely high end where there are some scenarios where the XTXs beat the GX2x.
 

Fursecul

Distinguished
Apr 27, 2006
204
0
18,680
Nvidia has a slight advantage then it somes to games with just a few fps,but ATI has a better image quality,so they are even
 
Yeah and I'm a little curious about those 8XAA results on the GX2s since there seems to be no performance penalty going from 0xHDR to 8xnoHDR yet a penalty going from resolution to resolution. Me thinks that there's som no AA going on there or else the penalty of using high AA is much greater on the ATi than HDR on the nV's and based on my experiences with Oblivion and others with Oblivion, that doesn't match the norm.
 

DaveUK

Distinguished
Apr 23, 2006
383
0
18,790
I've got a fairly good perspective on the ATI vs nVidia thing...

last four graphics cards were

Radeon 9700Pro (great card, motherboard problems so got a...)
Geforce FX5900XT (good card, AA performance not as good)
Geforce 7800GT (good card, but dissapointed image quality and performance in Oblivion)
Radeon X1900XT (great IQ and performance esp Oblivion)

The latest comparison, 7800GT vs X1900XT, reveals a couple of interesting differences. In my opinion, nVidia's 4xAA is superior to ATI's in terms of quality.

However, High Quality Anisotropic Filtering on the ATI cards really needs to be seen to be believed, the level of detail is fantastic.

In terms of comparing a very new game like Oblivion, stark IQ differences can be seen. Colour rendering on ATI is better than nVidia - on the nVidia card I had some interesting colour representation that just didn't seem a 'smooth' gradient, too much reds and greens when you look closely.

Also, (ignoring AA) HDR lighting looks MUCH better on ATI. I was shocked, HDR in oblivion looks how I imagined it should on the ATI hardware, looking very natural. However, in contrast on the 7800GT it was garish and looked 'artificial', like some parts of the image were overexposed and thus lacking in detail.

In all, I considered the 7950GX2 but I'm very happy with my move to ATI and I've saved the price difference towards my next (no doubt DX10) graphics upgrade. It will take some pretty hefty performance figures / reviews to sway be back to nVidia from ATI, after seeing the recent differences in image quality. It is *not* hype, I assure you.
 

Grinch123456

Distinguished
May 19, 2006
128
0
18,680
The 7900GX2s sucks something that shall not be mentioned. I'm almost 90% sure the only PCs that ran them were the Dell Renegade XPS600s. We're talking about quad-sli for 7950GX2s, which are totally different animals (well not really, but the 7950s don't suck while the 7900GX2s do). I have yet to see a benchmark of 7950's in SLI, but I'm sure it would be alot of wasted power. Nearly any game can be played with today's cards at max, or near max settings, at most normal resolutions. Screw Crossfire and SLI (except Voodoo2 SLI, that made Quake fly)! Screw the 4x4 system and the expensive doohickies that most every fanboy has been dreaming of! Therefore, let's forget about SLI and Crossfire and look at the facts, something most people will have: a single card and a 19" monitor. Under these resolutions, yes, an ATI card does well with filtering enabled, but when not enabled, NVidia rules the roost. Obviously, filtering is important, and therefore, ATI, for practical reasons, barely wins. Both companies are good, and are on nearly identical performance levels. With the next generation parts however, it will be interesting as NVidia will have an architecture similar to current Radeon X1x00s, while ATI will be moving to Unified Shaders. 2007 looks like it's going to be quite the year.
 

TRENDING THREADS