WOW my first ATI in seven years

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Of course you are gonna see an IQ difference when you compare a 7xx0 card with an 8x00 card. They're completely different architectures. You might as well say an X1900XTX looks better than a voodoo banshee.

EDIT: typo
 
We can call it a day, but shouldn't this load of horsecrap be put down first?



Games are programed in DX9/10, or openGL. They can't stray from this, or else people won't be able to run them. If you are talking about "the way its meant to be played", that is simply marketing. Take a look at this review from tweaktown.

http://www.tweaktown.com/reviews/1115/7

Here is a HL2 LC review showing the 2900XT and the 8800GTX basically equal. Depending on your resolution, either card might be ahead by a few FPS. Now take a look at the company of heroes, which you think is programmed for nvidia. Again, pretty much a tie here. The nivdia card does best at 1920x1200, but might lead by only a single frame if you are playing at 1280x1024. I don't know about you, but I wouldn't call either of these a "stomping".

It used to be that ATI did best at DX while Nvidia was king of OpenGL games. Those days are gone. Half life looks good on either companies cards. AMD/ATI can compete on openGL games. Nvidia can do well on DX games. Things have changed.
 


I have those kind of problem with my 8800 gts 320, very rare but annoying. I will sell my card and try ATI 3870. It's going to be my first ATI.
 


Ah ok, so newer=better, therefore the HD3K > GF8 series, but the GF9600 beats the HD3K becuase it's newer, and then the S3 Chrome 400 bearts them all because it's newer still. :pfff:

I'd find these discussions more interesting if people didn't criticize others and then come up with their own hairbrained ideas.
Reality is that for the issue you're refering to (HQ AF) in those pictures you keep posting, the R9700 outperforms the GF7900, but by the same toke, the FX5900 had better AF quality than bott the GF7 and X1K-HD3K cards, that they didn't use it is another story.

People confuse specs with defaults.

The reality is that now both have very similar features and benefits bring the pluses of one inline with those of the other where overall the +/- basically balance out and it pretty much a specific area that is different for a specific app and usually in a specific situation.

Notice the 7900GTX has the worst image quality of the three graphics cards, in the end the 8800GTX has the better textures filters. The 2900 Pro is most likely on par with the 8800GTX or slightly better.

Actually for hardware AF support the GF8 series has not been bested by the AMD counterparts, however the deafault useage of both is basically the same where the differences are hard to notice and take an effort to find.

In general this is how it works VERY Minutely;
Top AF Qual: GF8 > R9700-HD3K by tiny margin
Top AA Qual: HD2-3K > GF8 by tiny margin.

But really they're so close it doesn't matter.

What most people notice is a change in gamma/alpha levels in games as well as other default. Some people prefer A to B, just like some people liked the over-saturated digital Vibrance.

Anywhoo, I think most people just see differences for the sake of differences nowadays, but hey if you like A or B then go with it.

Anywhoo this topic has been discussed ad nauseum and there's very little to support the 'feelings' people are having.

-edited to fix open quote-
 


The thing to remember though is while they have to meet DX9/10 spec they can change the way a game functions or even handles shaders to accomplish a same task.

Optimizing techniques for one or the other has been and will continue to be done.
Whether or no there would be much difference between optimizing for one or the other requirs the option to optimize both. Usually the title that get IHV help has optimizations built in, while the other company(ies) usually have to compensate for that with future optimizations in drivers. Main thing is to avoid floptimizations to try to catch up.
 
This topic makes me laugh it keeps coming around like the proverbial bad penny. Image quality is 100% subjective, If you like what you see on your screen then thats fine but as TGGA says its pointless trying to quantify your preferance one way or another with reviews and graphs.
Some like widescreen some like the old 4:3 format, some play with game pads and some use keyboard and mouse. It all comes down to personal preferance and i beleive what you are used to at the end of the day.
Personally i like 4:3 and a keyboard and mouse and think the ATI picture is better than the Nvidia one.
Mactronix
 



You forgot Uranus. Goodnight everybody! Yessss....I knew watching Animaniacs would pay off in the end! I would think that the monitor being used would make a big difference too but after staring at the screen for an hour do you really notice?

Greatgrapeape...you really love the Chrome 400 don't you?
 
Hey I'll pimp any new contender to field.
(Edit: PS, if it sucks I won't recommend it, but until they arrive I'll give them free Promotion/Props/Word of Mouth wahtever, just for the hope they will have something to compete with.)

Until it's finally benched and tested (including AF quality) it's the most intruiging development in the market.

Since up until now, really we've had A's method versus B's method (since intel really keeps fouling up so much they don't count).

I would love to see the AF choicess, the custom AA options (8 modes according to the whitepaper) and more about their Chromotion.

Also think of the architecture difference for the shader hardware and their choice of texture and ROP options. Love to know more.
 
I noticed when I installed the latest Nvidia driver for my 8800 GTX it was automatically set for 16 bit colour. When I install any driver for my ATI products they are always set in 32 bit colour. No doubt Nvidia's drivers are always designed to sacrifice Image Quality for those precious fps.

Indeed, I was absolutely blown away at colour difference between the 8800 GTX and the 3870 in crossfire. Maybe that's why the GTX is better at doing shaders b/c it's colour is naturally dull and faded producing more grays in the picture.

Over the years I have owned mostly ATI products due the real fact they do produce a better looking picture. Pro Nvidia owners are too busy watching the fps counter to bother caring about the actual display quality.
 



Yeah I am curious about it too...I want to know what all of the chromotion is about...<badum dum ching> That would be most excellent to see a 3rd player in the game. However, their lack of testing results leaves me a bit skeptical.
 
That's not the real pic, that's Beyond3D's logo on their photochopped mock-up.

Not much detail yet.
Remember at this point it's just a product launch, they are still lining up players like Sapphire, ASUS, etc to be Board Partners. There may be some already, but I think it's just the early stages of the return of S3, and while it's a 'paper launch' it is the first step when you don't have an existing supply chain to the level of AMD/nVidia.
 


I wish I had taken a screen shot of some 3D objects before and after my upgrade from my older 7800GTX and my 8800GTX, while using the default setting (right after you updated your Nvidia drivers) I noticed a huge difference in the 3D object in the Nvidia control panel that rotates around. This isn't some madeup story its that even without any AA or AF the 3D object looked less jagged around the corners and sides than it did with the 7800GTX.

"I still have the same monitor that I've had for many years", its still the first monitor that I ever owned.
 


Yeah they would, but would you show it to everyone before you can sell it and tip your hat to nVidia and AMD?

-now taking off pimpin hat puttin' on serious hat-

I was dissapointed on Tuesday learning it's essentially a business card for AIBs and OEMs, but it's a step in the right direction, whereas Matrox has only decided to re-number their line without any any new features (to even allow them to run Vista Aero even!). As small an issue as that sounds, that took Matrox out of our buying options simply because some of these rigs last 3-5 years and by then vista may be ready for business.

For now I'm just glad they are at the point where they feel like launching anything, I just wish I had something more concrete, because it always struck me as if the S3 solutions would arrive just in time to be their respective counterparts from last generation while getting bowled over by the concurrent releases by AMD&nV.

If ever it was their time to strike, last summer was it when the GF8600 and HD2600 and their offspring underwhelmed.

Now the mid range is contesting the upper level cards again, so it's gonna be tough for them when they finally do have full volume sales.

-put hat back on-

But Dang Son, you now it's gonna be Fan-Funkin-Tastyc it's got a '40' in the name !! [:thegreatgrapeape:7]
 
I went form x800xl->8800GT, and the only advantage I remember on my x800xl, is that the colors were richer...however!

i just went into my 8800GT Nvidia Control Panel and change some stuff -- let me just say you can make the colors MUCH MORE vibrant than they are on default...basically, default settings on Nvidia cards don't seem to be as good as ATI -- but if you go through all the options, I was able to make the colors significantly better and tweak other things around to get them to look just as good.

This has made me think that Nvidia cards are not the issue...the default settings on the driver are...right now my 8800GT color settings, look much less "grayer"/"fogger"/"duller" in games, I thought the ATI had better color quality in.

My conclusion: Tweak your color settings on your Nvidia card...and you will see that it is very much like ati. This is just from my experience.

BTW: in my Nvidia Control Panel "Digital Vibrant" was set on 0% as default...I changed it to 50% and it was a HUGE difference...anyone else default set at 0%?

This topic about color being better on ATI made me QQ a little at first because I also felt this to be true...but after tweaking some setting...I have to say the color quality is just as good, if not better than what I remember on my ATI card. I am happy again! no more dullness! yayyyyy. =)
 
If i did not fear for my poor smart power 450s life i would add my 8800GTX to my X1900 system. I have a feeling as said its all in the adjustments, But i ALWAYS adjust my screen to my preference(its kind of a warm color....so not as vibrant as some would like)....

I also find that after you use it for even 15 min any color difference goes away.....

Cards i own and compared....

TNT2
Geforce 2mx 200 and 400
Geforce 4 ti4200
Geforce 4mx 440(notebook, its color(the screen on it) honestly looks great)
Rage2c 🙁
9800Pro
X850xt
X1900xt
8800GTX

@ qmalik - Yes mine is at 0 too. and thats just how i like it...i find it makes some colors mesh together...look at the green on the start bar or the back button in the Nvidia control(just select something else so it lights up) panel and you will see after 50 the green shades just become one color of green....that could mess up photoshop.
 


so in the end - You, as I do, believe this discussion about color quality comes down to the fact that ATI drivers have set there colors significantly different than that of nvidia drivers. But with tweaking on either ati or nvidia driver settings, you can basically achieve the same color settings/quality - as to your liking (given the same monitor) on either? correct?

Side Note: I have always believe that image quality (sharpness/detailing) has been better on generation-8 Nvidia...but that the color richness has been better on ATI on default driver settings.

My Bottom line Opinion about image/color quality for 8800GT vs 3870: the 8800GT has a shaper and more refined detailing when it comes to image quality (but not really that noticeable). But when it comes to vibrant/richer colors...3870 at first may seem to have the upperhand (on default settings) - only until you tweak the Nvidia driver settings to your liking and realize that you can get the same color quality with nvidia that you can with ATI. In the end, this is just my opinion - and overall...I believe both cards are great for there price. I also believe that the difference in image quality between the two cards isn't really that noticeable.

Nvida's better image quality:
http://www.pureoverclock.com/review.php?id=647&page=4
http://www.legitreviews.com/article/504/2/
 


Just FYI Digital Vibrance is hardware based and not available on most GF8 cards, they removed it after the GF7 series (probably to save transistors on the huge G80) and only recently did they add it back due to the outcry, but it is actually different than the old implementation because it can't be done the same.

Anywhoo, like I mentioned before, some people like it, but most image pros find it pointless because you can do better with manual settings without over-saturating everything. Using a colourimeter/spider and tweaking will give you the proper picture. Of course some people like stuff over the top like Sony's ViViD setting on their TVs.

But the point being if you spend the time to tweak them both they are very similar. There's still small differences like the choices of algorythms in accomplishing the same task like AA which results in slight alpha/gamma differences, but really even that becomes a preference issue as both a close to but never 100% the same as the CPU refrast output.

edit to combine quotes;

Side Note: I have always believe that image quality (sharpness/detailing) has been better on generation-8 Nvidia...but that the color richness has been better on ATI on default driver settings.

The difference is prior to the GF8 the Geforce series just couldn't keep up either colour wise (no 10bit support) or with AF, now they are physically able to be equals, and it comes down to softare now more than hardware.

Nvida's better image quality:
http://www.pureoverclock.com/revie [...] 647&page=4
http://www.legitreviews.com/article/504/2/

So after all that where you finally come to the conclusion it's very preference and driver setting oriented you compare mature G80 driver to launch R600 drivers which didn't have support for edge detect AA, and doesn't have the programmable tent coverage refinements? It's like writing alot to say they are equal and then relying on a GF7 vs X1K comparison to show they aren't. C'mon!

After all that build up to put those two links in the end of your post makes it look like you haven't learned anything from your own experiences. :pfff:
 
I like pushing my digital vibrance up about 25% on my 8600 gT. It makes the games look a lot more "rich". But, I could see this being a problem for anyone doing any sort of graphic design.
 



I also set mine at 25%, that seems not to be overkill.
 



Most definitely. Even if they couldn't surpass those performances, coming within the ball park would have got them a lot of attention.
 
Well i think qmalik has hit the nail on the head here really there a fist full of built in/third party tweaking apps for either make of card. Now you can beleive this or not but my old AGP system runs a X1650 XT with a Athalon 3000+ cpu 1 gig of ram. Crysis streatches it to breaking point and it uses just about all its got to run Oblivion at decent quality and FPS. I wont go into details about settings (unless you want ) but after tweaking the card with ATT i acheived a better quality picture and a 9-10% increase in FPS.
I have always wondered why there hasnt been some sort of subjective benching done involving these tweaks. Or just for the hell of it. I would love to see what sort of FPS an ATI card could get if it was optimised to do so and at what cost to the image compared to an Nvidia card. And vice a versa what if anything FPS wise it would cost the Nvidia card to match the ATI card for colour depth and image quality. These two things seem to be the basis of these discusions so why cant we get some one at Toms to do a little Myth Busting ? :)
mactronix
 


nonono. u misunderstood me..those 2 links were to show that there have been reviews that favor nvidia image quality over ati (unlike what most have stated in this thread)...that was just some evidence to show the "other side". What you read above those links is how i feel...both new gen cards are basically the same (non-noticeable) image quality and with tweaking, you can achieve very similar color/picture quality.