ATI vs Nvidia...Big difference for LCDs?

stephpar

Distinguished
Apr 22, 2002
14
0
18,510
Just trying to get sense of whether there is big difference between the two, since it's been suggested to me that, as a rule, ATI is much better for LCDs than Nvidia. I am moderate gamer looking at getting a 15inch LCD with 25ms response time. It will likely be an analog LCD. Would ATI serve me that much better than Nvidia? Lots of thoughts/experiences would be really appreciated here.

Thx
Steph


--------------------------
Just a broke Canadian Boy!
 
Tecchannel.de has tested the signal quality of graphic cards a couple of months ago. They've found out that especially cards with Nvidia chips have problems here. The reduced signal quality decreases image sharpness and adds ghosting:
http://www.tecchannel.de/hardware/583/index.html

David Kirk - chief scientist of Nvidia - confirmed these finings recently in an interview with tecchannel and theinquirer:

http://www.theinquirer.net/29040212.htm
http://www.tecchannel.de/news/20020424/thema20020424-7375.html

This is why I decided for ATI Radeon.....
 
Are you sure bad signal quality causes ghosting? I've heard of problems with Geforce and one Viewsonic TFT, but is this a general problem?
Anyway, what Kirk admits is that some manufacturers equip their cards with cheap filters. Nothing new, in other words. Leadtek Ti500 TDH has been said to outshine even Matrox, when it comes to 2D quality.
 
about ghosting... tecchannel mentions "shadow images" and reduced sharpness. It is not that kind of 'ghosting' due to slow response time of LCD displays. This effect is better described as "motion blur".
For me it is clear that due to bad signal slopes, signal over- and undershoot and bad line terminations inducing singnal reflections the image quality is reduced. As long as people are talking about this in context with NVidea chips I go with ATI. Though my top preference is still Matrix when it comes to 2d quality but they are a bit expensive.
Maybe DVI is a different story here. But since I have a triple CRT monitor setup which I want to replace by LCD displays I have to look for good analog performance.




<P ID="edit"><FONT SIZE=-1><EM>Edited by blexxun on 05/03/02 05:56 AM.</EM></FONT></P>
 
My German isn't the best (there isn't an English version of this test is there?), but did all of those GF4 cards score equally bad? Weren't the MX line in general worse than the Ti's? In the reviews I've seen of GF4's so far they've been pleased with signal quality. I think Nvidia reference is said to be on par with ATI/Matrox, Leadtek TDH even better.
HardOCP for one was happy with the Gainward Ti4600:
http://www.hardocp.com/reviews/vidcards/gainward/gf4ti4600_ultra750xp/index5.html
 
ya..not all nvidia cards scored bad. It seems that the TI cards performed better than the MX cards. BTW the last column in the test results table (entitled "wertung") corresponds to the total score. I havent found an englisch version of this, anyway you can just read the numbers. The test is really quite impressive. But as usual its hard to translate the measurements into image quality ( Tho they have some examples in the test article) And things depend strongly on your monitor (&cable) as well.
But again..if even Nvidia admits they have an issue here...well what can I say...one must be really careful in picking the right nvidia card...actually there are some good ones (G4TI4600 based)even according to this test.... BTW Matrox G550 was the best.