• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Graphic card buying guide!

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Before I write anything, I should thank you for making a good suggestive post.

Removing the FX cards form the buyers guide is a joke, of the 200+ popular games on the shelves being bought, the FX cards have trouble with only a few. I wouldnt even call it having trouble, I would merely say that they dont perform as well as the competitions.
FX5900's are uder $250.00dollars in stores now, and I personally think thats a good deal
The whole GF-FX series is a joke. Who currently buys a midrange or high-end card to play 1/2 year old game? R3x0 series cards can also play older games at same good speed and looks MUCH better for near future games. So why anybody should bother with GF-FX cards? Don't forget about cheaptimized and buggy drivers. On the other hand, R3x0 series cards have stable and cheat free drivers.

I'm not telling people who own GF-FX cards to sell their current card and spend extra money for R3x0 card. How can I recommend GF-FX cards for a new purchase? (unless I'm an nVidiot)

This isn't making the guide biased. Neutrality doesn't mean I have to pick things from both company.

The Radeon 9600Pro is as low as $139.00 in stores, this should be at the top of the list.

should read full DX8.1 support

DX8.1
I'll modify these parts, ASAP

The MX440 shouldnt be in this catagory, it's a $50.00 dollar card, and performs good for that price. Not everyone reading this is a hardcore gamer
A non-gamer usually sticks with integrated graphics. If a non-gamer buys a video card, then it should be Radeon 9000 (non-Pro) or Radeon 9200. When I upgraded to Radeon 9000 from Integrated GeForce2 MX, the 2D quality improvement was noticable. Moreover, ATI cards are well known for better DVD Playback.

It does too overclock good
The GPU may overclock good, but how much you can expect from 5.0ns memory? Moreover, they doesn't cost significantly less than Radeon 9600 Pro.

Yes, I'll add few lines here

These cards can be had for cheap. I disagree that they arent worthy of consideration
They are cheaper than Radeon 9600 Pro? And how how much cheaper so that I can ignore DX9 support and 2x better AA+AF performance?

I guess dynamic overclocking and temperature monitoring arent new features?....c'mon dude
I was taking core enhancements in account. Bascically R9800XT is bit higher clocked Radeon 9800 Pro with bit faster memory and nothing else.

I would take Sapphire over Hercules any day.....better ram
Collect few more votes, and Sapphire will be promoted. 100% gurranted.

Many people have BAD experience with PowerColor cards. Daytona Palit is the "Mr. 64bit memory" company. They make 128 bit SDR cards as 64 bit SDR card and of course, 128bit DDR cards as 64bit DDR card. This is also true for many unknown/less known nVidia based card manufacturers.

While telling people which card is better is good, it still doesnt tell them why one is better then another. When you tell an average person that one card has DX8 support, and one has DX7 support, you arent telling that person a whole lot. Somewhere in this guide, there needs to be a brief description of pixel/vertex shaders, memory bus width, what DX spec calls for(not just DX9, but DX8.1, DX8, DX7) I wouldnt go as far as trying to describe the fundamentals of floating point precisions in modern architectures, but there's alot that needs to be covered if you guy's want to make this guide really great. I think that a full description of what each generation of graphics card is capable of(fillrate/shader performance/poly count/which shaders the card uses/memory bandwidth etc.)would be of more use to the new people then just telling them which card is better. I would want to know 'why' that card you recomended is better. Unless a person can get that answer in the buyers guide, their only going to start a thread about it asking the question anyways....and stopping the same redundant questions from being asked is the reason why we wanted the buyers guide in place to begin with.
Yeah, we are considering a clickable link with every card. The clickable link should open a page with specs. of the certain card.

We will also add a clickable link on top of the thread. This page will have explainaiton of basic technical things such as Anti-Aliasing, Anisotropic filtering etc.

I will host these webpages. I'm not gonna leave THGC anytime soon or not leaving without letting anybody know. So don't worry about hosting these pages in my webspace.<P ID="edit"><FONT SIZE=-1><EM>Edited by Spitfire_x86 on 10/14/03 05:24 AM.</EM></FONT></P>
 
Cards in this buyers guide are going to be ranging from DX8 to DX9. Some of you guys know a little about vertex shaders and pixel shaders, but because these elements of rendering are rarely discussed on these forums, I've compiled a page of what I would expect anyone who considers themselves to be a knowledgable enthusiaste to already know. This stuff is really just the basics
The Vertex shader = Objects in a 3D scene are typically described using triangles, which in turn are defined by their vertices. A vertex shader is a graphics processing function used to add special effects to objects in a 3D environment by performing mathematical operations on the objects' vertex data. Before DX8, vertex shading effects were so computationally complex that they could only be processed offline using server farms. Now, developers can use Vertex Shaders to breathe life and personality into characters and environments, such as fog that dips into a valley and curls over a hill; or true-to-life facial animation such as dimples or wrinkles that appear when a character smiles. People talk about pixel shaders alot these days, but the vertex shader gave birth to the pixel shader, just as important of a discovery then hardware transform and lighting was a few years earlier.

Pixel shaders = A Pixel Shader is a graphics function that calculates effects on a per-pixel basis. Depending on resolution, in excess of 2 million pixels may need to be rendered, lit, shaded, and colored for each frame, at 60 frames per second. That in turn creates a tremendous computational load. Modern cards process this load through Pixel Shaders. Per-pixel shading brings out a high level of surface detail—allowing you to see effects beyond the triangle level. Rather than simply choosing from a pre compiled palette of effects, developers can create their own. Pixel Shaders provide developers with the control for determining the lighting, shading, and color of each individual pixel, allowing them to create some really cool effects.
DirectX 8 brought us pixel shader 1.0, 1.1, 1.2, and 1.3.
DirectX 8 allowed programmers to write pixelshader up to 12 instructions in length. immediately after DX8's release, it was determined by programmers and Microsoft themselves that 12 instructions werent quite enough. Thats why right after DX8's release, MS gave birth to DirectX 8.1, and introduced us to a new shader...PS 1.4
PS 1.4 allowed programmers to now write shaders at nearly twice the size of DX8 shaders...up to 22 instruction lengths. So now you have to remember that cards that are DX8(Ti series) support pixel shader 1.0, 1.1, 1.2, and 1.3, but not 1.4 Cards that are DX8.1(Radeon 8500 and up) support all of the same shaders as are required in DX8, but also support PS 1.4
PS 1.4 support is why the Radeon 8500 was such a revolutionary card compared to the Geforce 3&4.
I've also seen some people struggle with the understanding of pixel shaders and OpenGL. OpenGL has pixel shaders, but they arent called pixel shaders, they're called ARB_FRAGMENT_SHADERS, but they basically perform the same functions.
I'm starting to lose track of what I'm doing so I know that I'm getting tired:)
Tomarrow we'll tackle DX9 and all of the fun stuff that entails.
CoolSquirtle and UFO, if I'm typing all of this info out and you guys dont somehow use it in your project, I will personally rip off your heads and shiit down your wind pipes:)
This will be considered as an extremely valuable resource for the Technical FAQ. Thanks again!


----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
Dang Spitfire, do you HAVE to do all the work? Some others including myself would be delighted to share the burden of the responsbility of all this work. Plz give me some taks or assignment, & I'll get to work.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
UFO....Your the chearleader:)



Spitfire, looking back on what I wrote, I dont really see how usefull any of it would be for the buyers guide. Maybe you can do something with it though...

<b>I help because you suck</b>
 
GW, I got to hand it to you. I though when I started reading this thread that you were going to go after UFO. I know he sounds awfully anxious, but he's young. I go, Oh No! 😱

Then you posted what you thought instead of going on & on without doing anything, but going on & on. Takes a Big Man to do something about something, rather than give lip service. I think how you handled this is a great example to the forum. Ratcheted up my respect for you a bit.

BTW your sig. is true. :wink:

Dazzle them with Brilliance, or Baffle them with BS! :wink:
 
Your second post contents will not be used directly in the buyer's guide, it will be used in the linked "Technical FAQ"

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
Oh yes, of course you'll get some work. At this moment, I'm giving you data collecting job.

Collect data in following format:

Core Clock:
Memory Clock:
Memory Bus:
Memory Type:
Memory Bandwidth:
Fill rate:
Pixel Shader version:
Vertrex Shader version:
DX support:
Max. AA:
Max. AF:
RAMDAC:
AGP Interface:
Other features: (like video capabilities etc. etc.)



Collect Data for following cards:

GeForce3 / Ti200 / Ti500
GeForce4 MX 420/440SE/440/460
GeForce4 Ti 4200/4200-8x/4400/4800SE/4600/4800
GeForce FX 5200/5200 Ultra
GeForce FX 5600/5600 Ultra/5600 Ultra rev.2
GeForce FX 5800/5800 Ultra
GeForce FX 5900/5900 Ultra
GeForce FX 5950

Xabre 400/600

Radeon 8500/8500LE
Radeon 9000/9000 PRO
Radeon 9100
Radeon 9200
Radeon 9500/9500 Pro
Radeon 9600/9600 Pro
Radeon 9700/9700 Pro
Radeon 9800/9800SE/9800 Pro
Radeon 9800XT



After collecting data, send them via PM.

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
I'm on it. Just one question though. Since some manufactururers deviate from stock speeds and slower memory on certain cards, do you want mme to include info on those as well with deviations?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
Cards in this buyers guide are going to be ranging from DX8 to DX9. Some of you guys know a little about vertex shaders and pixel shaders, but because these elements of rendering are rarely discussed on these forums, I've compiled a page of what I would expect anyone who considers themselves to be a knowledgable enthusiaste to already know. This stuff is really just the basics
The Vertex shader = Objects in a 3D scene are typically described using triangles, which in turn are defined by their vertices. A vertex shader is a graphics processing function used to add special effects to objects in a 3D environment by performing mathematical operations on the objects' vertex data. Before DX8, vertex shading effects were so computationally complex that they could only be processed offline using server farms. Now, developers can use Vertex Shaders to breathe life and personality into characters and environments, such as fog that dips into a valley and curls over a hill; or true-to-life facial animation such as dimples or wrinkles that appear when a character smiles. People talk about pixel shaders alot these days, but the vertex shader gave birth to the pixel shader, just as important of a discovery then hardware transform and lighting was a few years earlier.

Pixel shaders = A Pixel Shader is a graphics function that calculates effects on a per-pixel basis. Depending on resolution, in excess of 2 million pixels may need to be rendered, lit, shaded, and colored for each frame, at 60 frames per second. That in turn creates a tremendous computational load. Modern cards process this load through Pixel Shaders. Per-pixel shading brings out a high level of surface detail—allowing you to see effects beyond the triangle level. Rather than simply choosing from a pre compiled palette of effects, developers can create their own. Pixel Shaders provide developers with the control for determining the lighting, shading, and color of each individual pixel, allowing them to create some really cool effects.
DirectX 8 brought us pixel shader 1.0, 1.1, 1.2, and 1.3.
DirectX 8 allowed programmers to write pixelshader up to 12 instructions in length. immediately after DX8's release, it was determined by programmers and Microsoft themselves that 12 instructions werent quite enough. Thats why right after DX8's release, MS gave birth to DirectX 8.1, and introduced us to a new shader...PS 1.4
PS 1.4 allowed programmers to now write shaders at nearly twice the size of DX8 shaders...up to 22 instruction lengths. So now you have to remember that cards that are DX8(Ti series) support pixel shader 1.0, 1.1, 1.2, and 1.3, but not 1.4 Cards that are DX8.1(Radeon 8500 and up) support all of the same shaders as are required in DX8, but also support PS 1.4
PS 1.4 support is why the Radeon 8500 was such a revolutionary card compared to the Geforce 3&4.
I've also seen some people struggle with the understanding of pixel shaders and OpenGL. OpenGL has pixel shaders, but they arent called pixel shaders, they're called ARB_FRAGMENT_SHADERS, but they basically perform the same functions.
I'm starting to lose track of what I'm doing so I know that I'm getting tired:)
Tomarrow we'll tackle DX9 and all of the fun stuff that entails.
CoolSquirtle and UFO, if I'm typing all of this info out and you guys dont somehow use it in your project, I will personally rip off your heads and shiit down your wind pipes:)

GW you're a knowledge whore. :wink:

Great new info there for most of us btw.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>This just in, over 56 no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol:
 
Good morning RC. I was surprised to see the sudden change of heart in GW also, especially after my non-constructively calling him a pessimist. That really meant alot to me that he was willing to analyze the whole guide in its current state and propose what needed to be done with it.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
You?
BOING?

:wink:
You can boink me anytime!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>This just in, over 56 no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol:
 
Boring? Are you kidding? Technical info is FAR from boring, its AWESOME. Not only that, technical information is something that we need alot more of around here. If you feel like posting technical stuff, don't hold back at all I love it.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
I'm on it. Just one question though. Since some manufactururers deviate from stock speeds and slower memory on certain cards, do you want mme to include info on those as well with deviations?
Include the official spec. of nVidia/ATI/SiS. I'll leave a note at the end of every page regarding this issue

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
For example, GeForce4 MX440SE memory spec should be 166 MHz/128 bit DDR, though most of the manufacturers make MX440SE cards with 64bit memory

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
Don't forget Trueform with the 8500...it looked sooo friggen sweet with Q3

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 
R u going to take on all the manufactureers?
Would it be OK if I take on Asus for u? 'Cause right now, they r the only one releasing both Ati and nVidia, that's why I am interested. WOuld it be ok?

System Integration...yeah right, thanks to marketing, more confusion
 
Instead of asking, I would simply put something together, and turn it in to them. If you present them with something usefull, they'll more then likely incorperate it into one aspect or another of the buyers guide.

<b>I help because you suck</b>
 
*cough* you guys just wont let me rest wont u -_-
lemme fix this up



Special GeForce FX notes: All GeForce FX cards are temporarily removed from this buyers' guide due to their serious performance problems in Half Life 2 and unimpressive performance with other DX9 games. If Detonator 50 driver solves the problems without cheating, we will again add GeForce FX series cards in this buyers' guide.

Till then, avioid all GeForce FX cards by all means.
Removing the FX cards form the buyers guide is a joke, of the 200+ popular games on the shelves being bought, the FX cards have trouble with only a few. I wouldnt even call it having trouble, I would merely say that they dont perform as well as the competitions.
FX5900's are uder $250.00dollars in stores now, and I personally think thats a good deal

*cough* didn't i say i'll modify it when the DET 52s come out?

FAQ
~~~

#1) What does "FAQ" means?
Ans: Frequently Asked Questions

#2) Should I read this guide before I'm going to ask a question about buying graphics card?
Ans: YES!

#3) I don't understand the techincal things of this guide
Ans: A technical FAQ is coming, it should make everthing clear.

#4) There's no list of recommended card manufacturers.
Ans: See at the bottom of the guide.

#5) Where can I search for prices?
Ans: See at the bottom of the guide.

#6) I think this guide is "x" brand biased!
Ans: Of course not!

#7) I have comments/suggestions/flames about this guide.
Ans: See at the bottom of the guide.



_______________________________________________________
Start of the FAQ



_____________________

Value Cards (upto $150)
_____________________



Recommended Cards


Best buy: GeForce4 Ti4200 (64 MB version)
The Radeon 9600Pro is as low as $139.00 in stores, this should be at the top of the list.

*cough* i'm sorry GW, not everyone is lucky like Americans R9600pro costs an arm and a leg in Canada as well as other places. Change it back please!

Other good cards in this price category: Radeon 8500LE/Radeon 9100, Radeon 9200 Pro/non-Pro,


GeForce4 Ti4200

Pros- Fastest out of all those cards, cheap
Cons- No DX9 support, AA/AF can hurt performance
Sidenote: There are three types of Ti4200
Ti4200 64mb 4x/8x- cheapest, second fastest(clocked higher), better quality ram
Ti4200 128mb- slowest out of the three, said to have cheap ram to reduce cost(clocked lower)
Ti4200-8x 128mb- 8X AGP, slightly faster than Ti4200 64 MB, uses good ram

Radeon 8500/8500LE/9100

Pros- Cheap, full DX8should read full DX8.1 support support, good image quality
Cons- No DX9 support, not so fast, 8500/8500LE are discontinued product


Radeon 9000 Pro/9200 Pro

Pros- Cheap, full DX8 DX8.1 support, good image quality
Cons- No DX9 support, not so fast



Cards to Avoid


SiS Xabre 600/400

Pros- Cheap
Cons- Speed comes at the cost of heavy quality loss, very slow when quality on par with ATI/nVidia cards, poor driver support


GeForce4 MX440
The MX440 shouldnt be in this catagory, it's a $50.00 dollar card, and performs good for that price. Not everyone reading this is a hardcore gamer.

*cough* again! not everyone here are american!

Pros- Cheap
Cons- No DX8 support, basically turbo charged GeForce2 MX with Video processing engine. Slow in DX8 games


Radeon 9500 (non-Pro)/Radeon 9600 (non-Pro)
Pros- DX9 support, Best AA and AF performance among value cards
Cons- Radeon 9500 (non-Pro) sometimes can be very slow due to it's 4 pixel pipeline design. Radeon 9600 (non-Pro) is bandwidth limited and unlike the pro version, not good overclocker.It does too overclock good

*cough* why did u even put it into "cards to avoid"?

Special notes for budget graphics card buyers:

Avoid 128 MB versions and save money. All of these budget cards aren't powerful enough to make use of 128 MB memory. Moreover, many 128 Radeon 8500LE/9100 cards are slower than the 64 MB version, because they come with slower memory.




__________________________

Midrange Cards ($150 to $300)
__________________________


Recommended Cards


Best buy: Radeon 9700 (non-Pro)/9800 (non-Pro)

Other good cards in this price category: Radeon 9500 Pro, Radeon 9600 Pro, GeForce FX5600 Ultra rev2.0


Radeon 9700 (non-Pro)/9800 (non-Pro)
Pros- DX9 support, Fastest out of all those cards, good price/performance ratio, very good AA/AF peformance
Cons- Most expensive among listed cards, 9700 (non-Pro) is quite hard to find www.pricewatch.com

*cough* AGAIN! not everyone is american

Radeon 9500 Pro

Pros- DX9 support, Inexpensive, Speedy, good price/performance ratio, very good AA/AF peformance
Cons- Discontinued product, Hard to findnot where I live

*cough* OMFG! SORRY! IT'S NOT OUR FAULT WE'RE NOT AMERICAN!

Radeon 9600 Pro

Pros- DX9 support, Inexpensive, Speedy, very good overclocker,some are good price/performance ratio, very good AA/AF peformance
Cons- slower than Radeon 9500 Pro



Cards to Avoid

GeForce4 Ti4800/4600/4800SE/4400
These cards can be had for cheap. I disagree that they arent worthy of consideration

*cough* speechless

Pros- Inexpensive, Runs most games smoothly without AA and AF
Cons- No DX9 support, Get wasted by Radeons, not really worth the price anymore


Special notes for midrange graphics card buyers:


Don't think about buying 256 MB version GeForceFX 5600 cards. They aren't fast enough to utilize 256 MB VRAM.

GeForce4 Ti4800 = GeForce4 Ti4600 + AGP 8X; GeForce4 Ti4800SE = GeForce4 Ti4400 + AGP 8X







__________________________

High-End Cards ($300 and up)
__________________________



Recommended Cards

Radeon 9800XT

Pros- DX9 support, fastest card, very good AA/AF performance
Cons- very expensive, only little faster than Radeon 9800 Pro 128 MB, no new features compared to Radeon 9800 PRO
I guess dynamic overclocking and temperature monitoring arent new features?....c'mon dude


Radeon 9800 Pro (128 MB version)

Pros- DX9 support, very fast, very good AA/AF performance
Cons- Expensive


Radeon 9700 Pro

Pros- DX9 support, not much slower than Radeon 9800 Pro, best price/performance ratio in this category, very good AA/AF peformance
Cons- none



/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\



Recommended Brands: (sorted alphabetically)


For ATI based cards:
Best: Built by ATI, Hercules
Good: Gigabyte, Sapphire
I would take Sapphire over Hercules any day.....better ram
For nVidia based cards:
Best: ASUS, MSI, Leadtek, PNY
Good: ABIT, Gainward


Brands to avoid: (sorted alphabetically)

For ATI based cards: Gigacube, PowerColor(huh?)

For nVidia based cards: Daytona Palit(huh?), and any unknown/less known brand
While telling people which card is better is good, it still doesnt tell them why one is better then another. When you tell an average person that one card has DX8 support, and one has DX7 support, you arent telling that person a whole lot. Somewhere in this guide, there needs to be a brief description of pixel/vertex shaders, memory bus width, what DX spec calls for(not just DX9, but DX8.1, DX8, DX7) I wouldnt go as far as trying to describe the fundamentals of floating point precisions in modern architectures, but there's alot that needs to be covered if you guy's want to make this guide really great. I think that a full description of what each generation of graphics card is capable of(fillrate/shader performance/poly count/which shaders the card uses/memory bandwidth etc.)would be of more use to the new people then just telling them which card is better. I would want to know 'why' that card you recomended is better. Unless a person can get that answer in the buyers guide, their only going to start a thread about it asking the question anyways....and stopping the same redundant questions from being asked is the reason why we wanted the buyers guide in place to begin with.

*cough* god.........

BORING? sorry guys, not everyone wants to read all that stuff! that's what the technical FAQ is for? hmm it's for people who actually want to learn about video cards VS people who wants to buy a card taht can run XXX game/program (the majority)

and spitfire...... just lemme review stuff before u go changing it 😀


RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy
bring it fanATics~~ nVidia PWNS all!
SCREW aBOX! LONG LIVE nBOX!!