What are Pixelshaders?

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
</font color=red><font color=green><b>THIS STICKY ROX DUDE!</b></font color=green>

<b><font color=red>EXCEPTIONAL</b></font color=red> work GW! I will comment more later, because I am hard at work on the Chip/card specs project. I kind of see what your getting at towrads the end of your document. To an extent, I definitely consider DX8 & DX9 part of eh same era, because of the focus on Pixel and Vertex sharders for both generations. I can't beleive everyone else passed this up! 😡

</font color=red><A HREF="http://forumz.tomshardware.com/community/modules.php?name=Forums&file=viewtopic&p=37267#37267" target="_new"><b><font color=green>OFFICIAL PETITION - REPEAL SPUD'S BAN?</b></font color=green></A><P ID="edit"><FONT SIZE=-1><EM>Edited by UFO_WARVIPER on 10/26/03 08:39 PM.</EM></FONT></P>
 
I tried out Omega's drivers a while back, and that's when I realized what the drivers actually are. The Omegas are actually nothing but the standard ATi-programmed and compiled driver code as found in the Catalysts (same is true for nVidia Omegas.) The difference is that in the Omega driver sets Omega Guy mixes and matches driver file components from differing driver versions, according to his own personal preferences and formula. The rest of the differences in the drivers are all accomplished by making text changes to the install .inf that comes with the standard drivers. In other words, what the Omega's actually are is a set of "repackaged" standard driver files made from more than one set of an IHV's officially released drivers, with a customized install .inf file (written in text as per the standard .inf format) which he sets up to do simple scripts and things, such as his "softmod" scripts.

You or I sitting at home could do the very same thing since it has nothing to do with changes to the driver's source code in any manner. To be fair, the Omega Guy right on his site has never claimed to actually code any of the drivers he distributes since he started doing his thing, and has always truthfully stated that what he does is simply repackage the standard IHV driver sets into custom groups of files--an Omega driver set "based on" the Cat 3.8's might include various files from the the Cat 3.0's, the Cat 3.4's, or really any version of the Catalysts, if the Omega Guy thinks it's a "better" driver file than the actual component ATi puts in the official 3.8's. I can't much see how that's an approach that won't cause more problems, somewhere, than it solves--and that's why I wasnt tempted by them.

When I tried the Omegas with my 9500Pro, I was pretty unimpressed. Omega Guy had written his driver readme to express his belief that his drivers represented an improvement over the standard official ATi drivers both in performance and IQ. Well, what I discovered after installing them was that the performance definitely improved, but it was because Omega Guy had decided it was a good thing to disable trilinear filtering for ATi products at that time, for the Omega driver set "based on" whatever Cat's it was based on. What I think happened was that in the file mixing process between driver versions that Omega Guy does, something just broke trilinear in the Omegas and he apparently didn't notice it. I'm an IQ kind of guy, so I went back to the standard drivers and I've never tried a set of "modded" drivers since. As I said I can't say, but I would imagine Wizzard does much the same thing as Omega--they simply mix and match driver file components, and write custom install scripts into the standard install .infs that come with the official IHV drivers. Not having the source code from any IHV, of course, means that even if they knew how to write driver code in the first place on even a rudimentary level they would not be able to do so for any IHV's driver set--which is a *good* thing--as what they are doing now, as harmful as it potentially might be to someone's configuration, is not as bad as what could happen if these guys were able to monkey around with the driver code itself on the source level.

You know, I really did agree on one level with nVidia when it decided a couple months ago to try and discourage Omega Guy from continuing to produce nVidia Omegas, and this is one time I believed them when they said many users of their products were reporting problems with them. The Omega approach to "modding drivers" has many strikes against it from the point of view of an IHV, since it completely defeats the IHV's exhaustive and expensive work relative to bug fixes that an IHV does on a continuing basis. On the level of customer relations, however, it's a tough place for an IHV, which is why nVidia backed off of its "hard line" against Omega Detonator driving modding when they perceived it was having a negative effect among their actual and potential customers. And also, I would imagine that less than 5% of both nVidia's and ATI's customers use such "modded" drivers, so there's really no sense in making the issue seem larger than it is. As long as people understand what these driver packages are, and what they aren't, I think it's fine if people want to play around with them or use them. I don't think it is advisable to use them, but that's just my own opinion.


The words "mixed and matched driver files from various official IHV driver releases, along with custom text-based driver install .infs, which themselves are simply the standard IHV install .infs to which text scripts have been added and various other things added and deleted," best describe what the drivers are. "Repackaged IHV driver sets," is the way I would describe them. Someone using such drivers should clearly understand that they can easily defeat all of the beta-testing both in-house and external that an IHV does before releasing an official set of drivers, they absolutely do defeat any "WHQL" testing certification the manufacturer's drivers have that they work correctly with Windows, and depending on which driver files ATi ships with the Catalysts that a modded driver replaces with files from earlier Catalyst release, there may be several bug fixes in the IHV release which are not in the modded driver version which is "based on" that release, if the bug fixes were made in the driver files that Omega or Wizzard choose to replace with older IHV driver files.

By comparison to what Omega is doing, look at what Unwinder the Riva Tuner guy does. There's a lot of difference as what Riva tuner did was to access driver functionality as ascertained by a knowledge of the driver source code, which the author does not pretend to have obtained through "official" channels, and in the cases where he was unable to obtain what he needed that way his overall knowledge of the source structure and functionality allowed him to reverse engineer the rest. Indeed, it was largely because of RT and its efforts to render nVidia driver "optimizations" nil that nVidia began encrypting its D3d drivers--to defeat such efforts in the future. It seems though that RT has still been able to work out a few things anyway with respect to the encrypted Detonators... (BTW, I have no first-hand knowledge that they are currently encrypted, but am simply taking the word of those who say such encryption exists.) The point here for me is that Rivatuner approaches driver configuration at a much deeper level than what the Omegas do, and does it without having to change any of the current release driver files, or through any re-writing and customizing of the standard driver install .inf. Such scripts as Rivatuner uses are internal to the Rivatuner program, AFAIK. Of course this example is only relevant to nVidia drivers, since Unwinder has been unable to get very far with the ATi driver source code... But... I'd also like to say that I stopped using 3rd-party 3d-driver tweaker software myself a while ago because I found myself often experiencing general stability problems while playing games that eventually I was able to trace to two things:

(1) Tweakers were activating functions in the 3d card drivers via registry switches, functionality that was often unfinished or buggy in the drivers, and turned off in the drivers by default as per intent by the IHV, and as such the features when turned on by the tweakers often did not work or else introduced stability problems that were difficult to pin down.

(2) Tweakers (Rivatuner is pretty good about it, though) often uninstall themselves while leaving many if not all of their registry and other .inf file settings changed in the system. This caused me a lot of initial perplexity as I had naively assumed that the Tweakers I had installed would "naturally" do a thorough job of uninstalling themselves, along any settings changes they had made. Even with the offending Tweaker program removed from the system, I found myself continuing to have problems with the driver settings changes they had made which they had failed to remove or reset. Often it's not clear where in the registry or the system a Tweaker makes all of its changes (unlike official IHV drivers which do a good job of uninstalling themselves and their settings), and so I am sure this accounts for that certain percentage of people who claim they "have to" reformat and reinstall Windows to ensure the proper installation of a new official driver set.
I myself have begun making my own drivers, like Omega and W1zzard, and am having fun doing it. There are alot of different combinations to try out, if you have a large enough library of drivers to use a parts. I recommend people try it if they ever have the time.(just dont redistribute them:)


<b>I help because you suck</b>
 
What's a pixel?


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
LOL!

You need a job where you can do both! :wink:

Luke, I am your... Brother.

The one they don't speak of who uses his jedi powers to open beers and lift skirts. :evil:

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
The purpose of this thread is to briefly familiarize you with the vertex and pixel shader functions in modern graphics cards. Anyone reading a graphics card review these days have come across these two words numerous times, and unless you have a small understanding of what these functions are, and what they do for your video games, reading a review can be a frustrating experience. Providing an accurate description of these functions without being too technical is a rather tough experience in itself, but I am going to try to do this to the best of my limited knowledge...here we go...

<b>The Vertex shader</b> = Objects in a 3D scene are typically described using triangles, which in turn are defined by their vertices. A vertex shader is a graphics processing function used to add special effects to objects in a 3D environment by performing mathematical operations on the objects' vertex data. Before DX8, vertex shading effects were so computationally complex that they could only be processed offline using render farms. Now, developers use Vertex Shaders to breathe life and personality into characters and environments, such as fog that dips into a valley and curls over a hill; or true-to-life facial animation such as dimples or wrinkles that appear when a character smiles. People talk about pixel shaders alot these days, but the vertex shader gave birth to the pixel shader, just as important of a discovery as hardware transform and lighting was a few years earlier.

<b>Pixel shaders</b> = A Pixel Shader is a graphics function that calculates effects on a per-pixel basis. Depending on resolution, in excess of 2 million pixels may need to be rendered, lit, shaded, and colored for each frame, at 60 frames per second. That in turn creates a tremendous computational load. Modern cards process this load through Pixel Shaders. Per-pixel shading brings out a high level of surface detail-allowing you to see effects beyond the triangle level. Rather than simply choosing from a pre compiled palette of effects, developers can create their own. Pixel Shaders provide developers with the control for determining the lighting, shading, and color of each individual pixel, allowing them to create some really cool effects.
DirectX 8 brought us pixel shader 1.0, 1.1 and 1.2
DirectX 8 allowed programmers to write shader programs up to 12 instructions in length. After DX8's release, it was determined by programmers and Microsoft themselves that 12 instructions werent quite enough. So immediately after DX8's release, MS gave birth to DirectX 8.1, and introduced us to a couple of new shader...PS 1.3, and PS 1.4
PS 1.4 allowed programmers to now write shaders at nearly twice the size of DX8 shaders...up to 22 instructions in length. So now you have to remember that cards that are DX8(Ti series) support pixel shader 1.0, 1.1, 1.2, and 1.3, but not 1.4. Cards that are DX8.1(Radeon 8500 and up) support all of the same shaders as are required in DX8, but also support PS 1.4
Now that we have DirectX 8 and 8.1 covered briefly, lets talk a little about DirectX 9.
DirectX 9 brings us some new features in the way of pixel and vertex shaders version 2.0. These new shaders have a much higher instruction count in comparison to their directX 8.1 bretheren, and allow game programmers to pull off some even cooler effects then they were able to before. however, the real key feature of DirectX 9 is the introduction of RGBA values in 64 (16-bit FP per color) as well as 128-bit (32-bit FP per color) floating point precision. This large increase of color precision allows a suprisingly new amount of visual effects and picture quality. For those of you reading graphics reviews on the web, or possibly visiting tech forums to find out what the latest buzz is, DirectX 9 cards are diffinately the topic of discussion. People having cards from only one generation ago are more times then not made to feel that they are in desperate need of an upgrade. While this is an expensive pattern to fall into, it happens easily to most people considering themselves to be an enthusiast. To the writer of this thread, it can all seem absurd at times, the vast majority of games that are available to us today, barely take advantage of the DirecX 8 API, much less DirectX 9. You could probably count the number of publicly available DX9 titles on one hand....
So why all the fuss?, because thats how it's always worked. Hardware manufacturer's count on it. The online communities beat it into you. You will believe that you need the latest and greatest. After all, it sucks to never be able to watch all of those cool demos, or join in on discussions regarding one of those few DX9 games that may have just been introduced. I myself have never seen the Dawn demo run on my own, or anybody elses, computer. As silly as it sounds, I feel like a half geek instead of a full fledged geek because of it:)



<b>I help because you suck</b>
 
What's a pixel?


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:

A pixel is one of the many tiny dots that make up the representation of a picture in a computer's memory. Each such information element is not really a dot, nor a square, but an abstract sample. With care, pixels in an image can be reproduced at any size without the appearance of visible dots or squares; but in many contexts, they are reproduced as dots or squares and can be visibly distinct when not fine enough. The intensity of each pixel is variable; in color systems, each pixel has typically three or four dimensions of variability such and Red, Green and Blue, or Cyan, Magenta, Yellow and Black.

http://en.wikipedia.org/wiki/Pixel

Hope I helped. :wink:
 
http://en.wikipedia.org/wiki/Sarcasm

moon0ld.gif
 
To the writer of this thread, it can all seem absurd at times, the vast majority of games that are available to us today, barely take advantage of the DirecX 8 API, much less DirectX 9.

I think you'll find that the vast majority of games use at least one 2.0 shader, and most use quite a few.

You could probably count the number of publicly available DX9 titles on one hand....

That's because it's suicidal to release a DX9-only game while so many people are stuck with DX8 chips (or, God forbid, a DX7 GF MX): but plenty of games rely on DX9 for eye candy. I imagine that will change pretty quickly with Xbox-360 games being ported to the PC.

Anyone who bought a 9700 Pro or 9500 Pro three years ago for the DX9 features is probably still getting a usable performance out of it today in most games: that's pretty good going in the graphics world. Anyone who said 'DX9 isn't important, there are no DX9 games' at that time is probably either running current games in crappy mode or has had to upgrade by now.
 
hey someone up there asked: what is a pixel? hahahahahahahhah!

pixel is a dot that display only one color

and thousands of pixels maks up your sceen
 
hey someone up there asked: what is a pixel? hahahahahahahhah!

pixel is a dot that display only one color

and thousands of pixels maks up your sceen

(Lack of logical thinking + eye scanning error attributed to the non-viewing of two whole posts) ftl!
 
Ok, but does anyone know what Anit-Aliasing is?
Anit-Aliasing is the smothing off jagged lines. for example if you look at the edge of a wall while playing a game, it might appear jaggy this is because no anti-aliasing is enabled. if anti-ailiasing is enabled you will notice that the edge is much smoother. the more anti aliasing you enable(2x, 4x, 6x, and so on) the more of an effect it will have on objects hence making your overall video quality much better. it comes at a great price though, your overall preformance will drop a decent amount, thats why older cards should not enable anti aliasing on newer games.
 
Very informative - thanks!

But a couple of related questions that I've always wanted to ask. GPU core clock speeds are typically in the 400 to 600mhz range. This seems slow when compared to the clock speeds of AMD and Intel processors. Do you know why gpu's don't run at much higher clock speeds? Is a pixel shader the equivalent of a core in a multi-core processor? If a gpu has 16 pixel shaders, does that mean the gpu has the equivalent of 16 cores that concurrently process pixels? Would that be a reasonable explanation for the lower clock speed?

Thanks and Best