I'm not saying there's <i>no</i> difference in 32-bit color between 16-bit, I'm just saying if your graphics card takes a hit on high detail with 32-bit, lowering the color to 16-bit is a much better compromise than sacrificing texture resolution. The differences in color between each color upgrade seems to make less & less of an impact as time goes on.
Take monochrome color, when the graphics went to VGA 16 colors that was way better monochrome. Also the bump from 16 colors to 256 colors was also a giant leap in color quality. The bump between 256 & 16-bit wasn't as big of a bump as the other 2 leaps, but it still was a nice enhancement. I even remember playing Rebel Assualt II in the mid-90s and the FMV sequences looked terrific in 256 colors. I thought the graphics were getting getting so good back then, but FMV based games don't use 3d-Hardware rendering. The graphics on Rebel Assualt II were surprisingly good back then, but it was limited in nature because it did not offer the freedom of movement of a 3D-engine. I've compared UT (original UT GOTY) in 16-bit vs. 32-bit many many times. I see a difference in color quality, but its just not that great or noticeable if you are concentrating on actually killing your opponents. Yes, I do play on 32-bit color, but I also definitely think that playing on 16-bit color is worthwhile if framerate or texture resolution must be compromised to play on 32-bit. Did you see my comments about the matrox card. I can play at high quality at great framerates on that card IF AND ONLY IF I use 16-bit color. If I use 32-bit color on the Matrox card, the game is unplayable at any resolution or level of detail, except really low ones. If you think about it, 16-bit is thousands & thousands of colors, and how many wavelengths are there on the visibile light specrum in increments of 1 nanometer, not more than a 1000 I think. Here let me go dig it up in my serries of "Barron's EZ-101 Studdy Keys":
<b>Visible light</b> is nothing more than that part of the elctromagnetic spectrum to which the hman eye is sensitive. The visible region is only a smlall portion of the electromagnetic spectrum, ranging from about 4 x 10^-7 m (400 nm) to 7 x 10^-7m (700 nm).
The length of even 1 nanometer is unvisible to the human eye. With a small range of like 300 nanometers, how many different wavelengths can you cram into that range? In theory, millions, infinite in fact. But If you were to look at the visible light spectrum for 320 nm red, would you see the difference between that and 321 nm red, maybe so, but there won't be much difference. 16-bit color has covers about 65,000 colors or so in this small range. So if you divide the 300nm visible light range by approximately 65,000 you will find out the distance of each increment in color on the 16-bit digital visiblle light spectrum increases by 0.004615 nanometers.
The calculation i used was:
300 nm/65,000 = 0.004615 nanometer increments.
As you can see, each increment is less than .5% of a nanometer. This is actually pretty good. The more bits you go, the more you are just merely fine-tuning the color quality. The increment size of 16-bit color is a much better than the (300 nm) gap of monochrome. I assume the gap is 300 nm because neither black or white exist on the "visible light" spectrum & or supposed to be void colors.
I'll be very intrested in awaiting replies, Williamette or anyone else for that matter on this issue.
My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!