Newbie needs video card help!

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
That's quite possibly right. According to the "crystal ball" the future is uncetain. Why is AMD saying the Hammer chipset is going to change the Integrated graphics market?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
That's quite possibly right. According to the "crystal ball" the future is uncetain. Why is AMD saying the Hammer chipset is going to change the Integrated graphics market?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
UFO, I have to tell you, you are SOO SOO wrong (about 16 vs 32bit color).

Now when ur surfing teh intarweb, or staring at ur desktop, sure, you may not notice much if any difference.

Play any game with fog or culling effects and you will see a huge difference.

Tell me UFO, if there wasn't much of a diff between 16 and 32bit color, why the hell would developers and vid card manufacturers be moving to 16/24/32 bit FLOATING POINT color? Because there just aren't enough colors for a truly realistic image, and smooth transitions.

Maybe your eyes suck ass.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
 
I would tend to agree with WS, but even in he day to day stuff. I don't min te 24bit on my laptop, but 16 bit just bugs me. Looks not quite as 'clean' (hard to describe).

There's just something about it. Especially since some of the colour even when surfing cause strange banding artifacts due to colour interpolation.

Anywhoo I prefer 24+ but hey I'm just glad to be beyond the era of EGA/MCGA, and even that was a nice boost from just CGA. 256 colours SUX for everything!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I'm not saying there's <i>no</i> difference in 32-bit color between 16-bit, I'm just saying if your graphics card takes a hit on high detail with 32-bit, lowering the color to 16-bit is a much better compromise than sacrificing texture resolution. The differences in color between each color upgrade seems to make less & less of an impact as time goes on.

Take monochrome color, when the graphics went to VGA 16 colors that was way better monochrome. Also the bump from 16 colors to 256 colors was also a giant leap in color quality. The bump between 256 & 16-bit wasn't as big of a bump as the other 2 leaps, but it still was a nice enhancement. I even remember playing Rebel Assualt II in the mid-90s and the FMV sequences looked terrific in 256 colors. I thought the graphics were getting getting so good back then, but FMV based games don't use 3d-Hardware rendering. The graphics on Rebel Assualt II were surprisingly good back then, but it was limited in nature because it did not offer the freedom of movement of a 3D-engine. I've compared UT (original UT GOTY) in 16-bit vs. 32-bit many many times. I see a difference in color quality, but its just not that great or noticeable if you are concentrating on actually killing your opponents. Yes, I do play on 32-bit color, but I also definitely think that playing on 16-bit color is worthwhile if framerate or texture resolution must be compromised to play on 32-bit. Did you see my comments about the matrox card. I can play at high quality at great framerates on that card IF AND ONLY IF I use 16-bit color. If I use 32-bit color on the Matrox card, the game is unplayable at any resolution or level of detail, except really low ones. If you think about it, 16-bit is thousands & thousands of colors, and how many wavelengths are there on the visibile light specrum in increments of 1 nanometer, not more than a 1000 I think. Here let me go dig it up in my serries of "Barron's EZ-101 Studdy Keys":

<b>Visible light</b> is nothing more than that part of the elctromagnetic spectrum to which the hman eye is sensitive. The visible region is only a smlall portion of the electromagnetic spectrum, ranging from about 4 x 10^-7 m (400 nm) to 7 x 10^-7m (700 nm).
The length of even 1 nanometer is unvisible to the human eye. With a small range of like 300 nanometers, how many different wavelengths can you cram into that range? In theory, millions, infinite in fact. But If you were to look at the visible light spectrum for 320 nm red, would you see the difference between that and 321 nm red, maybe so, but there won't be much difference. 16-bit color has covers about 65,000 colors or so in this small range. So if you divide the 300nm visible light range by approximately 65,000 you will find out the distance of each increment in color on the 16-bit digital visiblle light spectrum increases by 0.004615 nanometers.

The calculation i used was:

300 nm/65,000 = 0.004615 nanometer increments.

As you can see, each increment is less than .5% of a nanometer. This is actually pretty good. The more bits you go, the more you are just merely fine-tuning the color quality. The increment size of 16-bit color is a much better than the (300 nm) gap of monochrome. I assume the gap is 300 nm because neither black or white exist on the "visible light" spectrum & or supposed to be void colors.

I'll be very intrested in awaiting replies, Williamette or anyone else for that matter on this issue. :smile:

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
We have yet to see an article that compares how well an IGP440 with Dualchannel memory stacks up against a GeForce4 MX 440 add-in card. I really think it was neccessary for them to have included it in that integrated chipset article. The article would have been more logical that way. That way we could see how much of a "true" performance hit an integrated chip that borrows on-board memory takes compared to an exact equivalent add-in card. that would have gone in nicely with Grape's suggestion to include the IGP9100 chip in the article. That article just left to many variables unresolved as far as integrated chipset go.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Bumped because an important reply remains unanswered by Williamette & Grape. Sorry guys, I've just been anticipating your answer all day. Oohh I can't wait.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
Xeen that spectrum only represents colors not intensity.

Theres infinity times infinity possibilities within that spectrum.

16 bit color sux.

Don't have alot of time and don't know off hand I'm sure someone does but if you look at JUST the shades of, say, blue, that 16-bit color can represent, its not alot. L00X like crap.

Someone elaborate:)

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
[/quote]Theres infinity times infinity possibilities within that spectrum.
Agreed, I even mentioned that in my post.

65,000 colorsor 65,000 wavelengths exist on the 16-bit digital visible light spectrum. There are 8 basic colors on the digital light spectrum. Even though the width of the range, varies from color to color if you divide 65,000 by 6 this will give you the average number of different wavelengths for each color that exist for the parameters of the spectrum that I speciifed. This means that there are is an average of 10,833.333 varying wavelengths per each basic color on the visibile light spectrum. This figure however, isn't even remotely accurate for colors like Yellow that only cover 10 nm of the visible light spectrum. If you want more precise values for each basic color, you can calculate it from the ranges of wavelength for each color below.

1. Violet 400 - 424 nm
2. Blue 424 - 491 nm
3. Green 481 - 575 nm
4. Yellow 575 - 585 nm
5. Orange 585 - 647 nm
6. Red 647 - 700 nm

16 bit color sux.[/quote}

I definitely agree that 16-bit color in 3D games is not quite as good as 32-bit, but I don't think it looks "down-right horrid" or "not even worth playing on"

I think my agruement has been taken a little out of context. It's true I'm overemphasizing the wrong points, but I think their worth a look at.

The crux, if you will of my original arguement before your first reply was, that "Sacrificing color-quality is a better compromise then sacrificing texture detail."

So let me put it this way, all I am trying to ask is the following:

Would you rather play a game at full detail @ 16-bit color or low detail @ 32-bit color.

<b>There is indeed a noticeable difference between 16-bit and 32-bit.</b> But what I was trying to say is that this difference becomes less and less apparrant with each color-bit upgrade.

For example: 32-bit makes a more noticeable improvement over 16-bit than 64-bit will to 32-bit.

64-bit makes a more noticeable improvement over 32-bit than 128-bit will to 64-bit

128-bit makes a more noticeable improvement over 64-bit than 256-bit will to 128-bit.

Aww, I'm confusing myself! But basically I am saying that the difference between each progressive color upgrade won't make as big of a noticeable impact as the last color upgrade.

Yes, I know 256 colors sucks by today's standards, but it had a much greater affect against 16 colors than 16-bit had against 256 colors. But 16-bit was a really nice upgrade to.

Mathematically, the visibilie difference between each color upgrade is logarithmic IMO even though the number of wavelenths available in the pallete increases exponentially with each upgrade.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
I don't think that 32bit color makes a bigger diff. over 16bit color than 24/32bit FP will/does over 32bit. Period. It's neccessary.

32bit color over 16bit color DOES NOT make a large performance impact on current cards! So ur hypothetical question would never be 2 real-world choices. And I would always choose 32 bit, as 16 bit color does look horrid. HL2 in 16 bit color? Turn that beatiful pixel-shaded water into a blocky piece of crap.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Your absolutely correct Williamette about that being a non-realworld hypothetical situation on current cards. It was a real-world situation on that crappy Matrox card. I've got a feeling that 64-bit color gaming is on the horizon pretty soon. When the first 64-bit color-capable 3D accelerators hit the market the probably won't have acceptable framerates in games that provide options for 64-bit color. So we're back to the hypothetcal situation, except this time around, its 64-bit vs. 32 bit. You'll be able to run all your games with awesome framerates in 32-bit color on High detail. But when you switch to 64-bit mode, your framerate takes a big dive. You would still go to lower texture res @ 64-bit rather than using a hi texture &hi screen res @32-bit color + better framerates? I respect your opinions & all, if you disagree with me on this last point, I can live with that. I'm just curious, that's all. However, I can almost assure you that will see a much greater difference in 16-bit vs. 32-bit than you will ever see in 32-bit vs. 64-bit.

One last note: I've noticed that on newer cards that 16-bit usually looks more garbled than 16-bit on older cards. In fact, my Voodoo2 didn't have an "airy" or "noisy" appearance in 16-bit like my 8500LE did. I even thought my 8500LE was defective because of the garbled image in 16-bit compared to older cards that I owned. This could be happening to someo fo the other not-so-old cards. If you've seen this "airy" appearance that looks like television static on newer cards, I'd definitely understand why you'd probably think I'm downright crazy. I never thought about this until now when you mentioned the word "blocky". But 16-bit doesn't look so "noisy" on other cards. Maybe Grape has an explanation for this. What I am saying, is you might be experiencing the same affect on your newer graphics card as well.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
Well as it wasn't directed at me I didn't answer it. However since you wanted a follow up I will say this, as someone who has been downgrading 36/48 bit images all week. The difference is there and it's perceptibles, only in the sense that the grass doesn't look quite right, the flowers and people look a little duller and even some colours look a SLIGHT bit different. Someone who didn't see the originals wouldn't notice at all and would say they look perfect. It's usually the transitions that suffer. Now when it comes to games it all depends on what you are playing, like I said about creeper gamers versus jump-n-frag FPS, the creeper doesn't need as high framerate, but would benifit from better IQ at the cost of fps, whereas a shooter that involves alot of movement doesn't need the same level of precision to look good because most of the images are moving, but by the same token losing fps will be very noticeable because of the way the brain perceives motion.

Once again it all comes down to personal preferences and what YOU can perceive and what you find acceptable when you are doing a given task.

BTW, just FYI, there is a VERY rare condition (in women only, and like 1 per ~100 million) where they have an extra cone which gives them even sharper colour diferentiation, and also sharper vision (usually associated with rods oddly enough) The case I read about was a women working for a paint maker. So I'm sure her perception would be different.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
What about in standard apps under Windows 98? I know its an old OS, but its commonly used, and commenly supported. 3 of my systems us Windows 98 SE because we wished to have full compatibility for older DOS programs, like games. Do you ever get a hunger for nastalgia that you can't seem to fill? It happens all the time on my laptop that ses Windows XP. The only way I am able to play classic DOS titles is if they have a Widnows 32-bit port for 3D acceleration which leaves some of them like Wolfenstie, Rise of the Triad, & Dark Forces out of the camp sadly.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!