performance questions

lamer_gamer

Distinguished
Apr 28, 2001
312
0
18,780
I'm new to the graphic card world and have a couple of questions. First off I have an Elsa Gladiac GeForce 2 MX card. It's got 32Megs SDRAM. First of all, what is the screen rez stuff everyone talks about? I run at 600x800. But I see stuff here where people are running way higher resolutions, like in the 1000x1000 type range. Why? For me, it makes my desktop icons and everything else soooo small. And how does it affect frame rates in games? And do they really look better at higher rez's? And does "higher resolution" mean higher numbers(i.e.1000x1000) or does it mean lower numbers (i.e. 600x800). Also, what about color depth? My old card didn't support 32 bit. This one apparently does, but I haven't noticed any real difference when I run it in 32 bit. How does running at 32 bit affect frame rates? And last but not least; I paid less than $100 for this card. But it seems that to go to the next step up is gonna cost well over $200. Isn't there something in between, say the $150 range that comes from a reputable card manufacturer? Sorry for what I'm sure are <i>really</i> lame-ass questions, but other than just plain asking them, I don't know where to start learning. Thanks in advance!

My brain has performed an illegal operation and will be shut down
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
The MX or nVidia 16 bit is pretty good compared to the Radeon's 16 bit which is pretty horrid at times. Well in 32bit you shouldn't see any banding of colors but smooth transitions. It is really a matter of preference, if you can enjoy 16 bit or 8 bit color and enjoy the game as much as I who wants 32bit, then power to you. Resolution is also a preference, I like higher resolutions because it makes the game more realistic if there is such a thing, less jaggies, more sharper. Other people could care less and worry more about the game play vice the candy on the walls. I guess I am a candy person.
 
G

Guest

Guest
i have a radeon SE 64mb, and if u call what i get--a constant 99+ fps in counter-strike (16-BIT COLOR) horid, then you are a moron.
 
G

Guest

Guest
he wasn't talking about speed, he was talking about quality. Some people like quality over speed. Why don't you run counter-strike in 32bit mode? You'd see not noticable loss in speed and it looks quite a bit better I must say.
 

Crashman

Polypheme
Former Staff
For only $70 you could have got the Radeon LE from a pricewatch.com vender such as NewEgg. This card is by far better than the MX, both in speed and video quality.
16-Bit is ok for many games simply because they don't make good use of color. But if you look at a photo in 16-bit, it looks pretty bad compared to 32-bit. And newer games are making better use of the additional color.
The Geforce GTS is a GREAT card for under $150 and almost twice as fast as the MX in many games. It is also much faster than the Radeon LE in 16-bit, but only modestly faster than the LE in 32-bit.
The resolution you feel comfortable with really depends on your monitor quality, size, and your eyesight. I use 800x600 for 14 and 15 inch monitors, 1024x768 for 19 inch monitors, and 1280x1024 or more on a 21-inch monitor. My personal computer has a 19 inch monitor, my desktop is at 1024x768, but I play games at 1600x1200/32-bit color in order to reduce "jaggies" and improve visual quality.

Cast not thine pearls before the swine
 

lamer_gamer

Distinguished
Apr 28, 2001
312
0
18,780
This is a perfect example of what I'm asking. Isn't there a point where higher fps really becomes meaningless? I read somewhere here that T.V. runs at 24fps (fact?). And you don't see jerkiness or stalls. Also, when talking to friend of mine who is into computers, I was told that the further you get from 30fps (as a minimum, not avg.) the less noticable it is. In other words, if a game runs smooth at 30fps, then is it smoother at 40, 50, 100fps? Isn't there a point where it just becomes bragging rights, and not a real improvement in gameplay? I see the point of 32 bit color from noko's post (btw noko & kelder, you're two posters I respect). It just seems sometimes that there's an obsession with getting the latest and greatest of everything (and shelling out big bucks for it) just to get a 10 point improvement in benchmark scores.

My brain has performed an illegal operation and will be shut down
 

lamer_gamer

Distinguished
Apr 28, 2001
312
0
18,780
Ah, Crashman, dude you're another poster who has earned my respect (not that it means a whole lot to you, I'm sure!). You say your desktop is at 1024x768 but you play games at higher rez's. I've avoided switching resolutions because I thought you have to restart the computer anytime you do that. Is that true, or is windows lying to me? One last thing, I'll search again for the GeForce GTS, since you're saying it can be found for <$150, but I've always seen it at $200+.

My brain has performed an illegal operation and will be shut down
 
G

Guest

Guest
If a game doesn't drop below 30fps I'm happy, I'd rather be seeing 40-50fps because you can tell the difference (at least I can). For me, 60fps is the point of no return, some say they can tell the difference from 60fps and 80fps but I can't. The way I usually set things up is turn the poly's to max and all eye candy to max and then check to see what the max resolution I can run at for around 40-50fps. I then check to make sure that the game was made properly and looks nice with a resolution high, some use bad textures or don't let you choose the resolution of the textures ( a big no no to me). After that, I check to see how it performs with 4x fsaa (2x just doesn't reduce the jaggy's for me) on at 800x600 and 1024x768 and compare the performance and how it looks to the higher resolution. I usually weigh towards the quality where it be high resolution or lower with fsaa. And one things, always always at 32bit. I dropped 16bit when I had my tnt2, of course some games may look ok at 16bit but as crashman said, most games today made way better use of the higher bits.

If your concerned about price (who isn't, if your not hand some over to me ok?) then you should really do some research before you buy. With any kind of card, you want the best ns ram you can get for the price. The card I ended up getting was the Leadtek MX SH PRO 32mb. It retails for about $90 (which was my limit) but comes with 5ns ram which actually runs at 245mhz without ram heatsinks. The core goes to 215mhz and I ended up getting performance that nearly touched a standard gts. I know there is Geforce 2 Pro cards out there that have 5ns ddr ram and ultra cards with 4ns ram, those will be the ones to try and find for a good price.
 
G

Guest

Guest
"One more thing, what performance gain does one get from 64 megs vs. 32?"

Means less times the graphics card has to use main memory which can be slow sometimes if you don't have much to spare. Usually only see improvement in higher resolutions.

Here is a good thing to show you how it helps only at 1600x1200
http://www.tomshardware.com/graphic/00q3/000816/suma-05.html#test_results
Note, the SE is clocked higher, the 64mb and 32mb version are the same clock<P ID="edit"><FONT SIZE=-1><EM>Edited by m_kelder on 06/27/01 00:19 AM.</EM></FONT></P>
 

lamer_gamer

Distinguished
Apr 28, 2001
312
0
18,780
Thanks for such a reasoned and well put response. It was just what I needed (I seem to be on one tonight!).

My brain has performed an illegal operation and will be shut down
 

Crashman

Polypheme
Former Staff
Are you Canadian? Because I'm speaking in U.S. currency. Check Pricewatch.com, lots of places carrying the 32MB GTS for well under $150 U.S. Most of the GTS comes with the specified 6ns memory which comes clocked at 333DDR but can accept 380-400.
I use a Leadtek 32MB card and have not had any problems, it is far more powerfull than any of my games require.
Or you could go cheap and get the Radeon LE for around $70 U.S. My wife has one. It scores much lower than the GTS in 16-bit color, but almost as well at 32-bit.
No, you do not have to reboot. Under Display Settings>Settings>Advanced>General, you get several options, reboot before changing, don't reboot before changing, or ask me. Some cards do not support this feature, but the ones that don't are all older cards that don't support 3D gaming anyway. All nVidia cards and Rage/Radeon cards support switching on the fly. You do not need to change your resolution on the desktop to make it higher in games, you simply set the game to change it under options.

Cast not thine pearls before the swine
 
G

Guest

Guest
Hey there. I think resolution is mainly dependent on what size monitor you have. I think someone else basically covered this already. If you think that the icons look too small, you can always increase their size in the display properties.

As for 32bit vs. 16bit, in most current games, I don't think you'll notice much when you're just putzing around on your desktop. But if you're playing games (Deus Ex comes to mind) switching between the two you will definitely notice. With the GF2MX running in 32bit mode will cause a slight hit in performance at the higher resolutions.

Which brings us to your question of how does resolution affect frame-rates ;) Resolution will <b>definitely</b> affect your frame-rates. The higher the resolution, the slower your average frame-rate will be.

And about your TV related question. I believe in the United States all TVs run at 29.97 fps, as opposed to Europe where it is exactly 30 fps (can anyone else back me up on this?). The reason that you want higher frame-rates in a game is that what we see as frame-rates is actually the average, so the higher the average, the less of a chance that it will dip below the minimum of 30 (at which you will notice it isn't smooth). I usually think that anything beyond 60 and 70 is just for bragging, but obviously people will argue about that ;)

BTW: With the GeForce cards, you should never have to restart for a new resolution to take effect. I can't think of any reason why your computer would be asking you though...off the top of my head...hmmm....

And to answer your question on 64Mb vs 32Mb, I believe that the main difference is the amount of textures you can load onto your card. Some games use a whole lot of textures, I believe there is one point in MDK2 where there are over 90Mb of textures visible. Having more memory is always better ;) But it usually won't have that much of an effect on game performance, since most games are designed for the lowest possible grafix card.

Anyhoo, I hope this answers most of your questions. And my apologies to everyone in advance if I've made any incorrect assumptions. Later all!

Sample...