Are screensavers needed with today's monitors?

BLiTZ

Distinguished
Aug 8, 2002
20
0
18,510
Basically as the subject said, I'm curious to find out whether one really needs to run a screen saver anymore, or is screen burn-in a thing of the past? If this is still a problem, how long can one keep the same image on the screen without damaging your monitor?

I personally don't think this problem exists anymore, as there are a fair number of games, (strategy especially), that keep a static menu on part of the screen at all times...and often I've played for hours on end with no screensaver or switching to another program.

This is just for curiousity sakes really...:)
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
To my knowledge color monitors today are using the same (p22) medium short persistence phosphor that was used many years ago. Thus monitors today are just as susceptible to phosphor burn as they ever where. With every technology there is a trade off. One of the benefits of aperture grill technology is the fact that more or the electron beam current hits the phosphors. Thus Aperture grill monitors may be more susceptible to screen burn than shadow mask monitors.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

BLiTZ

Distinguished
Aug 8, 2002
20
0
18,510
Thanks for your reply Jim!

I'm sure it varies, but would there be a usual minimum time required, such as 10 hours without a screensaver, and not changing the screen to another program or leaving it static, or is a matter of a few hours perhaps? Maybe days?

I suppose there would be a way to check out a monitor for this, a blank white screen, and look for what, ghost images that won't go away I suppose...?
 

max_clif

Distinguished
Jul 23, 2002
25
0
18,540
GoSharks, I was wondering, if the monitor is stated as 21" but you can only view around 19.8" of that, why is it that manufacturers persist in covering up part of the tube with black paint at the edges?
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Max_clif

Because the focus and convergence falls off rapidly as you get to edge of the screen. You would notice a large difference and complain that the monitor was out of focus in the corners and around the edges. You can only deflect a beam a certain amount before it spreads out and becomes fuzzy. Take a flashlight and aim it at a wall. Notice the nice round dot. Now aim it at the corner of the wall, notice how the beam shape changes (spreads out). This is what happens inside the monitor as you project the beam up into the corners and around the edges.

Mattburklund

The q910 is a great little monitor. And I mean that literally. The Q910 is the smallest physical sized 19” monitor on the market today. If you have space restrictions and cannot afford a flat panel LCD monitor, the Q910 is a perfect fit. It support 1600 x 1200 at 85Hz and uses a Flat Shadow mask 0.2mm CRT.

If I were doing text based applications this would be a unit I would consider, see this link.

http://shop.monitorsdirect.com/product.asp?sku=1928023

If I were a game player I would go for the Cornerstone p2450. The p2450 uses the Mitsubishi Natural Flat aperture grill CRT. The p2450 is $225 with free shipping this month. See this link.

http://shop.monitorsdirect.com/product.asp?sku=1858365

Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com


<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

max_clif

Distinguished
Jul 23, 2002
25
0
18,540
Thanks for the reply.

And just wondering, under windows 98 in powermanagement, what do you suggest as a setting for Turning Off Monitor in minutes?

I heard that if u turn it off too often you can destroy the electronic components inside, but if you never turn off monitor, you wear out the tube.

As an expert in monitors, what do you suggest as a good setting?
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
The argument of turning off vs. leaving on has been debated many times and there are two camps. One that says turn the monitors off when not in use to save life span the other says leave it on to reduce the strain of the components heating up and cooling down. I do not know of any magic number of hours used to determine which is better.

The popular argument is that power-on is a stressful time for any electronic device and reducing the number of power cycles will prolong the life. This thermal cycling resulting from turning a monitor on and off will shorten its life can contribute to various kinds of failures, however, this is almost never the case in normal use.

Leaving the unit on for long periods of time has some downfalls as well. For example: The heat generated inside a monitor tends to dry out parts like electrolytic capacitors thus shortening their life.

Modern monitors use a lot less energy than older models. LCDs consume about half the amount of energy of a comparable size CRT monitor. Even in power save modes they still consume 3-5 watts. Every little bit helps especially here in California, it doesn’t seem like much, but add up all the monitors in use and multiply by 3 watts and the number is quite substantial.

I lean to the “turn the unit off” or use the power save modes after 30 minutes camp, especially when we are talking about monitors.

Monitor longevity Top ten list

1. Run at the manufacturers recommended resolution and refresh rate
2. Use indirect lighting, turn the brightness and contrast down (this will also improve focus on a CRT )
3. Turn off the unit when not in use for long periods. Use saving modes, use a screen saver for shorter periods (over 15 minutes)
4. Keep unit in a well ventilated area (Fans are not necessary) if you do steps 1 - 3
5. Keep your fingers off the face
6. Keep monitor a sufficient distance from anything that creates a magnetic field (CRT especially)
7. Degauss once a day (not necessary if you turn the unit off, CRT only)
8. Do not place anything on top of the monitor especially containers of water, do not block ventilation slots, top, bottom and sides
9. Use a surge protector (Unplug the unit during thunderstorms)
10. Avoid direct sunlight

Turning the unit off when not is use for long periods of time (> 30 minutes) will extend the life of the monitor (and your investment) and is good for the environment as well.

Jim Witkowski
Chief Hardware Engineer
Cornerstone / Monitorsdirect.com



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

max_clif

Distinguished
Jul 23, 2002
25
0
18,540
I turn mine off every 25 mins using the power management.

Ok thanks. One last question, is there ANY chance at all that an inexperienced end user like me can fix a miscovergence of electron guns (on Trinitron), or fuzziness at corners, without special equipment? Im pretty good with electronics but have never tried openning up a monitor before. Yes, I do know about the shock hazards :p

Im very curious about this, and I had to ask you since you are an engineer.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Odds are you will only make it worse, if you do not have the training and right equipment.

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

orbz

Distinguished
Nov 11, 2001
772
0
18,980
Thanks for all the tips.

Does the rgb color temp. settings affect the life of the monitor?

<i><font color=blue>The only difference between 200 fps and 300 fps is the blink of the eyes.</font color=blue></i>
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
Really?! Isn't increasing the color temp really just increasing the brightness of color(s) and doesn't increasing brightness decrease phosphor life? I've noticed that increasing rgb together at the same rate definitetly produced brighter images. Could you expound the not so simple answer to color temperature increasing? Thanks.

Quality is better than name brand, even regarding beloved AMD.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Brightness or luminance output is controlled by the contrast control. Black level is controlled by the brightness control. Yes I know this is backwards but it comes from the first TV sets.

Color temperature is simply how the red green and blue colors are mixed.

Most monitors have preset setting for three color temperatures, check your user guide.

9300 deg. K is the standard "computer monitor" white, and gives the highest absolute luminance but is distinctly bluish in color. Every common desktop monitor that I know of defaults to 9300 deg. K color temperature.

6500 is generally considered a "whiter" white, and is often referred to as "daylight" white. It's the standard white point used in the TV industry, and so is the best choice if you're displaying video on your computer monitor. You gamers may want to check this out, flesh tones are better etc.

Another common "standard" white is 5500 or 5000 deg. K, a bit redder than 6500 and often called "paper" white. As you might guess, it's most often used in document review or photographic and pre-press applications.

Jim Witkowski
Chief Hardware Engineer
Cornerstone



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
I already know that 9300K, 6500K, and 5000K are some of the most common temperature standards. There are two things going on when you talk about the overall temperature. First is the relative ratios of red, green, and blue (this causes tints and is what you mean by "mixing"). This mixing creates tints as you emphasize or de-emphasize either r, g, or b. Secondly, the higher the overall temperature value, the higher absolute temperature per color. My question deals with the second notion of temperature.

I didn't ask about the brightness and contrast defs also, as I already read that somewhere.

You still haven't answered my question. Keeping the rates of each color rise constant (take the ratios out), what is the effect of increasing temeperature? Obviously, the color becomes "brighter" and if raised collectively results in bright colors of the same tint (since the ratio hasn't changed), but what causes this?? I think the overall strength of emitted light (sorry I don't know the technical term but I think it's brightness or luminescense), per color, is increased as you increase the temperature. Is this wrong? Please explain technically what, for just one color such as red, happens when you increase the temperature. As I said, I suspect you are increasing the intensity of that component color, which is tantamount to increasing brightness. Now, if you mess with the ratios, you'll have different tints. But if you raise all at the same time everything becomes brighter at the same rate and a brighter image is achieved similar to just raising overall "brightness" (not the technical definition, just as in higher light output).

Please ask me questions if my question isn't clear.

Thank ya.

Quality is better than name brand, even regarding beloved AMD.
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
Jim, I know you are a monitor engineer, but maybe you can help me with color correction of monitors through the video board or operating system (it's related). Also, this question is not pertinent to the thread, so please forgive the forum faux paux, but I've tried the other forums (video editing and graphics card) to no avail.

Does the geforce 4 ti4600 ramdac support transfer table function editing to correct monitor compression of mid-tone colors and cosequently grays (grays are just a mix of colors afterall)? Alternatively, can you correct this compression for monitor in the operating system (xp or win98)? I already know that you can do it for for certain programs (i.e. photoshop), but I'm looking for a global correction (the whole monitor, not just per image). I'm not a graphics pro or anything, but am simply interested in getting the highest fidelty of colors and grayscale possible, mainly for simple curiosity. I've searched the web and contacted the manufacturers of the relevant components, but to no avail.

I'm not talking about gamma either, as it is only an approximation.

Thanks.

Quality is better than name brand, even regarding beloved AMD.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
Here is a writeup I did on color temp, this may help explain.

The Correlated Color Temperature of a light source is usually referred to in the specs as CCT, and in common speech as “color temperature.” This rating indicates the color of a tube or bulb when it’s lit. It also indicates the general atmosphere (warm or cool) created by a particular bulb or tube.

Color Temperature is measured on the Kelvin scale. Zero degrees Kelvin is equal to minus 273 degrees Centigrade. Without getting into the more boring details, the CCT basically refers to what temperature in degrees Kelvin an iron bar would have to be heated in order to duplicate the color of a particular bulb or computer monitor tube.

Strangely, a higher color temp indicates a “cooler” color. This of course is due to the fact that you heat iron first to red-hot (around 1500 to 2000 K) and later to white-hot (3000 to 5000 K) and finally to a bluish-hot (6500K and above).

Incandescent bulbs have color temps between 2000K and 3000K. The most common incandescent color temp is 2700K. Halogens have a CCT of 3000K. That’s why Halogens look a little “whiter” than standard incandescent. Fluorescent bulbs can be manufactured with almost any CCT - depending on which phosphors are used, and their relative proportions.

The color of the light used to view an image is probably the most important factor affecting its appearance. If you have ever compared a paint chip in a hardware store to the actual paint in the can versus the same paint on the wall, you know they all look different. One of the biggest factors affecting these variations is different light sources. Typical store light is a cool florescent, while home light is a warm incandescent and daylight streaming through open window changes according to the time of day.

Color temperature is a way to define the illumination of a light source. The color temperature or chromaticity of a computer monitor is usually described in terms of its color temperature or, more accurately, by its "X" and "Y" coordinates on the CIE Chromaticity Diagram. We can measure a computer monitor “X” and “Y” color coordinates vary accurately with a color meter

Color adjustments give the user the flexibility to customize color settings for different types of applications. Text based applications text based applications that use light backgrounds with black text benefit from cooler bluish white “paper like” colors. Color / graphics applications like web page development and pre press benefit from warmer reddish color adjustments. Optimizing the monitors color settings will provide for more accurate color reproduction and accuracy.

So bottom line is, color temp has does not have a real affect or degrade the phosphors, it is the beam current applied by the contrast and brightness controls that will have a larger affect.

Jim Witkowski
Chief Hardware Engineer
Monitorsdirect.com


<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
When I was designing video cards back in the early 90's I was up on all the current ramdac technology, I have no clue what Nvidia is doing on thier cards for color correction or gamma correction. Sorry

Jim Witkowski
Chief Hardware Engineer
Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
Thanks for the details on what color temp means, how they are used, and how they apply to light bulbs. But you still haven't answered my question.

Ligth bulbs are more akin to the just the cathode of monitor. They are designed with certain materials that give off certain wavelenghts of light at certain current. Generally speaking (depending on the material), increasing the thermal temperature of a material will increase energy and shift the wavelenghts towards the ultraviolet area (shorter wavelengths). Now, if you use a dimmer switch, you are basically controlling the current that runs through the bulb. By decreasing it, you effectively lower the current (less resistance to the filament, less heat), which not only lowers the brightness but causes it's color spectrum to emit more red relative to other visible colors. This simple analogy makes me consider that this also happens in the cathode of the monitor.

Now, in computer monitors, color isn't strictly defined by wavelength emitted by the cathode. The phosphor triads emisson, which is affected by the cathode of each component color, determine the color on the screen. What I'm asking is what physically happens when you increase or decrease the temperature of, say red? Does the red cathode increase in brightness (as if dimmer switch is increased), or something else? Also, what is the relationship between the phosphors and cathode beam? Can each phosphors emit slightly different hues (wavelenghts)depending on the energy level of the electrom beam, or does it increase or decrease the intensity of the same hue? Also, is the phosphor sensitive to different wavelengths from the cathode, that is, does it change it's emission(intensity and/or hue) based on the excitation wavelenght (intensity and/or hue)?


Thanks


Quality is better than name brand, even regarding beloved AMD.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
9300 is the whitest white point a monitor can produce. At this temp, with the contrast set at max, the monitor luminance output is max. When you change color temp and want a more reddish color, the monitor simply decreases the beam current that hits the blue and green phosphors, this gives the monitor a more reddish tone. It’s not an additive process it is a subtractive process.

I hope this answers your question.



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
Sorry, it doesn't answer my question. You are not reading carefully.

Color temperature is controlled by three hue controls (some monitor just have a single temperature control, while others have individual hue controls that make up what is known as temperature). Adjusting the ratio will give different tints, but what if you kept the ratio constant and just increased or decreased all colors at the same rate? What is happening when the ratio is controlled for? I've noticed that tint looks the same, when ratios are constant, but the whole image looked brighter or darker depending if you increased or decreased the hues at a constant ratio.

I was hoping you could answer my question as you are expert, though your aversions suggest otherwise.

This statement is ambiguous.

"At this temp, with the contrast set at max, the monitor luminance output is max"

Are talking about the contrast, temperature, or both as contributing to max luminance? If it is either the temp. or both, then you are saying that color temperature does indeed at least contribute to increase of the luminance, and therefore will technically increase screen wear, much as just turning up the contrast (white) will. What if the color temp is low (for each of the hues)and contrast is set to high? Will the monitor luminance output be less or more that scenario you described? If less, then, assuming by luminance you mean the increased emission of light by the phosphors, you have to admit that increasing temperature, to what significant degree unknown to me, will affect negatively affect the phosphor life relative to decreased temperature (assuming all else equal).

The statement below is not true strictly true if I understand it correctly.

"When you change color temp and want a more reddish color, the monitor simply decreases the beam current that hits the blue and green phosphors"

5000K is reddish, but the red hue in absolute value lower than in 9300K, though in proportion to blue and green is higher to 9300K! I have a feeling you are not so familiar with the component nature of temperature color, or else you are deliberately misleading.

Now, it could be that each component color control doesn't affect one color, but that it controls a ratio itself, namely subtracting other hues to relativley raise its own hue. So increasing the red hue, is in fact lowering the green and blue. Let's say it's by -1 each per hue. If this is true, then increasing blue and green by the same amount will actually result in r of -2, g of -2, and b of -2. The ratio will be the same after increasing the hue by the same amount for all component hues, but since it is subtractive for "adversary" hues, the overall luminance has gone down! Isn't white light based on combinations of r, g, and b and doesn't increasing the luminance of the hues at the same rate increase in a brighter white (of the same ratio)? By your explanation above, the highest temperature will will result in the lowest luminance.

Maybe you could just answer this one repeated question.

When adjusting the hue (of which three make up "temperature"), what physical parts of the monitor of the involved (please, talk about cathodes, phosphors, or whatever is relevant) and how do they change?

Again, I value your expert knowledge for my selfish curiosity.

Quality is better than name brand, even regarding beloved AMD.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
You obviously do not understand the concept that I have tried to explain. If you think I'm trying to avert the question, wrong! you just don’t get it. Go back an read my previous posts its all there.



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
I'm asking for a really technical (physics, mechanics, etc.)answer in all it's glory as I really want to what to know actually happens.

What physically happens when you change the red hue up or down? Is that too much to expect from a monitor expert to answer? You still haven't answered this simple question!

Please answer or tell me you don't want to.

I've read your posts. They only circumscribe the concept of temperature without really going into the details.

Also, you don't seem to distinguish between hues and color temp which makes me hard to understand what you are talking about. Please answer in terms on just one hue control, for clarity's sake.

Please explain or not.

Thanks.

Quality is better than name brand, even regarding beloved AMD.
 

GoSharks

Distinguished
Feb 9, 2001
488
0
18,780
First you must understand how color is made on a CRT. There are three electron beams, each beam varies in beam current according to the input signal voltage. Varying the beam current that hits the phosphors makes the colors change. This variation changes the color hues thus the color temperature. For example, all three beams on full = white or 9300 degrees color temp. Turn off the blue and green gun, you get pure red thus a color temp shift to around 3000 degrees. Its that simple.

This is why I said in the beginning that changing the color temp would not harm the unit, in fact lowering it from 9300 will extend the life of the phosphors and cathode.

Jim Witkowski
Chief Hardware Engineer
Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 

compuhan

Distinguished
Apr 29, 2002
181
0
18,680
You still didn't seem to understand my question, but I can pick apart some facts from you said. I already understand that how color is dependent on three phosphors elements and electron guns.

What you fail to understand is some monitors can go above 9300K and that some monitors include individual hue adjustement rather than preset color temps. From 9300K I can leave blue and green the same and just lower the red.

At default, most monitors are at 9300K which is the highest value, therefore you can only turn it down. So you are saying that though color does affect phosphor life, since it is practically a moot point since monitors are already set at the highest point. This is different from saying "color temp does not affect phosphor life at all". I can increase my monitor life at lower temp, as you say. You had previously said that color temp has no effect on phosphor and cathode life.

Even though you havent fully answered my question, you have finally said the color temps do have an effect on phosphor life.

Thanks.

Quality is better than name brand, even regarding beloved AMD.