Adjusting Samsung 213T Monitor

ted

Distinguished
May 25, 2001
516
0
18,980
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Does anyone know how to change the contrast on a Samsung 213T monitor?
When using the digital connection most of the adjustments are disabled
on the front of the monitor. I have an ATI Radeon 9700 Pro card but I
cant find anyway to set the contrast on that either, just color &
brightness.

What is the best way to calibrate a monitor so that digital images will
come out the way they are supposed to? Any calibration I have seen
starts out telling me to adjust the contrast first, so I don't get far.
Is there a calibration program that can adjust all of this stuff within
the program its self without having to use the monitor buttons (that are
disabled)?

Thanks!
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Ted" <txguy972@aol.com> wrote in message news:419A376C.1857A7A6@aol.com...
> Does anyone know how to change the contrast on a Samsung 213T monitor?
> When using the digital connection most of the adjustments are disabled
> on the front of the monitor. I have an ATI Radeon 9700 Pro card but I
> cant find anyway to set the contrast on that either, just color &
> brightness.

"Contrast" as used on monitors (esp. CRT monitors) is actually
a video gain control - it basically sets the difference between
"white" and "black" (i.e., the maximum and minimum input levels
expected during active video on each video channel). With
a digital connection, these are already fixed (with an 8 bit/color
interface, for instance, the minimum value is simply 0, while the
maximum is 255) - and since they ARE known, and basically
fed straight through to the panel (in the case of an LCD monitor)
or at least with some level of processing which is also known
in advance, there's no real need for a "contrast" control.

"Brightness" in a CRT monitor is also somewhat misnamed, as
it's really setting the cutoff level of the CRT relative to the video
signal. In an LCD monitor, "brightness" is generally just that -
it directly controls the brightness of the backlight and hence the
overall brightness of the display.


> What is the best way to calibrate a monitor so that digital images will
> come out the way they are supposed to? Any calibration I have seen
> starts out telling me to adjust the contrast first, so I don't get far.
> Is there a calibration program that can adjust all of this stuff within
> the program its self without having to use the monitor buttons (that are
> disabled)?

Many of the calibration procedures you'll come across are written
with the CRT in mind. With an LCD, esp. with a digital interface,
the "contrast" is fixed as noted above, and the "brightness" at least
shouldn't have much effect on the colors. That just leaves a couple
of possible adjustments - the color balance or "white point," which
is controlled by determining the proportions of red, green, and blue
which are used to make "white" (where "white" is just whatever color
you get when all three input channels are at maximum) - and the
response or "gamma" (which may or may not be adjustable in a
given LCD monitor product, and in any case is being adjusted by
varying the contents of a look-up table between the input signal and
the input to the panel). With the exception of a very few high-end
products, LCD monitors at this stage of the game aren't very
"calibratable" for color. You get to choose the white point to match
whatever the original image data expects (6500K is often a good
starting point, as it's a commonly-used standard), you set the brightness
to whatever looks OK to you, and that's about all you get to do.
Some LCD monitors will provide an "sRGB" setting, which supposedly
sets up the monitor to match the requirements of the sRGB standard.
I say "supposedly" as there are varying degrees of how well the end
result complies with the actual standard, but you most likely won't get
to change it in those areas where it doesn't match.

Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Myers" <nospamplease@address.invalid> wrote:

>"Contrast" as used on monitors (esp. CRT monitors) is actually
>a video gain control - it basically sets the difference between
>"white" and "black" (i.e., the maximum and minimum input levels
>expected during active video on each video channel). With
>a digital connection, these are already fixed (with an 8 bit/color
>interface, for instance, the minimum value is simply 0, while the
>maximum is 255) - and since they ARE known, and basically
>fed straight through to the panel (in the case of an LCD monitor)
>or at least with some level of processing which is also known
>in advance, there's no real need for a "contrast" control.
>
>"Brightness" in a CRT monitor is also somewhat misnamed, as
>it's really setting the cutoff level of the CRT relative to the video
>signal.

I'd say grossly misnamed. I figured-out all of the above many years
ago, but I'd wager that 90% of the population doesn't have a clue,
which partially explains all the horribly-adjusted TV's in people's
homes (I'd say "mostly explains", but the REAL culprit is that most
people just don't give a rip).

For CRT's, I can't understand why they just don't label the
"brightness" control "black level".
 

ted

Distinguished
May 25, 2001
516
0
18,980
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Thanks for your help, the info that comes with the monitor doesn't even talk
about any of that as far as I can see. How can I be sure that the images I see
on my computer look the same on other peoples computer and more importantly
print out the same?

Bob Myers wrote:

> "Ted" <txguy972@aol.com> wrote in message news:419A376C.1857A7A6@aol.com...
> > Does anyone know how to change the contrast on a Samsung 213T monitor?
> > When using the digital connection most of the adjustments are disabled
> > on the front of the monitor. I have an ATI Radeon 9700 Pro card but I
> > cant find anyway to set the contrast on that either, just color &
> > brightness.
>
> "Contrast" as used on monitors (esp. CRT monitors) is actually
> a video gain control - it basically sets the difference between
> "white" and "black" (i.e., the maximum and minimum input levels
> expected during active video on each video channel). With
> a digital connection, these are already fixed (with an 8 bit/color
> interface, for instance, the minimum value is simply 0, while the
> maximum is 255) - and since they ARE known, and basically
> fed straight through to the panel (in the case of an LCD monitor)
> or at least with some level of processing which is also known
> in advance, there's no real need for a "contrast" control.
>
> "Brightness" in a CRT monitor is also somewhat misnamed, as
> it's really setting the cutoff level of the CRT relative to the video
> signal. In an LCD monitor, "brightness" is generally just that -
> it directly controls the brightness of the backlight and hence the
> overall brightness of the display.
>
> > What is the best way to calibrate a monitor so that digital images will
> > come out the way they are supposed to? Any calibration I have seen
> > starts out telling me to adjust the contrast first, so I don't get far.
> > Is there a calibration program that can adjust all of this stuff within
> > the program its self without having to use the monitor buttons (that are
> > disabled)?
>
> Many of the calibration procedures you'll come across are written
> with the CRT in mind. With an LCD, esp. with a digital interface,
> the "contrast" is fixed as noted above, and the "brightness" at least
> shouldn't have much effect on the colors. That just leaves a couple
> of possible adjustments - the color balance or "white point," which
> is controlled by determining the proportions of red, green, and blue
> which are used to make "white" (where "white" is just whatever color
> you get when all three input channels are at maximum) - and the
> response or "gamma" (which may or may not be adjustable in a
> given LCD monitor product, and in any case is being adjusted by
> varying the contents of a look-up table between the input signal and
> the input to the panel). With the exception of a very few high-end
> products, LCD monitors at this stage of the game aren't very
> "calibratable" for color. You get to choose the white point to match
> whatever the original image data expects (6500K is often a good
> starting point, as it's a commonly-used standard), you set the brightness
> to whatever looks OK to you, and that's about all you get to do.
> Some LCD monitors will provide an "sRGB" setting, which supposedly
> sets up the monitor to match the requirements of the sRGB standard.
> I say "supposedly" as there are varying degrees of how well the end
> result complies with the actual standard, but you most likely won't get
> to change it in those areas where it doesn't match.
>
> Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"chrisv" <chrisv@nospam.invalid> wrote in message
news:n1nmp09gjqslkefhcgo2sejadr2ga5biuv@4ax.com...
>
> For CRT's, I can't understand why they just don't label the
> "brightness" control "black level".
>

I'm tempted just to say "tradition" - the term was first used
on TV sets, from which the early PC monitors were derived -
but in large part because once established, no one was willing
to rock the boat by renaming the control - and, as you noted,
the vast majority of users simply didn't know what the thing
did under ANY name, so changing to "black level" wouldn't
really have helped much. It's difficult enough to find someone
who knew enough to set the thing to just extinguish the
"blanked" regions.

Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Ted" <txguy972@aol.com> wrote in message news:419BEBA4.285BF33E@aol.com...
> Thanks for your help, the info that comes with the monitor doesn't even
talk
> about any of that as far as I can see. How can I be sure that the images I
see
> on my computer look the same on other peoples computer and more
importantly
> print out the same?

That's a subject that deserves an entire book to address,
and in fact several have been written to try and tackle it
(although none that I'm aware of that try to get the concepts
across at a truly accessible level for most users. And it's
certainly not something that can even remotely be addressed
in its entirety in a posting here. Which, of course, will not
stop me from trying (and likely failing spectacularly! :).

You can't ever be sure that the images you see are going to be the
same on other peoples' computers, let alone that they are
"right" (i.e., reflect the "real" colors of the original scene, as if
you were seeing the real thing, or at least were the colors that
the creator of the original intended) without having at least three
or four key pieces of information regarding that original image.
First, what "white" is supposed to be (in other words, what color
should the display produce when all the inputs - all the primaries -
are set to their maximum allowable level). Next, what the colors
of those primaries were assumed to be in creating the original
information (what "color space" was assumed by the creator, or
used in the original image sensor), and then also what response
or "gamma" was assumed in encoding the image. Finally, to get
it really right (or at least close), you also need to know the intended
"brightness" (luminance) of the image, and the ambient lighting
conditions under which it was intended to be viewed.

There are a couple of ways that this is done in computer imagery.
The best would be to include all of the above information along
with the image data, and then also know the relevant characteristics
about the display device to be used. That's what the "ICC profile"
approach attempts to do - add information to the image data file
regarding its encoding, and then also always have "profile" information
available describing the display device such that you can translate
as best as possible between the two. Unfortunately, this approach
is rarely used to the extent that it might be, except for some high-end
or professional applications where this information is maintained
and properly used all along the process.

Another, simpler approach is just to assume a standard set of
conditions to be used throughout the chain, and try to get everything
to match that standard as closely as possible. This is what's done
in television and in such things as the "sRGB" standard in the PC
world. You standardize a set of primaries, a white point, the
"gamma" curve, and a few other things, and images will look
reasonably consistent as long as those settings are maintained.
This is currently the most widely used method in most "mainstream"
PC apps, with the "sRGB" model being among the most popular
if not the most popular. If your display is set to the sRGB specs,
then images created with that standard in mind should look "correct"
(at least as much as possible), and should be reasonably consistent
on any sRGB-compliant display. With an LCD monitor, about the
only thing the user can generally adjust to match this spec is the
white point (which should be 6500K), and possibly the gamma.
The primaries of pretty much any monitor are non-adjustable
(they're set by the phosphors or the color filters), and the only
thing left you're likely to be able to control would be the brightness
and the black level (although as already noted, the latter is
generally not controllable by the user in an LCD monitor).
The sRGB standard is very CRT-oriented, though - for one thing,
if you were to really apply it strictly, you'd be using a brightness
(luminance) setting that is unreasonably low for most LCD products
- only 80 cd/m^2 - and as already noted, most LCD products
currently available will not do a great job of matching a CRT-like
gamma. (under sRGB, it's supposed to be 2.4 with a slight
positive offset).

Making the images you see on your monitor match what comes
out of the printer is a separate but similar problem, further complicated
by the fact that the two use completely different color systems
(the additive RGB vs. the subtractive CMYK primary sets), with
color spaces that don't really match up all that well.

I'm not sure this has been all that helpful, but as noted this is an
extremely complicated subject and a problem which really has
only been partially addressed in the mainstream PC market
to date.

Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Myers" <nospamplease@address.invalid> wrote in message
news:bsUmd.3100$%A3.799@news.cpqcorp.net...
>
> "chrisv" <chrisv@nospam.invalid> wrote in message
> news:n1nmp09gjqslkefhcgo2sejadr2ga5biuv@4ax.com...
>>
>> For CRT's, I can't understand why they just don't label the
>> "brightness" control "black level".
>>
>
> I'm tempted just to say "tradition" - the term was first used
> on TV sets, from which the early PC monitors were derived -
> but in large part because once established, no one was willing
> to rock the boat by renaming the control - and, as you noted,
> the vast majority of users simply didn't know what the thing
> did under ANY name, so changing to "black level" wouldn't
> really have helped much. It's difficult enough to find someone
> who knew enough to set the thing to just extinguish the
> "blanked" regions.
>
> Bob M.
>
I go along with the tradition idea, amplified by "social inertia". I
can remember having intense discussions with TV marketing types
(I was in the technical arena) on this subject, way back in the early
'70's... Nobody was brave enough then, or since, to take the first
step. Notwithstanding that the BRT and CONT function on earlier
TVs did not act then as they do in a typical CRT monitor - but it
must have been just too easy to "keep status quo", compared to
the work necessary to try and educate "Joe Public".

NGA