flat panel monitors

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> They are way behind in 3D perf, and only just
> announced their first PCI-Express card.

But are they ahead in 2D performance and image quality? I have a
Millennium II card in my oldest PC, which as always served very well.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> They are way behind in 3D perf, and only just
>> announced their first PCI-Express card.

> But are they ahead in 2D performance and image quality? I have a
> Millennium II card in my oldest PC, which as always served very well.

It depends on your applications, operating system,
PC, and graphics slot (AGP, PCI, PCI-X or PCIe).
You need to hit some forums devoted to your key
apps and get advice.

The two most graphics-intensive things I do, Photoshop
and IMSI TurboCAD, seem to get no particular benefit
from the accelerations available on ATI and Nvidia cards,
and perform quite adequately on a Matrox Parhelia.

Photoshop is compute and bus-bound.

TC uses OGL, but only for modes where performance isn't
an issue anyway. In fully-rendered mode, it's doing that
entirely in host software, and is purely compute-bound.

If I ran games, the config might have been different.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> I was actually a bit startled by how crisp
> the screen was using the DVI-D connection. In my CAD work,
> I now always see stair-casing of angled and curved lines,
> whereas on the CRT monitor (same res), they were smooth.

I doubt that this is a result of switching to a digital connection.

Note also that aliasing is usually a sign of lower resolution, not
higher resolution.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> I was actually a bit startled by how crisp
>> the screen was using the DVI-D connection. In my CAD work,
>> I now always see stair-casing of angled and curved lines,
>> whereas on the CRT monitor (same res), they were smooth.

> I doubt that this is a result of switching to a digital connection.

Re-running the comparison, I see that it was partly due
to going digital, but mostly due to switching to LCD.
The former CRT (same res) was providing some additional
de-crisping 🙂

> Note also that aliasing is usually a sign of lower
> resolution, not higher resolution.

In this case, I'm making no changes to the video setup
when I switch between CRT and LCD, or analog and digital
on the LCD.

Just playing around in analog mode on the LCD, I see
not only the pink halo on black-on-white objects, but
also some ghosting (or ringing). Likely a result of the
KVM switch and extra cable in that path.

And painting a test pattern with alternating single-pixel
white-black, the white is not pure (but, impressively,
the alignment of the data and display rasters is perfect);
no gray moire.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news😱psi0je4o1ft8z8r@news.individual.net...

> The monitor knows that the incoming data will be
> pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
> range, or maybe be linear (no re-comp).

No, the monitor knows nothing about how the incoming
video data is biased; the video source (the host PC) MAY
apply a pre-compensation based on what it knows of the
monitor's response curve (based on the gamma value given
in EDID). But the "correction" the host applies to the
video data is not the issue here. (Whether or not any
correction SHOULD be applied is another matter, and one
that probably deserves some attention later on.) But all
the monitor really knows is that it's getting such-and-such
an input level.

The problem is that while the CRT provides, just by its
nature, a nice "gamma" curve (it's nice for a number of
reasons, not the least of which is that it's a very good match
to the inverse of the human eye's own response curve -
the bottom line result being that linear increases in the input
video level LOOK linear to the eye, even though the actual
output of light from the tube is varying in an objectively
non-linear fashion), the LCD does not do this. The LCD's
natural response curve, from a perceptual standpoint, is
ugly - a S-shaped curve which is sort of linear in the
middle and flattens out at both the black and white ends.


> Why doesn't the look-up more fully adjust-out the
> S-curve, so that color errors that can be corrected
> with the simple exponent adjustment of typical graphics
> card gamma control menus?
>
> My guess is that because LCD subpixels are just barely
> 8-bit, a full correction might minimize color errors at
> the expense of introducing visible terracing in gradients.

Even if they're fully eight bits, that's not enough IF you
are also advertising to the outside world (i.e., to those
devices ahead of the LUT) that you're providing a true
eight-bit accuracy. You've already mapped some of those
values off what they're expected to be, which in effect
will compress the curve in some areas and cause, for
instance, two successive input values to result in the same
ONE output value. You need finer control of the pixel
gray level, relative to the specified accuracy of the input
data, to be able to both compensate the response curve
AND provide that specified accuracy at all levels.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>> The monitor knows that the incoming data will be
>> pre-compensated to a gamma (log curve) in the 1.8 ... 2.6
>> range, or maybe be linear (no re-comp).

> No, the monitor knows nothing about how the incoming
> video data is biased; the video source (the host PC) MAY
> apply a pre-compensation based on what it knows of the
> monitor's response curve (based on the gamma value given
> in EDID).

I was using "know" in the metaphorical sense. The
monitor maker knows that the signal is apt to be
either linear, or pre-comped in the 1.8 - 2.6 gamma
range ...

.... and that if the user has any tool for dealing with
a mismatch of expectations, it's apt to be just a simple
exponent control, and maybe ganged (can't separately
adjust R, G and B).

> (Whether or not any correction SHOULD be applied is
> another matter, and one that probably deserves some
> attention later on.)

Is a gamma standard a topic of any of the follow-on
standards to DVI? Packet? Send-changed-data-only?

> Even if they're fully eight bits, that's not enough IF you
> are also advertising to the outside world (i.e., to those
> devices ahead of the LUT) that you're providing a true
> eight-bit accuracy. You've already mapped some of those
> values off what they're expected to be, which in effect
> will compress the curve in some areas and cause, for
> instance, two successive input values to result in the same
> ONE output value. You need finer control of the pixel
> gray level, relative to the specified accuracy of the input
> data, to be able to both compensate the response curve
> AND provide that specified accuracy at all levels.

No problem, just do error-diffused dithering in the
monitor's full-frame buffer 🙂

Now this could be done in the host, but then we'd need
some new VESA standard for reading back the tables of
stuck values.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:acjur0la3u84nchrn17juprb3d0iuadssb@4ax.com...
> The incoming data might be 8-bit, but there's no reason why the internal
> correction of the monitor can't be carried out with much higher
> granularity.

The "granularity" of the look-up table data is not the
limiting factor; it's the number of bits you have at the
input to the panel, vs. the numer of bits you claim to
have at the input to the overall system. If I map 8-bit
input data to, say, 10-bit outputs from the look up
table, I don't get as good a result as I want if the panel
itself has only 8 bits of accuracy. I need to at the very
least call in some additional tricks (which ARE available
- some frame-to-frame dithering can help, for example)
to be able to take advantage of the greater accuracy in
in the middle of the chain.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news😱psi0bkbqcft8z8r@news.individual.net...

> But there are opportunities for the signal to get
> visibly degraded if it goes to analog before it gets
> to the LCD panel lattice. In the entirely unscientific
> test I just ran, where I saw exactly what I expected to
> see, the analog happened to be running through two 2m
> lengths of HD15 cable and a KVM switch. The LCD image
> went from pixel-perfect to slightly fuzzy, and perhaps
> also reduced "contrast".

Oh, sure - but then, that's a bad thing to do to any connection.
Have you tried the corresponding experiment with a
digital interface running at its max. pixel rate? (Nope -
because passive switchboxes and the like simply don't
work with digital interfaces.) In an apples-to-apples
comparison, say a VGA vs. a DVI over the standard
2-3 meters of good quality cable in each case, the
differences you will see are due to sampling errors in the
analog case. Or in other words, the advantage of the digital
interface is that it brings its "sampling clock" along with
the data.


> Umm, if the bits in the frame buffer are going thru a
> DAC (which can introduce noise and distortion), then
> thru a cable (which <ditto>), even if the LCD is not using
> an ADC, and is using the analog signal directly, that
> extra noise and distortion may show up on screen.

Sure; the question is always going to be whether or not
that "noise and distortion" is below the level we care
about. Digital interfaces are not error-free, either; that
they are acceptable, when they are, is the result of the bit
rate being below perceivable levels. Similarly, if the analog
interface delivers a stable image with the video data to
the desired level of amplitude accuracy (in most cases here,
to an 8 bit/sample level, or an accuracy of about +/- 1,5 mV
in "analog" terms), the difference between the two interfaces
will not be distinguishable. It is ALWAYS a matter of how
good is good enough, and neither type of connection is
ever truly "perfect."


> I sorta suspected that, but in the DVI-D model, the
> signal remains digital until it hits the rows & columns, no?

Well, until it hits the column drivers, yes. On the other hand,
there HAVE been LCD panels made, notably by NEC,
which preserved the analog video signal in analog form clear
through to the pixel level.


> Does the typical analog-only LCD have a DAC? Or does it
> just sample the analog signal and route values to drivers?

It has an ADC right up front - it generally has to, especially
if it supports any sort of image scaling, which is definitely
something best done in the digital domain. Scaling does
not necessarily imply a full frame buffer; modern scalers
make do with a few lines' worth of buffering, unless
frame rate conversion is also required - in which case at
least a good deal of a frame's worth of data must be stored,
and in the best versions a full frame buffer or two of memory
is used.



> Even if the clocks align, there's also the matter of
> whether or not the analog signal has completely slewed
> to the value needed. If the DAC-cable-ADC path has
> bandwidth-limited (softened) the transitions, or
> introduced color-to-color skews, that will show up.
> I see it, or something like it, doing analog on my LCD.

Sure - but you can't really lay the blame for having a BAD
analog interface on analog connections in general. The
point is that a very good interface is still most definitely possible
in the analog domain, and is in fact achieved quite often. There
are also analog systems which take advantage of the rather
forgiving nature of analog to enable truly cheap and nasty
cables, connectors, etc., at the expense of performance.
Digital, as noted, either works or it doesn't - which is a big
part of the reason that digital interfaces are not as inexpensive
as the cheapest (and lowest quality!) of the analog types.
You simply HAVE to meet a certain minimum level of
performance with digital, or you don't get to play AT ALL.

> > ... the New Analog Video Interface standard, or simply
> > NAVI. ... It's not clear yet how well NAVI will be
> > accepted in the industry, but it IS available if
> > anyone chooses to use it.
>
> I suspect it's irrelevant at this point. Analog is
> the "economy" graphics connect now, and what we have
> is sufficient for the market.

Possibly; we'll see how it plays out. While digital
interfaces are becoming a lot more popular, analog
connections still account for well over 80% of the
video actually being used in the desktop monitor
market, even though LCDs took over from CRTs
as the unit volume leader this past year. As you know,
a gargantuan installed base has certain advantages
(or problems, which is often a different word for the
same thing! 🙂).

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Bob Myers <nospamplease@address.invalid> wrote:

>> > ... the New Analog Video Interface standard, or simply
>> > NAVI. ... It's not clear yet how well NAVI will be
>> > accepted in the industry, but it IS available if
>> > anyone chooses to use it.

>> I suspect it's irrelevant at this point. Analog is
>> the "economy" graphics connect now, and what we have
>> is sufficient for the market.

> Possibly; we'll see how it plays out. While digital
> interfaces are becoming a lot more popular, analog
> connections still account for well over 80% of the
> video actually being used in the desktop monitor
> market, even though LCDs took over from CRTs
> as the unit volume leader this past year. As you know,
> a gargantuan installed base has certain advantages
> (or problems, which is often a different word for the
> same thing! 🙂).

Does NAVI bring any benefits to the installed base of
CRTs? Does it matter if it does?

If it does bring benefits to LCD via analog connect,
does that matter? I suspect the users who care about
whatever NAVI promises, will tend to go digital.

And I have a suspicion that the temptation on entry-
level PCs in the near future will be an analog-free
connection. A dumb UMA frame buffer, exposed thru a
TMDS chip thru a DVI-D (only) port on the back panel,
thru a DVI-D (only) cable, to a DVI-D (only) monitor.
Omits a couple of buffers, a DAC, an ADC (maybe) and
some copper. Maybe only runs at native res. Does DVI
allow captive cable at display?

The entire concept of "high end CRT" is already dead,
and increasingly what remains of new CRTs in the market
will tend toward junk (or be seen as so). The momentum
to flat panel (LCD or not) may cause the entire analog
graphics connection to go the way of the impact printer
before NAVI can get a foothold.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:voiur0ltbpk88i1aehf4g17qsotq7cumb5@4ax.com...
> Analog can also be more perfect than digital. In fact, it is always
> possible to build an analog system that is superior to any given digital
> system--if money is no object.

Exactly. Both are simply means of encoding information
for transmission; when comparing "analog" to "digital," the
best that you can ever do is to compare one given
implementation of "analog" vs.a given implementation of
"digital." Neither "analog" nor "digital" is inherently
superior to the other, per se. Each has its own advantages
and disadvantages, and there is a lot of misunderstanding
as to just what those are in each of these.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:fdiur09qghsmlp0pa3bscnpk7ts7iidocb@4ax.com...
> The best analog system will always beat the performance of the best
> digital system.

Unfortunately, I'm going to have to disagree with that, as
well; as I noted in another response here, neither type of
interface, per se, is inherently superior to the other.
Both are ultimately limited by the Gospel According to
St. Shannon, which puts strict limits on how much data
you can get through a given channel REGARDLESS of
how that data is encoded. Now, a particular sort of
a digital interface may or may not be superior to a
particular sort of analog; it depends on the specific
characteristics of the interfaces in question, and just what
is important, in a given application, in determining
"superior."


> This is why the _best_ analog audio systems can consistently beat the
> best digital systems.

That's not the only reason for this; high-end audio also
incorporates huge dollops of what can only be seen as
"religious" beliefs, with no basis in reasoning or evidence,
re a given individuals' views on what is "superior." (I
mean no disrespect to religion in saying this; I am simply
noting that there is a difference in kind between a belief
held solely on faith, and one arrived at through a careful
and objective consideration of evidence.) In the case of
audio, an awful lot of what has been claimed for the various
"digital" and "analog" systems is quite simply wrong.
(This isn't the place for that discussion - I'm sure it
continues, unfortunately quite healthy after all these years,
over in rec.audio.high-end, a group I left a long time ago
for just this reason. There's just no sense in discussing
something when very few are interested in anything
other than argument by vigorous assertion.)



>
> > Oddly enough, the LCD is NOT inherently a "digital"
> > device as is often assumed - fundamentally, the control
> > of the pixel brightness in any LCD is an analog process.
>
> Every interface between the digital world and the physical world is
> analog, so all input and output devices are ultimately analog devices.

No. This is a common misconception regarding what is
meant by the term "analog." It does NOT necessarily mean
a system which is "continuous," "linear," etc., even though
in the most common forms of analog systems these are
often also true. "Analog" simply refers to a means of encoding
information in which one parameter is varied in a manner
ANALOGOUS TO (and hence the name) another - for
example, voltage varying in a manner analogous to the original
variations in brightness or sound level. The real world is
not "analog" - it is simply the real world. "Analog" points
to one means of describing real-world events, as does
"digital."

> "Digital" only means something in the conceptual world of information
> representation.

"Digital" is simply another means of representing information;
one in which the information is described as a series of
"digits" (numbers), and again, this is reflected in the name.
It is neither inherently less accurate or more accurate than
"analog" per se - that comparison always depends on the
specifics of the two implementations in question.

If you want a truly painful and detailed treatment of this
question (well, it HAS been one of my hot butons), I
spent a whole chapter on the subject in my book
"Display Interfaces: Fundamentals & Standards."


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news😱psi0l3q0rft8z8r@news.individual.net...
> Is anyone prepared to argue that using an HD15 analog
> connection to an LCD monitor provides a "better" presentation?

Sure - but first, you have to define "better." 🙂

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Re-running the comparison, I see that it was partly due
> to going digital, but mostly due to switching to LCD.
> The former CRT (same res) was providing some additional
> de-crisping 🙂

Remember that, in theory, there's no fixed upper limit to horizontal
resolution on a CRT, although the mask or grille spacing imposes some
practical restrictions. So you could be seeing additional detail on the
CRT that the LCD cannot display, in some cases.

> Just playing around in analog mode on the LCD, I see
> not only the pink halo on black-on-white objects, but
> also some ghosting (or ringing). Likely a result of the
> KVM switch and extra cable in that path.

It has to be distortion of the signal. The panel is just going to
sample the signal, so if there's a pink halo on the screen, there's one
in the signal.

I'm happy to say that I see no such artifacts on my screen. I just have
a simple 2-metre cable betwixt PC and panel (the cable supplied with the
panel).

> And painting a test pattern with alternating single-pixel
> white-black, the white is not pure (but, impressively,
> the alignment of the data and display rasters is perfect);
> no gray moire.

Maybe you just need to remove the switch and cable.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> Re-running the comparison, I see that it was partly due
>> to going digital, but mostly due to switching to LCD.
>> The former CRT (same res) was providing some additional
>> de-crisping 🙂

> Remember that, in theory, there's no fixed upper limit to
> horizontal resolution on a CRT, although the mask or grille
> spacing imposes some practical restrictions.

Not to mention circuit bandwidth, beam spot size,
beam focus and grill diffraction.

> So you could be seeing additional detail on the
> CRT that the LCD cannot display, in some cases.

My impression is less detail on the CRT. Each LCD triad
definitely represents one graphics card frame buffer pixel.
On the CRT, each fb pixel gets smeared into its neighbors
a bit, via one or more of the above mechanisms.

>> Just playing around in analog mode on the LCD, I see
>> not only the pink halo on black-on-white objects, but
>> also some ghosting (or ringing). Likely a result of the
>> KVM switch and extra cable in that path.

> It has to be distortion of the signal. The panel is just
> going to sample the signal, so if there's a pink halo on
> the screen, there's one in the signal.

I've little doubt that the artifacts are due to the analog
connection outside the monitor. And they probably would
improve if I used a single shorter run of HD15 cable.

> Maybe you just need to remove the switch and cable.

Normally, it's only used for temporary PC connections,
so it's not an on-going issue.

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:k0kur0l41j26inl877ikc58uoqpnpi160s@4ax.com...
> Note also that aliasing is usually a sign of lower resolution, not
> higher resolution.
>

Well, no - "aliasing," if that's truly what a given observation
is all about, is always a sign of improper sampling, whether
it's in an analog situation or a digital one. See "Nyquist
sampling theorem" for further details.

The classic sort of "aliasing" in displays is the good ol'
Moire pattern common to CRTs. What few people realize
is that such patterns were in the past seen as GOOD things
when a CRT maker was testing a new tube, as being able
to see the Moire pattern was a visible indication that the
tube was able to focus sufficiently well!

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:1cpur0po7frloek9q1vn306asak5ogji42@4ax.com...
> Remember that, in theory, there's no fixed upper limit to horizontal
> resolution on a CRT,

No, there is ALWAYS an upper limit to the resolution of
a CRT - for the simple reason that, even in theory, an
infinite bandwidth channel is not possible. Any limitation
on bandwidth in the video signal path represents a
resolution limit. And with respect to the CRT specifically,
other resolution limits come in due to the lower limits on the
physical spot size and the ability of the tube to switch the
beam on and off (i.e., you can't make a CRT without
capacitance in the gun structure, so you can never get an
infinitely short rise/fall time unless you can come up with a
video amp that's a perfect voltage source, capable of
delivering infinite current when needed).

> although the mask or grille spacing imposes some
> practical restrictions. So you could be seeing additional detail on the
> CRT that the LCD cannot display, in some cases.

And the mask or grille, along with the phosphor dot structure,
places some very similar limits on the resolution available
from the CRT as does the physical "pixel" structure of the
LCD or other FPD type. (Whether or not the limits are the
SAME for a given pixel pitch is really more a question of
such things as whether or not the LCD in question permits
sub-pixel addressing, which few so far do.)


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Possibly; we'll see how it plays out. While digital
> interfaces are becoming a lot more popular, analog
> connections still account for well over 80% of the
> video actually being used in the desktop monitor
> market, even though LCDs took over from CRTs
> as the unit volume leader this past year.

If my memory serves me correctly, the earliest monitor connection
interfaces for PCs (CGA and EGA, for example) were _digital_ connection
interfaces. VGA went "backwards" to analog to provide higher
resolutions and color depths, and greater flexibility.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> The entire concept of "high end CRT" is already dead ...

Not for the most critical uses. A high-end CRT is still the best image
quality overall, if you really need the absolute best.

CRTs also still dominate at the low end, since they are ten times
cheaper than flat panels.

As in so many other domains, the advantages of digital do not involve
actual quality, but instead they involve convenience. And in the
imaging field, the usual cost advantage of digital doesn't exist,
either--digital imaging equipment is at least as expensive as analog
equipment, because of the bandwidths required.

> and increasingly what remains of new CRTs in the market
> will tend toward junk (or be seen as so).

CRTs are projected to be clear leaders on the market for years to come.
Flat panels receive all the media hype, but they are not actually
running the show. It all reminds me of the very similar situation in
"digital" (electronic) photography vs. film photography.

> The momentum
> to flat panel (LCD or not) may cause the entire analog
> graphics connection to go the way of the impact printer
> before NAVI can get a foothold.

Not likely any time soon. The inertia of the computer industry today is
enormous; things no longer change over night. The VGA interface may be
around indefinitely, and some users are still using earlier interfaces
(which, ironically, were digital).

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

>> The entire concept of "high end CRT" is already dead ...

From a market standpoint, I hasten to add.

Sony, for example, has ditched all but one of
their CRTs, most recently the GDM-FW900 24" wide,
even though it sold for less than the 23" LCD
that replaced it. The entire Sony entry-level,
mid-range and hi-end consumer and business CRT
product line is done for. Sony was selling CRTs
using a "higher quality" positioning. The customers
took the extra cash and spent in on LCD.

> Not for the most critical uses. A high-end CRT
> is still the best image quality overall, if you
> really need the absolute best.

And you pay dearly for that. The remaining Sony GDM-C520K
is a $2000 product. But customers other than graphics
professionals, who have $2K to spend, are spending
it on LCD. The wider market for "quality" CRTs is gone.

> CRTs also still dominate at the low end, since
> they are ten times cheaper than flat panels.

Not 10x. LCD prices have been collapsing. Using Wal-Mart
as a low-end reseller, their low-end 17"LCD is only
1.3x of their low-end 17"CRT. True, you can get into a
CRT for $70, and their cheapest LCD is $188, but that's
still only 2.7x.

You can watch the Asian press lament the near-daily LCD
pricing collapse at: <http://www.digitimes.com/>

> As in so many other domains, the advantages of digital
> do not involve actual quality, but instead they involve
> convenience.

It has ever been thus. In addition to being trendy and
cool, LCDs are cheaper to ship, use less power, turn on
faster, are easier to install and move around, take up
less space and are less of a problem at disposal time.
The small premium they still command is something an
increasing number of average users are willing to pay.

> CRTs are projected to be clear leaders on the market for
> years to come.

Only if someone is still making them.

> It all reminds me of the very similar situation in
> "digital" (electronic) photography vs. film photography.

Yep. I dumped all my 35mm gear on eBay last year, went
all-digital, and haven't regretted it for a moment.
Silver halide is racing CRT to the exit, but both will
be around for a while yet.

> The VGA interface may be around indefinitely, and some
> users are still using earlier interfaces (which,
> ironically, were digital).

Yep, we've come full circle to CGA and EGA 🙂

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> Not to mention circuit bandwidth ...

Circuit bandwidth places an even greater restriction on digital
transmission. For any given channel speed, the real-world capacity of
the channel is always lower for digital transmission than for analog
transmission.

Remember that digital transmission is nothing more than declaring an
arbitrary signal level as a noise threshold, and considering anything
below it as noise and anything above it as information. Inevitably,
this reduces the information-carrying capacity of the channel.

> ... beam spot size, beam focus and grill diffraction.

True, but CRT manufacture is extremely mature, and amazing things can be
done.

There was a time when NTSC meant "never the same color," but even NTSC
is amazingly precise these days--more so than many people would have
ever thought possible.

> My impression is less detail on the CRT. Each LCD triad
> definitely represents one graphics card frame buffer pixel.
> On the CRT, each fb pixel gets smeared into its neighbors
> a bit, via one or more of the above mechanisms.

The total information content on the screen is the same, though.

Some high-end CRTs for broadcast video use have a mode that deliberately
reduces bandwidth in order to produce a more natural-looking image
through the filtering of high-frequency signal that bandwidth
restriction produces. CRTs can handle extremely high resolutions if
need be.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> No, there is ALWAYS an upper limit to the resolution of
> a CRT - for the simple reason that, even in theory, an
> infinite bandwidth channel is not possible.

But I said no _fixed_ upper limit. The upper limit depends on the
performance of all the components in the chain. Ideally it is equal to
or better than the design limit of those components.

So a CRT might be designed to provide x resolution, but in fact it might
stretch to x+10% resolution. Of course, when digital elements are
present in the chain, the extra resolution, if any, is wasted.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Unfortunately, I'm going to have to disagree with that, as
> well; as I noted in another response here, neither type of
> interface, per se, is inherently superior to the other.

But all digital systems are simply analog systems operated in a
predefined way that declares anything below a certain threshold to be
noise. So the capacity of a digital system is always inferior to that
of an analog system with similar components and bandwidth.

Furthermore, the physical interface at either end of any system is
_always_ analog, so the system as a whole is never better than the
analog input and output components.

It's possible to surpass analog if you are building a system that does
not interface with the physical world. For example, if the system
handles _only_ information (such as accounting data), then you can
easily surpass analog performance with digital methods. But for any
system that requires a physical interface--audio, video, etc.--no
digital system can ever be better than the best possible analog system.
This is inevitable because all digital systems of this kind are just
special cases of analog systems.

> Both are ultimately limited by the Gospel According to
> St. Shannon, which puts strict limits on how much data
> you can get through a given channel REGARDLESS of
> how that data is encoded.

Yes. If the channel is analog, the limit of the channel's capacity is
equal to the limit imposed by Shannon. But if the channel is digital,
the limit on capacity is always below the theoretical limit, because you
always declare some portion of the capacity to be noise, whether it
actually is noise or not. This is the only way to achieve error-free
transmission, which is the advantage of digital.

In analog systems, there is no lower threshold for noise, but you can
use the full capacity of the channel, in theory, and in practice you're
limited only by the quality of your components. In digital systems, you
declare _de jure_ that anything below a certain level is noise, so you
sacrifice a part of the channel capacity, but in exchange for this you
can enjoy guaranteed error-free transmission up to a certain speed.

> That's not the only reason for this; high-end audio also
> incorporates huge dollops of what can only be seen as
> "religious" beliefs, with no basis in reasoning or evidence,
> re a given individuals' views on what is "superior."

Not necessary. Ultimately, audio systems (and imaging systems) depend
on analog devices for input and output. So no system can ever be better
than the best analog system. This is inevitable for any system that
requires interfaces with the physical world, such as displays,
microphones, speakers, etc., all of which _must_ be analog.

The real problem with analog is not its ability to provide quality
(which is limited only by the limits of information theory) but the
extremely high expense and inconvenience of obtaining the best possible
quality. Digital provides a slightly lower quality for a dramatically
lower price.

Just look at flat panels: they provide defect-free images at a fixed
resolution, but they don't provide any higher resolutions. CRTs have no
fixed upper limit on resolution, but they never provide defect-free
images.

> No. This is a common misconception regarding what is
> meant by the term "analog." It does NOT necessarily mean
> a system which is "continuous," "linear," etc., even though
> in the most common forms of analog systems these are
> often also true. "Analog" simply refers to a means of encoding
> information in which one parameter is varied in a manner
> ANALOGOUS TO (and hence the name) another - for
> example, voltage varying in a manner analogous to the original
> variations in brightness or sound level. The real world is
> not "analog" - it is simply the real world. "Analog" points
> to one means of describing real-world events, as does
> "digital."

Analog reduces to using the entire channel capacity to carry
information, and tolerating the losses if the channel is not noise-free.
Digital reduces to sacrificing part of channel capacity in order to
guarantee lossless transmission at some speed that is below the maximum
channel capacity. With digital, you sacrifice capacity in order to
eliminate errors. With analog, you tolerate errors in order to gain
capacity.

Only analog systems can reach the actual limits of a channel in theory,
but ironically digital systems usually do better in practice. Part of
this arises from the fact that analog systems introduce cumulative
errors, whereas digital systems can remain error-free over any number of
components in a chain, as long as some of the theoretical capacity of
the chain is sacrificed in exchange for this.

I used to go with the "analogy" explanation for digital vs. analog, but
since everything in reality can be seen as _either_ a digital or analog
representation, this explanation tends to break down under close
examination. The explanation I give above does not, and it is
compatible with other explanations (for example, representing things
with symbols is just another form of the arbitrary threshold for noise
that I describe above).

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> The "granularity" of the look-up table data is not the
> limiting factor; it's the number of bits you have at the
> input to the panel, vs. the numer of bits you claim to
> have at the input to the overall system. If I map 8-bit
> input data to, say, 10-bit outputs from the look up
> table, I don't get as good a result as I want if the panel
> itself has only 8 bits of accuracy.

But the panel is driving analog pixels. If you get a 10-bit value from
the LUT, why can't you just change this directly to an analog voltage
and drive the pixels from it? You'll still be limited to 256 discrete
luminosity levels for a pixel, but east of those levels can be chosen
from a palette of 1024 steps between black and white. So you have more
precise control of gamma on output. You could use more bits to make it
even more precise.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> From a market standpoint, I hasten to add.

Even that I wonder about. Flat panels are the rage in developed
countries, but CRTs still have a market elsewhere, since they are so
cheap.

> Sony, for example, has ditched all but one of
> their CRTs, most recently the GDM-FW900 24" wide,
> even though it sold for less than the 23" LCD
> that replaced it.

I'm not sure that this was a good decision on Sony's part, but then
again, Mr. Morita has been dead for quite a while now.

> The entire Sony entry-level,
> mid-range and hi-end consumer and business CRT
> product line is done for. Sony was selling CRTs
> using a "higher quality" positioning. The customers
> took the extra cash and spent in on LCD.

So all the Artisan buyers chose LCDs instead? That's hard to believe.

> And you pay dearly for that. The remaining Sony GDM-C520K
> is a $2000 product.

About the same as any decent mid-range LCD. My little flat panel cost
that much.

> Not 10x. LCD prices have been collapsing.

You can get CRTs for $60 or so.

> True, you can get into a
> CRT for $70, and their cheapest LCD is $188, but that's
> still only 2.7x.

For a large segment of the market, that's a lot.

> You can watch the Asian press lament the near-daily LCD
> pricing collapse at: <http://www.digitimes.com/>

Why do they have a problem with it? I thought margins were small.

> Only if someone is still making them.

They will likely be made in Asia for quite some time. There are still
several billion people there without monitors.

> Yep. I dumped all my 35mm gear on eBay last year, went
> all-digital, and haven't regretted it for a moment.

I still shoot film.

> Silver halide is racing CRT to the exit, but both will
> be around for a while yet.

The demise of CRTs has been predicted for forty years, and we are still
waiting.

> Yep, we've come full circle to CGA and EGA 🙂

A lot of the people making the decisions today are too young to remember
CGA and EGA, so they think they're inventing something new.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

> Mxsmanic <mxsmanic@hotmail.com> wrote:

> So all the Artisan buyers chose LCDs instead?

No, the last remaining Sony CRT, the GDM-C520K,
is an Artisan.

> You can get CRTs for $60 or so.

Even though CD audio media was higher priced than
LP, and CD players were substantially higher priced
than turntables, CD still killed LP surprisingly rapidly.
Just because the old stuff is cheaper, and arguably
"better", may not save it. Market forces have a
logic of their own that isn't necessarily logical.

> The demise of CRTs has been predicted for forty
> years, and we are still waiting.

Well, flat panel TV had been only ten years away
for the last 50 years. It's here now. When the
existing TVs in this household fail, they'll get
replaced by something flat, for any number of
reasons.

Note Bob Myers observation that LCD sales eclipsed
CRT within the last year. That's a fairly important
event, and won't go unnoticed by industry planners.

Curiously, I also note that Apple has entirely
dropped CRTs from their product line. That really
surprised me, because I'm not convinced that LCD
is really ready yet for pre-press, broadcast DCC,
video post and movie post (entirely apart from
the recent user complaints about the color
uniformity and stability of the Cinema 23).

--
Regards, Bob Niland mailto:name@ispname.tld
http://www.access-one.com/rjn email4rjn AT yahoo DOT com
NOT speaking for any employer, client or Internet Service Provider.