flat panel monitors

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Niland writes:

> No, the last remaining Sony CRT, the GDM-C520K,
> is an Artisan.

I had read elsewhere that even production of that had stopped.

> Even though CD audio media was higher priced than
> LP, and CD players were substantially higher priced
> than turntables, CD still killed LP surprisingly rapidly.

But that's not a valid analogy.

CDs and LPs are storage media, not input or output devices. There are
tremendous practical advantages to digital storage over analog storage;
these advantages ensured success for the CD format.

For input and output devices, the situation is different. For one
thing, they are all analog, whether they are called digital or not. And
because of this, there's no intrinsic advantage to moving to "digital"
devices such as electronic cameras or flat-panel displays. You're
really just exchanging one analog technology for another. The
advantages of a newer technology cannot be taken for granted; it may or
may not be superior to the old technology. And even in the best of
cases, it may take a very long time to become dominant over the older
technology. And most importantly of all, none of it is really
"digital." LCDs depend on variable voltages just as CRTs do.
Permanently dividing the screen into discrete pixels does help for
things like geometry, but it hurts for things like resolution (only one
resolution works if the pixels are fixed on the screen).

> Just because the old stuff is cheaper, and arguably
> "better", may not save it. Market forces have a
> logic of their own that isn't necessarily logical.

That doesn't mean that one must throw up one's hands and follow the
market.

> Well, flat panel TV had been only ten years away
> for the last 50 years. It's here now. When the
> existing TVs in this household fail, they'll get
> replaced by something flat, for any number of
> reasons.

Not unless the flat panels cost about the same as the tubes. The
majority of TV owners in the world can barely afford a tube TV, much
less a flat panel.

And in computerland, there are still people out there running Windows
3.1 on 80386 machines. They aren't going to rush out and buy flat
panels.

> Note Bob Myers observation that LCD sales eclipsed
> CRT within the last year. That's a fairly important
> event, and won't go unnoticed by industry planners.

It's important not to overestimate the significance of short-term
trends. CRTs are a replacement market; flat panels are often new
purchases (either unnecessary replacements or completely new
acquisitions). Digital photography is seeing the same thing.

> Curiously, I also note that Apple has entirely
> dropped CRTs from their product line. That really
> surprised me, because I'm not convinced that LCD
> is really ready yet for pre-press, broadcast DCC,
> video post and movie post (entirely apart from
> the recent user complaints about the color
> uniformity and stability of the Cinema 23).

Professionals who need more quality probably weren't buying their
monitors from Apple to begin with. There are lots of specialized
manufacturers who probably do a better job than Apple in this domain.

Come to think of it, I can't remember the last time I saw an Apple
CRT--the iMac maybe? I don't look much at Apple machines, though.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news😱nhvr0pjlhis5ksvaes5glcv4jnheja16l@4ax.com...
> But the panel is driving analog pixels. If you get a 10-bit value from
> the LUT, why can't you just change this directly to an analog voltage
> and drive the pixels from it?

That's exactly the problem; there are as of yet no
10-bit column drivers (which are the components within
the LCD panel that convert the digital input information
into an analog voltage to drive the pixels) in mainstream use.
10-bit drivers have only recently been introduced at ALL,
and I have heard from some manufacturers that so far they're
not seeing acceptable noise performance from these. And
obviously, they're also more expensive than 6-bit or 8-bit
drivers, which are the mainstream components right now.
But these problems very clearly are being addressed, and it's
only a matter of time before we have 10-bit control at the
pixel level.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news😱psi0vp5n6ft8z8r@news.individual.net...
> I was using "know" in the metaphorical sense. The
> monitor maker knows that the signal is apt to be
> either linear, or pre-comped in the 1.8 - 2.6 gamma
> range ...

Still "no," with the exception for TV products intended
for use with a given broadcast TV standard. It doesn't
really matter much for the purposes of this discussion,
since the bottom line remains the same - no matter what
application we're talking about, we'd LIKE it for LCD
monitors to deliver a "CRT-like" response curve with a
gamma of around 2.2-2.5. The problem is how that
might be achieved.

> ... and that if the user has any tool for dealing with
> a mismatch of expectations, it's apt to be just a simple
> exponent control, and maybe ganged (can't separately
> adjust R, G and B).

Most likely, and in fact some products do provide
such controls to the user. The problem remains that
there is not a sufficiently fine degree of control provided
by the current hardware to allow the curve to be matched
well while NOT impacting the accuracy of the response
elsewhere.


> Is a gamma standard a topic of any of the follow-on
> standards to DVI? Packet? Send-changed-data-only?

Not with regard to the interface standards themselves, no.
There ARE, of course, various standards which define the
gamma that "should" be provided by the display, such as
sRGB and the aforementioned broadcast TV signal standards.


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Bob Niland" <email4rjn@yahoo.com> wrote in message
news😱psi0trvx5ft8z8r@news.individual.net...
> Does NAVI bring any benefits to the installed base of
> CRTs? Does it matter if it does?

With respect to the first question - yes, there are some
features in NAVI that woule be useful to CRTs, if any were
to implement them. Among these are what amounts to an
"automatic gain control" feature (i.e., the display could
automatically compensate for signal amplitude errors,
including cable losses) and a channel for carrying digital
audio information over the VGA interface.

I'm not sure how to answer the second question.


> If it does bring benefits to LCD via analog connect,
> does that matter? I suspect the users who care about
> whatever NAVI promises, will tend to go digital.

Oddly enough, one of the other features that the NAVI
standard provides is a means to do true "digital"
transmission of the video information over the VGA
interface.

> And I have a suspicion that the temptation on entry-
> level PCs in the near future will be an analog-free
> connection. A dumb UMA frame buffer, exposed thru a
> TMDS chip thru a DVI-D (only) port on the back panel,
> thru a DVI-D (only) cable, to a DVI-D (only) monitor.
> Omits a couple of buffers, a DAC, an ADC (maybe) and
> some copper. Maybe only runs at native res. Does DVI
> allow captive cable at display?

I think so - but the biggest problem in the above scenario
is that the PC industry has yet to get away from the model
that says the display is a very flexible device in terms of the
range of formats and timings it should be expected to accept.
God knows we ARE trying to get there, though...


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:2plvr0d0hnf7c5gck0sektqo2t5bdnm7lq@4ax.com...
>> Even that I wonder about. Flat panels are the rage in developed
> countries, but CRTs still have a market elsewhere, since they are so
> cheap.

The question, though, is to a very large degree NOT
whether a market exists, but if there will be anyone left
willing to make the CRTs. CRT production is not something
that some little niche player will be able to crank up and keep
going in their garage; it takes a pretty sizable commitment of
capital. Not as much as an LCD fab, I grant you, but it's
still something that's going to be the domain of the big boys
- and once they're tired of making them, they're GONE.

>
> So all the Artisan buyers chose LCDs instead? That's hard to believe.
>

Who says that they will have a choice? You can't buy a
product which is no longer manufactured.




> > You can watch the Asian press lament the near-daily LCD
> > pricing collapse at: <http://www.digitimes.com/>
>
> Why do they have a problem with it? I thought margins were small.

Exactly...and the farther the prices collapse, the worse the
margins get. And SOMEONE has to pay for those nice
shiny new billion-dollar fabs that are driving these prices
down in the first place.

> > Only if someone is still making them.
>
> They will likely be made in Asia for quite some time. There are still
> several billion people there without monitors.

That CRTs are made SOMEWHERE is no guarantee that
CRTs of the type and quality we've been speaking of here are
still available in the market. What you're going to see coming
out of the "new" Asian sources (e.g., mainland China) are almost
certainly going to be entry-level products only. The high end
WILL go to non-CRT technologies, by necessity.


Bob M.
..
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:0huvr01vmh6464sus2d1h2j7p2t3i97afs@4ax.com...
> I had read elsewhere that even production of that had stopped.

I believe you have read correctly.

>
> > Even though CD audio media was higher priced than
> > LP, and CD players were substantially higher priced
> > than turntables, CD still killed LP surprisingly rapidly.
>
> But that's not a valid analogy.

I disagree; even though the CD and LP are both
examples of storage technologies, the example DOES
apply in the sense of a new, incompatible technology
displacing an older one. The discs themselves may be
simply "storage devices," but to move to CD the user also
had to buy a new sort of player that FUNCTIONALLY
did exactly what the old one (the turntable) did - it "read"
the information from the storage medium and converted
it to an audio signal. That's a very close analogy to the
LCD vs. CRT situation. The LCD fundamentally DOES
what a CRT does - it presents images to the viewer -
but through a completely different technology, and one which
does not co-exist in the same manufacturing environment
as its predecessor. It is a displacement sort of change in the
market, rather than simply an "upgrade" to a "new and better"
version of the same sort of thing (as occurs, for instance,
every time a new model of car is introduced).


> For input and output devices, the situation is different. For one
> thing, they are all analog, whether they are called digital or not.

Well, no, not really, but then I've already said enough about that
elsewhere. There are most certainly "digital" display devices (i.e.,
those that fundamentally deal with "numbers" rather than analog
control) - the LCD just doesn't happen to be one of them.


> Permanently dividing the screen into discrete pixels does help for
> things like geometry, but it hurts for things like resolution (only one
> resolution works if the pixels are fixed on the screen).

What do you think the phosphor triads of a CRT do?


> That doesn't mean that one must throw up one's hands and follow the
> market.

Actually, in many cases it DOES. If the forces of the market
as a whole wind up dictating that the product you want to buy
is no longer produced, and you yourself do not have the resources
to continue to make that product on your own, you ARE forced
to follow the market. That situation is very rapidly coming to be
in the case of the high-end CRT market. Very soon, there simply
won't be any such things to be had. (An analogy: no matter how
much you might want to buy a factory-fresh Ford Model T, there
simply is no such thing on the market these days.)


> Not unless the flat panels cost about the same as the tubes. The
> majority of TV owners in the world can barely afford a tube TV, much
> less a flat panel.

Which is why the CRT will continue for quite some time as the
entry-level display of choice, and similarly will continue to dominate
the under-30" classes of TV products. But again, that's not the
market we've been talking about here.


> > Note Bob Myers observation that LCD sales eclipsed
> > CRT within the last year. That's a fairly important
> > event, and won't go unnoticed by industry planners.
>
> It's important not to overestimate the significance of short-term
> trends.

But this is not a short term trend. There is NO ONE in
the business today who expects the CRT to regain any
market share in the desktop monitor arena, and the trend
of growing LCD share and declining CRT has been going
on for ten years now (it simply accelerated a lot in the
last few years, as LCD prices came down through certain
"magic" levels). Things are looking a bit better for the CRT
in the TV space, where it will continue to hold on to significant
market share at least through this decade and likely into the
next, but again no one is expecting the trend line to change
direction.


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:tsfvr05ef6ripvc9nor1pa9voi32mnmrlt@4ax.com...
> Bob Myers writes:
>
> > Possibly; we'll see how it plays out. While digital
> > interfaces are becoming a lot more popular, analog
> > connections still account for well over 80% of the
> > video actually being used in the desktop monitor
> > market, even though LCDs took over from CRTs
> > as the unit volume leader this past year.
>
> If my memory serves me correctly, the earliest monitor connection
> interfaces for PCs (CGA and EGA, for example) were _digital_ connection
> interfaces. VGA went "backwards" to analog to provide higher
> resolutions and color depths, and greater flexibility.

Technically, the very earliest monitor connections for
PCs were analog - since the very earliest PCs, not having
an established "monitor" market of their own, simply provided
either TV outputs on an RF connection or baseband video,
and used the home TV as the display. CGA and EGA were
"digital", but then why would VGA be "going backwards" simply
because it was analog? It can be argued that the very earliest
form of electronic (well, "electric," at least) communication was
digital in nature - everyone remember Mr. Samuel F. B. Morse?

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:upgvr095kjccjbt1orku8q2hed32f79mjo@4ax.com...
> But all digital systems are simply analog systems operated in a
> predefined way that declares anything below a certain threshold to be
> noise. So the capacity of a digital system is always inferior to that
> of an analog system with similar components and bandwidth.

No. Fundamentally, there is no such thing as an "analog"
system or a "digital" system - there is just electricity, which
operates according to the same principles whether it is
carrying information in either form. The capacity of a given
communications channel is limited by the bandwidth of the
channel and the level of noise within that channel, per Shannon;
but Shannon's theorems do NOT say what the optimum form
of encoding is, in the sense of whether it is "analog" or
"digital." My favorite example of a device which pushes channel
capacity limits about as far as they can go is the modem - and
do you call the signals that such a device produces "analog"
or "digital"? The answer is that they are purely digital - there
is absolutely nothing in the transmitted signal which can be
interpreted in an "analog" manner (i.e., the level of some
parameter in the signal is directly analogous to the information
being transmitted). The signal MUST be interpreted as symbols,
or in simplistic terms "numbers," and that alone makes it
"digital." The underlying electrical current itself is neither analog
nor digital from this perspective - it is simply electricity.


> Furthermore, the physical interface at either end of any system is
> _always_ analog, so the system as a whole is never better than the
> analog input and output components.

This is also an incorrect assumption; I can give several examples
of I/O devices which behave in a "digital" manner.

> Yes. If the channel is analog, the limit of the channel's capacity is
> equal to the limit imposed by Shannon. But if the channel is digital,
> the limit on capacity is always below the theoretical limit, because you
> always declare some portion of the capacity to be noise, whether it
> actually is noise or not. This is the only way to achieve error-free
> transmission, which is the advantage of digital.

No. Shannon's theorems set a limit which is independent of the
signal encoding, and in fact those real-world systems which come
the closest to actually meeting the Shannon limit (which can never
actually be achieved, you just get to come close) are all currently
digital. (The digital HDTV transmission standard is an excellent
example.) Conventional analog systems do not generally approach
the Shannon limit; the reason for this becomes apparent once the
common misconception that an analog signal is "infinitely"
variable is disposed of.


> In analog systems, there is no lower threshold for noise,

And THAT is the fancy version of the above misconception.
There is ALWAYS a noise floor in any real-world channel,
and there is always a limit to the accuracy with which an analog
signal can be produced, transmitted, and interpretated which
the various noise/error floors set. It is simply NOT POSSIBLE,
for example, for the commonanalog video systems over typical
bandwidths to deliver much better than something around 10-12 bit
accuracy (fortunately, that's about all that is ever NEEDED, so
we're OK there). I defy you, for instance, to point to an example
in which analog information is transmitted at, say, 24-bit (per
component) accuracy over a 200 MHz bandwidth.


> but you can
> use the full capacity of the channel, in theory, and in practice you're
> limited only by the quality of your components.

But even in theory, those components cannot be "perfect." A
transistor or resistor, for example, MUST produce a certain
level of noise at a minimum. You can't escape this; it's
built into the fundamental laws of physics that govern these
devices. The short form of this is There Is No Such Thing As
A Noise Free Channel EVER - not even in theory.


> > That's not the only reason for this; high-end audio also
> > incorporates huge dollops of what can only be seen as
> > "religious" beliefs, with no basis in reasoning or evidence,
> > re a given individuals' views on what is "superior."
>
> Not necessary. Ultimately, audio systems (and imaging systems) depend
> on analog devices for input and output. So no system can ever be better
> than the best analog system.

That does not logically follow, for a number of reasons. What
DOES logically follow is that no system can ever be better
than the performance of the input and output devices (which
we are assuming to be common to all such systems), but this
says nothing about the relative merits of the intermediate components.
If it is possible for the best "digital" intermediate to better the
best "analog" intermediate, then the digital SYSTEM will be
better overall, unless BOTH intermediates were already so good
that the limiting factors were the common I/O devices. This is
not the case in this situation. (For one thing, it's not quite a
case of the chain being quite as good as its weakest link -
noise is ADDITIVE, just to note one problem with that model.)


> Just look at flat panels: they provide defect-free images at a fixed
> resolution, but they don't provide any higher resolutions. CRTs have no
> fixed upper limit on resolution, but they never provide defect-free
> images.

As has already been shown, CRTs most definitely have fixed
upper limits on resolution.

> Analog reduces to using the entire channel capacity to carry
> information, and tolerating the losses if the channel is not noise-free.
> Digital reduces to sacrificing part of channel capacity in order to
> guarantee lossless transmission at some speed that is below the maximum
> channel capacity.

No. Here, you are actually comparing two particular versions
of "analog" and "digital," not fundamental characteristics of these
encodings per se. And the most common examples of "analog"
signalling do NOT, in fact, use the full channel capacity. (Even
if they did, a "digital" signalling method can also be devised which
does this - again, see the example of the modern modem or
HDTV transmission.)

> With digital, you sacrifice capacity in order to
> eliminate errors. With analog, you tolerate errors in order to gain
> capacity.

"Capacity" is only meaningful if stated as the amount of information
which can be carried by a given channel WITHOUT ERROR.
Any error is noise, and represents a loss of capacity. What I
THINK you mean to say here is probably something like "quality,"
but in a very subjective sense.

> I used to go with the "analogy" explanation for digital vs. analog, but
> since everything in reality can be seen as _either_ a digital or analog
> representation,

NO. Let me be more emphatic: HELL, NO. Reality is reality;
it is neither "digital" nor "analog." Those words do NOT equate
to "discrete" or "sampled" or "linear" or "continuous" or any other
such nonsense that often gets associated with them. They have
quite well-defined and useful meanings all on their own, and they
have to do with how information is encoded. Nothing more and
nothing less. The world is the world; "analog" and "digital" refer to
different ways in which we can communicate information ABOUT
the world (and they are not "opposites," any more than saying that
"red is the opposite of blue" is a meaningul statement).


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:vjgvr0hikf8s3du6vij9nl140va7mmro31@4ax.com...
> But I said no _fixed_ upper limit. The upper limit depends on the
> performance of all the components in the chain. Ideally it is equal to
> or better than the design limit of those components.

We must be using different meanings for the word "resolution",
then. I most certainly see, for instance, the phosphor dot and
shadow mask structure of the typical color CRT as imposing a
very fixed limit on resolution. Can that CRT be used to display
image formats which supposedly provide a greater number of
"pixels"? Sure - but that's not the same as RESOLVING them.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:magvr0pad3nvjr58786c1s9crpomftod3o@4ax.com...
> Bob Niland writes:
>
> > Not to mention circuit bandwidth ...
>
> Circuit bandwidth places an even greater restriction on digital
> transmission. For any given channel speed, the real-world capacity of
> the channel is always lower for digital transmission than for analog
> transmission.

You are assuming the "digital transmission" must always
equate to simple binary encoding, one bit per symbol
for a given physical channel. That is not the case.

> Remember that digital transmission is nothing more than declaring an
> arbitrary signal level as a noise threshold, and considering anything
> below it as noise and anything above it as information. Inevitably,
> this reduces the information-carrying capacity of the channel.

But no more than the capacity is reduced by the channel anyway;
if a given level of noise exists in the channel, then the level of an
analog signal cannot be determined more precisely than the limit
imposed by the noise. It is exactly the same limit, for exactly the
same reasons, no matter what form of encoding is used. (It's
interesting to note that the Shannon limit is most meaningfully
expressed in units of bits/second or similar, but that the use of
such units does NOT imply that a "digital" system must be used
to transmit the information.)

>
> True, but CRT manufacture is extremely mature, and amazing things can be
> done.

To quote my favorite fictional engineer: "Ye canna change the
laws o' physics, Cap'n!" 🙂 The limits of what you can do with a
CRT are pretty well known at this point, in large part because it
IS such a mature technology. We understand pretty well what can
be done in terms of shaping, accelerating, and directing a beam
of electrons.


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> I disagree; even though the CD and LP are both
> examples of storage technologies, the example DOES
> apply in the sense of a new, incompatible technology
> displacing an older one.

But CDs were successful because of the clear superiority of digital
storage over analog storage. Flat-panel displays and similar devices
are not digital but analog, and they are input-output devices, not
storage devices, and so their advantages, if any, are far less patent,
and one cannot plausibly predict that they will succeed simply because
there is something "digital" about them in some respect (the only thing
they have in common with CDs).

> The LCD fundamentally DOES
> what a CRT does - it presents images to the viewer -
> but through a completely different technology, and one which
> does not co-exist in the same manufacturing environment
> as its predecessor.

But the LCD has no "digital advantage." It does have other advantages
(and disadvantages), but being "digital" is not among them. It's still
an analog device, like any other display.

> Well, no, not really, but then I've already said enough about that
> elsewhere. There are most certainly "digital" display devices (i.e.,
> those that fundamentally deal with "numbers" rather than analog
> control) - the LCD just doesn't happen to be one of them.

Every display device eventually produces an analog output, and the
quality of this analog output usually determines most of the quality of
the displayed image.

> What do you think the phosphor triads of a CRT do?

They are tiny, and the raster that excites them is adjustable. By
adjusting the raster you can perform very smooth interpolation with no
special circuitry. On an LCD display, you need special logic to
explicitly perform interpolation for non-native resolutions.

This was never much of an advantage for me because I always ran CRTs at
the highest resolution I could, but for people who like to run lower
resolutions, it's very handy.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> The question, though, is to a very large degree NOT
> whether a market exists, but if there will be anyone left
> willing to make the CRTs.

From what I hear, places like China can produce them dirt cheap.

> CRT production is not something
> that some little niche player will be able to crank up and keep
> going in their garage; it takes a pretty sizable commitment of
> capital.

Nothing compare to building LCDs. So in poor markets with
underdeveloped industrial infrastructures, it would make sense to
continue building CRTs. I expect the Third World may continue to do
this for some time; indeed, they may be the only ones building
Trinitron-style CRTs soon (and to think how exclusive Sony used to
consider their technology!).

> Who says that they will have a choice?

Why would anyone stop producing something that people are buying,
especially something with fat margins like a professional CRT?

> Exactly...and the farther the prices collapse, the worse the
> margins get. And SOMEONE has to pay for those nice
> shiny new billion-dollar fabs that are driving these prices
> down in the first place.

Someone must be making money if someone is still building the
fabrication facilities.

> That CRTs are made SOMEWHERE is no guarantee that
> CRTs of the type and quality we've been speaking of here are
> still available in the market.

Maybe. But tomorrow's Chinese CRTs might be the equal of today's
Artisan CRTs. There's no fundamental obstacle preventing this. And the
demand might well exist in China or India.

Of course, if and when LCDs can be made to match or equal the best
possible CRTs at similar price points, it won't matter. Certainly in
that case I won't care what happens to CRTs.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> CGA and EGA were
> "digital", but then why would VGA be "going backwards" simply
> because it was analog?

It wouldn't, at least not in my mind. But for many people today, analog
= backward, and digital = perfection.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> You are assuming the "digital transmission" must always
> equate to simple binary encoding, one bit per symbol
> for a given physical channel. That is not the case.

No, I'm stating that the distinguishing characteristic of a digital
communications channel is that it places an arbitrary dividing line
between noise and useful signal. Anything below the line is noise;
anything above it is signal. The advantage is that zero loss can be
achieved up to a certain bandwidth. The disadvantage is that the full
bandwidth of the physical channel can never be used.

> But no more than the capacity is reduced by the channel anyway;
> if a given level of noise exists in the channel, then the level of an
> analog signal cannot be determined more precisely than the limit
> imposed by the noise.

True, but analog channels are extremely variable in their
characteristics. If the noise drops dramatically in an analog channel,
you can communicate more information. If it drops dramatically in a
digital channel, nothing changes--your upper limit on channel capacity
does not increase.

> To quote my favorite fictional engineer: "Ye canna change the
> laws o' physics, Cap'n!" 🙂 The limits of what you can do with a
> CRT are pretty well known at this point, in large part because it
> IS such a mature technology. We understand pretty well what can
> be done in terms of shaping, accelerating, and directing a beam
> of electrons.

And what can be done still beats flat panels, for now.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> We must be using different meanings for the word "resolution",
> then. I most certainly see, for instance, the phosphor dot and
> shadow mask structure of the typical color CRT as imposing a
> very fixed limit on resolution.

Sure, but a lot of CRTs are of such poor quality that they can never hit
that limit, anyway.

And for years I actually used my CRT at a resolution that was just
slightly higher than that permitted by the grille spacing. It wasn't
high enough to produce obvious artifacts, though, especially under
normal viewing conditions. The pixel size was slightly smaller than a
phosphor stripe triad.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Conventional analog systems do not generally approach
> the Shannon limit ...

Only because systems cannot be made noise-free. If you declare an
arbitrary noise limit (thus making the system digital), you know when
errors occur or you can at least design systems that are error-free (and
thus do not accumulate errors). If you set the limit at zero (an analog
system), you never really know.

> I defy you, for instance, to point to an example
> in which analog information is transmitted at, say, 24-bit (per
> component) accuracy over a 200 MHz bandwidth.

You can approach 200 MHz as closely as you wish with either digital or
analog. With analog, you can't be sure how many errors you'll get.
With digital, you can predict how many errors you'll get.

> If it is possible for the best "digital" intermediate to better the
> best "analog" intermediate ...

It isn't.

Furthermore, in systems that do not involve a chain of components that
can accumulate errors, there's virtually no advantage to digital.

> As has already been shown, CRTs most definitely have fixed
> upper limits on resolution.

You've never used a monochrome CRT?

> "Capacity" is only meaningful if stated as the amount of information
> which can be carried by a given channel WITHOUT ERROR.
> Any error is noise, and represents a loss of capacity. What I
> THINK you mean to say here is probably something like "quality,"
> but in a very subjective sense.

Digital is simply a way of keeping errors within a predictable range.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:qoa1s0tbeg6as6lqhk0vjg0tk4rvfoa7it@4ax.com...
> From what I hear, places like China can produce them dirt cheap.

Yes, they can. But, having recently visited a number
of mainland-China display manufacturers, I would have
to also note that they're a long way from producing
a competitive high-end CRT display. Entry-level
stuff, sure.

>

> Nothing compare to building LCDs. So in poor markets with
> underdeveloped industrial infrastructures, it would make sense to
> continue building CRTs. I expect the Third World may continue to do
> this for some time; indeed, they may be the only ones building
> Trinitron-style CRTs soon (and to think how exclusive Sony used to
> consider their technology!).

The latest Trinitron technology IS still exclusive; the only
other manufacturer to ever produce it was NEC (as it
appeared in the NEC-Mitsubishi monitors), and that was
under license from Sony. I don't know if Sony still holds
any patents that would prohibit someone else from making
ANY sort of aperture-grille product, but they certainly still
own the IP that covers that technology in its recent forms.
And it is not that easy a tube to make...

>
> > Who says that they will have a choice?
>
> Why would anyone stop producing something that people are buying,
> especially something with fat margins like a professional CRT?

Because of the high overhead necessary to continue to make
the basic technology (in this case, the CRT) in light of a rapidly
diminishing market. You don't think Sony and NEC have stopped
making these things just because they were tired of playing
in the professional market, do you?

CRT plants ARE fairly expensive things to keep running, and it
simply is not feasible to run them for production runs that are
going to be measured in the tens of thousands per year, tops.

> > Exactly...and the farther the prices collapse, the worse the
> > margins get. And SOMEONE has to pay for those nice
> > shiny new billion-dollar fabs that are driving these prices
> > down in the first place.
>
> Someone must be making money if someone is still building the
> fabrication facilities.

More accurately, someone has convinced their board of
directors that the investment in the new fab is a good idea,
financially, over the long haul. New fabs are money LOSERS
at the start - that sort of investment is a big hole to climb out
of, and it takes some time to get through enough revenue-
generating production to pay it all off. Recently, there have
been a LOT of companies making the bet that they can make
a new, large-size fab pay off, and as a result the industry may
be facing an oversupply situation. That, though, drives prices
down to the point where the less-financially-secure of these
companies may not be able to survive in the market, or at
least be able to continue to operate their nice, shiny new fab
on their own. LCDs are NOT guaranteed moneymakers for
everyone who gets into the market, and certainly are not
moneymakers at all in terms of getting a really quick
return on your investment. You have to be willing to commit
to the long haul, and then be able to actually survive over
that period.

>
> > That CRTs are made SOMEWHERE is no guarantee that
> > CRTs of the type and quality we've been speaking of here are
> > still available in the market.
>
> Maybe. But tomorrow's Chinese CRTs might be the equal of today's
> Artisan CRTs. There's no fundamental obstacle preventing this. And the
> demand might well exist in China or India.

True, there's nothing fundamentally blocking this - but I
doubt that it will actually happen, for a number of reasons.
For one thing, remember that China is very keen on developing
their own high-end LCD capabilities - and the LCDs in general
keep getting more and more competitive in all markets, all the
time.

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:gda1s09vuo7q0a2loougt9a73pu06ghq5k@4ax.com...
> But CDs were successful because of the clear superiority of digital
> storage over analog storage. Flat-panel displays and similar devices
> are not digital but analog, and they are input-output devices, not
> storage devices, and so their advantages, if any, are far less patent,
> and one cannot plausibly predict that they will succeed simply because
> there is something "digital" about them in some respect (the only thing
> they have in common with CDs).

Perhaps, but if you've read carefully here you'll note that
I'm not the one (if there IS anyone) claiming that LCDs
will succeed simply because "there is something digital
about them."

> > Well, no, not really, but then I've already said enough about that
> > elsewhere. There are most certainly "digital" display devices (i.e.,
> > those that fundamentally deal with "numbers" rather than analog
> > control) - the LCD just doesn't happen to be one of them.
>
> Every display device eventually produces an analog output, and the
> quality of this analog output usually determines most of the quality of
> the displayed image.

Well, in this we are still disagreeing as to what "analog" means. I
don't see anything "analog" about the actual image - it's just an
image, it is neither digital or analog per se.

>
> > What do you think the phosphor triads of a CRT do?
>
> They are tiny, and the raster that excites them is adjustable.

They are just about exactly the same size as the physical pixels
of an LCD. Now, assume that the LCD in question provides
sub-pixel addressing (i.e., the image is not restricted to have its
"pixels" align with the pixel boundaries of the LCD, but rather
on the sub-pixel boundaries) - how is the effect of the "pixelated"
LCD screen on the image any different from the effect of the
phosphor triad structure? (This is not to say that the two images
have the same "look", but in terms of the limits on what can
be resolved, is there any real difference?)


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:gfb1s0l3ipii6lna2bd5biu6er2mj5sdc5@4ax.com...
> Bob Myers writes:
>
> > Conventional analog systems do not generally approach
> > the Shannon limit ...
> Only because systems cannot be made noise-free. If you declare an
> arbitrary noise limit (thus making the system digital), you know when
> errors occur or you can at least design systems that are error-free (and
> thus do not accumulate errors). If you set the limit at zero (an analog
> system), you never really know.

An analog system never ever has "the limit at zero"; that's the
whole point. You do not "declare an arbitrary noise limit" in
"making a system digital"; digital systems can and have been
made which adapt themselves (through varying the number of
transmitted bits per symbol) to the level of noise in the channel
at any given moment, so there's no real reason that either mode
suffers from a capacity limitation before the other.

>
> > I defy you, for instance, to point to an example
> > in which analog information is transmitted at, say, 24-bit (per
> > component) accuracy over a 200 MHz bandwidth.
>
> You can approach 200 MHz as closely as you wish with either digital or
> analog.

Sorry, you missed the important part of that: can you maintain 24-bit
accuracy in an analog system which fills a 200 MHz bandwidth,
in any current practical example? (And yes, it IS meaningful to
speak of the capacity of an analog system in bits/second, or
the accuracy of such a system in bits; this is basic information
theory.)

> > If it is possible for the best "digital" intermediate to better the
> > best "analog" intermediate ...
>
> It isn't.

But so far, this is merely an assertion on your part; you have
not given any real theoretical or practical reason why this
should be so. Again, practical examples exist of "digital"
systems which come very, very close to the Shannon limit
for their assumed channels. So what's this basic, inherent
limit that you seem to be assuming for "digital" transmissions?

> > As has already been shown, CRTs most definitely have fixed
> > upper limits on resolution.
>
> You've never used a monochrome CRT?

Not only used them, I've designed around them. There
is most definitely an upper limit on the resolution of ANY
CRT; it's just more obvious in the case of the standard
tricolor type.


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:2ab1s0pobga5hno7d5qn2u03u9evjnivgn@4ax.com...
> And for years I actually used my CRT at a resolution that was just
> slightly higher than that permitted by the grille spacing. It wasn't
> high enough to produce obvious artifacts, though, especially under
> normal viewing conditions. The pixel size was slightly smaller than a
> phosphor stripe triad.

Sure - which means that you DID NOT fully resolve those
pixels. Did you get a nice, stable image? Sure. Was it
even a very good-looking image? Perhaps. But were you
past the resolution limit of the tube? Definitely. Resolution
has a very well-defined, long-accepted meaning, and there
are a number of tests you can perform to see if a given
display actually "resolves" an image to a given level. You
DO NOT get to go beyond the limit imposed by, for
instance, the phosphor triad size without introducing
artifacts (errors). In general, you're actually getting errors
well before you even approach that size (aliasing, color
and luminance errors, and so forth). Resolution is NOT
just a question of whether or not you still think the image
looks "acceptably sharp."

Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"Mxsmanic" <mxsmanic@hotmail.com> wrote in message
news:k3b1s05lb7sdjuqpv6ush9fm9jlqmmpied@4ax.com...
> > To quote my favorite fictional engineer: "Ye canna change the
> > laws o' physics, Cap'n!" 🙂 The limits of what you can do with a
> > CRT are pretty well known at this point, in large part because it
> > IS such a mature technology. We understand pretty well what can
> > be done in terms of shaping, accelerating, and directing a beam
> > of electrons.
>
> And what can be done still beats flat panels, for now.

In some respects; in others, the CRT doesn't come close to
matching the LCD's performance, and hasn't for some time.
What is "best" is very much a matter of your personal preferences,
with respect to your use of these devices and the applications
and images you are dealing with. Which is all I have said, all
along, in the ongoing "CRT vs. LCD" debates.


Bob M.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> They are just about exactly the same size as the physical pixels
> of an LCD. Now, assume that the LCD in question provides
> sub-pixel addressing (i.e., the image is not restricted to have its
> "pixels" align with the pixel boundaries of the LCD, but rather
> on the sub-pixel boundaries) - how is the effect of the "pixelated"
> LCD screen on the image any different from the effect of the
> phosphor triad structure? (This is not to say that the two images
> have the same "look", but in terms of the limits on what can
> be resolved, is there any real difference?)

Pixels (or subpixels) on a flat panel have a constant luminosity
throughout their dimensions. On a CRT, the luminosity will vary
depending on the colors of the adjacent pixels and the bandwidth of the
monitor. This can make aliasing less obvious on a CRT. I'm not saying
that this is good or bad, just that it happens.

It's interesting that ClearType makes text on an LCD look much better in
most cases, even though it makes the pixels "messier" with its low-level
antialiasing. I haven't tried ClearType on a CRT, so I don't know what
that does (I doubt that it works very well).

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> The latest Trinitron technology IS still exclusive; the only
> other manufacturer to ever produce it was NEC (as it
> appeared in the NEC-Mitsubishi monitors), and that was
> under license from Sony. I don't know if Sony still holds
> any patents that would prohibit someone else from making
> ANY sort of aperture-grille product, but they certainly still
> own the IP that covers that technology in its recent forms.

What recent changes have been made to Trinitrons? The original patents
expired in 1990, I think. I recall that the Trinitron was important
enough to merit a special technical Emmy award for Sony in 1973. Of
course, in those days Mr. Morita was still around ...

In any case, it always seemed to blow all the competition away.

I was considering the Diamontron for a time, but I understand that,
although it's an aperture grille like a Trinitron, it apparently just
doesn't have the quality of a true Trinitron. I guess that's all
becoming increasingly academic now.

> Because of the high overhead necessary to continue to make
> the basic technology (in this case, the CRT) in light of a rapidly
> diminishing market. You don't think Sony and NEC have stopped
> making these things just because they were tired of playing
> in the professional market, do you?

I think they stopped making them out of misguided business decisions.

> CRT plants ARE fairly expensive things to keep running, and it
> simply is not feasible to run them for production runs that are
> going to be measured in the tens of thousands per year, tops.

The vast majority of monitors being sold today are still CRTs. This is
even more true for television sets.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Mxsmanic wrote:

> Bob Myers writes:
>
>> The latest Trinitron technology IS still exclusive; the only
>> other manufacturer to ever produce it was NEC (as it
>> appeared in the NEC-Mitsubishi monitors), and that was
>> under license from Sony. I don't know if Sony still holds
>> any patents that would prohibit someone else from making
>> ANY sort of aperture-grille product, but they certainly still
>> own the IP that covers that technology in its recent forms.
>
> What recent changes have been made to Trinitrons? The original patents
> expired in 1990, I think. I recall that the Trinitron was important
> enough to merit a special technical Emmy award for Sony in 1973. Of
> course, in those days Mr. Morita was still around ...
>
> In any case, it always seemed to blow all the competition away.
>
> I was considering the Diamontron for a time, but I understand that,
> although it's an aperture grille like a Trinitron, it apparently just
> doesn't have the quality of a true Trinitron. I guess that's all
> becoming increasingly academic now.
>
>> Because of the high overhead necessary to continue to make
>> the basic technology (in this case, the CRT) in light of a rapidly
>> diminishing market. You don't think Sony and NEC have stopped
>> making these things just because they were tired of playing
>> in the professional market, do you?
>
> I think they stopped making them out of misguided business decisions.

And of course you are a better marketing guy than the people at Sony. So why
ain't you running a 50 billion dollar corporation if you're such an expert?

>> CRT plants ARE fairly expensive things to keep running, and it
>> simply is not feasible to run them for production runs that are
>> going to be measured in the tens of thousands per year, tops.
>
> The vast majority of monitors being sold today are still CRTs. This is
> even more true for television sets.

That's odd, it was my understanding that this year LCDs outsold CRTs for the
first time. Or maybe that was last year.

Television sets are not computer monitors. And there are very few CRT
televisions that can display full HD.
>

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers writes:

> Sure - which means that you DID NOT fully resolve those
> pixels.

Yes. But often important elements on the screen were composed of
multiple pixels, so the lack of clearly visible individual pixels wasn't
that much of a problem. And, unlike a flat panel, you can still see
pixels smaller than triads on a CRT screen--they are just a bit blurry
or partially resolved.

--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.