G
Guest
Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)
Bob Myers writes:
> Sorry, you missed the important part of that: can you maintain 24-bit
> accuracy in an analog system which fills a 200 MHz bandwidth,
> in any current practical example?
I'm not sure what you mean by "24-bit accuracy." How many bits per
second?
You can always maintain at least the accuracy of the equivalent digital
system.
If you can push 200 Mbps through a digital channel, you can also get at
least 200 Mbps through the same channel with analog encoding (and
typically more). However, the analog equipment may cost more.
> But so far, this is merely an assertion on your part; you have
> not given any real theoretical or practical reason why this
> should be so.
Information theory proves it.
> Again, practical examples exist of "digital"
> systems which come very, very close to the Shannon limit
> for their assumed channels. So what's this basic, inherent
> limit that you seem to be assuming for "digital" transmissions?
The basic limit is the fact that you declare anything below a certain
level to be noise. You thus sacrifice any actual signal below that
level, and in doing so you also sacrifice part of your bandwidth. You
don't make this arbitrary distinction in an analog system, so your
bandwidth is limited only by the _actual_ noise in the channel.
> Not only used them, I've designed around them. There
> is most definitely an upper limit on the resolution of ANY
> CRT; it's just more obvious in the case of the standard
> tricolor type.
What limits resolution in a monochrome CRT? Scanning electron
microscopes prove that electron beams can be pretty finely focused.
--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.
Bob Myers writes:
> Sorry, you missed the important part of that: can you maintain 24-bit
> accuracy in an analog system which fills a 200 MHz bandwidth,
> in any current practical example?
I'm not sure what you mean by "24-bit accuracy." How many bits per
second?
You can always maintain at least the accuracy of the equivalent digital
system.
If you can push 200 Mbps through a digital channel, you can also get at
least 200 Mbps through the same channel with analog encoding (and
typically more). However, the analog equipment may cost more.
> But so far, this is merely an assertion on your part; you have
> not given any real theoretical or practical reason why this
> should be so.
Information theory proves it.
> Again, practical examples exist of "digital"
> systems which come very, very close to the Shannon limit
> for their assumed channels. So what's this basic, inherent
> limit that you seem to be assuming for "digital" transmissions?
The basic limit is the fact that you declare anything below a certain
level to be noise. You thus sacrifice any actual signal below that
level, and in doing so you also sacrifice part of your bandwidth. You
don't make this arbitrary distinction in an analog system, so your
bandwidth is limited only by the _actual_ noise in the channel.
> Not only used them, I've designed around them. There
> is most definitely an upper limit on the resolution of ANY
> CRT; it's just more obvious in the case of the standard
> tricolor type.
What limits resolution in a monochrome CRT? Scanning electron
microscopes prove that electron beams can be pretty finely focused.
--
Transpose hotmail and mxsmanic in my e-mail address to reach me directly.