Analog signal to Digital signal conversions

metallica435435

Distinguished
Dec 30, 2011
56
0
18,640
I know that for a computer connection you can only convert signal one way without the use of a box. If the signal leaving the computer is digital it can be converted to analog without a problem.. But I am writing because I am wondering why...

What I learned in school was that analog signal was a wave, and a digital signal was basically just a stream of yes and no's denoted by either a "1 block (meaning yes)" or a "0 block (meaning no)".

They said that it would be really easy for a analog signal to be converted into a digital signal because you can just "round the wave signal over a specified time interval to create either a 1 block or a 0 block" which would make it digital. BUT, with that in mind, they said to create an analog signal from a digital signal you would have to synthetically create analog waves from the digital 1 & 0 blocks that the digital signal would provide.. They said that would be extremely difficult and would require a box to do so.

All of this info checks out if you look at local broadcast-ed TV for example. When you try to get the now digital broadcast-ed TV on your old analog TV, you have to get a converter box to take the digitally aired signal and create a synthetic analog signal for your old piece of equipment.

BUT with computers its the exact opposite for some reason!!! with a computer you can take the digitally created signal from your laptop video card, and make it into a analog monitor signal with the use of a 4 dollar cable and no converter box! I just don't get it, but any help would be greatly appreciated :)

Sorry for the lengthy and complicated post but its been driving me nuts for years and i cant ask my professor anymore :/

Thanks again!
 
You got many things mixed up...

To start with your last observation: You are able to plug analog monitor (VGA) into "digital" (DVI-I) using cheap cable just because the interface on your laptop carries both digital ana analog signals.

You need "a box" for ADC or DAC (Google/Bing these, you'll learn a lot) because it is not just "squaring the analong" or "smoothing the digital" signals. In couple of words:
- analog signal is sampled many times, and resulting value is converted into a number which is encoded into digital signal. This conversion is done at frequency several times higher than the frequency of the analog signal (check Nyquist–Shannon sampling theorem)
- reverse conversion is done by first extracting the number representing analog signal from the digital stream, then converting that number into voltage which goes out as analog signal.
 
Hello! Thank you for your very fast and insightful response! I googled ADC and DAC like you noted! Very insightful!

A couple quick questions though,
When you say my laptop carries both digital and analog signals, do you mean that it has both a VGA port and a digital port (like an HDMI or DVI)? Or did you mean something else, because mine only has a HDMI port.

As far as the ADC's and DAC's go (or "boxes" lol), is there a reason why a analog TV requires a DAC box but a analog computer monitor doesn't (assuming there both receiving a digital signal)? because (and i'm sure i'm wrong) it feels like they should both require the same thing unless the computer video card or some other piece of the computer already has a DAC built in or something..
 
If you want to use a VGA to DVI-I cables your laptop must support it, there are some laptops that have an integrated DAC, but they're veeeeeery rare, most people that buy those cables find themselves in a bad purchase.

Analog to digital works like this:
An analog signal is sampled and its amplitude is represented in a binary representation, this takes a very little of time, but that time causes that when you take he next sample, there could be a change in the amplitude of the analog signal, so, instead of getting a nice and curvy signal, you get a squareish-like signal.

Digital to analog works in a opposite way, you have your sampled binary representation and give it a voltage amplitude, and because the nature of the DAC, you also end up with a squareish-like sample, that can be filtered, but will not be the same.
 
Ya I sort of know how they work... I was just taught that it is easier to convert analog to digital rather than the other way around. Since TV's required a complicated box to convert digital to analog I thought maybe my professors were right, but if that was true then it would be really easy to convert an analog signal to a digital signal on on a computer. And from the cables I'm looking at its actually impossible.

Also if what I was taught was true, then converting the digital computer signal to a analog monitor signal would require a fancy box but instead it doesn't.. Mostly just asking if my teacher was wrong and if so what the right explanation is, and also why it works one way with a TV and a totally different way with a computer
 
Actually is easier to converto from digital to analog, and conecting a computer to a vga monitor doesnt require a external convertor because the computer already has one, vga is actually an analog signal, and vga signal is different to rca signal, maybe there is where your confussion lies.
 
Well my computer doesnt have a VGA port, it only has HDMI thats why i was surprised it didnt need a box. I also can only assume its easier to convert a digital signal to analog because it doesnt require a box but I just dont understand why. Or why it requires a box to convert the digital signal into an analog one for TV's
 


Converting a digital signal to a analog one does require a convertor, its easier than analog to digital but still requires a convertor.