does vga support 1080p?

Status
Not open for further replies.
Solution
VGA can indeed support 1080p. The quality of the signal begins to drop off above 1920x1080 (1080p) which will cause a drop in image quality due to the analogue nature of the signal but with a good enough cable and transceiver on either end it can be used for resolutions up to and including 2048x1536.
VGA can indeed support 1080p. The quality of the signal begins to drop off above 1920x1080 (1080p) which will cause a drop in image quality due to the analogue nature of the signal but with a good enough cable and transceiver on either end it can be used for resolutions up to and including 2048x1536.
 
Solution

vampyiere6

Honorable
Mar 7, 2013
832
0
11,160


ok thanks so that monitor would be good even when it just has vga connector, with some color calibration i guess.

If vga can support 1080p why are ppl so much against it then? and want hdmi or dvi (like me i want dvi)
Since i tought vga cant do 1080p since ppl hate it so much.

 


VGA is an older signalling standard. The analogue components are the the red, green, and blue colour values that describe each pixel. Each of these values (one of each is needed for each pixel) is represented as a voltage level that sits between 0 volts, or no intensity, and 0.7 volts, or full intensity. So a pure black pixel will have an ideal signal level of 0v,0v,0v on the three lines, a pure white pixel will have an ideal signal level of 0.7v,0.7v,0.7v on the three lines, a pure red pixel will have an ideal signal level of 0.7v,0v,0v, a pure blue 0v,0.7v,0v, and so on.

The digital data for each pixel is converted from a digital format at the source into the appropriate analogue representation for transmission to the display. Most consumer products use a standard 8 bits per channel, and 4 channels per pixel (Alpha, Red, Green, Blue) of which the first is used for transparency blending and the latter three are used for actual representation. This means that 24 bits, arranged as three 8 bit channels (red, green, and blue) will be converted into three analogue values for each pixel. The digital symbols, which have a digital range from 0 to 255, are mapped to the analogue symbols, which have an analogue range of 0.0v to 0.7v. 0 is mapped to 0v, and 255 is mapped to 0.7v with all intermediary values spread linearly in between. Take a look at Digital-to-Analog conversion theory if you wish to know more.

Transmission cables are lossy low-pass filters. They attenuate the signal between the source and the destination. The level of attenuation increases as the frequency components increase, or as the cable gets longer in length. For the system above, a pixel sent from the VGA transmitter with intensity values of 0.565v, 0.665v,0.332v may arrive at the display receiver with intensity values of 0.560v, 0.662v, 0.330v. This isn't a huge issue for a system with a good quality cable and good transceivers, but at higher resolutions the effects of the filter and capacitive nature of the cable start to show more obviously, which may result in a high frequency edge between symbols on a channel such as one that occurs between a low intensity value such as 0.032v and a high intensity value such as 0.66v being attenuated. That may result in the actual signal value being pulled up/down from where it should be. The effect is that when using VGA, high contrast images appear blurry at high resolutions, and very bright/dark colours will appear to be washed out.

Digital signal standards such as HDMI and DVI-D send the pixel data not as a triplet of 3 intensities, but as a triplet of 8 bit bytes. Each byte from each bit is sent using a differential digital signal rather than a single-ended analogue signal. This greatly reduces the effect of signal integrity loss that occurs over the cable because the transceivers can only send and receive 1s and 0s, nothing inbetween. As a consequence, any errors in transmission will be much more pronounced but they are extremely rare given a good cable.
 

TerribleRedMonster

Reputable
Aug 20, 2015
1
0
4,510
Pinhedd, thanks for taking the time and going to the effort to so clearly and thoroughly reply regarding the differences between VGA and HDMI or DVI-D technologies. Google brought me to this thread after I searched for clarity regarding the HD via VGA question, and I hadn't dared hope for such a high quality explanation.

Thanks again!
 

Archaicsword

Reputable
Sep 24, 2015
1
0
4,510


HDMI cables can do audio and video but VGA cables can only do video.
 
Status
Not open for further replies.