What bandwith are QXGA cables?

Noob333

Reputable
Nov 27, 2014
66
0
4,630
I would like to know what the maximum number in hertz a QXGA (or VGA as many people still like to call them) cables can support? Or another way of wording it, what refresh rate can it handle? Because everyone is always saying at how horrible cables they are but, I fail to see how they are so bad. Seems like unless you have a $500 monitor you won't need anything better than QXGA.
 
Solution


No they haven't, because degradation depends on other things than just length. It's affected by cable quality, signal noise (are you running it near a power board or another monitor, etc) and probably lots of other things that I'm not aware of too.

All I can say is that I've seen 1080p @ 60hz screens on VGA and I didn't like them personally. They were probably okay, but the bit of blur bothered me. Most recently my father was running 2 1080p screens and one was on VGA. I walked straight up and said, "what's wrong with that...
The reason you're struggling to answer your question is because VGA is an analogue, not a digital signal, which means you don't have the hard and fast rules about things like bandwidth as you do with an digital signal.

Digital signals are sent as a series of 0s or 1s, they either arrive as readable 0s or 1s, which means the signal is received, OR the signal degrades to the point where the receiving device can't determine whether certain bits are 0s or 1s... and the signal is lost. It either works or it doesn't. That means when standards like HDMI are released, they can do extensive testing in various environments to determine the maximum bandwidth of the cable, how long the cable can be, what types of contacts are required... etc etc. That then gives you a "standard" which, provided everyone in the signal chain (output device, cable, receiving device) follow the standard, you can basically guarantee that the 0s and 1s will stay intact and the signal will work.

VGA is analogue, it sends a video signal as a wave rather than as a set of 0s and 1s. The receiving device simply displays the (wave) signal it receives... in whatever state it arrives in. So you can run extremely high resolutions over VGA, or extremely long cable runs, but the signal will always degrade to some extent and this will affect the image on the display.

As an example, I used to work in a school where, from time to time, we would hold a full school meeting in an indoor/outdoor area. We found we could run VGA cables about 40 metres, daisy chaining 3 projectors together. The first projector, however, looked much, much better than the last (which had the longest cable run). The signal degraded very badly over that distance. It was fine for large white text on a black background, the image was poor quality but the text was readable, but any photos or video content was almost unrecognizable. Had we done the VGA run right next to "noisy" cables drawing a lot of power (like a lighting rig, for example), I suspect our signal would have been even worse. This is where you get into signal degradation which is extremely complex and I don't claim to know much about.

The second point is that the more complex the signal being sent (i.e. the higher resolutions or higher frequencies), the more sensitive they will be to signal degradation. Had we tried to run our school projectors at 1080p instead of the 1024x768 signal we sent - I suspect we would have got ever worse results. Even a 30cm VGA cable for a 1024x768 monitor @ 30hz... will have some signal degradation meaning the receiving display will be somewhat poorer in quality to the one being sent. The question becomes at what point is the signal degradation high enough to become a problem. At the school we were prepared to accept the massive signal loss from the 40m daisy chained cable run. But if you were doing colour critical photoshop work, that would be worlds away from an acceptable image.

This is why you can't have hard and fast rules for analogue signals. The long answer is... it depends on lots and lots of different things! When someone says something like "I used a VGA cable for my 1440p monitor and it worked just fine"... what you need to read is that the signal degradation that resulted from that person's equipment in that person's environment did not affect the image to the point where that particular person deeemed it to be a "problem". Your-millage-absolutely-will-vary!

The TL : DR answer: there is no set "bandwidth" for VGA... it's analogue. It will send whatever signal you give it and that signal will *always* degrade based on things like the distance of the cable, the quality and shielding of the cable, and the amount of signal noise the cable encounters. You need to decide whether the level of degradation is a "problem" or not.
 
I don't have a projector or have my monitor far away from my PC so signal degradation is not something I am concerned with I had already heard a little bit about it but, it was nice to have it explained.
 


Glad to help. Even on short runs, high resolutions or refresh rates are still subject to noticeable (in my opinion) image degradation. It's been a long time since I've done it but I used to run two old (but then very high quality) 21" CRT displays. While they could run very high resolutions and frame rates, the image went fuzzy and noisy which bothered me. Even though I was using a high quality 2m cable. I eventually decided I preferred to run them on 1600x1200 @ 60hz (if I remember rightly) rather than the higher modes they were capable off, simple because they noise at those higher settings were unacceptable -> for me. Of course, others will have had a different response and may have chosen different settings.
 
Seems like there would be a set degradation for each cable length. But, I guess they haven't came out with those kind of numbers. So, what I am hearing is they can't handle something higher than 60hz?
 


No they haven't, because degradation depends on other things than just length. It's affected by cable quality, signal noise (are you running it near a power board or another monitor, etc) and probably lots of other things that I'm not aware of too.

All I can say is that I've seen 1080p @ 60hz screens on VGA and I didn't like them personally. They were probably okay, but the bit of blur bothered me. Most recently my father was running 2 1080p screens and one was on VGA. I walked straight up and said, "what's wrong with that monitor", and after some fishing around, gave him a HDMI cable. He on the other hand, had been working on it for 6 months and hadn't even noticed.

There's no way I'd want to run higher than 60hz over VGA because I'm not even happy with 60hz over VGA. My father on the other hand would probably neither notice nor mind. You can always try it and see? But for sure go digital if you can.

What's the problem you're actually trying to solve here?
 
Solution