The reason you're struggling to answer your question is because VGA is an analogue, not a digital signal, which means you don't have the hard and fast rules about things like bandwidth as you do with an digital signal.
Digital signals are sent as a series of 0s or 1s, they either arrive as readable 0s or 1s, which means the signal is received, OR the signal degrades to the point where the receiving device can't determine whether certain bits are 0s or 1s... and the signal is lost. It either works or it doesn't. That means when standards like HDMI are released, they can do extensive testing in various environments to determine the maximum bandwidth of the cable, how long the cable can be, what types of contacts are required... etc etc. That then gives you a "standard" which, provided everyone in the signal chain (output device, cable, receiving device) follow the standard, you can basically guarantee that the 0s and 1s will stay intact and the signal will work.
VGA is analogue, it sends a video signal as a wave rather than as a set of 0s and 1s. The receiving device simply displays the (wave) signal it receives... in whatever state it arrives in. So you can run extremely high resolutions over VGA, or extremely long cable runs, but the signal will always degrade to some extent and this will affect the image on the display.
As an example, I used to work in a school where, from time to time, we would hold a full school meeting in an indoor/outdoor area. We found we could run VGA cables about 40 metres, daisy chaining 3 projectors together. The first projector, however, looked much, much better than the last (which had the longest cable run). The signal degraded very badly over that distance. It was fine for large white text on a black background, the image was poor quality but the text was readable, but any photos or video content was almost unrecognizable. Had we done the VGA run right next to "noisy" cables drawing a lot of power (like a lighting rig, for example), I suspect our signal would have been even worse. This is where you get into signal degradation which is extremely complex and I don't claim to know much about.
The second point is that the more complex the signal being sent (i.e. the higher resolutions or higher frequencies), the more sensitive they will be to signal degradation. Had we tried to run our school projectors at 1080p instead of the 1024x768 signal we sent - I suspect we would have got ever worse results. Even a 30cm VGA cable for a 1024x768 monitor @ 30hz... will have some signal degradation meaning the receiving display will be somewhat poorer in quality to the one being sent. The question becomes at what point is the signal degradation high enough to become a problem. At the school we were prepared to accept the massive signal loss from the 40m daisy chained cable run. But if you were doing colour critical photoshop work, that would be worlds away from an acceptable image.
This is why you can't have hard and fast rules for analogue signals. The long answer is... it depends on lots and lots of different things! When someone says something like "I used a VGA cable for my 1440p monitor and it worked just fine"... what you need to read is that the signal degradation that resulted from that person's equipment in that person's environment did not affect the image to the point where that particular person deeemed it to be a "problem". Your-millage-absolutely-will-vary!
The TL : DR answer: there is no set "bandwidth" for VGA... it's analogue. It will send whatever signal you give it and that signal will *always* degrade based on things like the distance of the cable, the quality and shielding of the cable, and the amount of signal noise the cable encounters. You need to decide whether the level of degradation is a "problem" or not.