SteelCity1981 :
why no hdmi 2.0 connection?
Because it's useless. What advantage would it provide if the older/cheaper/proven spec ports can already push the panel to its limits?
I wouldn't say it's useless...
HDMI 1.4 supports 1080p as it's maximum resolution (most implementations I've seen don't even support 1200p). This monitor's primary input was meant to be Displayport. It doesn't even have a DL-DVI connector which is (was?) standard on any monitor with resolution greater than 1080p
They included HDMI 1.4 for AV connectivity. There's probably a scaler built into the thing upscaling 1080p -> 1440p (consoles, DVD players, etc). the HDMI port was not meant to be the primary input for this monitor.
HDMI 2.0 would allow HDMI to run in native resolution. However, this is a gray area. With the old HD standard, you would think that HD+ 1600x900p (something between HD 1280x720 and FHD 1920x1080) would be supported. But, most devices have no idea what to do with that resolution. Some of the better designed devices will upscale/downscale 720p/1080p to give you an image but many devices just blank out with that resolution. Devices were designed to output 720p or 1080p only and thus devices were designed to only accept inputs with those specific resolutions.
I suspect with this new generation of ultra high definition devices, something similar would happen. Devices will be designed assuming either FHD (1080p) or UHD (2160p) or 4320p and not have any idea what to do with QHD (1440p). Thus, adding a HDMI 2.0 port could cause confusion when AV devices could potentially not display anything when plugged in. Having HDMI 2.0 would only benefit computers which can output arbitrary resolution, but any computer that has HDMI 2.0 would also have a Displayport interface anyways...
Now with that out of the way, WHY VGA?!? (max resolution 1920x1200ish) Why can't that interface just die. DL-DVI would have been much more useful...