There are two ways to have what you described: three monitors with the same native resolution being run at non-native resolutions, and three monitors being run at their native resolution. The GPU doesn't care if the selected resolution is native or not. It builds an image and sends it to the display. If the image is native then the display maps it easily. If the image is not at a native resolution then the display logic in the monitor has to map what it gets on what it has.
When dealing with native resolution there is one pixel per addressable unit of display resolution. When switching to a lower supported, but non-native resolution the logic in the monitor has to extrapolate a solution. The monitor does this dirty work, not the GPU. It's called upscaling which has no relation to display scaling which defines how icons, text, and window frames are displayed.
When the new resolution is an exact, or clean, divisor of the native resolution, the display logic easily works. For example, if the monitor is a 4K unit, then the native is 3840x2160 (WQHD). If you want to display 1920x1080 (FHD) which is exactly the native divided by 2, then the logic simply has to display a box of 2p by 2p for each desired unit of display resolution. Now that effectively means that the visible pixel is 4x the screen size of the native pixel, which in most cases causes visible pixelation which only gets worse the bigger the screen size. When the new display resolution is not a clean divisor of the native resolution the logic has to make a compromise. Take 2K aka QHD, aka 1440p then the addressable display resolution is 2560x1440 (note that some companies use 2K when referring to UWFHD - 2560x1080 or WQXGA 2560x1600, both of which have similar issues with converting display units). 2560 goes into 3840 1.5 times. 1440 goes into 2160 1.5 times. Now as you might guess, the display cannot light up half a pixel, so what the logic has to do is attempt to map the desired image onto the available pixels. It is doing all of this every refresh so the image looks off.
But if you are using different monitors with different native resolutions all displaying at their native resolutions, then the display has no logic guesswork. Unless your GPU is from the digital stone age different resolutions on each monitor is not a problem. You can even set up the monitors with the same native resolution but different scales such as having a 32in 4k scale at 100% while at the same time having a 28in 4k scale at 125%. When changing scale True Type or similar software smooths out the pixels on the magnified text to sharpen the image. The GPU doesn't care. The GPU doesn't care if the refresh rates are different on each display. Mind you, since each display is different, windows spanning displays can look very strange.
I've run multiple monitors on my systems at home and work for many years, most of the time with different sizes and resolutions. In fact right now my system has three monitors with wildly different sizes and all with different resolutions on it: a Huion16in FHD graphics drawing tablet, a 28in Samsung 4K monitor, and a 55in Samsung 8K TV, which is my primary display. The 16in scales at 100%, the 28in scales at 150% and the 55in scales at 200%.