I'm using either a vga with an HDMI adapter or DVI to HDMI.
Are you using a VGA or DVI output from and old GPU and feeding it into a adapter that outputs HDMI? If so you may be disappointed if your adapter is not capable of handling all modes. Cheap adapters have their weaknesses and don't always work as you think they should (or the manufacturer claims).
When running old computers, I connect the VGA output from the GPU directly to the VGA input on an old monitor. Similarly, I connect the DVI output from the GPU direct to the DVI input on a monitor. I do not use any fancy signal converters.
On an old PC, the initial POST screen tends to run at 640x480 or 800x600, then the GPU usually switches to a higher resolution for the Windows Desktop, e.g. 1024x768 or 1280x1024. The refesh rate might change too, e.g. from 60Hz to 72Hz, 75Hz, etc.
Failure to display anything after POST sometimes results when Windows has been set to work at a fairly high resolution in past with a different monitor, e.g. 1600x1280, 1920x1200 or 2560x2048. Or it could be the GPU is set by Windows to output 75Hz refresh and your monitor/adapter can only cope with 60Hz.
If you connect a monitor with a lower maximum resolution of only 1280x1024, or an adapter that can't cope with higher resolutions, when Windows switches the GPU to 1600x1280, etc., all you get is a blank screen. The Windows 'screen resolution' and/or 'refresh rate' might be outside the range of your monitor/adapter.
I recommend using a monitor with native VGA or DVI inputs during the testing phase, to rule out any problems with inexpensive video standards adapters (VGA to HDMI, DVI to HDMI). Old montors can usually be picked up cheap on eBay and you may find that more of your old GPUs are still working than you thought.
I still have a bunch of old monitors with VGA and DVI inputs. It makes sense to pair old GPUs with old screen technology when fault finding. Reduce the number of variables.