Basic equipment for troubleshooting complex electronics would be a multi-meter with a decent resistance (10MOhms or more, more accurate), a logic probe (less useful these days, but you can still find discrete hex inverters or optical isolators that can be checked for operation), and Oscilloscopes as the ultimate tool.
Luckily most of the switching done in a GPU is not super high speed, so you can get more reasonably priced scopes. This would let you observe the actual wave forms coming out of VRMs and fancier ones will do RMS calculations for you, etc. I believe you might need one with trigger capabilities if you want to catch start up behavior of circuits.
As to how PCBs work, going to be mostly visible traces and guess work without schematics. Multi-layer boards hide a lot of detail. Really they are just a compact form of wiring. Some basic knowledge of common features is necessary, like knowing the components of a typical VRM. Now all these devices conform to the PCIe standards, so a lot of the circuitry there will come right out of the PCIe specs, which is fully documented.
Many discrete parts are labeled, you can look up the design specs and usually even applied example schematics from the manufacturer. Often, companies will take that schematic and implement it directly on their circuit boards. (The old joke that the only real engineers are the application engineers, everyone else is just playing adult lego)
As for where to learn this stuff. Maybe the best resources for GPUs, outside of taking electronics engineering and really getting into it, are going to be the modding community. So the extreme overclocking community.
If you want to know more about how the internals of GPUs (silicon) works, both AMD and Nvidia release white papers, and all the information about how to code for the GPUs. Those two together will teach you about the inner workings, at least at a high level. Obviously a lot of the rest of that will be IP and not subject to public release.