I'm fairly well versed in gaming pc components and can swap out things power supplies, and do stuff like upgrade memory, install or swap out graphics cards, etc.
But I don't know a lot about power usage beyond that I always go overkill with overall power supply. My current gaming rig has a Corsair 1200w which is way, way overkill, I probably only need about 700-750w for my current rig. But I had the brand new Corsair which I bought a few years back, sitting in its box for a long time and so I put it into my current rig. So the PC as a whole has plenty of power supply.
But how do graphics cards work with regards to how much power they actually draw?
I have an Nvidia Geforce RTX 2070 which has been running apparently fine for about eight months. The average gaming temps when I have games at highest graphics settings are at about 60-65c, sometimes as high as 70c but that's it. Idle temps are in the mid to high 30s c.
In my hardware monitor window and task manager, it shows my idle GPU usage at about 1-3%. It shows an idle wattage usage of about 27-28 watts.
What exactly does that 27-28 watts mean? Is that normal? Is that what it's drawing TOTAL against the power supply (don't know if I'm using the right terminology). I know it goes higher when playing games of course, but how high should it go? I've read on an number of other forums that they've tried to improve idle wattage usage and that some newer cards are much, much lower. I guess that to mean that older video cards tended to use higher wattage when idling?
But I don't know a lot about power usage beyond that I always go overkill with overall power supply. My current gaming rig has a Corsair 1200w which is way, way overkill, I probably only need about 700-750w for my current rig. But I had the brand new Corsair which I bought a few years back, sitting in its box for a long time and so I put it into my current rig. So the PC as a whole has plenty of power supply.
But how do graphics cards work with regards to how much power they actually draw?
I have an Nvidia Geforce RTX 2070 which has been running apparently fine for about eight months. The average gaming temps when I have games at highest graphics settings are at about 60-65c, sometimes as high as 70c but that's it. Idle temps are in the mid to high 30s c.
In my hardware monitor window and task manager, it shows my idle GPU usage at about 1-3%. It shows an idle wattage usage of about 27-28 watts.
What exactly does that 27-28 watts mean? Is that normal? Is that what it's drawing TOTAL against the power supply (don't know if I'm using the right terminology). I know it goes higher when playing games of course, but how high should it go? I've read on an number of other forums that they've tried to improve idle wattage usage and that some newer cards are much, much lower. I guess that to mean that older video cards tended to use higher wattage when idling?
Last edited: