Look up any power comparison chart that lists an 8800GT or 9800GT. Take that chart's idle consumption and subtract 2.5A x 12v (30 watts), that is the PC's overall power consumption without the graphics card.
Now that you know what the rest of the system consumes at idle, take the load reading they had and subtract 6A (the max I've ever seen my 8800GT go) x 12v (72w), that is the PC's overall power consumption at load without the graphics card.
Now assuming the hardware is identical except for the video cards, you can figure out how much that GTX 260 uses. If you cant find a benchmark featuring both an 8800GT AND your chosen card, then things get a bit more complicated. Find some common ground, the 260 is almost always compared to a 9800GTX, so compare the 8800GT to that 9800GTX, then the 9800GTX to the GTX 260. Toms hardware claims ~15A for the 260, but then again somehow their 1GB 8800GT draws less power than the 512MB version... significantly less. My 8800GT 512MB uses slightly lower power at stock clocks than their tested 1GB model.
Overclocking is a different story though, at 740/1840/980 (core/shad/mem) my Zotac (non-amp edition) 8800 runs around 3.2A idle and 7.6A under load. Thats at stock voltage too, so I'm curious as to what the 1.1v mod will do or 1.15v
That xbitlabs image looks pretty close though as the 9800GTX is just an 8800GT(S) with higher clocks and extra voltage. The GTX+ would likely come pretty close to the GT.