Honestly under normal gaming, that card tends to use 200-250 watts. If is connected for 300(75 from the board, 75 from the 6 pin pci-e cable and 150 from the 8 pin pci-e cable). If you run Furmarks(but you got a video card for games not Furmarks) I am sure you can get to those limits and past them.
Now many cards have requirements listed so it is best to check those out. Even 2 identical cards may draw slightly different power to do the same thing.
You will see anything ranging from maybe 600 to 750 watts for some of the highly factory overclocked cards.
In the end quality matters and they sometimes over rate because some "650 watt" power supplies have less current on the 12 volt line than a quality 500 watt unit(or less in worse case).
If you check your current power supply it should have a sticker listing the 12 volt current rating. Also having the required 6 and 8 pin or even 8 pin pin pci-e connectors is normally an indication it was designed with that kind of load in mind.
This was a review of the 290 reference card's power consumption. Please note all cards have spikes above these values, but most power supplies can handle that without issues.
http://www.techpowerup.com/reviews/AMD/R9_290/24.html
Thermaltake has many power supplies so a model number would help. 80+ has nothing to do with power output and instead how much input is required to make that output. It is all about wasted energy(heat generated as a result of conversion from wall power to PC friendly power)