If you're going for modern AAA gaming I don't think I'd bother with a slow clocked i5 like that barely reaching 3ghz. May as well get an i3 6100, it would probably perform better in some games. Going by passmark scores the 6500t version runs around 18% slower than the standard 6500. Some games aren't as cpu dependent so it may not matter as much, for those that are and that scale better with clock speed it will be a detriment. Just like the stock 6500 vs a 6600k oc'd. Given the $60 increased cost up front I think pretty much any power savings in terms of cost just went out the window. Cpu's themselves don't use a ton of power, less than gpu's use.
The difference in power consumption between an i3 6100 and i5 6500 is only around 10-13w under load, doing the math to convert that to kwh that means that 'if' both cpu's were run full out 100% load max power consumption (prime95) 24hrs a day, 365 days a year the i5 6500 would use 113.88 kwh more during the entire year than the i3. Going by average price of power in the u.s. at $0.12/kwh you're looking at a whopping $13.68/yr cost increase. Less though since no one is likely running their cpu 24/7/365 100% use. If using the same math to figure out the cost difference of gaming on the i5 vs i3 6hrs a day every day all year long, the cost difference is $3.41.
Maybe you're still interested in the low power aspect but putting it into perspective there's very little real world difference. You'd save more power by turning off 1 cfl light bulb with a 75w output equivalent since they run around 17-18w.