Niavlys77 :
The only thing missing here is power/efficiency - AMD cards generally use a good amount more power than the NVidia ones.. would be nice to see the comparison, and the potential $/month or something of the sort.
Depends on the price segment, actually, as AMD has some efficient stuff, too.
But from a realistic standpoint, lets say there's a 50 Watt difference under load (at idle and active idle there's very little being drawn from the graphics card) and you play games an average 2 hours a day, five days a week. That's 40 hours a week times 50 watts = 2000 watt/hours = 2 kwh. In the most expensive states I think electricity costs somewhere in the 16 cent per kwh range, that's a
32 cent a month difference in power usage.
Not really worth calculating unless you're corporate with thousands of machines.