60FPS High on modern titles might be a bit of a stretch.
A 960M sits between a GT 1030 and a GTX 1050 in the desktop space, although much closer to a 1030 than a 1050.
Low 1080p is probably going to be it's sweet spot..... Medium in some titles.
FWIW though, have you truly isolated your desktop as the cause of the increased utility bill?
An i7-870 is a 95W TDP chip, a GTX 980 is a ~165W TDP card.
Overall, you're looking at a ~<300W power draw.
Even assuming 300W, and a basic 80+ bronze PSU
=375W
x24hours
x365 days
= 3,285,000 watt hours annually
/1000
=3,285 kWh annually or ~274 kWh/month
From what I can find, Denmark has one of the highest electricity costs at (the equivalent of) $0.41/kWH
The average, worldwide would be more like (the equivalent of) $0.20/kWh
So, using the figures
HIGH
3,285 kWh annually x $0.41 = $1,347 annual or ~$112/Month
AVERAGE
3,285 kWh x $0.20 = $657 annual or $55/Month
And that's running 24/7 100% load
Realistically, your actual consumption is probably less than half of that (12hours/day)
Taking 1/2 of that max, your power "cost" would be somewhere between $30-$60/month IF you're running 100% load, 12 hours a day, every day.
If you only use it for, say 4 hours a day (after school?) and some more on weekends, then you're down to between $15-$30/month
Compare that the to GL752V with it's max 120W power draw
That's not accounting for efficiency drops, which I assume there is.
Using 120W, and the same formulas as above:
120w
x24 hours
x365
= 1,051,200 watt hours annualy
1,051 kWH or 88kWh/month
1,051 x $0.41 = $431 annual / $36 month (Or half for 12hours/day = $18)
OR
1,051 x $0.20 = $210 annual / $17.50 month (or half for 12hours/day = <$9)
Considering all that, using the "typical" workload, your savings in electricity would work out to be between $6-$12 by dropping to the laptop.
I'm not trying to trivialize $6-$12, but with that context, is there not some way you could raise $10 per month & give it to your parents?