If your electricity rates are the U.S. average of 11.5 cents/kWh, then each Watt a device uses turns out to be almost exactly $1 worth of electricity over a year if the device were left on 24/7. So a 20 Watt monitor left on 24/7 for a year would use about $20 worth of electricity (exact figure is $20.16).
From there you can divide by how often the monitor is on.
If you use the computer about 8 hours a day, that's 1/3 of a day. So you just divide $20/3 = $6.67 for the year.
If you use the computer only 2.5 hours a day, that's about 1/10 of a day. So $20/10 = $2 for the year.
If your electricity rates are higher or lower, you just multiply by the ratio. So if your electricity rates are 20 cents/kWh, then you just multiply by...