If you want more specifics, we'd need to know the model of MB, RAM, SSD, HDD, Keyboard etc. As well as your kW/H cost.
But generally you could see it as (I've assumed pretty peak W for unspecified items):
GTX 1080 = 180W
Ryzen 2600x = 95W
RAM = Approx 6W
Typical Samsung EVO 850 500GB SSD = 6W x 2 = 12W
Typical HDD = 8W x 4 = 32W
Typical High End MB = 80W
General case fan = 2W x 2 = 4W
Total just there is 409W - this excludes peripherals, extensions, and also efficiency of your components themselves, so thereoretically, if you ran just the above at full load for 1hr:
Convert to kW/H = 409W / 1000 = 0.409
Times this by your electricity cost = 0.409 x (hypothetically) 13 cents = 5.317 cents per hour.
I've assumed an american cents basis but in reality I'm GB so have no clue of electricity costs there.
I'm also sure plenty can disagree with me on my W numbers above as I know I haven't considered efficiency of your components and dissapation etc, but i've just tried to speak generally and hypothetically so you can get the drift.
Completely depends on the components, what you are running, as the PSU will only use the power it requires for the task. That's how I see general power consumption anyway, but happy for someone to disagree.
The best way to measure would be to literally read the power being used and work out how much wattage you used in an hour, then times that by the cost of your electricity in W.
Edit: Want to clarify, this is assuming peak loads. - your components won't actually be drawing all this much.
None of the components would run at full tilt for any game except for maybe the GPU if you hand picked the settings to make it run as close to 100% as possible.
On the other hand a PSU with a rating of 80% would draw 20% more power than what the components need.