Apparently HDD manufacturers (at least Seagate) use the decimal definition of "kilo" meaning 1,000 versus the binary definition of 1,024. So to your computer, 1 kilobyte is 1,024 bytes but the decimal definition is 1,000 bytes. So, in decimal notation 1 megabyte is 1,000,000 bytes and 1 gigabyte is 1,000,000,000 bytes (versus 1,048,576 bytes and 1,073,741,824 bytes, respectively - in binary notation).

So, if you go by decimal notation, 500 GB is equal to 500,000,000,000 bytes. That is what Seagate uses to market their hard drives. When you hook up your hard drive though, those 500,000,000,000 bytes are really equal to ~465 gigabytes because again, in binary there are 1,024 bytes in a kilobyte (not 1,000).

500,000,000,000 bytes / 1,073,741,824 bytes = decimal notation of 500 GB / binary notation of 1 GB = 465.66

...which is why my 500 GB shows up as only 465 GB.

Am I the only who didn't know this or what? Is this common knowledge?

If you want the links, here are two:

http://www.seagate.com/ww/v/index.jsp?locale=en-US&name=Storage_Capacity_Measurement_Standards_-_Seagate_Technology&vgnextoid=9493781e73d5d010VgnVCM100000dd04090aRCRD

http://seagate.custhelp.com/cgi-bin/seagate.cfg/php/enduser/std_adp.php?p_faqid=336