hardware gigabyte versus software gigabyte

Um. They're not terms I've heard used. Any context?

It's possibly you're referring to the difference between a decimal (10^9 bytes) and binary (2^30 bytes) gigabyte. Technically the latter should be called a gibibyte, but Microsoft doesn't bother using the term correctly. The difference between those is about 7.3%.
 
Thank you for your prompt reply! The whole context is a question I am trying to answer for a class assignment:
"Describe the difference between a hardware gigabyte and a software gigabyte and why it is important to be aware of the difference."
I first went on wikipedia and found indeed the explanation regarding the decimal and binary systems to represent the giga- versus gibi-byte. I was not sure that that was the answer to the rather confusing question so I started this thread :)

Nicole
 
Thank you very much for replying!
This also may be what the question refers to ("Describe the difference between a hardware gigabyte and a software gigabyte and why it is important to be aware of the difference."), so I will use this difference too.

Nicole



 
I will do that.
Thank you again!

Nicole