[citation][nom]Camikazi[/nom]No, they are right, gigabyte ethernet (1GB) would be 8 gigabit ethernet (8Gb), since 8 bits make one byte.Going by the article it's kind of obvious they meant gigabit ethernet, but using the word gigabyte so much they just slipped.[/citation]
no that doesn't make any sense. you have it backwards.. it takes 8 bits to make a byte so you have to divide the 1Gb by 8 and thats your actual number. you even have the GB and Gb backwards. for example 10Mb is actually more around 1.310 MB (using rounded numbers of course) my 20 meg line's real speed is approximately 2.621 MB
I did this a clumsy way but i hope it shows better how to calculate bites into bytes.
1048576 bites in a MB (this value will be B, A will be actual Bytes )
10 * B ( B = 1048576) / 8 = A ( A = 1,310,720 )
Now my numbers could be off but so far they have proven accurate for me.
Now 1Gb would be
1000 * 1048576 / 8 = 131,072,000 MB
This is assuming we are using real (decimal) 1024 bytes
Even if the network card was capable of 1 GB which its not its only doing 1Gb it still wouldn't be capable of 8x its rated speeds.