Interpretation of Values for Download speeds

Example a commonly used web utility tests your broadband connection and reports final speed values as below:

Download Speed: 37680 kbps (4710 kB/sec)
transfer rate if 1 byte = 8 bits and decimal calculation)

And Upload Speed: 31650 kbps (3956.3 kB/sec)
transfer rate if 1 byte = 8 bits)

The unit symbol kB is commonly used for kilobyte as it is here, but may be confused with the common meaning of kb for kilobit.  Bit is the smallest unit of data, being a value of 1 or 0.

Standard IEEE 1541 specifies the lower case character b as the symbol for bit; however, the IEC 60027 and Metric-Interchange-Format specify bit (e.g., Mbit for megabit) for the symbol, sufficient to distinguish from byte.

A byte is normally defined as being 8 bits in recent computer jargon, so dividing the download speed of 37680 kbps (kilo bits per sec) by 8 gives a decimal value of 4710 kB/sec

The kilobyte (symbol: kB) is a multiple of the unit byte for digital information. Although the prefix kilo- means 1000, the term kilobyte and symbol KB have historically been used to refer to either 1024 (210) bytes or 1000 (103) bytes, dependent upon context, in the fields of computer science and information technology.

Dividing 37680000 bps by 8 to give bytes and then by 1024 gives 4599.6 kB – a slightly lower figure.  In the end exactly the same amount of information is transferred in the time.

Either way a speed of over 4MB per sec is not bad!  If comparing values make sure you use the same units – that is the main point.

On modern systems, Mac OS X Snow Leopard represents a 65,536 byte file as "66 KB", while Microsoft Windows 7 would represent this as "64 KB".

Storage device manufacturers measure capacity using the decimal system (base 10), so 1 gigabyte (GB) is calculated as exactly 1,000,000,000 bytes. The capacity of storage media in your Mac (Mac OS X v10.5 or earlier), and other Apple hardware is measured using this decimal system. They describe this on their product packaging and website through the statement "1 GB = 1 billion bytes."

Capacity stated in Mac OS X
However when you view the storage capacity of your Mac (Mac OS X v10.5 or earlier), or other electronic devices within its operating system, the capacity is reported using the binary system (base 2) of measurement. In binary, 1 GB is calculated as 1,073,741,824 bytes (1024 ^ 3 (which is 2 ^ 10) or 2 ^ 30). This difference in how the decimal and binary numeral systems measure a GB is what causes a 32 GB storage device to appear as about 28 GB when detailed by its operating system, even though the storage device still has 32 billion bytes, as reported. You will also see this difference in the "About" menu on your device. The important point to understand is that the available storage capacity is the same no matter which system (decimal or binary) is used. Nothing is missing.