What’s the difference between a Bit and a Byte?

By 14th Aug 2017 Blog
Turrito Networks

Gigabit and Gigabyte are terms that are often confused. Both are measurements of digital data, however, what they measure and how they are used is different.

A Bit

This is the most basic unit of digital measurement. A bit is one binary unit, meaning it can either have a value of “0” or “1”. With computers, this can indicate “true” or “false”.

A Byte

A byte is a collection of 8 bits. The term was first created in 1956 by Werner Bucholz and was done deliberately to avoid confusion with “bit”

So, what’s the difference?

As bits or bytes get larger we use prefixes from the metric system. To tell the difference between the two when making abbreviations, bits get a lower case b and bytes get and upper case B. This table below can also help with understanding the conversions:

prefix multiplier bits-to-bytes bytes-to-bits
kilo- (K) 1,000x 1Kb = 125B 1KB = 8Kb
mega- (M) 1,000,000x 1Mb = 125KB 1MB = 8Mb
giga- (G) 1,000,000,000x 1Gb = 125MB 1GB = 8Gb
tera- (T) 1,000,000,000,000x 1Tb = 125GB 1TB = 8Tb


Bits and Bytes are also different in their uses. Bits are normally used when measuring the rate of data such as bandwidth. Here we us megabits or gigabits per second. Bytes are normally used to describe data storage/ capacity. We measure file sizes in Bytes and drives that store them in Gigabytes and terabytes.

So now you know the difference. Fun fact: 4 bits, or half a byte, is called a nibble.

Leave a Reply