A unit of information is any unit of measure of digital data size. In digital computing, a unit of information is used to describe the capacity of a digital data storage device. In telecommunications, a unit of information is used to describe the throughput of a communication channel. In information theory, a unit of information is used to measure information contained in messages and the entropy of random variables.
Due to the need to work with data sizes that range from very small to very large, units of information cover a wide range of data sizes. Units are defined as multiples of a smaller unit except for the smallest unit which is based on convention and hardware design. Multiplier prefixes are used to describe relatively large sizes.
For binary hardware, by far the most common hardware today, the smallest unit is the bit, a portmanteau of binary digit,[1] which represents a value that is one of two possible values; typically shown as 0 and 1. The nibble, 4 bits, represents the value of a single hexadecimal digit. The byte, 8 bits, 2 nibbles, is possibly the most commonly known and used base unit to describe data size. The word is a size that varies by and has a special importance for a particular hardware context. On modern hardware, a word is typically 2, 4 or 8 bytes, but the size varies dramatically on older hardware. Larger sizes can be expressed as multiples of a base unit via SI metric prefixes (powers of ten) or the newer and generally more accurate IEC binary prefixes (powers of two).