CHAPTER FIVE

NUMNER BASES

5.2.1. Bit

The term bit was introduced by an American statistician and computer scientist, John Tukey  in 1946, as an abbreviation of the term binary digit. Bit is the smallest unit of data or information that can be stored in a computer. It is usually, represented by the numbers 1 and 0- binary numbers. In computing these numbers correspond to the states of on and off, true and false, or yes and no.

5.2.2. Nibble

A nibble is a group of 4 bits. The smallest value a nibble can hold is 0000 and the largest number is 1111.
Nibble is used by early processors such as

5.2.3. Byte

The term byte was formulated by Dr. Werner Buchholz of IBM in 1956. A byte is composed of eight bits. A byte is the unit most computers use to represent a character such as a letter, number, or typographic symbol (for example, "g", "5", or "?"). Computer storage is usually measured in byte multiples. For example, an 520 MB hard drive holds a nominal 520 million bytes - or megabytes - of data.

5.2.4. Word

Bytes are combined into groups of 1 to 8 bytes called words. The size of the words used by a computer’s central processing unit (CPU) depends on the bit-processing ability of the CPU. A 32-bit processor, for example, can use words that are up to four bytes long (32 bits).

5.2.5. Kilobyte, Megabyte And Gigabyte

5.2.5.1.       Kilobyte

Kilobyte is used for measuring the amount of data or information a computer can store. Kilobyte is expressed as kb. A kilobyte (KB) is 1,024 bytes, not one thousand bytes as might be expected, because computers use binary (base two) math, instead of a decimal (base ten) system.

Page 45 Page 46 Page 47

Copyright@2015. Digital Vision Digital Content Development Unit. www.digitalvision.net.ng