Wednesday, December 26, 2007

Wednesday Math, Vol. 8: powers of two

A single switch has two positions, on and off. If there are two switches mounted on a wall, they have four possible combinations.

on on
on off
off on
off off

Computers are made up of millions, even billions, of tiny switches, known originally as binary digits, at one time shortened to binits, but that name was eventually lost and the more popular term bits became the standard word used to describe the little switches. The word bitstring became standard usage to describe a set of bits clumped together, though every spell checker I use tells me bitstring isn't really a word. There are four possible bitstrings of length two, which we can create by substituting 1 for on and 0 for off in the patterns above.


Every time you add a bit to the string, the number of patterns doubles. This makes the powers of two very important to computer programmers. The number of bits clumped together as the standard size in a particular design of computer is called the architecture of the machine, as in terms like 4-bit architecture or 8-bit architecture. There was no standard size in the early era, kind of like the early days of railroading. But eventually, putting eight bits together and calling that a byte became the standard smallest useful clump of bits, and larger bitstrings tended to be multiples of eight bits, like 16-bit architecture, 24-bit architecture, 32-bit architecture, etc.

As we can see in the table, 2 to the 8th power is 256, so there are 256 different patterns of 0's and 1's that can be represented in the 8 bits that make up a byte. This make a byte a good size for storing a pattern for every possible thing you can type on a keyboard. There are 26 lowercase letters, 26 uppercase, 10 digits and a few dozen different types of punctuation and special symbols. The standard way to line up each of these with a different bit pattern is called ASCII. It's pronounced ask-ee and it was adopted as the standard way back in the computer Stone Age, which is to say the 1960's. There 95 standard ASCII characters, so there is a lot of extra space in the ASCII table, and a lot of special characters you can't type on a standard keyboard have their own ASCII code, just like standard letters, numbers and punctuation symbols do.

Those familiar with the metric system know that prefixes on the words meter, liter and gram deal with different powers of 10, kilo means 1,000, mega means 1,000,000, giga means 1,000,000,000, etc. It just so happens that 2 to the 10th power is 1,024, very nearly 1,000. In common computer parlance, a kilobyte, usually shortened to K, is 1,024 bytes, not 1,000 bytes. Likewise, a meg is 2 to the 20th, slightly more than a million, and a gig is 2 to the 30th, larger than a true billion.

Now playing: David Bowie - Thru' These Architects Eyes
via FoxyTunes


dguzman said...

I understand the binary on/off thing, but I still can't for the life of me figure out how people invented computers. Amazing.

FranIAm said...


Matty Boy said...

To dg: It was the work of a lot of smart folks, including John Von Neumann, who most mathematicians put at top of list o' smart folks in the 20th Century.

To fran: What what? (Translation: be specific, please.)

FranIAm said...

Oh Matty. It was just my ironic cry for not being able to understand one thing you say here.

If I lived in your area, I swear that I would hire you as my private maths tutor. When I even hear or see anything relating to math, I just freeze.

Deep sigh.

Distributorcap said...

are you sure i wasnt in your computer class at rutgers?