Throughout this column, we have relied on the idea of randomness to seed all of our
unconventional computing experiments. In this month’s article, we will take a brief
detour from code and hardware to examine just what the concept of “random”
actually means, how our microcontroller is implementing it, how this differs from a
computer, and some schemes for creating “true” random number generators.
First, a little history. The concept of chance appears in
most ancient cultures, dating back at least as far as
the ancient Egyptians use of the “talus,” an early
predecessor to the die. Fashioned from the knuckle or heel
bone of a hoofed animal, the talus had four possible
outcomes. It has been found alongside tomb illustrations
and scoreboards in Egyptian sites lending support to the
idea that playing dice and gambling were popular pastimes.
In the words of Ian Hacking, a historian of probability,
“It is hard to find a place where people use no randomizers
yet theories of frequency, betting, randomness, and
probability appear only recently. No one knows why.”
An intriguing mystery! And indeed it is not until 1654
that what we know as probability today began to take
shape. French nobleman Chevalier de Mere had a gambling
problem. He wanted to know if it would be profitable to
bet that double-sixes would appear at least once in a set of
24 dice throws. Luckily for him, he was a friend of one of
the most brilliant mathematicians of his day, Blaise Pascal.
De Mere posed the problem to Pascal and the theory of
probability emerged from the ensuing correspondence
between Pascal and his good friend Pierre de Fermat,
another brilliant mathematician.
What Fermat and Pascal realized is what we all learned
in grammar school — that with each throw of the dice,
each possible outcome is equally likely. The possibility of
throwing a six is therefore 1/6. They further realized that
probabilities could be multiplied, and the probability of
throwing two sixes was 1/6 x 1/6 = 1/36. The probability,
therefore, of throwing two sixes in 24 throws is 1/36 x
1/36 x 1/36 … 24 times or 0.4914. So it was not advisable
for Chevaler de Mere to bet on throwing two sixes within
24 throws after all, and probability theory was born!
So what does any of this probability stuff have to do
with random number generation?
Games of chance are, as Hacking called them
“randomizers.” They are a way of abdicating responsibility
for decision-making to the world of probabilities. In turn,
results returned from such activities are random outcomes.
At any given moment of the game, you cannot predict with
certainty which number will appear next. And a set of random outcomes produces a sequence of random numbers
containing no repeating patterns — a sequence which
Algorithmic Information Theory would call uncompressable.
AIT and Compressibility
As Gregory Chaitin, founder of algorithmic information
theory describes it, compression occurs if data can be reproduced by a computer program with a smaller number of bits
than the original data contains. In other words, if I have a
data set of 500 one-byte measurements, the data contains
500 8 = 4,000 bits. If I can find a repeating pattern in that
data, I can simplify it by creating a simpler symbol to represent
each repeating subset of the sequence. I can then code an
algorithm to process the input string and replace the
symbols based on a key. In this way, I can shrink the size of
the data itself without losing any of the original information.
SERVO 07.2008 67