The Science of Discworld II
thereâs a sale on? The scientific concept of âinformationâ is a measure of how much message youâre sending. In human affairs, it seems to be a fairly universal principle that for any given medium, longer messages cost more than short ones. At the back of the human mind, then, lurks a deep-seated belief that messages can be quantified: they have a size . The size of a message tells you âhow much informationâ it contains.
Is âinformationâ the same as âstoryâ? No. A story does convey information, but thatâs probably the least interesting thing about stories. Most information doesnât constitute a story. Think of a telephone directory: lots of information, strong cast, but a bit weak on narrative. What counts in a story is its meaning. And thatâs a very different concept from information.
We are proud that we live in the Information Age. We do, and thatâs the trouble. If we ever get to the Meaning Age, weâll finally understand where we went wrong.
Information is not a thing, but a concept. However, the human tendency to reify concepts into things has led many scientists to treat information as if it is genuinely real. And some physicists are starting to wonder whether the universe, too, might be made from information.
How did this viewpoint come about, and how sensible is it?
Humanity acquired the ability to quantify information in 1948, when the mathematician-turned-engineer Claude Shannon found a way to define how much information is contained in a message â he preferred the term signal â sent from a transmitter to a receiver usingsome kind of code. By a signal, Shannon meant a series of binary digits (âbitsâ, 0 and 1) of the kind that is ubiquitous in modern computers and communication devices, and in Murrayâs semaphore. By a code, he meant a specific procedure that transforms an original signal into another one. The simplest code is the trivial âleave it aloneâ; more sophisticated codes can be used to detect or even correct transmission errors. In the engineering applications, codes are a central issue, but for our purposes here we can ignore them and assume the message is sent âin plainâ.
Shannonâs information measure puts a number to the extent to which our uncertainty about the bits that make up a signal is reduced by what we receive. In the simplest case, where the message is a string of 0s and 1s and every choice is equally likely, the amount of information in a message is entirely straightforward: it is the total number of binary digits. Each digit that we receive reduces our uncertainty about that particular digit (is it 0 or 1?) to certainty (âitâs a 1â, say) but tells us nothing about the others, so we have received one bit of information. Do this a thousand times and we have received a thousand bits of information. Easy.
The point of view here is that of a communications engineer, and the unstated assumption is that we are interested in the bit-by-bit content of the signal, not in its meaning. So the message 111111111111111 contains 15 bits of information, and so does the message 111001101101011. But Shannonâs concept of information is not the only possible one. More recently, Gregory Chaitin has pointed out that you can quantify the extent to which a signal contains patterns . The way to do this is to focus not on the size of the message, but on the size of a computer program, or algorithm , that can generate it. For instance, the first of the above messages can be created by the algorithm âevery digit is a 1â. But there is no simple way to describe the second message, other than to write it down bit by bit. So these two messages have the same Shannon information content, but from Chaitinâs point of view the second contains far more âalgorithmic informationâ than the first.
Another way to say this is that Chaitinâs concept focuses on the extent to which the message is âcompressibleâ. If a short program cangenerate a long message, then we can transmit the program instead of the message and save time and money. Such a program âcompressesâ the message. When your computer takes a big graphics file â a photograph, say â and turns it into a much smaller file in JPEG format, it has used a standard algorithm to compress the information in the original file. This is possible because photographs contain
Weitere Kostenlose Bücher