This nOde last updated December 1st, 2001 and is permanently morphing...
(9 Ik (Wind) / 0 Mak - 188.8.131.52.2)
- _Tools For Thought_ by Howard Rheingold
Bergson: _Matter and Memory_
from the liner notes for track _Bitstream_ by ClockDVA off of _Man-Amplified_ CD on Contempo (1992)
Information Theory, theory concerned with the mathematical laws governing the transmission, reception, and processing of information. More specifically, information theory deals with the numerical measurement of information, the representation of information (such as encoding), and the capacity of communication systems to transmit, receive, and process information. Encoding can refer to the transformation of speech or images into electric orr electromagnetic signals, or to the encoding of messages to ensure privacy. Information theory was first developed in 1948 by the American electrical engineer Claude E. Shannon. The need for a theoretical basis for communication technology arose from the increasing complexity and crowding of communication channels such as telephone and teletype networks and radio communication systems. Information theory also encompasses all other forms of information transmission and storage, including television and the electrical pulses transmitted in computers and in magnetic and optical data recording.
When a message is transmitted
through a channel, or medium, such as a wire
or the atmosphere, it becomes susceptible to interference from many sources,
which distorts and degrades the signals. Two of the major concerns of information
theory are the reduction of noise-induced errors in communication systems
and the efficient use of total channel capacity. Efficient transmission
and storage of information require the reduction of the number of bits
used for encoding. This is possible when processing English texts because
letters are far from being completely random. The probability is extremely
high, for example, that the letter following the sequence of letters informatio
is an n. This redundancy enables a person to understand messages in which
vowels are missing, for example, or to decipher unclear handwriting. In
modern communications systems, artificial redundancy is added to the encoding
of messages in order to reduce errors in message transmission.
from the environment, it can autocorrect errors and increase self-organization.
Along with the technological expansion of the Information Age, the 20th century has also seen an expansion in our understanding of the nature of information.
Through theoreticians such as Claude Shannon, humanity has begun to understand the fundamental relationship that appears to exist between language, information, energy, and entropy. A "physics of information" has begun to develop which suggests that information relationships are as important as material, causal ones mediated in space and time.
Some cosmologists now look at the cosmos as a system of various kinds of information-processing, perhaps even an "infoverse."
Thus, the Information Age
marks a change in our worldview, as well as our technology. The mechanistic
view of the Industrial Era is giving way to something new.
A chess playing program was also interesting because it was a relative of the kind of informational entities known as automata that John von Neumann and Turing had been toying with. Once again, like Turing's universal machines, these automata were theoretical devices that did not exist at that time, but were possible to build, in principle. For years, Shannon experimented with almost absurdly simple homemade versions--mechanical mice that were able to navigate simple mazes.
- Howard Rheingold - _Tools For Thought
Through copying itself, DNA sends its instruction code through messenger RNA, which delivers the message to factories in the cell, which then copy the code into a sequence of amino acids. This linear string of amino acids then literally folds into three-dimensional proteins. These proteins are the building blocks for the life of the world.
Some DNA messages code for specific proteins, while others concern structural commands and other information: how to edit and arrange the information, when to start and stop, etc. And 97% of the original DNA seems to be junk, random noise that means and produces nothing. Recently, in order to retrieve the meaningful "words" buried in the babble of "AGGCAGCTTGA...", some scientists began using linguistic techniques originally developed to decipher ancient languages, many of which, like DNA, are written without spaces between words. These techniques involve different kinds of statistical probability analysis developed out of information theory.
Old Claude is really grinning at this point, but he has one more trick up his sleeve. At first scientists thought any given strand of DNA contained only one sequence of instructions. But in Grammatical Man, Campbell writes about a virus which mystified scientists because its DNA was much too short to code for all the proteins that it was commanding the host cell to produce. Scientists then discovered that the viral DNA superimposed different instruction sequences on top of one another, so that its sequence would have different meanings depending on where you started reading the letters. As the kabbalists reasoned long ago, living language is a language of layered meanings.
- Erik Davis - _Tongues
of Fire, Whirlwinds of Noise_
When everyone talks louder, no one can hear very well. Today, the favored regions at the bottom of the spectrum are so full of spectrum-hogging radios, pagers, phones, television, long-distance, point-to-point, aerospace, and other uses that heavy-breathing experts speak of running out of "air."
Anticipating this predicament, Claude Shannon offered a new paradigm, redefining the relationship of power, noise, and information. He showed that a flow of signals conveys information only to the extent that it provides unexpected data - only to the extent that it adds to what you already know. Shannon termed this information content "entropy." In a digital message, another name for a stream of unexpected bits is random noise. Termed Gaussian, or white, noise, such a transmission resembles this form of noise, the more information it can hold, as long as it is modulated to a regular carrier frequency. In the esoteric language of Shannon, you need a low entropy carrier to bear a high entropy message.
George Gilder - _Telecosm_