fusion telexTelexexternal linkExternal Linkinternal linkInternal Linkatomjacked inventory cacheInventory Cache


Claude Shannon 1948

Claude Shannon
This nOde last updated December 1st, 2001 and is permanently morphing...
(9 Ik (Wind) / 0 Mak - 12.19.8.14.2)

fusion telex
Claude Shannon, a lone-wolf genius, is still known to his neighbors in Cambridge, Massachusetts, for his skill at riding a motorcycle. In 1937, as a twenty-one-year-old  graduate student, he showed that internal linkBoole's logical algebra was the perfect tool for analyzing  the complex internal linknetworks of switching circuits used in internal linktelephone systems  and, later, in computers. During the war and afterward, Shannon  established the mathematical foundation of internal linkinformation theory.  Together with cybernetics, this collection of theorems about  information and communication created a new way to understand  people and machines--and established information as a cosmic  fundamental, along with energy and matter.

- _Tools For Thought_ by Howard Rheingold 

the Network


fusion telex
Claude E. Shannon, father of information theory, whose work _The Mathematical Theory of Communication_ is one of the greatest works in the annals of technological thought.  It showed how an algebra invented in the mide 1800's by British mathematician George Boole, (Boolean Algebra) could represent the workings of switches and relays in electronic circuits, its implications were profound.  He defined the overall potential for information in a sytem of messages as its entropy, which in thermodynamics denotes the randomness of a system.  Shannon defined the basic unit of information which came to be called a "bit".  Information could then be encoded as bits.  Code compresses information into its most compact form.  Shannon's ideas were almost too prescient to have an internal linkimmediate impact.  Vacuum tube circuits simply could not calculate the complex codes needed to approach the Shannon limit.  Not until early 1970's, with the advent of high speed integrated circuits did engineers begin to fully exploit information theory.  Today Shannon's insights have shaped virtually all systems that store, process or transmit information in digital form.  Obviously this information applies to the above applications but science and computer technology is returning to the much older concept of connectionism.  "Does not the fiction of an isolated object imply a kind of internal linkabsurdity, since this object borrows its physical properties from the relations which it maintains with all others and owes each of its determinations, and consequently its very existence, to the place which it occupies in the universe as a whole".

Bergson: _Matter and internal linkMemory_ 1910
from the liner notes for track  _Bitstream_ by internal linkClockDVA off of _Man-Amplified_ CDatomjacked inventory cache on Contempo (1992)



fusion telex
Information Theory

Information Theory, theory concerned with the mathematical laws governing the transmission, reception, and processing of information. More specifically, information theory deals with the numerical measurement of information, the representation of information (such as encoding), and the capacity of communication systems to transmit, receive, and process information. Encoding can refer to the transformation of speech or images into internal linkelectric orr internal linkelectromagnetic signals, or to the encoding of messages to ensure privacy. Information theory was first developed in 1948 by the American electrical engineer Claude E. Shannon. The need for a theoretical basis for communication technology arose from the increasing complexity and crowding of communication channels such as internal linktelephone and teletype internal linknetworks and radio communication systems. Information theory also encompasses all other forms of information transmission and storage, including television and the electrical internal linkpulses transmitted in computers and in magnetic and optical data recording.

When a message is transmitted through a channel, or medium, such as a internal linkwire or the atmosphere, it becomes susceptible to interference from many sources, which distorts and degrades the signals. Two of the major concerns of information theory are the reduction of noise-induced errors in communication systems and the efficient use of total channel capacity. Efficient transmission and storage of information require the reduction of the number of bits used for encoding. This is possible when processing English texts because letters are far from being completely random. The probability is extremely high, for example, that the letter following the sequence of letters informatio is an n. This redundancy enables a person to understand messages in which vowels are missing, for example, or to decipher unclear handwriting. In modern communications systems, artificial redundancy is added to the encoding of messages in order to reduce errors in message transmission.



fusion telex
Shannon noted certain features (such as redundancy) which reduce errors and occur in natural internal linkinformation systems (such as the genome or human internal linklanguages.) Some of these same systems are used in today's electronic communications and media technologies to preserve the signal and reduce noise. What he and internal linkcyberneticist    internal linkNorbert Wiener came to realize is that the relationship between information and entropy is as direct as that between matter and energy. If a system can be described by a certain number of statements, clearly a more ordered system requires more complex description than a more disordered system. Its information content is inversely related to entropy. And through communication or information exchange with other systems, an open system can raise its information content, and reduce its entropy.

Through internal linkfeedback from the environment, it can autocorrect errors and increase self-organization.
  

Norbert Wiener


fusion telex
The Techgnostic Worldview

Along with the technological expansion of the Information Age, the 20th century has also seen an expansion in our understanding of the nature of information.

Through theoreticians such as Claude Shannon, humanity has begun to understand the fundamental relationship that appears to exist between language, information, energy, and entropy. A "physics of information" has begun to develop which suggests that  information relationships are as important as material, causal ones mediated in space and internal linktime.

Some cosmologists now look at the cosmos as a system of various kinds of information-processing, perhaps even an "infoverse."

Thus, the Information Age marks a change in our worldview, as well as our technology. The mechanistic view of the Industrial Era is giving way to something new.



fusion telex
internal linkAlan Turing and Claude Shannon were altogether serious in their interest in internal linkchess, because of the complexity of the game in relation to the simplicity of its rules, and because they suspected that the shortcut needed to perform this kind of time-consuming search-procedure would also be a clue to the way brains solved all sorts of problems.

A chess playing program was also interesting because it was a relative of the kind of internal linkinformational entities known as automata that internal linkJohn von Neumann and Turing had been toying with. Once again, like Turing's universal machines, these automata were theoretical devices that did not exist at that internal linktime, but were possible to build, in principle. For years, Shannon experimented with almost internal linkabsurdly simple homemade versions--mechanical mice that were able to navigate simple mazes.

 - Howard Rheingold - _Tools For Thought 

Alan Turing 1954 John Von Neumann


fusion telex
Let's imagine Claude Shannon in a heavy, dark coat with twinkling eyes, sidelocks, and a Bible opened to Ezekiel. He might say that the internal linkkabbalist's living internal linklanguage of code was internal linkDNA, and if we allow ourselves to ride the analogy, we might agree. DNA is an information system, with a senderómessageóreceiver form. DNA's basic code consists of four different nucleotides which are multiplied along the linear strand of the internal linkdouble helix (while YHVH contains only three letters, it also includes four units). The arrangement of these four "letters" (AGCT) produces "words" (called codons) which combine into instructions for the cell.
 
 
Kabbalah DNA

  Through copying itself, DNA sends its instruction code through messenger RNA, which delivers the message to factories in the cell, which then copy the code into a sequence of amino acids. This linear string of amino acids then literally folds into three-dimensional proteins. These proteins are the building blocks for the life of the world.

Some DNA messages code for specific proteins, while others concern structural commands and other information: how to edit and arrange the information, when to start and stop, etc. And 97% of the original DNA seems to be junk, random noise that means and produces nothing. Recently, in order to retrieve the meaningful "words" buried in the babble of "AGGCAGCTTGA...", some scientists began using linguistic techniques originally developed to decipher ancient languages, many  of which, like DNA, are written without spaces between words. These techniques involve different kinds of statistical probability  analysis developed out of information theory.

Old Claude is really grinning at this point, but he has one more trick up his sleeve. At first scientists thought any given strand of  DNA contained only one sequence of instructions. But in Grammatical Man, Campbell writes about a virus which mystified scientists because its DNA was much too short to code for all the proteins that it was commanding the host cell to produce. Scientists then discovered that the viral DNA superimposed different instruction sequences on top of one another, so that its sequence would have different meanings depending on where you started reading the letters. As the kabbalists reasoned long ago, living language is a language of layered meanings.

- Erik Davis - internal link_Tongues of Fire, Whirlwinds of Noise_



fusion telex
Every age defines itself by the resource it wastes.  Our agarian forefathers wasted human internal linktime.  The Victorians wasted coal and iron, the twentieth century wasted internal linkelectricity.  Over the past decade, the world had to learn to waste transistors.  Now it needs to learn how to waste bandwidth, and begin rebuilding the world yet again.

[...]

When everyone talks louder, no one can hear very well.  Today, the favored regions at the bottom of the spectrum are so full of spectrum-hogging radios, pagers, internal linkphones, television, long-distance, point-to-point, aerospace, and other uses that heavy-breathing experts speak of running out of "air."

Anticipating this predicament, Claude Shannon offered a new paradigm, redefining the relationship of power, noise, and information.  He showed that a internal linkflow of signals conveys information only to the extent that it provides unexpected data - only to the extent that it adds to what you already know.  Shannon termed this information content "entropy." In a digital message, another name for a stream of unexpected bits is random noise.  Termed Gaussian, or white, noise, such a transmission resembles this form of noise, the more information it can hold, as long as it is modulated to a regular carrier internal linkfrequency.  In the esoteric internal linklanguage of Shannon, you need a low entropy carrier to bear a high entropy message.

George Gilder - _Telecosm_

fusion telexTelexexternal linkExternal Linkinternal linkInternal Linkatomjacked inventory cacheInventory Cache
fUSION Anomaly. Entities
return to the source...fUSION Anomaly.
fUSION Anomaly.