April 1, 2004 Doug Blank (BMC Computer Science) and Jim Marshall (Pomona College Computer Science) "A Bit about Bits"
Summary Prepared by Anne Dalke Additions, revisions, extensions
are encouraged in the Forum
Participants
Jim and Doug sought, in this presentation, to help us all understand how information is measured in computer science--in ways that are entirely dissociable from meaning. Information, in their field, refers to numbers of bits; a bit is defined both as a binary digit (1 or 0) and as "how uncertain one is before one receives a message." Information, as they use the term, is a formula for the reduction of uncertainty. This understanding of information presumes a defined, finite input and output and a fixed relation between them; the issue and presumptive task of the system (as defined by Shannon, whose work was in radio transmission) is to create fidelity. Information, in other words, involves faithful pattern recongtion, and an assumption that there are right answers. This is very different, for instance, from the "yeastiness of the Pi story," the way in which literary language aims to evoke a wide range of meanings, rather than a single, unambiguous decoding.
Jim and Doug also suggested that we join them in considering the possibility that "the physical world needs another law" (in additional to the 1st, that the total amount of energy in a closed system remains constant, and the 2nd, that the total amount of disorder in a closed sysem increases); what is needed is a third law of thermodynamics that explains "why the conservation of information is compelled to happen." Why are there local cases of decreased entropy? Is there a law of matter, energy and information which might demonstrate the "inevitability of coagulation" in a closed system? (Is information physical?) Or is the anthropic principle in physics a "non-principle," given the likelihood that there are innumerable numbers of universes, of which ours is only one? Might we say that we already have three guiding principles available to us: "you can't win, you can't break even, and you can't get out of the game"?
Jim and Doug closed with the observations that not computing, but only the destruction of information, requires energy--and that perhaps only reversibility can "give meaning" to information.
Our next meeting will be on Thursday, April 8, when Tamara Davis of the Biology Department will discuss "Genotype and Phenotype." Tamara is planning to talk about information contained (and not contained) in a genome. Some of the ideas to open up for discussion include
- what information is actually obtained by genome projects
- how we interpret the information (guesswork vs. actual knowledge)
- whether or not the sequence of the DNA is really a blueprint for development
- what kind of information can't be gained from knowledge of the DNA sequence
- the plasticity of information in the genome.
All are welcome to join in the ongoing conversation on "Information, Meaning and Noise" continuing on-line.
Return to Brown Bag
Home Page
|