Home | Calendar | About | Getting Involved | Groups | Initiatives | Bryn Mawr Home | Serendip Home

April 29, 2004

Paul Grobstein (Biology and Center for Science in Society)
"Information Update: Searching for the Third Law"

Summary
Prepared by Anne Dalke
Additions, revisions, extensions are encouraged in the Forum
Participants

Paul opened his presentation by saying that, rather than "drawing together all the threads" from the semester's series, he was going to examine the way the discussion related to a technological problem that had been bothering him for a long time. He intended to show us where the problem lies, and then trace some directions that might be profitable for us to mine together.

He began by referencing an article he'd published sixteen years ago,"From the head to the heart: Some thoughts on similarities between brain function and morphogenesis, and on their significance for research methodology and biological theory," in which he noted a property which seemed to be common both to developing embryos and to nerve cells: when a part is removed, the remaining part self-organizes to do its job. Assuming that the explanation had to be not idiosyncratic to either developing embryos or nerve cells, he made an effort to explain this similarity in the epigenetic process (did it entail the unfolding of something stored within? was there an intent distributed throughout the organism?) by exploring the possibility that things outside the developing organism have an impact on things inside. He asserted then that the existing definition of information (which focused on faithful communication across transmission channels) was not useful for working on this problem.

As the result of more recent thinking, both in the Working Group on Emergent Systems and in a course on "The Story of Evolution," Paul has been drawing on the first and second law of thermodynamics to re-conceive information theory:

  • The first law says that, in any closed system, the sum total of mass and energy is constant.
  • The second law says, the total amount remains constant, but the amount of useful energy is always declining.
  • In short, in the game of thermodynamics, you can't win and you can't come out ahead.
This claim poses a problem for biologists: what is happening in biological systems which so consistently demonstrate increases in order? Some biologists explain the discrepancy by saying that the earth is not an isolated system. Paul has been exploring the question in another direction, by suggesting that the relations between life and the second law are both intimate and dynamic. The substantial enhancement of order occurs not despite but because of the second law: because "something is falling apart" (the big bang, the sun...) order is created. The local increase in complexity (the movement, over 13.4 billion years, from quarks to people...) is an effect of the global increase in entropy. The creation of order, in other words, is "not a side branch," but directly related to the laws of thermodynamics. Although not required by the first and second laws, life and culture are compatible with them.

So: what's all this have to do with information? The first and second laws regarding matter and energy were originally developed to explain problems that arose with the invention of the steam engine: how much work was required and how much waste occured in the production of how much movement? What has since come to be increasingly recognized is the causal link between the laws of thermodynamics and the idea of information (most easily demonstrated in the thought experiment of Maxwell's Demon, in which information plays a central role in the creation of energy). During the past 100 years, numerous explanations have been offered for why perpetual motion cannot exist:

  • it takes energy for the demon to open and close the door (allowing faster moving particles to pass through to one side of the partition, slower moving particles to the other);
  • it takes energy to visualize the moving object (that is, to collect information);
  • the current best explanation is that, to collect information, you have to throw information away.
What is important here--if as yet not entirely understood--is the necessary relation between information and the first and second laws. We know, both from Darwin's theory of evolution and our own activities, that it costs us to control information: it takes electricity, which is not free, to run our computers; there is no way to manipulate information without energy. What is perhaps missing from the laws of thermodynamics, then, is a third law--one of information.

According to Doug Blank and Jim Marshall's presentation on A Bit About Bits, "information in a closed system tends to coagulate" (to use a technical term, it "mushes up"); that is, order increases. In Paul's terms, although the total information in a closed system remains constant (which leads indirectly to "mushing up"), it costs to make use of that information. As Paul began to sketch a path towards a more usable theory (law?) of information, he was asked to take into consideration the fact that other planets experience the falling apart of the sun w/out any significant correlation to the creation of order. So far as we know at the moment, the local creation of order is not universal, and we need to find a way to account for such local variation.

In this initial attempt to offer a "third law," Paul was taking very seriously its (not simple) relation to degrees of probability, following an intuitive feeling that, if a new law exists, it will turn out to be related to organized matter and energy--and increased order is an improbable state. We need to conceive of information not as a separate entity ("word," plan, design) but as the organization of matter and energy. For our purposes, randomly distributed matter and energy has the lowest possible information content. Information is, in some limited sense, not dissociable from matter and energy, although one can express information using different forms of organization (Beethhoven's Third Symphony, for instance, is representable in numerous forms other than as a certain set of grooves on a plastic disk). The same information is representable in different sets of matter and energy, an ensemble of different states rather than any particular one.

With the assistance of Eric Raimy, Paul is also suggesting that a new theory of information must be related to--indeed, can only exist in the presence of--a decoder. Trees which fall in a forest make vibrations in the air. Sounds are the significance placed on those vibrations by a receiver, and there is no information without such an interpreter. Information, which is fundamentally dependent on a decoder, may actually be defined as that which is transformed, with some degree of predictability, by a decoder (for instance, a tape recorder) from one to another form of organized matter and energy. (So: what motivates the decoder?) Information only exists, then, if there is a decoder--but does the decoder have to exist in actuality, or only potentially? We found the distinction between potential and actual information a useful one, along with the notion that new decoders can enable us to switch from the first to the second category.

This seemed akin to the ways in which literary theorists discuss implied readers: there are assumptions and expectations which get reworked by future generations, and movies are interpreted differently by different cultures, etc. Tamara Davis had provided this group, a few weeks ago, with another example by suggesting that, in the absence of a decoder, DNA is not information. All this implies that information is no essential, but rather a relational characteristic, one productive of further organization (and what is meant by "further"?).

Discussion concluded with a range of observations about how complexity is best defined, not by its end state, but rather by the complications involved in "getting there"; according to a range of theories (including Kalmogorov's) complexity is most effectively measured in relation to its history. The effort to come up with a measure of how difficult it is to make something, starting w/ randomness, relates to the number of steps it takes in time. An implied hierarchy, based on how "hard it is to get there," is not yet part of the new theory/third law.

Further conversation is invited on the on-line forum.

This was our final meeting of the Spring 2004 semester.
Tune your receivers for information regarding the fall series....

Return to Brown Bag Home Page


Home | Calendar | About | Getting Involved | Groups | Initiatives | Bryn Mawr Home | Serendip Home

Director: Liz McCormack -
emccorma@brynmawr.edu | Faculty Steering Committee | Secretary: Lisa Kolonay
© 1994- , by Center for Science in Society, Bryn Mawr College and Serendip

Last Modified: Wednesday, 02-May-2018 10:51:20 CDT