This paper reflects the research and thoughts of a student at the time the paper was written for a course at Bryn Mawr College. Like other materials on Serendip, it is not intended to be "authoritative" but rather to help others further develop their own explorations. Web links were active as of the time the paper was posted but are not updated. Contribute Thoughts | Search Serendip for Other Papers | Serendip Home Page |
Story of Evolution, Evolution of Stories
Bryn Mawr College, Spring 2004
Second Web Paper
On Serendip
Nature is a fluid coalescence of complex magnificence resulting from an algorithmic mastery of simplicity. It is no doubt an awe-inspiring entity that invokes both great curiosity and bafflement in those who attempt to account for its existence and splendor. It is often seen as overly reductionistic, if not ¡§dangerous¡¨, to try to condense the (mindless?) brilliance of nature through any sort of mechanistic or logical means. And here we are faced with what Daniel Dennett calls Darwin¡¦s dangerous idea: ¡§that all the fruits of evolution can be explained as the products of an algorithmic process¡¨ (Dennett, 1995 p.60). It is no surprise that this idea might present a problem for the Homo-sapien ego, as it jeopardizes our egocentric concept of ¡§natural¡¨ superiority, as well as fails to satisfy our almost insatiable need to directly account for the expansiveness of the world around us. That is, for many of us it is somehow pessimistic, if not fatalistic, to be satisfied with the idea that we are products of nothing more than a mindless mechanical process (what a dangerous idea this is!) (Dennett, 1995 p.60). The question then inevitably arises: Is Darwin¡¦s theory of natural selection really ¡§powerful enough¡¨ to can account for all of the world¡¦s design work (i.e., the time, energy and development needed to produce a complex outcome)? (Dennett, 1995) The answer is yes, but only after nature has been unraveled in terms of an algorithmic design and only after the many misunderstandings of Darwin¡¦s fundamental ideas have been rectified.
If we are to discuss nature and natural selection in terms of being an algorithmic process, we must first define what is meant by an algorithm. An algorithm is ¡§a certain sort of formal process that can be counted on-logically-to yield a certain sort of result whenever it is ¡§run¡¨ or instantiated¡¨ (Dennett, 1995 p.50). In other words, if the compulsory steps in an algorithmic process are executed without error or deviation, a predetermined outcome will result without exception. In this way, an algorithm can be seen as a ¡§foolproof and somehow mechanical procedure: a recipe of sorts¡¨ (Dennett, 1995 p.51). An attempt to characterize nature in terms of an algorithmic process as defined in this extremely deductive and causal manner can easily be seen as problematic. However, (and here the unraveling begins) it must be understood that this definition does not adequately illustrate the actuality that both chance and randomness may be integrated into and allowed for by an algorithm. For example, Dennett uses the analogy of long division:
¡§Because most mathematical discussions of algorithms focus on their guaranteed or mathematically provable powers, people sometimes make the elementary mistake of thinking that a process that makes use of chance or randomness is not an algorithm¡¨ (Dennett, 1995 p.52).
But long division remains to be a demonstrable algorithm; however, it is an algorithm (like natural selection) that allows for randomness. In other words, when faced with a difficult long division problem, the inevitable result is found only after numbers are chosen and tested at random. In this way, the algorithm may be foolproof, but the time it takes and the methods for producing its result are variable due to randomness and chance, as well as individualistic attributes inherent in the variability of the substrate/animal with which the algorithm is concerned (e.g. the variable proficiency with which an individual chooses numbers to test is directly related not to the eventual outcome, but to the way in which that outcome is reached).
Taken in terms of natural selection and evolution, catastrophic events can certainly have a substantial impact on a species¡¦ population, in that it is often the case that such interruptions can affect the results of competition within that species. In other words, there exists the possibility (chance) that any event (at random) might affect a population, and therefore bias the results of competition within that population. Therefore, chance and randomness might disallow for the concept of ¡§survival of the fittest¡¨ and instead allow for otherwise less-adept members of the species to have increased reproducibility and thus an increased genetic impact on subsequent generations. The fact that events can alter or in part determine which members of a species are more likely to survive and reproduce, leads to the fact that the specific outcome (e.g., which particular genes will be passed onto the next generation) of an algorithmic process is not fixed, just that the inherent nature of the causal procedure is. For instance, the algorithmic process of natural selection does not provide a certain mold by which to predict the most viable members of a species, only that it is guaranteed that the most well-adapted members of a species will necessarily correspond to the most viable.
We now return to the question of how the complexity and intricacy of nature can be accounted for by an algorithmic process constituted by an inundation of simple steps. William Paley¡¦s well-known watchmaker analogy is particularly useful here, in that it illustrates the problem of complexity of design (i.e., of a watch) independent of a purposeful designer (i.e., a watchmaker). If a watch represents a complexity in design that could not have possible arisen out of purposeless chance or randomness alone, it must therefore have required a significant amount of ¡§design work¡¨ (i.e., work done) (Dennett, 68). If a complexity in design presupposes the necessity of a great amount of design work, we must then ask who or what is responsible for this work being done? One possibility that is popular amongst creationists is that nature¡¦s incredibly complex design is the result of a purposeful designer/ creator, or more specifically, God: ¡§before Darwin, the only model we had of a process by which this sort of (R-and-D) work could be done was an Intelligent Artificer¡¨ (Dennett, 68).
When discussing the ways in which natural selection can account for nature¡¦s complex design and therefore all of its presupposed design work, we come across a common misunderstanding of Darwinian thought. The mistake is frequently made that evolution in terms of natural selection is a process for producing certain results. Our own egocentric exploration of evolutionary process, tempting as it may be, often misunderstands Darwinian processes as being synonymous with progress, culminating of course with our own sophisticated arrival on the scene. This is a misunderstanding of both the characteristics of evolution, as well as those of algorithms (not to mention of evolution as an algorithmic process). Algorithms, like evolution by natural selection, ¡§don¡¦t have to have points or purposes¡K.they just do what they do¡K¡¨ (Dennett, 1995 p.56). Therefore it stands that the world¡¦s design work did not unavoidably result from a purposeful design or designer (i.e., whether it be God or evolution for any particular reason). So, what did it result from? It resulted from Darwin¡¦s natural selection in terms of an algorithmic process accounted for by the Principle of Accumulation of Design.
The ¡§Principle of Accumulation of Design¡¨ refers to the fact that the complexity of design work found in nature can be accounted for, not by a definite design process preformed by a designer, but by ¡§a different sort of process that distributed that work over huge amounts of time, by thriftily conserving the design work that had been accomplished at each stage, so that it didn¡¦t have to be done over again¡¨ (Dennett, 68). This idea of distributed design work is certainly in line with nature¡¦s slow advancement in terms of complexity and ¡§order of organisms¡¨ (Dennett, 69). Furthermore, the Principle of Accumulation of Design does not apply to work done as a result of a single unifying algorithmic process, but to the work done by a ¡§large class of related algorithms¡¨, the conglomeration of which is responsible for the complexity found in nature today (Dennett, 51).
CLICK "SUBMIT YOUR PAPER" BELOW.
| Course Home Page
| Forum
| Science in Culture
| Serendip Home |