# Serendip's Bookshelves

Stephen Wolfram, A New Kind of Science, Wolfram Media Inc., Champaign, Illinois, 2002

Commentary by David Berger. David, Swarthmore College 2004 (Economics major with minors in History and Math), attended the New Kind of Science 2003 Conference in Boston in July. Davi's interest was stimulated by conversations with Mark Kuperberg, Professor of Economics at Swarthmore, and Steve Sigur, a high school math teacher. These notes were prepared for a meeting of the Emergence Working Group at Bryn Mawr College, and are also available as a PDF file.

A New Kind of Science (NKS) is Stephen Wolfram’s attempt to revolutionize the theoretical and methodological underpinnings of the universe. Though this endeavor is incredibly ambitious and thus should be approached critically, Wolfram’s book is not all talk: it contains many startling and thought provoking results and conjectures.

• enormous complexity can be generated from simple rules
• every system can be viewed as a computation: implications for philosophy and AI (visual evidence and rule 110 universality)
• simple classification of randomness and complexity.
• threshold beyond which one gets no new fundamental complexity

Notion of Computation

• underlies all of NKS
• a computation is an operation that begins with some initial conditions and gives an output which follows from a definite set of rules. The most common example are computations performed by computers, in which the fixed set of rules may be the functions provided by a particular programming language.
• classify systems according to what types of computations they can perform

Classification of computation (output)

1. Almost all initial conditions lead to a one, homogenous state
2. Almost all initial conditions lead to repetitive or nesting structures
3. Almost all initial conditions lead to output that looks random<
4. Almost all initial conditions lead to output that has aspects that are random but with definite localized structures that interact in random ways
• Analogy: initial conditions as inputs to the system; state of system after n steps is "output" (637)
• KEY: think purely abstractly about the computation that is being performed (the output) and do not necessarily worry about how the system actually does the computation.
• allows one to compare systems with very different internal structures but that have outputs that are nevertheless very similar. Good as a classification system.

Prevalence of Universality

• universal system is one where the underlying construction is fixed, but it can be made to perform different tasks by being programmed in different ways

• Main implication: universal system is capable of emulating any other system and thus must be able to produce behavior as complex as any other system.

• historically universality was thought to be rare and complicated to construct. That was until rule 110 was shown to be universal.

• implies that universality might be much more common than expected and indeed Wolfram demonstrates that basically every simple system he discusses in his book is universal. He believes that almost all systems of class 3 or 4 behavior will turn out to be universal.

Implications
• universality associated with all instances of general complex behavior
• a wide variety of systems (CA, mobile CA’s, substitution systems, Turing machines…), though internally different, are all computationally equivalent.
• thus studying computation at an abstract level allows one to say meaningful things about a lot of systems

• there is a threshold of complexity that is relatively low. Once this threshold is past, nothing more is gained in a computational sense

• Because the threshold is so low almost any system can generate an arbitrary amount of complexity. Wolfram believes that threshold is between class 2 and 3.

Principle of Computational Equivalence

the major assumption of his book- "all processes, whether they are produced by human effort or occur spontaneously in nature, can be viewed as computations." (715)

"it is possible to think of any process that follows definite rules as being a computation- regardless of the kinds of elements involved." (716)

- thus we can view processes in nature as computations

Therefore "almost all processes that are not obviously simple can be viewed as computations of equivalent sophistication." (717)

-this follows because Wolfram believes that all systems of class 3 and 4 behavior are universal and are therefore computationally equivalent.

- implies that there is an upper limit to complexity that is relatively low

More specifically, the principle of computational equivalence says that systems found in the natural world can perform computations up to a maximal ("universal") level of computational power, and that most systems do in fact attain this maximal level of computational power. Consequently, most systems are computationally equivalent. For example, the workings of the human brain or the evolution of weather systems can, in principle, compute the same things as a computer. Computation is therefore simply a question of translating inputs and outputs from one system to another.

- if some system seems complicated then it is most likely universal

PoCE allows for an explanation of complexity. Systems seem complex when they are of equivalent sophistication as any of the systems we use for perception and analysis. Thus,

"observers will tend to be computationally equivalent to the systems they observe- with the inevitable consequence that they will consider the behavior of such systems complex." (737)

our processes of perception and analysis are of equivalent sophistication and the most powerful computers or even our minds must obey the PoCE. "Can these (computer and brain) be more sophisticated? Presumably they cannot, at least if we want actual results, and not just generalities." Difference is that results require definite physical processes and are therefore subject to the same limitations of any process.

Generation of Randomness

1.Externally imposed.

2. Randomness initial conditions

3. intrinsically generated

-Wolfram believes that the third option underlies most of the generation of complexity in the physical world because of the ease of which nature seems to do it. Now that we know that we can generate enormous complexity from very simple rules it seems plausible that this is what nature does as well. Fits in with the idea of natural selection- nature just generates all possible outcomes and the ones that are feasible survive over time.

Problem of Randomness:

• how can one have a deterministic emergence of randomness? Does the answer lie in the introduction of interaction between localized structures? no because there are no localized structures to speak of.
• Randomness is colloquially defined as "lots of complicated stuff going on" and "without any patterns." Though the complexity in rule 30 fits this definition as well as the other statistical, mathematical and cryptographical definitions of randomness, it seems that there is something more to be explain.

Computational Irreducibility and the Problem of Prediction

• most major triumphs of theoretical science involved finding a shortcut that allows one to determine the outcome of the evolution of a system without explicitly following its evolution
• but with many phenomenons in nature no magic shortcut has ever been found. Could be a temporary issue but Wolfram believes that it is a consequence of PoCE called Computational Irreducibility.

If one views the evolution of a system as a computation, where each step in the evolution can be thought of taking a certain amount of computational effort, then traditional mathematics suggest that there must be a way to find a shortcut and know the outcome with less computational effort. The picture to the right suggests that this is not always true. This behavior is computationally irreducible.

- This implies that even if one knows the underlying rules of a system, then to predict what the system will do can still take an irreducible amount of computational work.

- To make meaningful predictions, "it must at some level be the case that the systems making the predications outrun the systems it is trying to predict. But for this to happen the system making the predictions must be able to perform more sophisticated computations than the system it is trying to predict." (741)

PoCE says this is not possible thus there are many systems where systematic prediction can not be done because the behavior is computationally irreducible.

• implies that explicit simulation may be the only way to attack a large class of problems instead of just being a convenient way to do mathematical expressions
• helps explain how one can have a deterministic emergence of randomness as even though the rules are simple it is irreducibly complex to predict is evolution
• the general question of what the system will do can be considered formally undecidable. Undecidability is common (755)

Implications

• though one can know the underlying rules for a systems behavior, there will almost never be an easy theory for any behavior that seems complex. (748)
• essential features of complex behavior can be captured with models that have simple underlying structures. Thus use models that whose underlying rules are simple
• Even though a system follows definite underlying rules, its overall behavior can still have aspects that cannot be described by reasonable laws
• Free Will? "For even though all the components of our brains presumably follow definite laws, I strongly suspect that their overall behavior corresponds to an irreducible computation whose outcome can never in effect be found by reasonable laws." (750)
• Strong A.I. is possible

Sources

| Serendip's Bookshelves | Serendip |