Serendip is an independent site partnering with faculty at multiple colleges and universities around the world. Happy exploring!
Chance: its meaning and significance
The Emergence of Form, Meaning, and Aesthetics
Evolving Systems:
Open Conversations
Chance: its meaning and significance
31 March 2010 | 7 April | 20 April | 5 May | 19 May | 2 June | 16 June
I still have a question about fortune - the aleatory, chance. I'm not sure why it feels like a problem: I guess it's to do with understanding the mechanism by which potential becomes actual ... What about C.S. Peirce's tychism? Did that notion go anywhere - do evolutionary biologists still use Peirce's coinage? ... Karla Mallette
What particularly intrigues me is that chance "feels like a problem" not in one discourse community (eg literature) but in many. Within physics there is a long and continuing debate about where or not "God throws dice" (cf http://serendip.brynmawr.edu/exchange/node/4882) and the same is true of biology (http://serendip.brynmawr.edu/exchange/node/5993), mathematics and philosophy (http://www.umcs.maine.edu/~chaitin/sciamer3.html), and ordinary everyday human life (http://serendip.brynmawr.edu/exchange/node/1954). Problems like this one that surface in a variety of contexts intrigue me because it suggests there is something involved beyond that can't be made sense of in terms of the idiosyncracies of particular contexts (cf http://serendip.brynmawr.edu/complexity/hth.html). In this particular case, my guess is what is at issue is whether one thinks of inquiry (and evolution) as moving toward an answer or as creating new possibilities (http://serendip.brynmawr.edu/exchange/node/6199). If the former, chance is a problem (a bug); if the latter, its appealing (a feature) ... Paul Grobstein
Chance, the aleatory, indeterminacy, stochasticity, tychism. Are these all the same thing? Are they expressions of existing limitations of human understanding, or features of what we are trying to understand? Can one answer this question with any certainty? Does it matter in science, in other forms of inquiry, in practical terms, in day to day life?
Starting points:
- Alternative perspectives on randomness and its significance
- On beyond an algorithmic universe
- Evolution/science: inverting the relationship between randomness and meaning
Some directions for future exploration:
- Ways of making sense of the world: from primal patterns to deterministic and non-deterministic emergence
- The limits of reason (Gregory Chaitin)
- Anti-determinism, tychism, and evolutionism (Charles Saunders Peirce)
- Bayesian probability
And some directions emerging from the conversation
- Godel, incompleteness
- Liar's paradox
Additional relevant materials on Serendip
- The magic Sierpinski triangle
- A voyage of exploration: find Serendip
- From random motion to order: diffusion and some of its implications
March 31 meeting summary (Paul)
A group of nine faculty and students, with interests in biology, physics, chemistry, computer science, literature, and education were intrigued by the notion that "chance" presented similar issues in a variety of fields and interested in taking on the exploration of it as a transdisciplinary issue. For all of them, including several with immediate interests in evolutionary biology and/or Peirce, "tychism" (see "The issue" above) came as a surprise. The scientists present were also generally unfamiliar with the term "aleatory." Some explanation of Mallette's particular concern, the apparent randomness of survival of texts from old Mediterranean cultures, led to an agreement that issues related to randomness were indeed important not only in scholarly inquiry but also in the development of canonical knowledge and in the design of curricula and in teaching more generally. The notion that too great an organization of course content, in lieu of some context acknowledging randomness, left students with a sense of an inability to engage constructively with the material themselves was noted and put on the table for further consideration.
Some discussion of different disciplines, most particularly physics and biology, led to the recognition of a distinction between approaches that seek only to make sense of observations and approaches that attribute some broader meaning to the ways that one makes sense of observations, and a distinction between randomness as an acknowledgement of a lack of complete knowledge and randomness as an phenomenon in its own right that in turn could be used to account for other phenomena. The issue of whether either of these two distinctions entails the other was briefly discussed and added to a list of issues for further consideration.
With regard to the second distinction, randomness as a consequence of what we don't know and randomness as an irreducible unpredictability, a consensus seemed to be reached that neither of the two perspectives could be ruled out by existing observations and, more generally, that they would both remain viable given any conceivable finite set of future observations. It was noted that people do nonetheless in practice often choose between the two alternative perspectives and that such choices have implications for future action. Two additional possibilities were raised, that one could "toggle" back and forth between the two perspectives and one could treat the distinction itself as meaningless. The group agreed to explore further the reasons people had for choosing among these four different approaches and the implications for the future of each.
The notion that randomness as ignorance was a "natural" ontological position was offered, and countered by the arguments that not all people adopt it, and that cultural factors clearly can influence the choice. It was then suggested that randomness as ignorance led to "falsifable" hypotheses and hence further progress whereas randomness as irreducible unpredictability did not. The resulting discussion suggested that there was something significant in this argument but it required a little more specificity. Hypotheses involving some specific "stochastic" process could indeed be tested but because of the nature of stochasticity itself might require an finite set of observations to be fully falsified. This in turn led to the question of whether complete ("digitial") falsifiability was actually necessary for science (or for inquiry more generally). Could one conceive a kind of inquiry (scientific or otherwise) that would retain a form of evaluation and a measure of directedness/progress without complete falsifiability? Bayes, Schrodinger, and Peirce (see above) were alluded to in this regard, and it was agreed to take that question as the starting point for future conversation.
April 7 meeting summary (Paul)
Gregory Chaitin's The limits of reason was the background reading for this session. A suggested background context was that Kurt Gödel established the existence of formal limitations of logical processes, that Alan Turing did the same for computability, and that Chaitin's "incompressible numbers" tie this line of development ("algorithmic undecidability) to randomness. It was suggested that a further link among the three is the dependence of each on problems that arise in self-referential formalisms.
Chaitin's Ω is a well defined infinite number sequence that has no internal pattern (is "random") and additionally cannot, as Chaitin established through a proof closely related to the earlier work of Gödel and Turing, be generated by any computer program shorter than the number itself. To put it differently, the number sequence exists but is not "computable" from any any set of starting conditions and rules; the sequence has no underlying explanation or reason. In the context of our discussion, the number is significant since there seems to be no way to get to it except by employing non-deterministic processes, ie processes that involve some element of randomness.
The significance of Ω depends on, among other things, how willing one is to accept its existence; the group didn't feel yet competent to adequately follow and evaluate for themselves Chaitin's argument. But it was agreed that, for at least some participants, Ω sharpens the question of whether science (and inquiry generally?) depends on an assumption that everything being inquired into must have well-defined and deterministic underlying causes. Must science (inquiry generally) fall into "one of two camps" with regard to randomness (it doesn't exist as a cause or everything is accounted for by it) or is there a "third space," one that can make productive use of some involvement of underlying randomness?
Among the issues posed for further discussion were
the relation between Gödel/Turing/Chaitin and "quantum computing." Does quantum computing get around the limitations of formal systems and, if so, how? (The Bit and the Pendulum: The New Physics of Information by Tom Siegfried is accessible and helpful along these lines)
the relation between "self-referential" and "recursive"
the necessity for inquiry of reducing observations to get theory
the necessity for inquiry of both self-referentiality and noticing limitations
the relation of noticing limitations to Peirce's triad of induction, deduction, and abduction
the issue of "causes" that may be neither probable nor computable and its relation to Aristotelian concepts of causation
the possibility that all meaning is statistical, that singularities lack meaning
It was agreed to invest more time in some of the details of the Gödel/Turing/Chaitin sequence, beginning at our next meeting with Gödel.
April 20 meeting background: Gödel's theorem and its significance
April 20 meeting summary (Paul)
Conversation largely focused on the usefulness or lack thereof of the suggested Gödel/Turing/Chaitin analysis of the limitations of formal systems (see first section of Evolving systems, Gödel's theorem, and its significance).
Many humanists think of Gödel as "on their side," ie they think of work of this kind as establishing what they already know, that formal/rational systems are not the only, perhaps not even the best, foundation for inquiry. From this perspective, its not clear that there is anything to be gained by a closer look at the Gödel/Turing/Chaitin sequence. Is there any reason to believe it is relevant to such humanistic perspectives? Is there anything there that would add to them, point in new directions relevant to them?
A different challenge comes from those who value the formal/rational methodology. Economists, for example, are familiar with Kenneth Arrow's "impossiblity theorem," a formal demonstration that a perfectly "fair" voting system cannot exist (with "fair" carefully and formally defined). Such findings (and by extension Gödel/Turing/Chatin), it is argued, show what cannot be done but have no broader significance. Limitations are limitations; there remains plenty of useful work that can be done using formal/logical forms of analysis. Significantly, here too the same questions arise as in the case of the humanistic challenge: is there anything to be gained from a closer analysis of instances of formal impossibility or incompleteness?
The development of non-Euclidean geometries was offered as an example of how formal analysis and a recognition of its limitations could indeed have productive outcomes. The formalization of properties of space as it is normally perceived into a set of axioms and methods for proving theorems led in turn to a recognition that by challenging aspects of the formal system one could achieve a more general understanding of what is meant by space, one that included spaces and properties not previously considered possible.
It was further noted that the issue of the values and limitations of formal/logical systems repeatedly arises in a variety of contexts even for those who are otherwise inclined to dismiss its significance. One example was the feeling among most academics that they need to "justify" grades. A second related example had to do with the "check out counter" phenomeon, a feeling (desirable or undesireable?) that one oneself had to justify one's activities at particular times and places by a formal accounting. Another was the occasional feeling among people with a commitment to rational processes (not usually openly admitted to) that they might actually need a "new religion." In milder form, it is the recognition that their own work in fact depends on occasional "flashes of insight" that they find it difficult to account for rationally/formally.
Against this background, it was agreed to proceed with a closer look at Gödel/Turing/Chaitin to find out whether it could indeed open new directions for not only science and humanities but for inquiry in general. Along these lines, it was noted that formal systems can play several different roles. They can be used to summarize empirical observations and, by so doing, suggest new directions for exploration. In these terms, demonstrations of impossibility or incompleteness are not a problem, and may indeed be a virtue. It is only when formal systems are treated as primarily anticipatory, ie as certain predictors of what can or will be that demonstrations of impossibility or incompleteness come to be seen as threats.
In looking more closely at Gödel/Turing/Chaitin, we will bear in mind the question of whether it is in fact "self-referentiality" that leads to "kookiness" (is self-referentiality inherent in the formal system or something Gödel added?) And that Gödel's proof does not itself show the existence of something beyond what is possible in formal systems, only that particular formal systems cannot exhaust the range of possibilities. It is Turing and Chaitin who went on to show examples of particular meaningful things of this kind.
May 5 meeting background: Gödel's theorem and its significance
May 5 meeting summary (Paul)
Some introductory conversation started with the notion that there is a bit of the positivist in all of us, as well as a bit of ... something else, and with efforts to better understand what that dichotomy is, why some people find logical thinking off-putting and others are attracted to it. Such a split occurs in day to day life and in many disciplines (analytic vs continental philosophy, for example), and is sometimes characterized as comfort or lack thereof with mathematics but seems to have a deeper origin. One suggestion was that it has to do with a distaste for versus tolerance of inconsistency. Another was that it had to do with a preference for formal systems versus a preference for more fluid, associational, and indeterminate thinking. It was more or less agreed that finding a better way to characterize "something else," a way that didn't depend on opposition to "logical," was a desired product of the continuing deeper exploration of Gödel/Turing/Chaitin. The issue of "consistency" in quantum logic/quantum computing was flagged for future discussion.
Also part of introductory conversation was what was meant by a "formal system." Baseball games and chess both have a starting position and proceed by a set of rules. The former, it was suggested, don't constitute a formal system because participants don't make choices "mindlessly," ie following a set of rules that exists prior to the choice and would always yield the same behavior for a given situation. Chess players may or may not make choices "mindlessly." Those who do may be quite successful but tend to be characterized by others as "technically sound" but lacking ... soul? This in turn led to concerns that the agenda of these discussions was aimed at establishing the existence of "mind" or "God." Here too it was more or less agreed that one desired product of the conversations was instead to come a better understanding of what exists (or doesn't exist) beyond the realm of the "mindless." For present purposes, a "formal system" is understood to mean a system with a well-defined set of presumptions and well-defined rules of operation that will always yield the same outcome for a given starting point. The rules of operation or "algorithm" is presumed to be deterministic, in order to assure both consistency and reproducibility.
We moved on to a closer look at Gödel's proof, beginning with his "Gödelization" or "arithmetization" of arithmetic statements. Such statements constitute a "countably finite" set, ie they are listable in a linear sequence like the natural numbers. While there are an infinite number of such statements, "infinity" is explicitly not understood as "everything included." The natural numbers (and hence arithmetic statements) are a subset of a larger infinities, such as the real numbers. This was established by Cantor using a "diagonalization" argument, essentially showing that any conceivable "countably finite" list left off some real numbers and necessarily continued to do so even when particular missing numbers were added to it. A similar argument for the existence of things beyond the "countably infinite" is at the core of Gödel's proof, as well as Turing's and Chaitin's.
The Gödel proof holds not only for arithmetic but for any formal symbolic system involving a deterministic set of elements and rules of generation with a fixed set of symbols and finite sentence length. English sentences (or for that matter books) according to this argument are also "countably infinite," and hence a complete catalogue of "expressible" human understandings in any language leaves open the possibility of additional understandings inexpressible in that symbolic system. At the same time, it was pointed out that the process of formalization can itself bring into existence things not previously conceived, the transfinite numbers being a case in point. To put it differently, formal systems can be thought of as not only demonstrations of the limitations of existing understandings and methods of generating understandings but also as a mechanism by which to bring into existence possibilities that didn't previously exist (as per non-Euclidean geometries mentioned above). In other words, formal systems should not be understood as constraints on what can be understood but as tools to expand the range of possible understandings.
As earlier noted, Gödel's proof provides a reason to believe there may be important statements/understandings outside the "countably infinite" number formally generated, but does not give an example of such a statement. One can still entertain the possibility that the Gödel limitation is a "singularity" that can be noticed but need not be regarded as a general problem. To address this concern, we'll look next at the work of Turing and Chaitin. Also not fully examined and to be returned to was the issue of where self-referentiality comes from and the role it plays in the limitations of formal systems.
May 19 meeting background: Between Gödel and Turing
May 19 meeting summary (Paul)
Discussion of Turing and Chaitin was deferred to a subsequent meeting in order to first consider a question that arose during the last meeting and subsequent discussion: the relevance (or lack thereof) of formal systems and their limitations for things other than logic, mathematics, and computing. See Between Gödel and Turing.
It was suggested (Between Gödel and Turing) that one could not "opt out" of the use of formal systems, that whatever one's familiarity with or attitude towards them the process of trying to reduce one's experiences to underlying "principles and rules" was an inherent part of thought in all people (a feature of one aspect of brain architecture common to all human beings). A parallel was offered between the Gödel limitation for logical systems and Wittgenstein's notion that human language was limited in its scope, that some things are "inexpressible". Hence, an appreciation of the origins and significance of the Gödel limitation is relevant for understanding human thought quite generally.
It was further suggested that one might think of the Gödel limitation not in terms of what is absolutely "expressible" or "knowable" but rather in terms of what is knowable/expressible given a particular formal system (a particular set of properties and rules). From this perspective, the Gödel limitation isn't an acknowledgement of any fundamental distinction between the expressible and the inexpressible but rather a recognition that particular formal systems themselves create a distinction between expressible and inexpressible and that the latter can become expressible by a change in the formal system. In short, the Gödel limitation is not an argument against using formal systems but rather provides a strategy for their more effective use: recognize the limitations of any given formal system and alter one's use of formal system to expand the range of expressible things. This might be done by altering a particular formal system and/or by making use of several different formal systems (see Forms of Inquiry). An appealing feature of this perspective is that it makes the distinction between the inexpressible and the expressible a function of the inquiry process itself rather than a characteristic of things "out there," assures that there will always be an "inexpressible" to inquire into since formal systems themselves contribute to creating the inexpressible, and encourages "chatter" rather than silence in the face of the inexpressible.
This suggested way of appreciating Gödel's work in a broader context in turn raised a number of issues, including that of whether it was actually a direct or only a metaphorical extension of Gödel's incompleteness theorem, and how it related to issues of self-referentiality, consistency, provability, and simpler logical systems for which the incompleteness theorem does not hold. It also raised issues about whether one does or does not need an "outside observer" to make the distinction between "expressible" and "inexpressible," the relation between that distinction and “Nature is what we are put on earth to overcome" (Catherine Hepburn in The African Queen), the preservation or loss of a line of "demarcation" between various forms of human activity (science, art, religion), the relation between human and "domain specific" computer languages, and the relation of human thought and the process of inquiry.
The expectation is that the exploration of a number of these issues, as well as of the broader significance of the Gödel limitation, will be usefully focused and advanced by talking next about Turing and Chaitin. Is a computer an inqurer? Is there inquiry other than "trying to get to the bottom? Is inquiry "creative"? Is the brain a computer? Are things things that brains can conceive that computers can't? And, if so, why?
June 2 background notes and background reading:
Demarcation?
- Crossing the lines of science and formal systems to ... ?
- ten thousand questions!
- More on demarcation
And on from formal systems to randomness
- What computers can't do (re Turing)
- The limits of reason, by Chaitin (pdf)
June 2 meeting summary (Paul)
Rather than focusing on the limitations of Turing computability and its relationship to randomness, as originally planned, this conversation focused largely on formal systems, science, and the "demarcation" problem. Science, it was pointed out, should not be equated with "formal systems." Whatever role formal systems play in science, there is an at least equally strong reliance on empirical observations both to motivate and to test hypotheses/statements/expressibles. The limitations of particular formal systems are not necessarily limitations of science either in practice or in theory.
At this point, "science" and the demarcation problem (what distinguishes science from other things, including "pseudoscience") rather than "inquiry" became a matter of central concern (for continuing thoughts along these lines, see for example Parascience, Beyond demarcation, and related comments in the forum below). It was suggested that science is distinctive in being committed to "value free" inquiry and that this was important to avoid problems of "disbelief in evidence." This in turn provoked challenges about the actual practice of science as opposed to the aspiration, as well as about whether evidence was actually value free and whether a distinction between practice and principle was sustainable. The "pursuit of consistent, reproducible findings," it was suggested, itself introduced "values" into scientific aspiration/practice.
An inclination to demarcate not only with regard to science but generally was itself both attacked and defended. Demarcation tends to create antagonisms and power imbalances, and so can contribute to oppression. On the other hand, demarcation can contribute to the productive existence of multiple, specialized approaches to inquiry that can in principle be more productive than any single generalized approach. Perhaps an appropriate resolution is to accept the usefulness of demarcation but work to assure integration rather than conflict among the various specialized approaches. The suggestion here, as earlier with regard to formal systems specifically, is not to feel a need to pick between demarcated things but rather to value each for what they individually and distinctively bring to a larger task.
Perhaps the issue of the role of "values" in science, and inquiry generally, like the issues of the role of "mind" and "soul" raised earlier, can be illuminated by coming to grips (albeit belatedly) with Turing computability and randomness.
June 16 background notes and background reading:
- What computers can't do (re Turing)
- The limits of reason, by Chaitin (pdf)
June 16 meeting summary (Paul)
It was agreed that issues of the nature of science and of the uses and problems of demarcating science raised in the earlier discussion had not been resolved but that they might usefully be returned to in light of planned discussions of Turing, computability, Chatin, and algorithms/non-compressibility. Against this background, discussion began with the notion of a Turing machine and a univeral Turing machine. See "Computers as formal systems and their limitations" and the following section of Beyond Incompleteness II /exchange/node/7642). It was also noted that while the focus here, as in the case of formal systems earlier, is on limitations or "incompleteness, but that in the case of both formal systems and Turing computability, there are and continue to be an enormous array of useful things that can be done within those limits.
Turing computability was likened to formal systems in the sense that there was a fixed and finite starting condition together with a fixed and finite set of construction rules from which everything else followed deterministically. It was noted that the suggested parallel between was worth looking at more closely in finer detail to test it more fully and potentially to more completely understood the relation between the two. For present purposes, the presumed equivalence is, more or less, the Church-Turing hypothesis: that what is logically deducible is what is Turing computable and vice versa.
Among the initial questions about Turing machines was to what degree one could make such machines themselves the target of inquiry: could one "reverse engineer" a Turing machine, determine its starting condition and construction rules from observing its output? In some cases (reversible machines) the answer is yes (and there are energetic issues that might favor focus on such cases) but in most cases the answer is no. Most computation involves irreversible steps, with a given output resulting from more than one possible prior state. In addition, paralleling the problem of inference from finite samples, there are always an infinite number of ways a given output state might have been achieved. The latter problem might be overcome by starting with the presumption that the observed output results from some Turing machine process (rather than, for example, from a random number generator).
There was interest also in the issue of whether the "state of a Turing machine could be equated to a brain "state," and whether the "meaning" of an input changed for different Turing machine states, as one imagines the meaning of an input does from different brain states. The issue of the relation between brain states and Turing machine states is directly related to the comparison between the two we are headed for. In some ways the relation is close: each brain state, like each Turing machine state, has as a major determinant the previous state. In other ways the parallel probably fails (indeterminacy, no inner "experience" of state in the case of a Turing machine). The lack of parallelism will, for these reasons, probably turn out to be greater with regard to "meaning." The state of a Turing machine affects what it does for a particular symbol but it is not clear that there is anything comparable to the human sense of "meaning" in the Turing machine so it is not clear there is "meaning" in saying that changes in state of the Turing machine yield changes of "meaning."
With this background on Turing machines, the notion of the limitations in possible outputs was illustrated with respect to the halting problem. It was noted that the proof has close parallels to the incompleteness of formal systems and relates to the outputs being countably infinite rather than larger and that there are additional close parallels in Chaitin's proof of the existence of non-computable or non-compressible or truly "random" numbers but neither proof was closely examined. Discussion proceeded instead on to the question of the relation between formal systems, Turing computability and the brain (a problem outlined by Roger Penrose in his Emperor's New Mind).
In this regard, the question was posed of whether non-computable things have any practical significance as opposed to being simply mathematical/philosophical oddities. Are they of significance only because some (many) people don't understand that "twentieth century positivism should have died?" It was suggested that a disinclination to take non-computability/"causelessness" seriously was actually at the root of may contemporary problems, including the current oil spill. The assertion that someone/some thing must have been at fault, that there are ways to control all variables so that such things wouldn't happen, wouldn't be possible if people actually took non-computability/"causelessness" seriously, and this might in turn have a significant impact on energy policy discussions. A similar argument made by made with regard to the financial crisis; one needs to design systems not in an effort to preclude the possibility of unanticipated occurences but rather to minimize the impact of such events on the assumption they will always occur with some probability).
This digression in turn highlighted a long-term concern of the present series of conversations (and of the evolving systems project generally): in what ways would taking seriously the notion that things are never fully predictable, that they always have some element of indeterminacy, change not only approaches to energy and financial policy but also changes to the process of inquiry itself. There followed an intriguing contrast. On the one hand, an assertion that we are largely already there in practice, in "habitus." On the other, a reminder that we still routinely separate "theory" and "practice," as if we persist in the belief in two worlds, one ideal and deterministic, the other perhaps messier. Is there another way to think about understanding and inquiry, one that doesn't use that distinction? Offered as a possibility was recognizing that formal systems don't describe what is eternally and deterministically "out there" but instead exist "in here" as transient "stories," of practical use at any given time but always revisable/discardable. And that it is at least as much conflicts between stories, rather than conflicts between particular stories and any external "reality" that should be regarded as the driving force and rationale of inquiry.
This in turn offers a perspective for thinking about how brains differ from Turing machines and embody formal systems without being fully constrained by their limitations that will be returned to when the group next convenes in the fall.
Continuing conversation, in on-line forum below
Comments
more on computers/brains
Apropos of "the human brain is not designed by evolution to achieve a particular outcome but rather to respond adaptively to a somewhat unpredictable world" and its implications for future developments in computer science ("information processing strategies that are implemented differently and more straightforwardly in a human brain"), see
A chip that digests data and calculates the odds
More on the computer/brain issue ...
this time with brains (at least collectively) doing better than existing computers ....
In a video game, tackling the complexities of protein folding
"Never Pure"
The book review's not laudatory, but worth a look just for the subtitle; check out Steven Shapin's new, 552-pp. volume, Never Pure: Historical Studies of Science as if It Was Produced by People With Bodies, Situated in Time, Space, Culture, and Society, and Struggling for Credibility and Authority. Of particular interest (to me) is the account of the shift in how scientific credibility, which originally derived from first hand observations, came to depend on more abstract (less reliable?) criteria: academic affiliation, consensus among other scientists, openness about methodology. This account put me in mind of a talk Paula Viterbo here gave a number of years ago, following a "scientific fact" from the laboratory into the social world: the movement included multiple levels of filtration, discussion, conversation, translation, and adaptation to accessible language.
listening to self and listening to/with others
"the shift in how scientific credibility, which originally derived from first hand observations, came to depend on more abstract (less reliable?) criteria: academic affiliation, consensus among other scientists, openness about methodology"
Amuses me to notice that one can replace "scientific" with several other adjectives ("academic," "democratic," "spiritual") and, with appropriate later changes in the sentence, get the same description. And to think, in all realms, about the value of "first hand observations" in relation to various criteria for "shared subjectivity." Maybe its time to acknowledge/embrace the notion that not only science but all forms of inquiry necessarily oscillate between "first hand observations" ("individual subjectivity") and "shared subjectivity."
Free will..and constraints?
From a different angle: for some more thinking about how brains embody formal systems without being fully constrained by their limitations, see on demarcation and cluelessness.
chance and jazz
"It's almost impossible to tell when the repetition is going to occur. You know it's going on, but you can't predict it." ... Henry Threadgill
adding in free will
Mike emailed several of us an article that usefully connects some of our own going conversation to the "free will" problem, among other things. For some of the thoughts triggered by that article, and continuing on-line conversation, see Randomness, the brain, free will, science, "pseudo-science," justice, and demarcation: a conversation.
more on the limitations of formal systems
From a conversation with Bill Huber re The conjecture, for which many thanks ....
The possible outputs of a classical one-dimensional cellular automata with an infinite number of cells are countably infinite, regardless of its rules and starting conditions (since they occur sequentially they can be associated one to one with the positive integers). Among such CA's is a universal Turing machine. Using the diagonalization argument, it can be shown that the number of possible states of an infinite number of cells are always greater than countably infinite, ie that there were always be possible states of an infinite linear array that are not among the outputs of a classical one-dimensional infinitely long cellular automaton, or among the possible outputs of a univeral Turing machine.
From this it follows that
Can a computer play Jeopardy as well as a human?
Apropos of our current and anticipated conversations, see What is IBM's Watson?
Yes, a universal Turing machine may in fact play Jeopardy at least as well as a human. Assuming so, playing Jeopardy well is within the Gödel limit of a universal Turing machine running a deterministic algorithm, and will have been shown to be so in a case of currently available (if somewhat unusual) memory capacity and processor speed.
Perhaps a point on the side of the brain as a universal Turing machine, but there's lots of race yet to be run. Moreover its worth noting that the algorithm in this case has some characteristics that would have surprised programmers working in Turing's era as well as up to no more than fifteen or twenty years ago. And those characteristics may prove to be ways to simulate on a universal Turing machine information processing strategies that are implemented differently and more straightforwardly in a human brain.
"no single algorithm can simulate the human ability to parse language and facts. Instead, Watson uses more than a hundred algorithms at the same time to analyze a question in different ways, generating hundreds of possible solutions. Another set of algorithms ranks these answers according to plausibility; for example, if dozens of algorithms working in different directions all arrive at the same answer, it’s more likely to be the right one. In essence, Watson thinks in probabilities. It produces not one single “right” answer, but an enormous number of possibilities, then ranks them by assessing how likely each one is to answer the question."
A universal Turing machine can't actually "use more than a hundred algorithms" at the same time, but a single properly written algorithm can give the appearance of doing so and, within limits, produce results comparable to a system, like the brain, that actually does use several different algorithms simultaneously. Similarly, a universal Turing machine can't actually "think in probabilities" but a properly written deterministic algorithm can give the appearance of doing so and, within limits, produce results comparable to a system, of which the brain may well be one, in which the basic currency of information processing is not deterministic 1's and 0's but instead probabilities.
So, the issue of whether the brain is a univeral Turing machine, and constrained by the limits of such a machine, remains open. Yep, human performance at Jeapordy may well prove to be within the computational limits of a universal Turing machine. But it may well be that establishing this will turn out to be more significant in clarifying the differences between a brain and a univeral Turing machine than it is in establishing an equivilence between the two.
Children as Turing Machines?
A connection between the Turing machine, discussed this morning, and education: The groups discussed the possibility of "working backwards" with the Turing machine and, from studying a certain output, discovering what kind of formal system created it. There are "an infinite number of explanations for any output," and many less options for any one input. This directly correlates with the problem in education of trying to decipher what kind of learner a child is, or diagnosing whether or not they have a learning disorder/other "disability." When looking at a child and their learning/behavioral patterns ("outputs"), many explanations are possible. If a teacher, parent or doctor chooses the wrong explanation to justify the child's actions, the child will almost certainly remain at a disadvantage for at least the duration of their learning career (for example, Donna Williams, an accomplished autistic author, was not diagnosed with this disease until late in her life and was treated accordingly: like a "psychopath"). Based on the decision of the adult taking care of the child of what is "wrong" with them or what "category" they should be placed into, the child will be brought up in an environment where certain influences will result in a specific type of behavior (certain "inputs" will result in certain "outputs")--this is a self-fulfilling prophecy waiting to happen.
The comparison above between children and formal systems such as the Turing machine is useful, but is it also dehumanizing these students? Or is it natural to think of things and processes as "inhuman" in order to treat them objectively?
metaphors for education
I appreciate this interesting connection between the "outputs" of a machine and the diagnosis of a child. As my colleague Alison Cook-Sather has written about (see, for example, "Movements of Mind: The Matrix, Metaphors, and Re-imagining Education -- I can send you a pdf if you like), educators and schools have used (consciously or not) a wide range of metaphors for teaching -- from producing students like products in a factory, or printouts from a computer, to tending them like plants and flowers, to treating and curing them like the sick and wounded . . . and more. So . . . I do think it's dehumanizing (de-creaturifying, as well) to regard/theorize/describe/interact with children as if their being, or even just what they need in/from school, is fully knowable. Rather than try to diagnose and then treat/cure the learner, what about a model of ongoing learning of the learner, not to complete it, but to be engaged with it, and with processes (via curriculum not only adjunct "testing") that help the learner teach you him- or herself?
My question about the brain as a universal Turing machine is how does the UTM account for the unconscious and also for change over time?
The unconscious etc.
Your question about the unconscious, Alice, brings forward an important distinction among levels of description of the behavior of a Turing Machine. A TM as we know it acts on individual cells in a tape filled with letters of an abstract alphabet, etc. The analog in humans might be a description of the brain as an assemblage of molecules and ions interacting according to chemical and physical laws. The "unconscious," however, is a description based on observing general patterns exhibited on the scale of an entire system comprising a heptillion or so pieces. To make progress with this question, which is fundamental, we might want to begin by clarifying what we mean by the "unconscious" so that it becomes a concept that can be unambiguously discussed and researched. The challenge then is to show how the defining markers of unconscious behavior can emerge from the (theoretically) simple interactions of the ions and molecules. It's a daunting task that we can hope to carry out by studying the system at several natural levels, ranging through larger molecules, neurons and ganglia, assemblies of neurons, and so on through the major structures of the brain. This analytical approach, though, risks not finding elements of system behavior that don't exist "in" any level but are solely properties of the whole big interacting mess.
The easier part of your question is about change over time. One way to use a Turing Machine as a (simplified) model of human thinking is to use a very long sequence of symbols on the tape to represent the constant stream of sensory inputs a person receives during their lifetime. Recall that one part of the machine's reaction to these symbols is by (optionally) changing its internal state. It can be helpful to think of this "state" a little less abstractly: it can consist of a huge database of information, for example, which the machine "updates" (that's just a change of state) in response to what it "sees" on the tape. I have in mind a caricature of the absent-minded professor who writes everything down in a notebook: to find out whether she likes chicken soup for breakfast, for example, she would look it up in the notebook. The caricature becomes less obvious when the professor is able to look things up quickly and surreptitiously and to record new observations just as easily. At that point, would an observer recognize the professor's decisions as being absent-minded anymore? Or even being purely mechanical?
where's the unconscious?
Scattered all over Serendip are collections of
metaphors for teachers/classrooms/students....
Applying Alice's questions, I'd ask where the
unconscious lies in each of these conceptions,
and how well each invites change over time....
From Turing machines to brains to education
Hmmmm. Let me try and dissect this and the above a bit, drawing on How Babies Think, a recent article by Alison Gopnik, a developmental psychologist.
"children's brains ... must be unconsciously processing information in a way that parallels the methods of scientific discovery."
What contemporary research is suggesting is that babies are born with unconscious presumptions about the world and an unconscious mechanism for revising/updating those presumptions. The latter involves somewhat randomly generated outputs ("play" or "experiments") that result in new inputs. The latter are compared with expectations based on the existing presumptions and lead to revision of those presumptions when the input doesn't match expectations. To put it differently, babies are born with a pared down version of loopy scientific method and it is this that accounts for their quite remarkable learning capabilities.
So where is the Turing machine, or formal systems, or "properties and rules," or consciousness, or free will? Or the brain/neurons for that matter? Its actually fairly easy to construct neural networks that will implement all of the features of the loopy scientific method, so that's not a big problem. And one could, in principle, mimic all this on a universal Turing machine if one knew in advance all the outputs a baby will generate and all the inputs it will get as a result. But that may well be missing the point (see Can a computer play Jeapordy as well as a human?). Arguably the human brain is not designed by evolution to achieve a particular outcome but rather to respond adaptively to a somewhat unpredictable world. And that in turn suggests that the brain is designed to be open to the world (see Doug) rather than, like a Turing machine, to be self-contained. And it suggests that the brain is designed to work in terms of probability, to both detect and make internal use of stochasticity, rather than, like a Turing machine, to operate deterministically in terms of true and false. And it suggests that the brain is designed to process information simultaneously in multiple modules, without worrying to much about consistency among them.
If all this is so, where do formal systems, or properties and rules, come from? My guess is that these, rather than being the essence of how the brain works are add-ons, things that appear when on begins to make sense of what the unconscious is doing, when one becomes conscious, when one starts to reflect on what one is oneself doing. And they clearly add substantial exploratory capability: by constructing formal systems, by trying to reduce one's experiences to properties and rules, one opens up new possible worlds that would not have been reached by one's starting presumptions and experiences alone. Conscious thought is useful. But thinking of things in terms of formal systems, of properties and rules, can also get in the way. As Gopnick notes
"There is a trade-off between the ability to explore creatively and learn flexibly, like a child, and the ability to plan and act effectively, like an adult."
Perhaps the most important idea that comes out of this is that we don't have to choose between formal systems and something else, between properties and rules and chaos, between the conscious and the unconscious. Its not a choice but a tradeoff, one we can make and remake depending on the what's going on at any given time. And perhaps that's the opening to free will? We're not bound by any of multiple ways of doing things that we have inside ourselves but instead have the capability to switch among them. If we learn about them all and about each of their strengths and weaknesses. Maybe that should be conceived as the objective of education? To enhance the capability of adaptive change over time?
Parascience!
When I called Mike, last week, on his use of the term "pseudoscience" (since something that is "not science" isn't necessarily "false science") he offered, as substitute, the word "parascience." I have been mulling over that intriguing alternative for a while, and as part of that process, just looked up the etymology of the prefix "para-." There are two: the first one, most commonly known, means "by the side of," "past or beyond." So a "parascience" (like literary study or history) might just exist "alongside" science, but it could also have come "before" science, or represent that which lies "beyond it," encompassing a larger sphere.
The second--and to me now even more intriguing--etymology comes from "parare: to prepare, defend from, shelter" (think "parapluie," which protects from the rain, or "parasol," which protects from the sun). I've had fun playing w/ this sense, in which the humanities might be said to "protect" science from itself (perhaps from its illusions that it exists w/out human investments, or outside of power relations....?) This is what much contemporary science studies does; I'm reading right now an intriguing book by Patti Lather called Getting Lost: Feminist Efforts Toward a Double(d) Science, which asks "how research-based knowledge remains possible after so much questioning of the very ground of science." It's only possible (no surprise) if inquiry is acknowledged as a social practice; it's less the nature of science, than its effects, that are now seen to be @ stake.
Along these lines, and in preparation for a new course on emergent systems that Paul and I will be teaching this fall, I've also just finished a monolithic study, Jared Diamond's Guns, Germs and Steel: The Fates of Human Societies, which sets the challenge of developing "history as a science." In his conclusion, Diamond catalogues the difficulties facing scholars in the historical sciences (among which he includes not just students of human history, but astronomers, climatologists, ecologists, evolutionary biologists, geologists, and paleontologists): these difficulties include "the impossibility of performing replicated, controlled experimental interventions, the complexity arising from enormous numbers of variables, the resulting uniqueness of each system, the consequent impossibility of formulating universal laws, and the difficulties of predicting emergent properties and future behavior." In other words, in the language we've been using here, it's all about the impossibility of using formal systems.
I was raised in a rural area, by a farming family who ofttimes feels themselves @ the mercy of unpredictable weather systems. I'm quite sure, looking back, that a good part of what drew me to a bookish life was an escape from the unpredictable, the desire to have the kind of work where I could be more in control of outcomes (and, perhaps paradoxically, freer of the dictates of the material world, more able to go exploring imaginatively). Of course what I've learned over the past few decades is that teaching and learning is not predictable (and that it's all the more interesting, the more unpredictable it is!). But I do understand the strong draw of formal systems.
And now, given our recent complex conversations, I'm wondering how widespread this phenomenon is. I'm thinking, for example, of David Ross's account of his distressed realization, as an economist, that his work of "counting changes what is counted"; or of Wil Franklin's repeated queries about how, "in a sea of change," we might "find any ground upon which to base action?" How many of us became (and continue to practice as) academics, in search of the (illusive?) security that formal systems offer? How many of us come to recognize it that search as illusory, and make adjustments....?
Against ambiguity
I was, and am struck, by Anne’s comment last week that words have meanings. Note the plural form of the terminal noun in the previous sentence. There is a nuance here that is worth pointing out, one which is relevant to demarcation and ways of knowing, and one which I think leads to misconceptions regarding science and what it is or is not.
During the discussion, Anne specifically called out my use of the term, pseudoscience, as derogatory. This attribution of my intentions is interesting and worth exploring. First, let’s look at a few definitions...
pseudo-, comb. form (from the Oxford English Dictionary): Forming nouns and adjectives with the sense ‘false, pretended, counterfeit, spurious, sham; apparently but not really, falsely or erroneously called or represented, falsely, spuriously’.
So how might this term get used? Here’s an example from biology:
pseudopodium, n.: (from the Oxford English Dictionary) A protrusion of part of the protoplasm of an amoeboid cell, typically in the form of a blunt lobe, by which it moves, ingests particles, etc.; (from Mike Sears) not really a foot...in the evolutionary sense that it is not a derived character shared by other organisms with ‘real’ feet.
Now, I have to wonder if biologists have been insensitive to the nature of amoeboids given our terminology? I’m guessing we’ve been more sympathetic than most other categorizations of humans. My point is that, as a scientist, the term pseudo didn’t imply any ill feelings toward amoebas (my apologies to amoebas if any feelings were hurt).
Now to the bogeyman...
pseudoscience, n.: (from the Oxford English Dictionary) 1. As a count noun: a spurious or pretended science; a branch of knowledge or a system of beliefs mistakenly regarded as based on scientific method or having the status of scientific truth. 2. As a mass noun: spurious or pretended science; study or research that is claimed as scientific but is not generally accepted as such. Chiefly derogatory.
Interestingly, I attached myself to the first definition, but Anne attached me to the second. I’ll stick to my meaning and avoid the implications of the second, if others will allow. As per the speech by Lakatos, there is a reason for needing to distinguish pseudoscience (i.e., ‘a system of beliefs mistakenly regarded as based on scientific method’) from science ([from the OED] 4. a. In a more restricted sense: A branch of study which is concerned either with a connected body of demonstrated truths or with observed facts systematically classified and more or less colligated by being brought under general laws, and which includes trustworthy methods for the discovery of new truth within its own domain). Note, the last part of the definition of science that I’ve used, ‘a branch of study...which includes trustworthy methods for the discovery of new truth within its own domain’. Oddly, I find myself valuing the trustworthiness of science versus what I feel as the untrustworthiness of pseudoscience. (Others might be comforted in knowing that I am pretty agnostic to the trustworthiness of nonscience).
An example that comes to mind is that of Intelligent Design (ID). This is a clear example of an instance of pseudoscience that, in particular, has used an untrustworthy manipulation of terminology and methods to establish (de novo) the authority of a subdiscipline within biology. I won’t dive into all of the documentation to support this statement (I can if pressed), but much can be found with regard to this assertion by study of the documents and evidence pertaining to the Kitzmiller v. Dover Area School District Case (if one is interested). Some would ask, what do I fear from adding to the ‘story of evolution’? First, I or others have no such fear. We do have a regard, though, for the methodology, or should I say story, of science.
This regard is why I raise the example of ID. To add to the story of science, one needs to operate within the existing story. ID purports to do this. One proposition of their story is to suggest that life is complex, so irreducibly complex, that it must have had a designer. Well, scientists can deal with such a statement, and in fact have. Systems purported to be irreducibly complex have been reduced (again you can search the scientific literature for this if interested). So what can be gained is that, despite what seems to be a human need to the contrary, there is no scientific reason that ‘complex’ life can’t arise and evolve by a random, nondirected process. (By the way, this is a very powerful statement that has big implications across disciplines). The problem with ID gets back to the original premise regarding the meaning of words. Proponents of ID have used the multiplicity of meanings that become attached to words as a means to attempt to establish itself as legitimate science. (Fortunately, there has been an extensive paper trail to document this, and the scientific method has not failed us when it comes to invalidating the claims of such a ‘theory’). Such multiplicity has been used not only in establishing an assumed authority for ID, but also by using ingenuous terminology such as ‘teach the controversy' to attempt to sneak into our teaching traditions. This phrase is very loaded to an evolutionary biologist not because we don’t want to teach any controversies in our field (because I do teach controversies, very intentionally), but the controversies can’t be contrived ones that use a play on words as a means to gain legitimacy.
The difference between Anne’s use of words (having built a career on the premise that words have meanings, to paraphrase) and the use of words by science is that science does try to restrict the definitions. To try to frame this in the conversation of formal systems, a consistent system cannot be built if the terms on which the system is constructed have multiple meanings. Seriously, what better way would there be to develop an inconsistent (incoherent) system than to attach multiple meanings to any given statement? Now, that said, exploring the different meanings of words might be very important in other disciplines (...English, Political SCience, Education, International Studies, etc.) for very practical reasons. For instance, meanings have the potential to reveal the motivations behind the actions of humans. That said, please re-read the definitions that I have laid out with regard to science. They are necessary to understand if one wishes to add to or edit the story of science.
An interesting and relevant path to follow for discussion comes from a paper published in 2001 by Joel Brown entitled, “Ngongas and ecology: on having a worldview”. Here is an excerpt from it’s Abstract:
“Ngongas provide a metaphor for some of the opportunities and challenges facing the
science of ecology and evolution. Ngongas, the traditional healers of the Shona
culture, Zimbabwe, fail in the delivery of quality health by today’s standards. Their
outdated worldview makes most health related issues seem more complicated and
more multi-factorial than when viewed through the worldviews of modern medicine.
With the wrong worldview, one can work very hard, be very bright and dedicated,
and still be ineffective. With the right worldview, one can work much less hard and
still be extremely effective. As ecologists, we should be opinionated and possess
clearly articulated worldviews for filtering and interpreting information. As ecologists,
we are also a bit like ngongas – we often fail to provide answers for society’s
ecological questions and problems, and we excuse ourselves with a belief that
ecological systems are too complex and have too many factors. Unlike ngongas, this
invites us to pay a lot of attention to promoting and assessing competing worldviews.
We should be open-minded to the anomalies in our worldview and the successes of
alternative viewpoints.”
Multiplying meanings
Hey, Mike--
I didn't say that I had built my career on the premise that words have multiple meanings, but rather that I had staked my life on it, actually taking every possible opportunity to multiply them...
...which probably explains why (!) I'm still trying to understand better why you find it useful to limit the number of meanings available-and-applicable in any given case. I'm perhaps even more curious about how you go about making such selections. So let's pick up here? When--having acknowledged that several alternative meanings are listed in the dictionary, and so @ least theoretically possible--you say "I'll stick to my meaning and avoid the implications of the second, if others will allow," I actually find myself not quite willing to "allow" that move, at least not without your explaining its back story and motivation. I'm curious about the process that goes into such a selection and limitation of one meaning among several possible alternatives: on what grounds do you make such a choice? On what presumptions does it rest? Are they acknowledged? What purposes does that particular definition serve for you? What does it allow you to argue, that the second definition, for example, might not enable? If you "need" a particular definition, to achieve a certain end, mightn't you be begging the question you want to answer? Constructing a circular argument?
I think I understand how the scientific method works as well as it does: restricting the meaning of words and limiting the scope of reference enables scientists to construct and work within formal, consistent, (more) controllable systems. But another way science works, surely, is through the counter-move of the kind of open-mindedness to anomalies and alternatives w/ which your posting concludes. I'd say that it's actually not the case that "to add to the story of science one needs to operate within the existing story." I'm thinking, specifically, of Galileo. I'm thinking, more recently, of a series of recent brown bag discussions, in which it was suggested that scientists dismiss outliers as "procedural errors," until enough of them accumulate to actually change the existing story (think: the growing recognition of the growing hole in the ozone layer). I'm thinking, more generally, of Thomas Kuhn's paradigm shifts: extraordinary science, constituted by explicit refusals to operate within the existing story. I'm thinking, more popularly, of Lisa Belkin's wonderful 2002 NYTimes Magazine article on "The Odds of That," which asks how we decide what's random and what's relevant: when are we "making up" highly improbable patterns in large data sets? When might they become the key to a new story or pattern not yet seen?
Finally, I'm thinking of the ethics of using the life experiences--aka the so-called "wrong world views"-- of others as metaphors, or vehicles, for constructing our own "right ones," rather than trying to understand why others may construct the world as they do: what useful purposes might such constructions (such multiple meanings) serve--both for them and for us? (For an experiment in this direction, see Adjectives of Order...)
Translating the language of science
For the new ESem on (biological, cultural, individual) evolution that Paul and I will be teaching this fall, I've been reviewing some studies of the evolution of language. The final essay in a special issue of Science (303, 5662: February 27, 2004) on Evolution of Language, Scott Montgomery's "Of Towers, Walls, and Fields: Perspectives on Language in Science," includes some observations that might be useful in our discussion about precision of language use in science:
"Language in science is in the midst of change and appears dominated by two contradictory trends. Globalization of scientific English seems to promise greater international unity, while growth of field-specific jargon suggests communicational diaspora....technical Englishes everywhere face a fate of inaccessiblity....
An important goal would be to increase tolerance toward variation in scientific English--to avoid the imperial attitude that one standard must be obeyed in order that any and all threats of semantic chaos be met....It may help, in this arena of work, to recognize the linguistic context involved: Explaining science qualifies as a form of translation, the movment of knowledge from one linguistic context to another....
Language in science is a historical reality, evolving @ every turn....the full landscape...reflects..the complexity and fertle change that are central to scientific practice itself."
On the Difference Between Philosophy and Science
Here is an interesting essay on the Difference Between Philosophy and Science. Maybe it has some bearing on our discussions?
questions
What is useful in the cognitive achievement of reducing apparent complexity?
Can the possibility of irreducibility also be a useful cognitive tool?
Is it possible, and, if so, might it be sometimes desirable, to exercise a degree of control/restriction over the definitions of words to achieve chosen objectives, and at the same time expect that these definitions will tend to slip/oscillate/mix with other discourses?
embracing the limitations of formal systems
"the strong draw of formal systems" is indeed, I suspect, related to a wish for certainty, "security," as well as a reliable sense of "meaningfulness" (see Evolution/Science: inverting the relationship between meaning and randomness) And I too think we all experience/act out of it to varying degrees, and could usefully learn to recognize the limitations of the formal systems approach. But, I'd argue, its not "all about the impossibility of using formal systems." To the contrary, its about "the impossibility of using formal systems to achieve any fixed and unchallengeable certainty/security/meaningfulness." We all can and should (and inevitably will) go on using formal systems to various degrees for various purposes. They are very useful, not only as a tool to summarize observations to date but also to open for exploration new ways of thinking about things. What we need to do is not to get rid of them but to accept their limitations ... and, perhaps, to give up the search for absolute certainty/security/meaning, and settle for making the best sense we can now of what we have to make sense of. Maybe we could even learn to enjoy uncertainty/insecurity/the lack of absolute meaning as the space that allows us to be meaningful participants in the creation of "opportunities that weren't there before and ... meaning that had yet to occur to us" (Evolution/Science).
constraint as scaffold
The poem and comment I recently posted speak to this shift -- thinking of formal systems as enabling constraints, not full stops on the horizon of expanding possibility. So, the poem can be read (and perhaps written?) not as a rejection of the formal system of English, in this case, or of language instruction, but of the necessary creativity engaged by their uptake.
On the complex ways words keep each other's company
Following up on that...
some thoughts about the limits of thinking of grammars as formal, fully "accountable" systems.
search vs. settle? Can both be "science?"
I am intrigued by the idea of embracing limitations -- accepting/welcoming/affirming imcompleteness/blockage/barriers/edges. It's interestingly counter-intuitive, as I typically think of the term "embrace" as applying to something positive, rather than to its edges or endings. I can see how the choice not to get rid of something (either in mind or in experience) but instead to accept its limits is actually a way to embrace more of what it is. For example, I'm thinking of one of my daughters and her persistent discomfort with aspects of schooling. If I embrace this discomfort, I let it in as something to build on or with . . .
"Making the best sense we can now of what we have to make sense of" sounds like the daily/hourly/minutely work of evolution and growth, right? It's the way eggs become chickens . . . It's closer to the conditions of organismic life than a search for TRUTH -- for answers that resolve questions in a trans-organismic way . . ?
Of course, PART of OUR organismic life, as humans, is to create abstractions of organismic life (and other things). So embracing the limitations of formal systems means in a sense accepting that they are among the experiences and tools that we have for creating and making sense of experiences -- no more and no less. As Adrienne Rich put it, "We must use what we have to invent what we desire." So "settling" would not mean compromising, giving up on desired states, but settling into what we are MORE -- yoga-wise. Settling into our surroundings so we can see/sense and better use "what we have."
settling as a tool in searching
Yep, "the daily/hourly/minutely work of evolution and growth." And very much one that "embraces" more fully what is, existing limitations very much included.
A similar perspective reached in a different way and expressed in different terms:
I am, and I can think, therefore I can change what I am/think
And another way to get there/express it
a world in which everything follows from a set of first principles wouldn't, in any case, be a very appealing one in which to live, for me at least. I would prefer a universe in which there is enough uncertainty to assure that the future has new things in it, and enough to allow me to conceive of rectifying existing wrongs and perhaps even contribute to doing so
And another ...
We don't need something out there to motivate communities any more than we need something out there to motivate ourselves. Rather than agreeing on common principles to create communities of individuals we could instead simply rely on our existing motivations, on our experiences, and on our abilities to conceive alternatives via stories.
And another
Out beyond ideas of wrongdoing and rightdoing, there is a field. I will meet you there ... Jelaluddin Rumi
embracing limitations vs? meeting beyond
One of my unconscious assumptions is likely that going beyond (finding a way out/a more expansive field) doesn't fit with accepting current limits. I think the frog story goes something like: "if you accept current limits, they will control you and your thoughts. Resist and refuse them! Shake y/our frog fist at them!" So . . . if this is just one story among many that are possible, another might be: "You are free to work with and from whatever and wherever you are. You don't have to right the world before you enter it. You are already t/here. So accept what is around you as what you have to work with -- don't assume you have to make it right or good before you get started; you've already started." There is something here about there being enough time/lack of urgency, too.
where one moves from and to ...
"enough time/lack of urgency" makes an interesting connection between this conversation and one going on about evolving systems in general (see "A meeting summary" at Reflections after the first year). Going "beyond current limits" (beyond formal systems) requires not only being able to "accept what is around you as what you have to work with" but also a long term ("deep time") perspective on what one is aspiring to? what one regards as "important"?
Its an interesting question how much "If you accept current limits ... Resist ..." is a "frog story." My guess is that for some people, the more immediate in relation to the ideal is the more powerful unconscious driver, while for others a resistance to limits and a deep time perspective is more so. And that this reflects in part both experiences and thought and so can change in the same person over time.
And
what about "a resistance to limits" combined with an urgent perspective?!
Beyond demarcation
I joined this conversation several months ago after discussing with Paul our mutual interests in the role (or lack thereof) of stochasticity in ultimate understandings of natural systems. Recent discussions with this group have lead me to some better understandings of what it is that I know, don't know, and can't know. For instance, could we distinguish a random process as truly random? Well, probably we can't. If the process is nonrandom and we identify some model that approximates that nonrandomness, then we can discard randomness. If we set up randomness as our hypothesis, given modern experimental approaches, we are testing for randomness against the null, which is by definition random. Maybe this is where Godel comes in...there can be true statements in our system, e.g., 'this process is random', which cannot be shown to be true in that system. So there's a potential limitation on science given a formal system, I think. Is there any way around this? Maybe not, and I'm fine with that. This limitation certainly does not mean that this randomness can not be a useful property through which models or theories can be constructed, nor does it mean that science as one means of inquiry is a broken or failed system. I'm optimistic on this point.
Now it seems that the discussion has moved from the above sorts of questions to ones of values. Again, as I stated, ideally science operates without values (consistency, I would argue is a property of scientific inquiry, not a value). Science is a process that makes statements regarding the consistency of observations in the natural world based on evidence. Other disciplines can attach values as to whatever that means, or what implications that might have. And yes, as practitioners, our personal values affect our practice of science. For instance, a value goal for my research is to ask questions of ecological systems without causing the death or increasing the suffering of the organisms involved. Does this value system limit my science? Yes, there would be more efficient means to answer questions if I did not value life and devalue suffering. That said, science doesn't care one way or the other and my research will proceed despite my values complicating (and slowing) the matter. Ethicists, on the other hand, should and do care, and they (along with elected and apponted officials in both government and private sectors) limit just what it is that science can investigate given our evolved social system as Western humans. I'm appreciative and fine with that as well.
So where is the conversation moving? I think that discussing the differences of science with other disciplines is a good and interesting discussion to have as long as it remains constructive. Science offers answers to particular sets of questions and is agnostic to other questions. The same can be said for other fields of inquiry with respect to science. Communication amongst disciplines would certainly go a long way toward understanding what types of approaches are relevent or efficient for addressing particular types of questions, or where cooperation amongst disciplines would be fruitful. As a part of my training in the liberal arts, I have benefitted from looking across disciplinary boundaries and engaging others for the general sake of learning. I realize, though, from recent discussions that my optimism (that specialization and conflict across disciplinary boundaries are good things) is not necessarily shared by all. In fact, there seems to be suspicion amongst disciplines for primacy in some perceived (if not in many respects real) power structure. I don't think this structure is an inherent property of science though, nor any other discipline, but a consequence of what it means to be a human individual vying for position within an evolving social system. Hopefully, the conversation can move beyond this awkward aspect of human nature, and maybe acknowledge where demarcation can be a good thing and something that we can all take advantage of to become better learners and teachers.
Where are we going?
I, too, have been participating in this group because of my interests in understanding the relationship between randomness and order in connected systems. Where we are now seems very far from that. I'm fine with the distance, but I am troubled by the direction.
It seems like we are trying to "move beyond science" based on the limitations of formal systems. This has lead us into a number of directions, and I am completely lost. I understand Godel's and Turing's arguments, but I find that these lend no support to moving beyond science. I don't even think that they warrant moving beyond formal systems.
beyond formal systems rather than "beyond science"
Its never been my intent to move "beyond science," but it certainly has been my intent to explore moving beyond "formal systems" as an unshakeable foundation for science or any other form of inquiry. And I continue to see the problem of the relationship between randomness and algorithmic processes as significant in that exploration, as per notes for our upcoming meeting.
more thoughts
Hi All,
I appreciate the discussion we had last week about what science is and why the question is important to us.
I’m glad the conversation has continued on Serendip and I appreciate the opportunity to join it here.
When it comes to the consideration of a question at once so basic and so abstract, it’s useful to use stories and propositions/arguments as communication tools. So I am going to write in both registers, and ask others to consider doing likewise. Our own personal, idiosyncratic connections to this question are part of what is in play as we converse -- not as hurdles to overcome or as touchstones, but important nevertheless in ongoing, changeable ways. Coming to know one another, and ourselves, better is entwined with sharing an inquiry into the nature of science.
I’m pretty sure that my fidelity to postmodern ideas -- about the contingency of thought, the multiplicity of identity, and the nonfoundational character of knowledge -- is a consequence of their offering relief from and an alternative to modes knowing and living that struck me in my coming of age as frustratingly and at times dangerously rigid and impermeable. Postmodern perspectives also offer richer conceptions of authority beyond the “master narratives” -- and masters -- that hurt and limit so many people, and so much learning, insofar as they distort, discount, control and erase experiences of human life.
It’s from a postmodern perspective that I question terms:
What is evidence? Is it something everyone can see from whatever their perspective is? Are “data” found or created? Does the natural world include humans? What is natural?
What are values? Are they fully separable from knowledge? What might we see if we imagine that it is not possible fully to disentangle the ways we know from the ways we live (Parker Palmer), or love and action (Rich) or making things work and changing things? What if this impossibility is not a limit but a condition of science, and of life?
What is identity? Can people be and believe a lot of different, and at times contradictory, things? What kinds of education could foster such multiplicity is ways useful to individual and social growth, including the growth of science?
What is progress? What if it’s not a single narrative? From whose perspective is progress measured and justified? At what cost?
It’s also from a postmodern perspective that I am interested in blurring boundaries between categories as another thinking tool, as demarcation is. Feminist post-structuralists have long shown that categories in the West are often understood in oppositional terms: science and religion; “basic” and “applied” research; teaching and learning; ideal and real; academic and practical. To my mind, there is freedom and possibility in unfixing boundaries between such terms and seeing what happens when we let them run. I think this interest in loosening the ascribed meanings of categories also arises from my interest in words, how they work, how they move and change, how they shape as well as name our worlds.
A power-sensitive approach to knowledge generation made up an important part of my doctoral studies and I continue to believe that it is a useful lens -- one among many -- on learning and discovery. Rather than think of power struggles or academic politics as purely social issues ,disconnected from the core work of science or any other form of scholarship, I proceed with the possibility in mind that they are often fully embedded in the core. At the same time, I am working on figuring out how to see more easily through a lens that doesn’t take them as given or primary. For me, an attraction of this group in the Evolving Systems project is what it offers to this project.
My first organizing framework for exploring oppression was a critique of patriarchy. My second, entwined with the first, was my experience as the daughter of a schizophrenic mother whose journey in the mental health system, as in marriage and work, was for me a potent education in the often ignorant and unhelpful and sometimes outright oppressive and dangerous character of scientific knowledge and practice. I’d like to draw on, and offer others, what is useful in this story, but not be limited to or by it. I can imagine that for others, there may be parallel stories -- perhaps about religion as authoritarian/suffocating, or about belief as a limit on intellectual growth -- that could orient and re-orient -- and be re-oriented by -- the investigation we are sharing.
avoiding the "authoritarian/suffocating"
I like a lot the suggestion that we "draw on, and offer others" our own experiences with the "authoritarian/suffocating" and the stories we make of them ... but "not be limited to or by" that. There certainly are "parallel stories" about things other than science being "authoritarian/suffocating." Indeed "science" itself arose to a significant extent as a way to contend with other forms of "authoritarian/suffocating," and still serves that function for many. Significantly, the same is so of many humanistic and religious traditions. What all of that suggests to me is that it is actually the "authoritarian/suffocating," rather than any particular instantiation of it, that needs to be resisted. And that people with common experiences of being made uncomfortable by the "authoritarian/suffocating" might do better by making common cause rather than contending with each other.
Along these lines, I'm further intrigued by the notion that "science," and "religion," and the "humanities" ("postmodernism" included) are each in some contexts "authoritarian/suffocating" and in others ... liberating. Perhaps that's a general characteristic of all socio-cultural "instantiations" of human activity? And perhaps its also a general characteristic of human thought, of our tendency to make our experiences more manageable by clustering what are always diverse things under a single umbrella name, and then attaching a single value judgement to that single name?
Its for that reason that I particularly like "not be limited to or by." The naming and valuing is not itself a problem; it serves a useful function and can be liberating. The problem arises when we forget that the names and values are relevant only in the particular contexts in which they arose, and should not be presumed to have universal significance (see Deconstructing and reconstructing cultures and individuals). We are at risk of oppressing each other when we do this but equally at risk of oppressing ourselves, of becoming for ourselves "authoritarian/suffocating." Perhaps we could help not only each other but ourselves meet the general challenge of the "authoritarian/suffocating" by continual reminders that our stories may be helpful in that they reflect experiences, individual and collective, to date but are otherwise significant only insofar as they serve as the grist for ourselves and others to conceive as yet unconceived stories? Rather than resisting each other's stories as incompatible with our own, perhaps we should value them as the means by which we can, individually and collectively, escape existing oppression in whatever form it takes?
Doing justice to the inexpressible
So: I was mostly confused by yesterday morning's conversation. I'm happy to let go of truth; not so sure about knowledge (but then what do we know, if we don't know what is true?). I'd certainly like to understand more more (much more) about the relation between the expressible, the communicable, and the represented.
"Expressible" only makes sense to me if it is "communicable" (I'm thinking here of a concept in linguistics, the "basic economy of response" needed for a conversation to be socially functional: language is intelligible if speaker and listener "connect," if what one says is "not heard as a non sequitur" by the other). I think, however, that expressible communication only makes sense to me, as a concept, if it is also understood to be "representative," if the words or numbers stand in for something, are signs for that which is not present....
Which gets us to the "non-representable," that which cannot be represented. Is that which is not representable not worth our bother? Or is it the only thing worth bothering about? Or is it simply that which we cannot handle--so we turn our attention, somewhat helplessly, to what we can manipulate? So we ask the questions that can be answered, and let the bigger, more important and pressing and real ones, go?
Wittgenstein's cryptic "Of that which we cannot speak we must remain silent" was evoked several times yesterday. In Goldstein's novel, 36 Arguments for the Existence of God, Professor Klapper invokes a less-cryptic passage from Nietzsche's Twilight of the Idols, which gives a very clear answer to these questions: "Our true experiences are not at all garrulous. They could not communicate themselves even if they tried. Whatever we have words for, that is already dead in our hearts. In all talk there is a grain of contempt."
Rebecca Goldstein got a MacArthur prize for her ability to "dramatize the concerns of philosophy without sacrificing the demands of imaginative storytelling." Having just finished her newest novel, I'd say that's not quite the case; she's George-Eliot-like in weighing her fiction down with philosophy and religion; as a novel, 365 Arguments for the Existence of God doesn't quite sing, though some of the arguments w/in it certainly do.
Speaking of which, here's one that brings us (@ least it brought me) back to another very compelling example of the limits of formal systems. The math prodigy @ the center of the text is also the son of the Rebbe of a Hasidic sect, whose life--as he himself describes it @ 16--presents this dilemma:
"I know the formula, but I can't see my way clear to the solution. I try out every permutation, and nothing comes out right. How can that be? How can there be no solution? The only thing I seem to be able to prove is that there is no solution....If I leave...I break the heart of...a community that remade itself through the efforts of my grandfather....I break the heart of my father, whom I love more than anything in the world. So that has impossible consequences and can be ruled out. so I stay. But if I stay, then...I live among people who love me more than they love themselves...and I'll never be able to share a single thought with them. I'll live the life of my father, and of his father before him, and of all the fathers who lived only to repeat the lives of their fathers. Where's the sense in that? How can one chose such a meaningless life? So I leave. And so it goes. Going to a university is necessary but impossible. Staying...is impossible but necessary...."
He decides to stay, and is observed, years later, by another, once "astonished by a little boy's genius," and now "astonished by the way in which that genius has been laid aside....Still, if to be human is to inhabit our contradictions...to be unable to find a way of reconciling the necessary and the impossible, then who is more human....? And if [his] prodigious genius has never found the solution, then perhaps that is proof that no solution exists, that the most gifted among us is feeble in mind against the brutality of incomprehensibility that assaults us from all sides. And so we try, as best we can, to do justice to the tremendousness of our improbable existence..."
What might doing justice look like, in such a context (which is, of course, our own!)?
Beyond Wittgenstein (and Descartes?)
I've been reading this great graphic narrative, Logicomix, an autobiography of Bertrand Russell. Wittgenstein makes an extended appearance in these pages, and on one of then (288), he has an exchange w/ the members of the Vienna Circle (wish you could see the comics--the pipes, mustaches, beards and chandeliers...):
"To celebrate our first meeting, we offer you our 'manifesto of the scientific worldview.'"
"...written in the wake of your 'Tractatus'..."
"...inimitably encapsulated in its last line..."
"...'what we cannot speak of, we must pass over in silence.'"
"Where 'speak,' naturally, means 'speak logically!'"
"Your work gave us the means to expel religion, metaphysics, ethics, etc. from rational discourse!"
"Since 'what cannot be spoken about logically' is, quite literally, non-sense..."
"...and obviously, beneath the dignity of serious minds!"
[Wittgenstein]: "Just wait a minute!
The meaning of the 'Tractatus' has completely escaped you!"
?
"Its point is the exact opposite: the things that cannot be talked about logically...
...are the only ones which are truly important!!!"
?
!
Later, in a summary lecture to a group of isolationists, demonstrating against U.S. involvement in Europe in 1939, Russell says, "Wittgenstein has a point, you see: 'All the facts of science are not enough to understand the world's meaning!'...taken my story as a cautionary tale, a narrative argument against ready-made solutions. It tells you that applying formulas is not good enough--not, that is, when you're faced with really hard problems!"
?
!
If I'm following what's going on 60 years later in the evolving systems circle, though, Paul seems to be nudging us past this standoff (sure, logical systems have limits...) to say that the range of what is inexpressible is never fixed, and so always explorable, via a variety of ever-moving systems of what is expressible. This puts me strongly in mind of a move made, years ago, to write and tell Descartes that he hadn't been skeptical enough. To say that formal systems are incomplete--that they don't express all that is--is not to limit (in fact, it is to enable?) our exploration of what lies beyond them....
"we can think, therefore we can change what we think"?
Doing justice to the inexpressible AND the formal/expressible
Many thanks. Indeed where I was trying to nudge myself (and offer as a possibility to others) was "past this standoff." Formal systems, what can be talked about "logically," is limited. But the limitation is specific to particular formal systems. The "inexpressible" is not inevitably so but only so in relation to a particular formal system (language). By changing the formal system/language, one brings into the realm of the expressible things that were not previously so. By understanding the limitations of formal systems one gains a tool to transcend the limitations of any particular formal system. And a useful appreciation that the inexpressible is itself a product of formal systems and so will will always exist in some form. Human understanding itself creates new challenges for human understanding. There is no standoff between "formal systems" and an interest in the "inexpressible;" they are co-dependent, co-evolving. "We are, and we can think, therefore we can change both what we are and what we think."
"Why should these exacting sciences exact anything from me?"
As a little (not-so-fictional?) evidence for Paul's musings that we're still in a mode in which some scientists and some humanists seriously mistrust one another, here's a judgment that comes straight from the mouth of Jonas Elijah Klapper, Extreme Distinguished Professor of Faith, Literature and Values. A wacky character in Rebecca Goldstein's new novel, 365 Arguments for the Existence of God, Klapper is here explaining why he has no interest in the child math prodigy he's just encountered:
"I'm not impressed by the slide-rule mentality. I remain unimpressed with the mathematical arts in general. What are the so-called exact sciences but the failure of metaphor and metonymy?...
mathematics...is a form of torture for the imaginatively gifted, the very totalitarianism of thought, one line being made to march strictly in step behind the other, all leading inexorably to a single undeviating conclusion. A proof out of Euclid recalls to my mind nothing so much as the troops goose-stepping before the supreme Dictator. I have always delighted in my mind's refusal to follow a single line of any mathematical explantation offered to me. Why should these exacting sciences exact anything from me?....'what do I care about the laws of nature and arithmetic if...I don't like these laws....?' Dostoevsky spurned the hegemanical logic, and I can do no less."
evolving systems and Gödel: the countably infinite and beyond
I'm seriously intrigued both by some of the ideas that emerged in conversation this morning, and by some of the social dynamics underlying that. Some thoughts along both lines, from the session as well as conversations after.
I'm happily getting more familiar with the details of Gödel as well as the context of his work. Glad to have Alan around this morning to help with both Godelization and the diagonalization argument first used by Cantor. Together it seems to me they give us tools that will not only help with Turing and Chaitin but can be used more generally, as suggested in my summary above. The distinction between the "countably infinite" and still larger infinities, and the ability to make that distinction, seems to me quite important in thinking generally about inquiry. And interestingly related to Wittgenstein's "of that which we cannot speak we must remain silent." Human expressions of understanding are "countably finite" since they rely on formal symbol systems, but there may exist human understandings beyond those, "inexpressible" in particular formal symbols systems used by particular people at particular times. Rather than "remaining silent" about those, one might deliberately alter one's formal symbol system in an effort to make them expressible. The point here is not that there isn't at any given time an "unknown" which must remain unknown but very much the opposite: that there is at any givent time an unknow/inexpressible that one might in principle reach by a change in one's language/formal system. The limitations that become apparent when one develops formal systems becomes, on this view, an incentive to develop ways to go beyond them, to develop new formal systems, to appreciate their limitations, and so on. A nice emergent perspective on inquiry.
Yet to be fully clarified, in my mind, is how self-referentiality and the consistency problem relate to this. Are they correlates of it, or additional characteristics, more relevant in some contexts, less in others?
The dynamics underlying the conversation continues to be one of concern about whether an appreciation of formal systems is helpful to those less inclined to them contrasted with concern that recognizing the limitations of formal systems is either practically irrelevant ore represents a slippery slope in the battle against irrationality. Maybe with the legitimization of the "inexpressible" we're closer to answering the first set of concerns? With regard to the second set, my hope is that the Turing/Chaitin discussions will help with the problem of relevance. As for the slippery slope, I continue to feel that "risk" is not a problem unique to either position; the world has as much to fear from rationality as it does from irrationality. And that biological evolution provides a nice model of how to get this less wrong: use both formal systems and some indeterminacy and trust the dyanimcs of interaction with external pressures to keep the whole thing under control. "Risk" cannot be eliminated along any productive/generative path, but it can be treated as a virtue rather than a deficiency (cf Evolution/science: inverting the relationship between randomness and meaning).
The dynamics is also interesting since it reminds me of old "Two Cultures" conversations, and suggests we're still in a mode in which some scientists and some humanists seriously mistrust one another. The origins of that mistrust have never been entirely clear to me but an interest in/distaste for "formal systems" may have something to do with it. Formal systems seem "inhuman" and therefore threatening to some people, whereas their very formality is appealing to others. Interestingly, both sides defend their positions in terms of contributions/threats to social well being. I wonder whether one can't do better for the latter by making common cause rather than by treating an alternate possibility as a distinctive threat. And to what extent the apparent "inhumanity" of formal systems is a matter of how it is talked about rather than the thing itself.
"Infinities come in various sizes"
As one of the resisters (or, more accurately, not quite engaged-ers), let me suggest that disinterest, rather than fear, might also be a motivator, here: I understand that formal systems have their uses (and am learning more about such uses from our conversations); I also understand that they have their limits (ditto). I feel neither "threatened" by nor "afraid" of them, but rather find myself more interested in the unpredictable complexities of the worlds that lie outside their purview. What more I might learn from a continued working-through of the proofs and argumentation...?
I guess I'll see!
So (just to keep some other spaces in play), here's one place my mind went wandering to this morning, when Alan asked Paul what he meant by "mindless," and Mike asked what alternatively "soulful" activities he had in mind. It occurs to me that the practices of "mindfulness"--which encourage practitioners not to "think" into the past or future, but rather to "stop" thinking--might also well be called "mindless," because they involve letting go of skepticism, of speculation, of thinking "beyond" what is. They might thus be said to be not "soulless," but "soulful": absolutely present, absolutely bountiful. Unbounded. Infinite.
Crossing the lines of science and formal systems to...?
More and more, the 'demarcation problem' comes to mind...what are the bounds of science, nonscience, and pseudoscience?...what is it and is it not that science can inform us about the natural world? Discussions of 'mindless' or 'soulful' seem to be attaching values to actions. These values fall outside the realm of science (and formal systems)...there is no experiment that could be performed by which one could determine a 'soulful' or 'mindless' action. If there could be such an experiment, then maybe science could make a statement of the success of one or the other strategy (e.g., through differential survivorship or reproductive success) in a particular context, but it would attach not any aesthetic value. My intuition tells me that such analogies of 'mindlessness' and 'soulfulness' are falling into the realm of pseudoscience, if we are trying to use formal systems to justify them. That said, I do like the concept and practice of mindfulness, but I prefer the sort of mindfulness embodied in some schools of Buddhist thought where skepticism is at the core of it's teaching...i.e., living through (and by) experience, not by (or through) faith. This seems to be the core of science, with the caveat that science seeks consistency in that experience.
A few thoughts on what you
A few thoughts on what you said in the meeting this morning:
Addressing a few thoughts
Thanks for your comments.
First, science is very much unlike religion in the most fundamental of ways. Science, if performed well, does not seek observations to support its theories or assumptions, but instead seeks observations that invalidates them. Religions (of the sky gods) typically do not do this (though many Eastern religions, or maybe better stated, philosophies, do). For a practitioner of religion to act scientifically, they should not seek affirmation of their beliefs, but instead seek the one observation that would cause them to give up their beliefs. In science, there are no miracles, just unknowns (which may present themselves as outlying observations) with explanations to be discovered. In many religious practices, I would ask, why is it that if one child out of thousands is spared a death by cancer that this is evidence of, say, a miracle, but the other thousands of deaths have no bearing as observations that can make statements for or against belief? Religion, then, seems to discount many, if not most, observations of the natural world, whereas science has an obligation to confront all observations. To say 'I don't know' or claim a miracle would effectively end the process of science, and thus be self-defeating. I actually don't categorize religion as pseudoscience though. I categorize it as nonscience.
Second, as individuals, scientists might have some insight as to what is ethical or not, but in another sense there is a real conflict of interest in having them tell others what is acceptable behavior on their part or is not. For instance, would you believe a study from a drug company that said their drug was harmless and cured all disease if the funding for the study of that drug came from the drug company itself? I would not, largely because the result would seem self serving. You should hold the same skepticism of scientists.
Lastly, science is ideally value free. As you point out, scientists are human and bring along all of the baggage that comes with being human. So, of course there are biases. That said, as humans, we can contemplate those biases and try to minimize them. I disagree with your stated assumption that humans are creatures with a moral and ethical center. Morals and ethics are human constructs as well (by the same reasoning you give to science). Acting morally or ethically probably boils down to acting within a set of evolved social rules, which can be shown to change with time (and aren't morals supposed to be universal truths?). Given the likely consequences of violating evolved ways of doing things, it isn't likely than many individuals, regardless of their occupation, would act immorally (their genes might not be represented very well in the next generation of humans if they did).
ten thousand questions!
Hi -- Reading Mike's post bring so many questions to my mind. Mike, I share some in earnest hopes of further conversation about these things, given how different our starting places, ideologies, and rhetorics seem to be for and about science. I'm going to share some reactions in the form of questions in hopes of opening dialogue, rather than in hopes of arguing or persuading. Here goes:
Is the term "demarcation problem" a term of art in your field?
Why is demarcation the thing to do? I've been thinking (apart from this context) about how so often in institutions, and academia, people set up categories, departments, programs and then spend so much time defining what is inside and what is outside of them. In this case, demarcation is a problem, not a solution.
What do we gain by thinking of actions apart from values, and what do we lose?
Similarly, what do we gain from seeking, or prizing consistency in experience (including in the practice of science), and what do we lose? Do you think there are senses and contexts (including those of concern to science) in which experiences, as such, defy consistency?
What can we see when we set off things from one another, and what can we see when we attend to their "grading into" one another (/exchange/node/6279#12).
Do you know Adrienne Rich's collection, 21 Love Poems (1974)? In it is a poem called "Splittings," in which Rich writes, "I refuse these givens, the splitting / Between love and action . . . " This is cryptic, I know, but I wonder if the idea of refusing splittings has something to offer us here. Can there be forms, formal systems, that don't follow or cross lines?
More on demarcation
Alice, these are all good questions that will ultimately generate much discussion, I think. That said, before diving into some of these, I'd like to further clarify my thoughts on demarcation, some of which might seem a bit odd, if not inconsistent...though ultimately I don't think that they are. So why demarcation, and is it art?
Well, the demarcation problem is a core problem for science because, largely, of the goals of science...mainly, to understand things, and I'll go a step further to say, to construct things. And science is useful to most folks, largely, because of the latter. Pick your favorite piece of technology, your favorite drug, your favorite mode of transportation. They work because of the consistency of the processes that govern their action. Imagine the issues that inconsistencies could cause with regard to any one of your choices above. Basically, what this means, is that for science to progress, the patterns of nature that it uncovers need to be consistent (or predictably inconsistent); otherwise, science would cease to be useful.
Demarcation amongst science, non-science, and pseudo-science, then, is a real societal problem from a practical standpoint. Because time and resources are limited, where do we apply shared wealth to tackle technological problems (say in the areas of energy, disease, etc.)? Probably toward approaches that are likely to come to solutions efficiently and consistently...scientific approaches seek, and tend, to do this. The problem is, and here is where the art comes in, when confronted with the task of distinguishing science from the other categories... the lines can become blurry quickly. What tests can distinguish these types, and do the same rules apply consistently and objectively? This is a much longer discussion to have.
All of this being said, here is where I imagine folks will think I'm stating an inconsistency (and maybe this idea will need flushed out a bit)...formal mathematical systems do not have much to say about the scientific process of discovery (through induction). Math and science are different. Math is a tool that can be leveraged by scientists to form (or in some cases validate) hypotheses, but mathematical statements themselves are merely a statements of logic with well defined symbols and grammars. Certain statements can be proven, and in some circumstances, statements can be generated that reveal inconsistencies in the system (and thus our discussions of Godel). To equate these statements with science is a little problematic I think. What we are doing is making an assumption that we have captured the behavior of some scientific observation through a particular formulation of a mathematical statement...we are making an analogy. Maybe the analogy holds maybe it does not, but to equate the process of science with the field of math is a bit dangerous.
Science makes progress through successive feedbacks between induction and deduction. Take a set of observations, attempt to generalize consistent patterns from them, apply these generalizations in new ways to make predictions that otherwise could not have been made. Math does not say much about this process. Mathematical statements do help make some hypothetical statements tractable, and from a deductive standpoint, help to formulate testable hypotheses. But we shouldn't assume that the math is real. Which gets me to Godel...the limitations of formal arithmetical systems are not fatal for science or put bounds on what we can learn from science. It may say something about what might be computed in a scientific context, but that is a different problem entirely.
For example, I was reading an article last night in the NY Times discussing some recent discoveries in physics having to do with the question, why do we exist? The 'why' that science addresses is fairly unique here. If the mathematics were correct, the Big Bang should have created equal parts of matter and anti-matter, and these would cancel each other out...yet here we are...and there is matter in the universe (oh yeah, that kind of 'why'). Science as a process (of observation and experimentation) is discovering that certain types of physical reactions, which should produce this balance between matter and anti-matter, tend to leave a little extra matter than expected...good for us, but this is an example of where the math is used as a means to codify thought in a tractable manner. The arithmetic behind these physical calculations is surely consistent, probably correct, and likely provable within the system to which it has been applied. But the math is only a representation of the science...an analogy....not reality (whatever that is). To equate the two, I think starts to travel along the borderlands of science and pseudoscience. Further, making the same analogies of science to other disciplines is performing the same operation, and can be quite dangerous (e.g., intelligent design, social Darwinism, or astrology come to mind).
This all said, with respect to other disciplines, I have no problem with (and in fact want) inconsistencies. Ask me about Coltrane or Miles (or more modernly Rudresh or Vijay) and I'll be glad for the inconsistencies of the world, both external and internal to their individual beings. For whatever reason, my nervous system is stimulated in ways that I experience as good when I hear how they are different than others, and how they tend to evolve ( or mutate) with respect to themselves. Outside of the sciences, inconsistencies (or our perception thereof) have much to offer from the perspective of aesthetics. That said, I do like Paul's suggestion of how these questions could be posed in such a way that science can make statements about them...but these might be really boring conversations for some.
And might I suggest...
...some further reading of Imre Lakatos on the subject of demarcation:
http://www.lse.ac.uk/collections/lakatos/scienceAndPseudoscienceTranscript.htm
I think the group's discussion might be well served by considering some of the solutions to demarcation proposed here. I have a hunch, given Lakatos's criticism, where expressivity might fall as a scientific paradigm...though that doesn't dismiss it's utility elsewhere.
And for a slightly different take, some might appreciate:
http://www.huffingtonpost.com/michael-zimmerman/religion-and-science-resp_b_583460.html
formal systems and infinities
Point well taken. One may be "disinterested in" rather than "threatened" or "afraid" of formal systems. Part of the objective on my mind is to make the case that formal systems are actually quite interesting, whoever one is. The problem here, I think is how they are taught/talked about, as if they were a special activity of significance only in a specialized world and accessible only to those trained in that world. Instead, they should be talked about as a significant component of all of us, one we all use all of the time. And better understanding them, both their strengths and their limitations, can be not only interesting but empowering for everyone. It is through an appreciation of formal systems that we gain the ability to identify and transcend our own presumptions at any given time.
Very much agree that the connection of formal systems to "mindless"/"mindful" and "soulless"/"soulful" is worth exploring further. And expect that to be a significant component of the next, Turing/computers phase of our conversations. To anticipate a bit, I'll almost certainly argue that the "mindless"/"soulful" state you describe is indeed "absolutely present" but is not actually "absolutely bountiful. Unbounded. Infinite." It has a lot to be said for it (cf interconnected vastness, oceanic feeling, the taoist story teller), but there is lots beyond it, worlds/universes/states of being that can't be reached next except via skepticism and thinking. Trees and computers are "absolutely present." And limited, relative to the human brain, as a result. Formal systems provide a way of transcending those limitations. Piece wise, never absolutely. All infinities, as per Cantor, are embedded in still larger infinities; none are "unbounded."
"piece-wise, never absolutely"
Lately, I've been relating to some of these threads with reference to comedy and tragedy. It seems as if tragedy is more a realm of absolutes, of form as limiting, of the way the story ends, while comedy is more a realm of the "piece-wise," of story after story, of "often, but a little at a time." I am trying to nudge myself into a comic outlook on the impossibilities of being a person -- if to be a person is to want and at the same time to [want to] see beyond human connection -- and to feel a good deal of discomfort and isolation all along this spectrum of desire!
Choosing Between the Tragic and the Comedic
My first thought here is of Woody Allen's definition of comedy as "tragedy plus time": all we need is a little distance on what seems tragically inevitable to see that it's...
not.
And now I'm also noting the connections to the concurrent discussion on From Evolving Systems to World Literature and back again about whether we might more profitably think of literature, writ large (not to mention the the world, writ large; or ourselves, written however!) as tragic or comedic. In entry #45, Wai Chee Dimock observed,
"A tragic account of this would say that the outcome is inevitable, fully scripted. A comic account would say that things could have gone the other way, that it was no more than fortune -- or luck -- that produces this particular outcome... it's a potentially reversible narrative, one that allows for a 'backward propagation' ... to a prior state of amplitude and fluidity"--
though, then, in entry #47, she adds "a word of caution: while it's tempting to think that such a world is 'comic' (and that comedy itself is the most capacious of genres, encompassing tragedy and subordinating it), there's no guarantee that this is indeed the case...."
more on logic AND "irrationality"
Getting there from a different direction ...
"no fixed center ... unconstrained by the need for consistency" is indeed a good characterization of Minsky, the frog brain, and probably in those respects a good metaphor for World Literature as well.
There is an interesting cross connect between this conversation and an exploration of "Chance" in which it is emerging that "consistency" is an important element of some mind/brain/inquiry processes and a less important part of others
Maybe there's a general principle here, equally relevant in several different contexts: one doesn't have to choose, one can have both rationality and inconsistency/randomness? One uses the "irrational" at the outset and then "filters"/generates additional possibilities with formal systems, instead of doing the filtering using formal systems at the start? A couple of relevant books along these lines that might be useful to sort out various meanings of the "irrational" and their relation to formal systems and indeterminacy ...
Gut Feelings: The Intelligence of the Unconscious by Gerd Gigerenzer
The Bit and the Pendulum: The New Physics of Information by Tom Siegfreid
Also relevant along these lines ...
Making sense of understanding: the three doors of Serendip
Formal systems/rationality AND inconsistency/illogic
Lots of interesting thoughts from last week, from continuing conversations in the corridors, and from below. Is helpful to be challenged on why certain things interest me, why I think others might be interested as well. In this case, I came away with a better sense of why Gödel/Turing/Chatin might not on the face of it seem interesting to a variety of other people, and an enhanced sense of why I think its significant, both for myself and for others. We all have a bit of formal systems (Enlightenment positivism) in us, whether we want to admit it or not, and we all, in the same way, have a bit of inconsistency and drive for completeness. Whichever we're more comfortable and familiar with, it behooves each of us to enhance our recognition of not only the limitations but also the strengths of the other. By doing so we can not only more effectively resist advocates of either approach as the exclusive foundation of inquiry but enhance our own capabilities as inquirers, and as participants in communities of inquiry.
That "we all have a bit of ... in us" has a platitudinous ring to it. Its the sort of thing one might say with some regret about one's interest in something one doesn't really think one ought to be interested in. But I actually mean something more significant. The brain is actually organized to process information in two quite different ways, logically/rationally/positivistically and "intuitively," and to achieve and test understandings not by the exclusive use of one or the other but rather by "looping" back and forth between them (see Making sense of understanding). We may each of us try to live by one and ignore the other but that leads to inconsistencies of the kind pointed out in the discussion and to a compromised use of brains that actually have richer capabilities. We are all better off recognizing and accepting both the limitations and the strengths of both ways of processing information.
And that in turn relates to some of the broader issues that have surfaced in one way or another in these conversations from the outset as well as more recently. People have a tendency to see an interest in randomness as antithetical to an interest in "properties and rules." And an interest in inconsistency/irrationality as antithetical to an interest in rationality/order. The turf I'm interested in exploring is not one of trying to decide which of these is "better" than the other (or to defend existing decisions of this sort) but rather one of asking how they fit together, and what one can do with the combination of them that one can't do with either alone.
Along these lines, Wil and I have been going back and forth over the issue of whether "rationality" is the best/only defense against "Evil," ie dogmatism, fundamentalism, Holocaust/global warming denial and the like. I don't certainly want myself to be identified with the latter, but I don't think its time to recognize (contra Enlightenment thinking and logical positivism) that "rationality" has already proven not in fact to be a very good defence against such things, that its time to try something different, and that a blended approach is promising in this regard. As well as in teaching.
I'm also, like Anne, struck by how easily we all assume a "check out" mentality about our activity as inquirers. Maybe a blended approach would provide a better model for inquiry (and life?) in general, one that positions us not as completers of any set of explorations but rather as contributors to continuing exploration?
Changing the goal of grocery shopping
I left my house this morning to find a truck dumping several tons of dirt @ the end of my street, blocking it entirely. I live on a "cartway" in midcity Philadelphia, and could not get out. That seemed not such a bad image for what I saw Paul dealing w/ a bit later in the morning: trying to get somewhere as all sorts of obstacles--queries about how going in this direction might be useful--were placed in his way.
I stayed quiet during most of this conversation, in part because I was trying to allow some space-and-time for Paul to lay out "his" proof, and to see what payoff that might give us. But, while silent, I was entertaining obstructionist thoughts quite similar to what was being said by others: of what use value will this be, for me and my discipline? What application can this global claim possibility have for my local particulars? -- particularly since "the inherent limits of formal systems," and the concomitant "need for less constrained approach to inquiry" seem to me exactly the space occupied by humanists (in company with the more qualitatively inclined social scientists). Certainly one payoff might be a better understanding of the use value of formal systems, a better understanding of what they might do, and just where and how they fall short: for things like grading, mentioned this morning; or for the larger structures of assessment in which we all increasingly find ourselves; or for some of the more positivist projects in which both humanists and qualitative social scientists find themselves involved; for my current favorite example, see the open review issue of Shakespeare Quarterly on "Shakespeare and New Media," in particular, an essay on "iterative criticism" that uses multivariate statistics and a text tagging device known as Docuscope to "create a portrait of Shakespearean genre @ the level of the sentence," a process that begins by sorting several million English words (and strings of words) into grammatical, semantic and rhetorical categories.
Given my own location in the world of academia, and my own particular angle of vision on the world, of particular interest to me in this morning's session was Tim's asking what happens after we name "that," and then name what is "not that." What's next: "'not-not-that'--and so forth, in seriatim? Is this like changing lines in the supermarket, never getting to the cash register? Why check out @ all? Why not reconsider the goal of getting to the checkout counter? Why not change the goal of grocery shopping: you can just go on having interesting conversations (and you never have to pay...)?"
I found myself laughing out loud @ this image: it's such a vivid description of a non-goal-directed sort of teaching-and-learning, such a nice counter to the conventional "check-out counter" model of education that is now largely operative.
But perhaps the most interesting question I heard, while sitting around in this morning's ever- shifting checkout line (to mix my metaphors a bit), was Alice's great query whether--instead of just noting the difference between "that" (say: Euclidean geometry, or quantum physics) and "not that" (say non-Euclidean geometry, or non-quantum physics)--we mightn't imagine (or bring into being?) a formal system to put these two different perspectives on the world into dialogue w/ one another. Would you say that that is what Hegelian synthesis did? Or did you (or do others) have other formalisms in mind?
that and that
I too really appreciate the keeping-on-changing-lines-in-the-grocery-store-and-never-reaching-the-checkout metaphor. If there be no checkout counter in the grocery store of inquiry, or if the fact that some people make/work at/or need checkout counters needn't exhaust the meaning of groceries, then it seems to me (by the so dim light of my scant knowledge of logic) that Hegelian synthesis is not what I am after, since that looks to trend into another checkout counter. I was wondering about how endlessly differing standpoints might be "programmable" -- but maybe the answer was here at home all along. Maybe novels, plays, and poems are the forms I am looking for.
Two things
These are two things that have crossed my mind since last time:
1. There is philosophical stance known as "compatibilism", in which it is claimed that determinism and free will are not opposed and are in fact compatible. Some philosophers in this camp have gone further and asserted that determinism is in fact a prerequisite for free will (Malebranche and David Hume are among these, but I'm sure there are many more). On close inspection, however, my impression is that these philosophers ultimately tend to reinvent "free will" to be something very different from the common conception, often dispensing with the element of choice (which most of us take for granted as an aspect of "free will"). I'm not sure how I side on this debate, but I will say that I do find it a little ridiculous that some thinkers seem to grab hold of inherent stochasticity almost as if it IS free will. Stochasity alone doesn't get us to free will, not by a long shot. But I am still willing to entertain the possibility that it is a prerequisite or necessary condition.
2. I was surprised to hear Paul say that was very skeptical of "natural ontological attitudes" and their ilk or, in the particular case of stochasticity, the claim that we (us humans) are simply "naturally" inclined to interpret the world in deterministic terms. I would have thought that Paul (given his neurobiological and evolutionary perspective) would be sympathetic to the idea that we've evolved certain neurological inclinations (I'm avoiding the metaphor "hard-wired" but that's sort of where I'm going with this) to see the world in this causal-deterministic way simply because these sorts of inclinations ultimately enhanced reproductive success in the early evolution of mammals or primates. This is of course different from making a claim about the way things actually are. Perhaps if we'd evolved on the quantum level we would have ended up with very different inclinations. (Paul: please correct me if I am misinterpreting you--the comment went by fast in the conversation and we didn't get a chance to return to it.)
Randomness: "free will" and "human nature"
Glad to have both issues raised/flagged for future discussion. My preference would be to defer the "free will" issue for a bit. I think it is indeed relevant in the longer term but, as you say, it is a bit bedeviled by problems of definition in its own right (what it means to a compatibilist is different from what it means to a non-compatibilist) and, at least as importantly, there is no direct relation between randomness and free will (one can have the former and not have the latter, even if one is pursuing a non-compatibilist approach).
The issue of "natural ontological attitudes" (or what I would call "human nature") is, I think, more immediately approachable. Yes, of course, as both a biologist and a neurobiologist, I think humans are born with "inclinations" that reflect our evolutionary history.
But I would, as you suggest, avoid the "hard wired" metaphor for several reasons. One is that, like all organisms, we have quite substantial individual to individual variation in what we are born with; one can speak of averages or norms but doing so misses the fundamental point of variation. A second is that as a deeply social species, it is incredibly hard to say with any certainty how much of any observed similarities among us reflects genetic as opposed to similarities in experiences for which culture plays a significant role. A third is that the metaphor itself is flawed; there is no distinction in the nervous system between "hardware" and "software" in the sense that those terms are used with regard to computers. All parts of the nervous system are simultaneously "hardware" and expressive of "software" in the computer sense.
On top of all this, I would argue that brains in general, and human brains in particular, have been designed by evolution to cope with a continually changing and somewhat unpredictable world. To put it differently, among the inclinations we are born with is an ability to identify and potentially alter our own inclinations. Given this, I would not only do away with the hardwiring metaphor but with the underlying concept of "natural ontological attitudes." Yes, there is at any given time a statistically meaningful "human nature," but to describe it is to ignore individual variation as well as the potential/likelihood that it has been different in the past and will be different in the future.
Peirce, deriving?
I had a particular interest in the role that C.S. Pierce played in kicking off our discussion this past Wednesday morning, because I just "happened" to be discussing two of his essays in my course on the James family that afternoon. It was quite striking for me to learn about his belief in "tychism," particularly because--in the essay we focused on later in the day, he lays out four methods for "fixing belief":
What was especially striking to me in our discussion was the challenge laid down that the scientific method, as Peirce defines it--gathering empirical evidence to arrive @ a certain conclusion--is not going to help us in this particular meta-case of deciding whether chance is a fundamental condition of the universe, or simply a description of our ignorance of how things work. Peirce, who identified chance as having an objective existence in the world, nonetheless thought there was a way to arrive, experimentally, at fixed beliefs. I'm breathless to find out how far we might go in that direction.
In the interim, Alice? Because of your call, months ago, to add "love" to the key forces circling in the universe, as a key contributor to the evolutionary process, you might be particularly interested in the reminder that, for Peirce, "the most fundamental engine of the evolutionary process ... is nurturing love" (which he called agapasm); tychasm is its degenerate form.
Which would seem to make it not fundamental, but derivative...?
tychism, empiricism, and fixed belief ... and agapasm
Yep, I too, despite a long-standing interest in Peirce and only recently "happened" onto the "tychism" aspect of his work. My current understanding is that Peirce regarded the scientific method as itself inevitably reflecting randomness since "empirical evidence" is always and necessarily a partial and somewhat random selection of relevant observations. My guess is that, like other pragmatists, Peirce saw "fixed beliefs" not as something that can be justified by the scientific method (or in any other way) but rather as a human state of mind that itself requires explanation. His "methods" represented a catalogue of ways in which humans might generate a state of fixed belief, not things that "objectively" justify that state. It would, of course, be well worth looking more into Peirce's thinking along these lines, as well as into how he saw the relation between tychism and "agapasm" (a term that is as obscure to me as tychism was).
centering the non-human?
I appreciate the discussion and look forward to more. Aong other things, I'd like to learn more about what it's like to do science/seek knowledge in a way that makes non-human dimensions of life and the world more central (than I usually make them).
finding "non-human dimensions"?
Yep, me too. Its a nice thought that there is something out there that doesn't reflect, or even care about, any of the presuppositions/preoccupations that I normally carry around with me .... something that will give me a new way to look at things, and myself as well.
Fragments of one of my favorite poems, by Federico Garcia Lorca:
¿Por qué nací entre espejos?
El día me da vueltas.
Y la noche me copia
en todas sus estrellas.
Quiero vivir sin verme.
Líbrame del suplicio
de verme sin toronjas.
Why was I born among mirrors?
The day spins around me
And the night copies
Me in all its stars
I want to live without seeing myself
Free me from the barrenness
Of seeing myself without fruit
randomness, falsifiability, and scale dependent science/inquiry?
Rich conversation in the session this morning, and in the halls afterward. My own take on where we started/got to is provided in the summary above. That is, of course, my particular idiosyncratic picture, filtered through my own perspectives/aspirations. Delighted to have here alternate/supplementary/conflicting pictures of what others saw.
Glad we got to a shared sense (at least I think we did) that it is worth looking into some alternative ways to conceive science/inquiry, ways that are less dependent on definitive "falsifiability" and more open to the existence multiple viable alternative possibilities paths to explore, including one that presume significant indeterminacy at the outset. Looking forward to seeing where we go with that, whether or not we get to randomness as "persistent and generative condition, one that allows us a creative role in shaping a future that will in turn always have new things into which to inquire" (Alternative perspectives on randomness and its significance).
Along these lines, Mike, Wil, and I had a post-session bull session worth recording some thoughts from. Will trust Mike/Wil to correct any distorted picture of that presented here. The issues posed had to do with local versus more expansive sets of questions, when and why one does or doesn't think about changing methodologies, and the relations between phenomena at different levels of organization. On the one hand (I won't say which hand) was the notion that one ought to stay with methodologies that have local success and presume that understandings at local scales will sum up to reasonable understandings at larger scales and higher levels of organization. On the other hand was the idea that local successes won't sum linearly, that methodologies that work locally shouldn't be assumed to work either at larger scales or higher levels of organization, that they work precisely because they have been developed to deal with the particular characteristics of phenomena at particular scales/levels of organization. If there are significant non-linearities, perhaps involving some degree of randomness, in moving to different scales/levels of organization, then one will need at those different methodologies. One guess (mine) is that that is indeed the case, that there are often "phase transitions" intervening. Hence perhaps a need to accept indeterminacy and non-falsifiability more at some levels of organization/scales than at others?
All this is interesting reminiscent of a discussion of Against Interpretation, Against Method? Perhaps there is a continuing need to match methods of inquiry to the materials at hand? With the only constancy being skepticism, responsiveness to the challenges set by the materials one is working with, and an interest in "shared subjectivity"?
"the keynote is hope"
I'm looking forward to this discussion. In my new course on the James family, we arrive this week @ William James's magisterial book, The Varieties of Religious Experience. His postscript to the volume concludes that "no fact in human nature is more characteristic than its willingness to live on a chance. The existence of the chance makes the difference ... between a life of which the keynote is resignation and a life of which the keynote is hope ...."
Post new comment