The conceptual idea of nesting boxes within boxes to represent the various levels of organization in the nervous system does seem to increase my level of comfort with the notion of the identity of the brain with behaviour, because it implies an inherent uncertainty in the structuring of the limits to which the nesting process can be taken. We currently believe that the smallest box, the smallest autonomous unit, is the neuron; we don't need to define a mathematical limit for the divergence of the "box" series. However, we don't really have any concrete evidence to suggest how many levels of the hierarchy of boxes within boxes there are; in a physical sense, maybe we do, but in the sense of "mind", and all the trappings of the "mind- body dichotomy", we have no clear idea about the kinds of levels to which it can be reduced.

Why does the idea of uncertainty imply greater comfort? It seems to me that if we assumed that everything was completely clear- cut, sharply defined and with an uncertainty of zero, then we should be able to absolutely explain everything that we claim to understand, completely. An uncertainty inherent in the theoretical model makes the model much more believable in terms of its correlation with empirical evidence. Perhaps sometime in the future the uncertainty will disappear, but it needs to be accounted for in contemporary theories.

Nice set of ideas. And interesting that you want some uncertainty somewhere. I agree (which isn't to say it wouldn't be wise to wonder WHY we want that). I think, though, that we can find the uncertainty without having the entertain an infinite regress of smaller and smaller boxes. We'll see. PG