**Summary** The **c**onstructivist **e**volutionary **e**pistemology (**CEE**) has taken up the demand of modern physics that theoretical terms have to be operationalizable (i.e. the description of nature should comprise only quantities, variables or notions which are defined by means of measurement facilities or other physical processes) and extended it by the idea that operationalisation is something general which must be the constituting basis also for observational terms. This is realised by considering the regularities we perceive and which we condense to the laws of nature as the invariants of phylogenetically formed mental cognitive operators. Experimental operators (i.e. measurement facilities) can be seen as extensions of these inborn operators. This will lead to the consolidation of the classical world picture if the mental and the experimental operators involved are commutable. Otherwise there will be invariants which cannot be described in classical terms and, therefore, will require non-classical approaches such as the uncertainty principle in quantum mechanics enunciated by Heisenberg. As the development of experimental facilities will never be completed and, therefore, will continue to bring about novel invariants, evolution of science cannot converge towards what many physicists envisage as the "theory of everything" describing definitively the structure of reality (Feynman, 1965; Hawking, 1979). So, both organic and scientific evolution are entirely open and non-deterministic. When seeing also mathematical objects and structures as invariants of mental operators we must expect similar phenomena. Indeed: Just as experimental operators, though constructed entirely according to the rules of classical physics, may lead to results which cannot be described in classical terms, there are also mathematical calculuses which, though based entirely on well tested axioms, can lead to statements which cannot be proven within the context of these axioms as shown by Gödel.

From a formal point of view it can be said that Heisenberg and Gödel have in common that they discovered phenomena which are beyond the scope of their own constituting theories and therefore would require a new theoretical framework. The construction of physical apparatuses even for subatomic applications, i.e. for applications which fall under the laws of quantum mechanics, is entirely classical. Mechanical, electrical and optical elements are combined according to the rules of classical physics and nevertheless lead to results which can no longer be interpreted in classical terms. Something similar applies to mathematics. From Gödel (see the summary of Ernest Nagel, 1958) we know that there are mathematical procedures which, though entirely constructed by means of well proven classical methods, will lead to statements representing a truthfulness which can no longer be derived from the axioms concerned.

According to all our classical thinking the two phenomena, despite all formal similarity, have nothing to do with each other. Quantum mechanics is derived from empirical research, i.e. from observations of an independently existing outside world, wheras mathematics is a matter exclusively of thinking and reflection without any link to visually perceived data.

On the other hand, the success of physical theories obviously depends on their appropriate mathematical formulation. It is indead unclear why mathematics does so remarkably well in describing the regularities we perceive, or why the world, as Davies asked (1990a), is algorithmically compressible, i.e. why the world despite all its vast complexity can be described by relatively modest mathematical means. Wigner (1960) wrote a paper called "The unreasonable effectiveness of mathematics in the natural sciences".

We will try here to show that what could be called the "constructivist evolutionary epistemology" (CEE) (Diettrich 1991a, 1991b, 1994) could provide both a common root for non- classical physical and mathematical phenomena and a better understanding of the role of mathematics in the natural sciences.

The CEE is based on the now nearly classical evolutionary epistemology (EE) of Riedl (1980), Vollmer (1975) et al. extended by some physical considerations.

Methodologically the CEE refers to the demand of modern physics to formulate the laws of nature exclusively by means of operationalizable terms. This demand results from the insight that classical physics failed vis-à-vis the phenomena of quantum mechanics and special relativity mainly because it got involved with a non verifiable syntax as brought about by the use of terms which were not checked as to their possible definition by means of physical processes. It was taken for granted that all physical quantities can be measured independently from each other which does not always apply in subatomic regions. Also events were expected to be classifiable always in unambiguous linear order of time, which is possible only in cases where the running time of signals can be neglected.

In our day to day life this does not matter. We have a very clear understanding of what the length or the weight of a body would mean, and we do not need confirmation from a yard-stick or a balance that this could be measured. Different, however, is the situation in microscopic regions which are smaller than the atoms of the yard-stick. Here we still have to decide what kind of experimental facility we will apply in order to define the quantities length or momentum. Physicists say that properties are defined as invariants of measurement devices.

We can generalise this by saying: properties of what ever have no own ontological quality. They rather are defined by the fact that they are the the invariants of certain measurement operators. This is in contrast to classical thinking in so far we use properties for the objective charachterisation of objects. One of the most important properties we usually attribute to properties, namely to exist independently from each other, is based just on the assumption of their independent ontological quality. In day to day life this is incontestable. The length of a body and its colour exist indepndently from each other and can be measured separately. This does not apply necessarily in subatomic regions, as we know. Position and momentum of microscopic particles cannot be measured independently from each other. Physicists learned from this and decided that theoretical terms have to be operationalised, i.e. when to describe nature by means of theories one should accept only those terms which are defined by certain experimental facilities rather than quantities and categories which are defined by nothing but by common sense.

This heuristically well proven concept has been picked up by the CEE - extended by the idea that operationalization must be something very general that is the basis not only for successful non-classical theoretical terms but also for classical observational terms and for all logical and mathematical terms as well.

As to the observational terms the CEE realizes this as follows. It is usual to call the part of the brain that deals with cognition, the cognitive apparatus. This apparatus can be said to act by means of cognitive operators which act upon the sensory input and transform it into the specificities of our perception, i.e. into the regularities of which we say that we perceive them. From the technical point of view we can consider the cognitive operators as measurement devices whose indicator is not a number or a position of a pointer but rather pictures, sounds or other perceptions. Then we can say, in analogy to physical measuring processes, that properties or perceptions, are nothing but invariants of cognitive operators. That being so, the observational terms we use as the basis for our description of nature are operationalised through inborn cognitive operators. As these operators are inborn and, therfeore, more or less equal for all men, we all would interprete the sensory input in a comparable way and, therefore, we can communicate and discuss what we see or hear.

Here are some examples of what the invariant of an operator will mean: Let us consider a measurement device which acts upon an object or a system. The result is, say, a certain pointer position. If the system is what physicists call an eigenstate of the device in question (what is the normal case in classical physics) then the result is an invariant of the measuring prozess, i.e. it is well defined and reproducible any time we do the measurement. If the system is not an eigenstate (as it can happen in quantum mechanics), it might be changed by the measurement process and, therefore, will lead to a non predictable result.

Similar applies to cognitive operators. In case that the object in question (i.e. the neural input) is an eigenstate of one of our (visual) cognitive operators, the interaction will result in certain patterns in space or time. Otherwise we see no regularity at all, i.e. we see chaos. The distinction between chaos and order, therefore, is not based on objective properties of the system but refers to whether one of the pattern generating mental processes has been stimulated. Order, like any other property, is defined only by cognitive operators. This picture does not allow to say that the perceptions produced by mental operators depict the causing stimulus because it is entirely a matter of the operator how it will react. In this respect they equal measurement apparatuses which also "decide" for themselves entirely on the basis of their construction how they will respond to the contact with the test object. One could well consider perceptions as the "reading" of cognitive operators and, reversely, the reading as the "perception" of the measurement apparatus.

Further to measurment and cognitive operators where we are interested mainly in their invariants, thera are others made deliberately to modify something. This is what we call acting in the usual sense, either by means of our hands ar feet or by means of artificial tools. A hammer, for example, is an instrument designed to modify certain objects. But also a hammer in its quality as an operator has invariants and, therefore, can be seen as a measurement devicein the following sense. Eigenstates of a hammer are those objects which resist the hammer's strokes of a given strength. The hammer, then, can be used to measure the mechanical properties of materials and objects. That being so, the distinction between perception and action, one of the central dichotomies of our classical world picture is reducing to a rather minor detail. Both perception and action mean to apply operators. In case of perception we ask for the invariants of the operator in question, i.e. what remains unchanged under the operators application, whereas in the case of action we ask for what is changed under the operators influence.

A special class of perceptions are the regularities we see and which we condense to what we call the laws of nature. That means, as all perceived regularities as we have seen are human specific, then also the laws of nature derived from them must be human specific. In dead, they are so in the following sense.

Let's consider the law of energy conservation. Physicists have shown that this law is based on the homogeneity of time, i.e. on the fact that time procedes in the same way everywhere and at all times. Only under this assumption it is meaningfull to say that a force free body will move uniformily in time - and just this is one of the formulations of the law of energy conservation. Uniformity in time, however, is nothing absolute. It can be defined only by means of a given metric, i.e uniformity has to be operationalised by a metric generator, i.e. by a clock or a puls generating device realised somewhere in our brain. It is this metric generator - and nothing else - which decides upon what processes we would consider to be uniform in time. Living beings from another planet with a diffent metric generator which would depend, say, on temperature or position or on light, would consider the motion of force free bodies not to be uniform in time. Energy, therefore, would not be an invariant of their perception. Instead they would perceive those processes as being uniform which are physically related to the mechanisms constituting their internal clock, i.e. processes which are invariants of their metric generator. They would have come to entirley different laws of conservation and, therefore, to an entirely different way of describing nature. They would live in a different system of cognitive coordinates which is completely different from ours, though in itself as consistant as ours.

This does not only apply for uniformity in time but for all the regularities we perceive. They are nothing but invariants of inborn cognitive operators. This goes even beyond what we may expect in the first moment. We see our world divided into subjects which have a temporaly unchanging identity, and, on the other hande into properties which we attribute to the objects which may well change. Also the notion of identity can be operationalised. In the course of evolution identity must have developed as invariant of motion (Piaget 1970), i.e. identity is what does not change wherever the subject concerned will go, or, as Üxküll (1921) said: "a subject is what moves together". If our biological ancestors would have decided in favour of a different operator in order to define the most basic elements of our world picture - this world picture would have become entirely different.

- The first one is: how can "home-made" laws of nature help us to survive in a world which is not home-made? Or, in other words: how can we profit by laws which do not refere to the structure of our environment? Or: What could be the criteria for the evaluation of theories if not consistency with the world in which we live?
- The second one is: If the laws of nature are based on nothing but on our physical and cognitive constitution, why, then, do we have to deal with empirical sciences? Wouldn't it be sufficient to investigate and analyse the hard- and software implemented in our brain?

The first question can be replied as follows:

The laws of nature comprise in condensed form all the knowledge we derived from passed experiences in order to predict future experiences. Or, more generally: the laws of nature are representations of operators which transform experiences into each other. Experiences, however, are necessarily always human specific and so must be the laws linking them together. In order to be universal a law of nature must refere to universal experiences. But these do not exist. No experience is so general that every kind of living being must share it. Even the fact that we cannot go through walls or other solid obstacles is not an universal experience. It is rather due to the fact that we, by our physical and chemical constitution, are solid bodies ourselves. If we were beings of the kind Fred Hoyle (1957) invented in his famous science fiction as interstellar "black clouds" which realise their internal functional complexity by means of intermolecular electromagnetic interaction, then we might have to respect certain radio waves but not solid rocks getting in our way. In other words, there is no experience being general enough so that any kind of organism has to consider it and which, therefore, could be the basis for a universal law of nature.

The same applies for perceptions in the more narrow sense. The regularities we perceive, as we have seen, are nothing but invariants of cognitive operators. Then, theories describing these regularities are correct or "true" if and only if they emulate the mental processes generating the regularities in question. As these processes are the oucome of human phylogeny, the criteria theories have to consider is consistency with the respective physical and cognitive phylogeny rather than consistency with the structures of an external and previously defined reality.

The rejection of the classical notion of reality as proposed here does not mean that we could ignore our actual environment or the facts of our actual situation such as the time-tables at the air-port or the traffic lights at the street. It only means that the various life strategies of all possible living beings do not have a common denominator which could be represented as universal laws of nature in the sense that everybody could profit from it, independent from his physical and cognitive constitution.

That we nevertheless use the notion of reality as the most central category of our life strategy has probably practical reasons. A mutually profitable communication about our perceptions and experiences is possible only if the way we interpretate sensory data is accepted as a general human standard. Evolution has immunised this standard by a remarkable trick. Evolution "told" us that the regularities we perceive have their origin in a region to which we have no access as it is "outside" ourself so that the structure of what we see is independent from us. We are told to call this region reality and to call an illusionist everyone who does not believe in the universal character of reality.

Further to this, reality does not comply with the criterion of operationalisability. According to our understanding the structures of reality and the laws of nature reflecting these structures is something upon which we have no influence. Reality, therefore, must be invariant under all our doing and acting, not only now but also under all we may do in future. So, an operator which should characterise the category of reality must be commutable with all possible operators. Unfortunately, the only operatore which is able to do this is the trivial unity-operator.

Let's come to the second question as to how far we can replace empirical search for the physical laws of nature by means of our brain's analyses:

Probably we could do this for all laws which can be derived from perceptions of the unaided sense organs, i.e. mainly for the area of classical mechanics. How this has to be understood we have discussed in the context of the law of energy conservation.

To do physics, however, does not only mean just to look for what is going on in nature. It means, first of all, to do measurements by means of experimental facilities. These measurement operators, as one could call them, can be considered as extensions of our sense organs, i.e. as extensions of the inborn cognitive operators. Here we have to distinguish between quantitative and qualitative extensions in the following sense:

- we will call such an extension quantitative if the experimental operators concerned are commutable (in the sense of operator algebra) with the cognitive operators, i.e. if the invariants of the experimental operators can be expressed in terms of the invariants of the cognitive operators. This applies for all measurements which contributed to the progress of classical physics.
- we will speak of qualitative extensions if the experimental and the cognitive operators are not commutable. Then we will get invariants which can no longer be described in terms of classical physics. We either, then, have to operate with additional assumptions from outside the theories concerned, or we have to formulate non-classical theories as we have done in the case of quantum mechanics and the theory of relativity.

This has an important consequence: as the set of possible experiments is in principle not closed we always have to be prepared to find new invariants which cannot be described in terms of what has been developed before. This means that the development of our theories cannot converge towards an ultimate end or towards a "theory of everything" as physicists say. With this, evolution of cognition and science is as open as organic evolution is. Neither will be there a definitive species of absolute fitness (the pride of creation), nor will be there a definitive physical theory, (the pride of science so to say).

Here we have an open question: From what point on an experimental measurement facility is no longer commutable with inborn operators of the sense perception and how could we identify such an operatur further to the results he will produce? A quantum mechanical measurement device is an entirely classically constructed instrument. Mechanical, electrical and optical parts are combined with each other and nevertheless will produce results which can no longer be interpretated within the framework of classical physics. So, the structure of the operator concerned does not provide us with the clue we need. To refer to subatomic regions and their different laws of nature does not help either, as this is concluded from theories which are based on instruments which have already passed this "point of no return". As to the fronties crossing itself these theories are entirely descriptive, i.e. they cannot say why non-classical theories are as they are. Existing theories, therefore, cannot provide us with any hint as to what future invention may lead to similar surprises.

This has an interesting mathematical analogon.

If we want to carry out the concept of operationalisation in full consequence we have to apply it to mathematical and logical terms as well as we did to observational terms, i.e. we have to consider also mathematical and logical structeres as invariants of special mental operators. If we start from the suggesting idea that the operators concerned are related to each other (for phylogenetic reasons), then the mathematical structures and the sensuously perceived structures themselves must show similarities. This would explain why mathematics does so well in describing the regularities we perceive, or why the world, as Davies asked (1990a), is algorithmically compressible (i.e. why the world despite all its vast complexity can be described by relatively modest mathematical means, or, in other words, why induction is so successful): the physical world - which is the world of our perceptions - is itself, on the ground of its mental genesis, algorithmically structured. Perceived regularities and mathematical structures are phylogenetic homologa. This is the reason why the formulation of (physical) theories in terms of the mathematics we are acquainted with is an essential prerequisite for their capability to emulate the genesis of perception and, therefore, for their truthfulness. From the classical point of view (i.e. within the theory of reality) the algorithmical compressibility of the world or, what is the same, the success of induction cannot be explained.

Under these circumstances we must accept the possibility of qualitative extensions also in mathematics. This sounds strange, but it really exist (Diettrich 1993). Similar to the operators generating sensual perception which can be extended by physical facilities, the mental operators generating our elementary mathematical conceptions can be extended through higher and more complex mathematical calculuses. This is what mathematics does as science. Insofar as the higher mathematics used is based on appropriate axioms, i.e. (in CEE parlance) on axioms which emulate correctly the cognitive operators concerned, there is no reason from the classical point of view to believe that this will lead to "non-classical" statements, i.e. to statements which can no longer be formulated within the syntax constituted by the axioms concerned. This view substantiated the confidence in Hilbert's program of the complete axiomatization of mathematics - or, in the terms used here, the confidence that mathematics can extend itself only quantitatively.

From Gödel, however, we know (see the summary of E. Nagel, 1958) that there are mathematical procedures which, though entirely constructed by means of well proven classical methods, will lead to statements representing a truthfulness which can no longer be derived from the axioms concerned. Mathematics (of the kind we know) turned out to be as incomplete as classical physics. In either case nothing but the application of well-tried and sound methods and procedures can lead to results which cannot be extracted from the foundations of these methods and procedures and, as we must conclude, we cannot be sure that there will be no surprises of a similar kind in the future.

The only difference between the physical and the mathematical situation is that we have in physics already two non-classical theories (quantum mechanics and special relativity) and that we can say precisely under what conditions we have to apply them, namely (simply spoken) in subatomic areas and with very high speeds. In mathematics we only know from Gödel that there must be non-classical phenomena, but we do not know what they are and, particularly, we cannot say which operations would expel us from the classical domain. Is it the notion of cardinal or ordinal numbers, or the notion of set or of infinity, or is it the combined application of these notions which comprise the cause of non-classical mathematical phenomena? What may approach us if we continue to formalise logic in order to find solutions for too complex cases? And what will happen if we deal with more and more powerful computers? We do not know - at least not yet!

The astonishment of mathematicians with respect to Gödel's proof continues, unbroken. Literature is full of respective manifestations. Among others the explanation was proposed that the brain's action cannot be entirely algorithmic (Lucas, 1961; Penrose, 1989). Further to the fact that it is not quite clear what in a neural network such as the brain could be non-algorithmic, this kind of reasoning is not necessary at all. What follows from Gödel's proof is only that what mathematical calculuses can bring about is not necessarily the same as what a certain combination of them could generate. It is as if physicists would believe that physics cannot be entirely natural because apparatuses constructed according to the laws of classical physics would not necessarily reproduce the laws of classical physics.

In contrast to physicists who suggested as an explanation for their respective experiences that they happened to come into domains of nature where other laws would rule, mathematicians hesitated to develop the idea that mathematical research would lead to really new discoveries which by no way could have been expected, even not a posteriori. If mathematics had its own specificity at all as comprised in the notion of Plato's reality, then, this must be something which is included in the very rudiments and which from there would determine all possible consequences. In other words: if there is such a thing like Plato's reality it must reflect itself in the fact that a consistent mathematics can be based only on particular axioms (the analogy to the laws of physical reality, so to say). Once they have been found - so Hilbert's conviction - they would settle once and for ever the "phenotype" of all future mathematics. Mathematics, then, would be nothing but a kind of craft filling up the possibilities defined by the axioms identified - similar to physics which, according to prevailing understanding could do nothing but looking for the applications of the "theory of everything" once it has been found.

Hilbert's famous program was based on the belief that mathematics is completely axiomatisable, i.e. that there is a complete set of axioms from which everything in past present and future mathematics can be derived - axioms for everything, so to say; similar to the theory of everything the physicists are envisaging. In the beginning there have been hopes, that extending or modifying the axioms according to the unprovable statements concerned could solve the problem. Unfortunately the new axioms would be in no better situation as for any set of axioms unprovable statements can be found. Also in physics we can modify theories according to 'unprovable' phenomena, i. e. new phenomena which cannot be explained by the existing theories, as we did for ex. when establishing quantum mechanics - but this will provide no guarantee that simmilar will not happen again and again. So, neither in physics nor in mathematics a 'tool for everything' can be found by means of which all problems concerned - even future ones not yet known - can be solved in a purely technical or formalistic manner.

We are fostering here the idea that the success of mathematical extrapolation of observed data as a prognostic tool must be due to the phylogenetically based affinity between the mental genesis of perceptional and mathematical patterns. In a special case this can be illustrated by a model (Diettrich 1991b) which would reduce on the one side the spatial metric to the category of motion (a view first time presented by Piaget (1970) through research into time perception of children) and at the other hand the algebraic metric (as expressed in the transitivity of addition) to the process of counting. It suggests considering moving and counting as analogue notions within the mental genesis of homologue algebraic and geometric structures. This connection may contribute to the hopes of the CEE that a possibly successful study of non-classical mathematical phenomena could be a clue for better understanding non-classical phenomena in physics too - and vice versa. Mathematics, then, would not only help us to extrapolate successfully physical data; it also could contribute to the conception of novel physical theories (as was already the case with Dirac). So, mathematics could outgrow the role of an auxiliary science, in which we have seen it since the beginnings of empirical sciences, into the role of an heuristic partner of equal rights. Strictly speaking, this already has happen. Of course, that we consider the world to be algorithmically compressible reflects nothing but the suitability of mathematics for prognostic purposes in physics. However, the relationship between perceptional and mathematical patterns was never seen this way. Physicists rather speak of "the unreasonable effectiveness of mathematics in the natural sciences". In the light of the CEE this effectiveness is indead reasonable.

Davies, P. C. W. (1990): Why is the physical World so comprehensible? In Complexity, Entropy and the Physics of Information, Santa Fe Institute studies in the Sciences of Complexity, ed. W. H. Zurek, Vol VIII, Addison Wesley, p. 61-70.

Diettrich, O. (1991a): Realität, Anpassung und Evolution. Philosophia Naturalis, Bd. 28, p. 147- 192

Diettrich, O. (1991b): Induction and evolution of cognition and science. In Gertrudis Van de Vijver (Ed.): Teleology and Selforganisation. Philosophica Nr. 47/II, p. 81-109

Diettrich, O. (1994): Cognitive and communicative development in reality free representation. In Communication and Cognition - Artificial Intelligence, Vol. 11, Nr. 1-2, p. 55-89

Diettrich, O. (1993): Das Weltbild der modernen Physik im Lichte der konstruktivistischen evolutionären Erkenntnistheorie. In proceedings of the 2. Int. Symposium "Die EE im Spiegel der Wissenschaften", Wien, April 1993.

Feynman, R. P. (1965): The character of physical law. BBC Publication.

Hawking, S. W. (1979): Is the end in sight for theoretical physics? Inaugural Lecture for the Lucasian Chair. University of Cambridge.

Hoyle, F. (1957): The Black Cloud.

Lucas, J. R. (1961): Minds, Machines and Gödel. Philosophy 36, p. 120-124

Nagel, E. und Newman, J. (1958): Gödel's Proof. London: Routledge

Penrose, R. (1989): The Emperor's new Mind. Oxford University Press

Piaget, J. (1970): Genetic Epistemology. New York: Columbia University Press.

Riedl, R. (1980): Biologie der Erkenntnis. Berlin, Hamburg: Parey

Üxküll, J. von (1921): Umwelt und Innenleben der Tiere. Berlin: Springer

Vollmer, G. (1975): Evolutionäre Erkenntnistheorie. Stuttgart: S. Hirzel

Wigner, E. (1960): The unreasonable effectiveness of mathematics in the natural sciences. Comm. Pure Appl. Math. 13, 1.