## Research

## 1. Interpretative problems in quantum mechanics

One of the basic features of the standard interpretation of quantum mechanics (QM) is the so-called nonobjectivity of the properties of individual physical systems. It is indeed well known that a property of a physical system cannot generally be considered as either possessed or not possessed by the individual examples of the system if a measurement of that property is not performed. Nonobjectivity is an inexhaustible source of logical and epistemological problems, from which a number of paradoxes follow. In particular, nonobjectivity is incompatible with some natural requirements on physical theories, such as realism, locality, causality, that are fulfilled by classical physics and relativity theory but are problematic in standard QM. This explains the enormous difficulties that scholars still encounter in trying to describe the transition from quantum to classical world and to construct a consistent quantum theory of gravity, thus unifying QM and relativity theory. Moreover, nonobjectivity leads to paradoxical results (for example, the Schrodinger's cat paradox and the Wigner's friend paradox) whenever the quantum formalism and its standard interpretation are applied to the description of the measurement process.

The above diffculties induced the scholars concerned with the foundations of QM to propose alternatives to the standard interpretation which could provide a more intuitive and manageable picture of the world (e.g., the elements of reality introduced by Einstein, Podolsky and Rosen, or EPR, the hidden variables research program, the proposals of modifying QM in order to reconciliate it with "macroscopic realism", etc.). All the proposals forwarded till now must yet come to terms with some basic results (in particular, the Bell-Kochen-Specker and Bell theorems) which are maintained to show directly, without any resort to epistemological choices, that QM exhibits some typical features (contextuality and nonlocality, respectively) that prohibit a local and realistic intepretation of it. These features are today often used in some new disciplines derived from QM, such as quantum computation and information, quantum teleporting, etc., but are also propounded as a support to the standard interpretation, hence they strenghten, rather than solve, the problems raised by this interpretation. Consequently, the recent research in QM has partially abandoned these problems, accepting their insolubility.

A seemingly different attempt to overcome the difficulties expounded above goes back to a famous 1936 paper by Birkhoff and von Neumann, in which the authors suggested that QM presupposes a logic that is different from classical logic, i.e., a quantum logic (QL). Quantum paradoxes would then follow because we use classical logic when reasoning about quantum phenomena that intrinsically require a nonclassical logic. The proposal by Birkhoff and von Neumann is rather controversial, since the interpretation of QL as a real logic is highly problematical from a logical and epistemological point of view. However, Birkhoff and von Neumann's proposal is connected with another difficulty of standard QM, i.e., the problem of founding its mathematical structure on more intuitive and elementary axioms, explicitly related with empirical facts. It is indeed well known that in the standard formulation of QM there is no physical motivation for representing states and observables in a Hilbert space, nor for the fact that such Hilbert space spans over the complex numbers. Moreover, the use of the tensor product of Hilbert spaces in order to represent systems made up by subsystems raises serious problems (strictly connected with nonobjectivity of properties) whenever one tries to describe the subsystems separately. A physically well founded axiomatization of QM could allow one to solve these problems. More important, it could lead to elaborate a new interpretation or a more satisfactory theory going beyond QM and avoiding the paradoxes of the standard interpretation. Thus, many axiomatizations of QM have been propounded in the last thirty years. These proposals have inspired in particular two nonstandard approaches to the foundations of QM, i.e., the Brussels approach (briefly, BR approach), elaborated by Aerts and his collaborators in Brussels, and the Lecce approach, developed in Lecce in the last two decades. These approaches have been critically analysed in the PhD thesis "Lecce and Brussels: Two Proposals for a Realistic and Objective Interpretation of Quantum Mechanics", and the results that have been obtained are expounded in the following section.

## 2. Lecce and Brussels approaches: a comparison

The BR approach continues the work started by Jauch and Piron in the sixties and the seventies in Geneva, with the aim of providing a physical justification of the mathematical apparatus of QM by means of an operational foundation of this theory and of better describing compound systems by suitably modifying the theory itself. The Lecce approach is based instead on a careful analysis of the language of QM and of the epistemological premises that are implicit in the standard interpretation and in the theorems proving nonobjectivity of the theory (in particular, the Bell-Kochen-Specker and Bell theorems, see Sec. 1). This analysis led the authors to criticize the above theorems, to adopt an epistemological perspective called Semantic Realism, and to elaborate a non-standard interpretation (briefly, SR interpretation) of QM within this perspective. This interpretation is objective, hence noncontextual and local, but preserves the mathematical apparatus of QM and its statistical interpretation, avoiding the paradoxes that afflict the standard interpretation (in particular, those connected with the quantum theory of measurement). Moreover, also the Lecce approach suggests that a broader theory can exist that embodies QM but says more than it. The two approaches have a number of common features, and seem to come to the same conclusions about many problematical issues of QM. These similarities suggested us to start a critical comparison between them, with the aim of constructing a unied general perspective embodying QM and interpreting it in a realistic sense. The results that have been obtained can be summed up as follows.

We firstly provided a preliminary comparison between the BR and the Lecce approaches, analysing their similarities. We noted that both approaches distinguish between the mathematical and the conceptual problems of standard QM and support EPR's reasonings in favour of the incompleteness of QM. Moreover, we stressed that both approaches can be classified as realistic and operational, that they interpret quantum probabilities as epistemic rather than ontological, and that they neatly distinguish states and properties both from a conceptual and from a mathematical point of view. Finally, we observed that the two approaches criticize the widespread conviction that QL represents a logic in the standard sense. However, our comparison also pointed out some differences between the two approaches. Indeed, we pointed out that they come to seemingly dierent conclusions about the crucial issues of contextuality, nonlocality and separability. In particular, we considered Aerts' separated quantum entities theorem, which states that the standard quantum formalism cannot describe separated systems: this result is important in the BR approach, since it allows Aerts to conclude that QM must be modified if one wants to take into account this kind of entities. Our analysis of the theorem shows, however, that it depends on an implicit assumption that is not introduced in the Lecce approach. Therefore, Aerts' conclusion cannot be attained in this approach. Notwithstanding this, we observed that the SR model, that has been contrived in order to prove the consistency of the SR approach, can be integrated in some cases with the BR approach, which leads one to wonder whether a unied general perspective can be constructed. We left this question open but we explicitly proved that it is possible to connect in a simple case the BR and the Lecce approach, constructing a macroscopic version of the microscopic SR model for quantum measurements which embodies Aerts' quantum machine. This result overcomes in a particular situation the seeming incompatibility between the two models (in particular, their seemingly dierent behaviour with respect to contextuality) and shows that it is possible, at least in principle, to connect the two approaches also in more complex situations.

In the framework of the comparison between the BR and the Lecce approaches, we studied the subentity problem in QM. It is well known that the density operators obtained by performing partial traces on the projection operator representing a pure entangled state of a compound physical system cannot be considered as representing neither pure nor mixed states (proper mixtures) of the component subsystems, but improper mixtures, because the coefficients in the convex sums expressing these density operators never bear the ignorance interpretation. Hence, the attribution of states to the subsystems of a compound physical system is problematical in standard QM. Hence, we discussed two alternative proposals that can be developed in the BR and the Lecce approaches. We observed that partial traces represent true states of the subsystems in both approaches. However, they represent new pure states in the BR approach, which may entail a breakdown of the linearity of QM, while improper mixtures can be regarded as nonpure states according to the Lecce approach. Moreover, it is reasonable in the Lecce approach to suppose that there exists a deeper distinction than the probabilistic one between the preparation devices producing proper and improper mixtures, and this provides an intuitive support to a typical conjecture in the Brussels approach on the different temporal evolution of the two kinds of mixtures. Hence, despite their different terminologies, the two proposals seem compatible, which suggests that it can be possible to provide a unified nonstandard solution of the subentity problem.

It is important to observe that the solution of the subentity problem that arises from the SR interpretation is qualitative, in the sense that it allows one to distinguish only from a conceptual point of view proper and improper mixtures which are still represented by the same density operators in complex QM. We have therefore considered the more general formulation of QM in quaternionic Hilbert spaces. In particular, we have proved that proper and improper mixtures can be associated with different density operators in quaternionic QM, and that this distinction is compatible with their different time evolutions in complex QM. The new representation seems interesting since it allows one to discriminate between these conceptually different kinds of mixtures also from an empirical point of view.

Furthermore, we have obtained some original results in the Lecce approach. They are synthetized in the following sections.

## 3. A semantic approach to the completeness problem in quantum mechanics

The problem of the completeness of QM was raised by EPR in a famous 1935 paper aiming to prove that QM is incomplete. It is known that Bohr replied with two papers in which the completeness of QM was asserted. The debate on this subject involved many scholars but it is not completely exhausted nowadays, even if Bohr's position is largely prevailing among physicists. Completeness of QM was meant both by EPR and Bohr in an ontological sense, hence the debate was largely affected by the different philosophical positions of the competitors and could hardly lead to indisputable conclusions. On the contrary, an accurate analysis of the constitutive elements of a physical theory, of its observative language, partial interpretation and truth theory that is adopted, led us to conclude that, for every physical theory, the completeness problem becomes more tractable if it is preliminarily discussed from a semantic viewpoint. Indeed, once the notion of semantic completeness of a physical theory has been introduced, the reference to the language of the theory, further than to a problematic external reality, provides rigorous criteria for establishing whether this kind of completeness occurs or not. We applied the semantic approach to QM and proved that QM is semantically complete (but nonobjective, see Sec. 1) if one adopts the theory of truth which is typical of the standard interpretation (empirical verificationism), which accords with Bohr's thesis. But we also proved that QM is semantically incomplete if a classical (Tarskian) theory of truth is adopted, which accords with EPR's thesis. However, the SR interpretation (see Sec. 2) can conciliate the dierent conclusions of EPR and Bohr by adopting an integrationist perspective that reinterprets non-Tarskian theories of truth (as the one implicitly adopted by Bohr) as theories of metalinguistic concepts that are different from truth. According to this perspective QM, altough semantically incomplete, is complete with respect to the subset of all sentences of its observative language whose truth values are testable, or epistemically accessible (pragmatic completeness), consistently with Bohr's thesis.

## 4. Recovering nonstandard logics within an extended classical framework

An analysis of the various notions of realism occurring in physical theories has been performed which shows that, at variance with a widespread belief, all existing interpretations of QM (except for the statistical interpretation) presuppose a minimal form of realism which consists in assuming that QM deals with individual objects and their properties. It has been demonstrated that the standard arguments supporting the contextuality and the nonlocality (see Sec. 1) of QM are a signicant clue to the implicit adoption of stronger and compelling forms of realism (realism of theoretical entities and realism of theories), notwithstanding the asserted "antimetaphysical" character of standard QM. If these kinds of realism are substituted by the simpler and more intuitive semantic realism adopted in the SR interpretation several fundamental problems of standard QM are avoided.

Furthermore a procedure has been proposed which allows one to recover classical and nonclassical logical structures as concrete logics associated with physical theories expressed by means of classical languages. This procedure consists in choosing, for a given theory T and classical language L expressing T, an observative sublanguage L' of L with a notion of truth as correspondence, introducing in L' a derived and theory-dependent notion of C-truth (true with certainty), defining a physical preorder induced by C-truth, and finally selecting a set of sentences that are verifiable according to T, on which a weak complementation is induced by T. The triple consisting of the set of verifiable sentences, physical order and weak complementation is then the desired concrete logic. By applying our procedure we have recovered a classical logic as the concrete logic associated with classical mechanics and standard QL as the concrete logic associated with QM. These results then show that some nonstandard logics can be obtained as mathematical structures formalizing the properties of different notions of verifiability in different physical theories. More generally, they strongly support the idea that many nonclassical logics can coexist without conflicting with classical logic (global pluralism), for they formalize metalinguistic notions that do not coincide with the notion of truth (described by Tarski's truth theory).

## 5. The ESR model: a noncontextual framework for quantum mechanics

We have recently elaborated an improved version of the SR model introduced in Sec. 2 (extended semantic realism, or ESR, model). The ESR model embodies the mathematical formalism of QM in a noncontextual framework reinterpreting quantum probabilities as conditional (in a nonconventional sense) rather than absolute. The ESR model consists of a microscopic and a macroscopic part.

The microscopic part can be considered as a new kind of noncontextual, hence local, hidden variables theory for QM which reinterprets quantum probabilities and provides a justification of the assumptions introduced in the macroscopic part. As a consequence, it yields some predictions that are formally identical to those of QM but have a dierent physical interpretation, and further predictions that differ also formally from those of QM. In particular, we proved that a modified Bell-Clauser-Horne-Shimony-Holt (BCHSH) inequality holds in the model if one takes into account all individual systems that are prepared, hence the ESR model is falsifiable. We moreover proved that standard quantum inequalities hold in the model if one considers only the individual systems that are detected, and a BCHSH inequality holds at a microscopic (purely theoretical) level. Finally, we observed that these results admit an intuitive explanation in terms of an unconventional kind of unfair sampling.

The macroscopic part can be considered as an autonomous theory that embodies the mathematical formalism of standard Hilbert space QM in a noncontextual framework, reinterpreting quantum probabilities as conditional. We showed that each generalized observable introduced by the ESR model is represented by a family of commutative positive operator valued (POV) measures parametrized by the pure states of the physical system that is considered. By using this representation, we obtained a generalization of the projection postulate for pure states (generalized projection postulate), provided rules for evaluating conditional and absolute probabilities in the case of both pure states and mixtures, and deepened the results discussed above on Bell's inequalities. Then we proved that a new mathematical representation of mixtures must be introduced which does not coincide with the standard representation in QM and avoids some deep problems that arise in the interpretation of mixtures provided by QM. Finally we got a nontrivial generalization of the Luders postulate (generalized Luders postulate), which was justified in a special case by introducing a reasonable physical assumption on the evolution of the compound system made up of the measured system and the measuring apparatus.

We stress that the approach forwarded by the Lecce research group in the Foundations of Quantum Mechanics has an interdisciplinary character, in the sense that it tackles the problems and the conceptual difficulties of QM under dierent (logical, physical, mathematical, epistemological, etc.) points of view. Furthermore, the ESR model developed in Lecce constitutes the first step toward the elaboration of a more general theory recovering the mathematical formalism of QM, providing new predictions, and constituting a completion of it.

## 6. Quantum structures in nonphysical domains

A research activity on the application of quantum structures in disciplines different from physics has been developed in collaboration with the Brussels research group. The BR approach supplies indeed a general framework (consisting in a SCoP formalism and a hidden measurement formalism) for the mathematical description of systems which are such that the system cannot be treated independently of its environment, hence context eects become fundamental, while the application of classical structures (Boolean logic, Kolmogorovian probability, etc.) is problematical. Situations of this kind have been recognized in psychology, biology and economics.

(i) Psychology. The BR approach has been employed to model the data collected in some experiments carried out in cognitive science to estimate typicalities of exemplars of concepts and their combinations. The empirical results contrasted the predictions that could be obtained by interpreting combinations of concepts in terms of classical logic and set theory. In particular, an exemplar such Guppy gave rise to a typicality with respect to the conjunction Pet-Fish which was much bigger than would be expected if the conjunction of Pet and Fish were treated from the perspective of classical logic and set theory (Pet-Fish problem). Guppy-like effects were also identified in the membership weights of exemplars with respect to concepts and their combinations. By adopting the SCoP formalism and elaborating a quantum-based model one can instead describe and explain experimental data in terms of contextual influence between concepts. Furthermore, it has been proved that the Guppy effect is also present on the World-Wide-Web whenever data on concepts and their combinations are collected by using search engines. A quantum-based model has been propounded which agrees with empirical data also in this case.

The formulation above has allowed us to identify the presence of typically quantum effects in the mechanisms of concept combinations, i.e., contextual influence, superposition, interference, emergence and entanglement. In particular, we have shown that some Bell's inequalities can be deduced by considering coincidence experiments and gathering data on real test subjects. The violation of these inequalities reveals the presence of entanglement between concepts. Moreover, the obtained results suggest the hypothesis that two structured and superposed layers can be identified in human thought: a classical logical layer, that can be modeled by a classical Kolmogorovian framework, and a quantum conceptual layer, that instead requires the formalism of QM. Some applications of our quantum cognition approach to information retrieval, articial intelligence and robotics have also been discussed. More precisely, we have pointed out that the conceptual and technical problems of the latter disciplines could be due to the fact that they use the paradigm of classical computation to simulate the processes occurring in human mind which should instead require a quantum-based computation.

(ii) Biology. Population ecology is mainly based on the nonlinear Lotka-Volterra equations, which rule the dynamics of interacting species. But, for many interacting populations, these equations entail complex dynamical behavior and long-term unpredictability, and give rise to known problems (e.g., the plankon and enrichment paradoxes). A careful analysis shows that an ecological system is an intrinsecally contextual system, while existing approaches, based on classical physics and probability theory, introduce contextuality as an external eect, hence they cannot generally explain the main features of ecosystems. For this reason, the BR approach has been adopted to work out a contextual formalism for ecosystems and to construct an extension of the Lotka-Volterra equations for contextual systems. The analytic solutions of these generalized equations entail an alternative explanation of the plankton paradox in terms of contextual interactions among individuals, species, populations and communities.

The violation of the Lotka-Volterra equations could suggest that the neo-Darwinian paradigm is too rough to describe the evolution of biological species, while a more general paradigm should be addressed to connect biological, physical and cultural evolutions.

(iii) Economics. The expected utility hypothesis is violated in real life decisions, as shown by the Allais and Ellsberg paradoxes. The popular explanation in terms of ambiguity aversion is not completely accepted. To overcome these diculties a distinction between risk and ambiguity has been introduced which depends on the existence of a Kolmogorovian structure modeling these uncertainties. On the other hand, context plays a relevant role in human decisions under uncertainty, and any probabilistic structure modeling contextual interactions between systems structurally needs a non-Kolmogorovian framework admitting a (generalized) quantum representation. We have thus proposed a notion of contextual risk to mathematically capture situations in which ambiguity occurs. The contextual risk approach has then been applied to the Ellsberg paradox, a sphere model has been elaborated within the hidden measurement formalism which reveals that it is the overall conceptual landscape that is responsible of the disagreement between actual human decisions and the predictions of expected utility theory, which generates the paradox.

Interesting results have also been attained in quantitative finance. Modern approaches to stock pricing are typically founded on the Black-Scholes equations and the underlying random walk hypothesis. Empirical data indicate that this hypothesis works well in stable situations, but it fails in abrupt transitions (market crash, economical crisis, etc.), hence alternative descriptions are needed. By using the SCoP formalism we have demonstrated that a stock market is an intrinsically contextual system where agents' decisions globally influence the market system and stocks prices. More specically, a given stock does not generally have a denite value, e.g., a price, but its value is actualized as a consequence of the contextual interactions in the trading process. This contextual influence is responsible of the non-Kolmogorovian quantum behavior of the market at a statistical level. Then, we have proposed a sphere model within the hidden measurement formalism that describes a buying/selling process of a stock and provides an intuitive support to the employment of quantum structures in finance.