Weekly Papers on Quantum Foundations (12)

This is a list of this week’s papers on quantum foundations published in the various journals or uploaded to the preprint servers such as arxiv.org and PhilSci Archive.

Bell’s Theorem and the Issue of Determinism and Indeterminism

Latest Results for Foundations of Physics

on 2015-3-21 12:00am GMT

Abstract

The paper considers the claim that quantum theories with a deterministic dynamics of objects in ordinary space-time, such as Bohmian mechanics, contradict the assumption that the measurement settings can be freely chosen in the EPR experiment. That assumption is one of the premises of Bell’s theorem. I first argue that only a premise to the effect that what determines the choice of the measurement settings is independent of what determines the past state of the measured system is needed for the derivation of Bell’s theorem. Determinism as such does not undermine that independence (unless there are particular initial conditions of the universe that would amount to conspiracy). Only entanglement could do so. However, generic entanglement without collapse on the level of the universal wave-function can go together with effective wave-functions for subsystems of the universe, as in Bohmian mechanics. The paper argues that such effective wave-functions are sufficient for the mentioned independence premise to hold.

A Reconstruction of Quantum Mechanics

Latest Results for Foundations of Physics

on 2015-3-20 12:00am GMT

Abstract

We show that exactly the same intuitively plausible definitions of state, observable, symmetry, dynamics, and compound systems of the classical Boolean structure of intrinsic properties of systems lead, when applied to the structure of extrinsic, relational quantum properties, to the standard quantum formalism, including the Schrödinger equation and the von Neumann–Lüders Projection Rule. This approach is then applied to resolving the paradoxes and difficulties of the orthodox interpretation.

Reply to Fleming: Symmetries, observables, and the occurrence of events

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: Available online 5 October 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Thomas Pashby
In this article I reply to Fleming׳s response to my ‘Time and quantum theory: a history and a prospectus.’ I take issue with two of his claims: (i) that quantum theory concerns the (potential) properties of eternally persisting objects; (ii) that there is an underdetermination problem for Positive Operator Valued Measures (POVMs). I advocate an event-first view which regards the probabilities supplied by quantum theory as probabilities for the occurrence of physical events rather than the possession of properties by persisting objects.

Ontological aspects of the Casimir Effect

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part A
Author(s): William M.R. Simpson
The role of the vacuum, in the Casimir Effect, is a matter of some dispute: the Casimir force has been variously described as a phenomenon resulting “from the alteration, by the boundaries, of the zero-point electromagnetic energy” (Bordag, Mohideen, & Mostepanenko, 2001), or a “van der Waals force between the metal plates” that can be “computed without reference to zero point energies” (Jaffe, 2005). Neither of these descriptions is grounded in a consistently quantum mechanical treatment of matter interacting with the electromagnetic field. However, the Casimir Effect has been canonically described within the framework of macroscopic quantum electrodynamics (Philbin, 2010). On this general account, the force is seen to arise due to the coupling of fluctuating currents to the zero-point radiation, and it is in this restricted sense that the phenomenon requires the existence of zero-point fields. The conflicting descriptions of the Casimir Effect, on the other hand, appear to arise from ontologies in which an unwarranted metaphysical priority is assigned either to the matter or the fields, and this may have a direct bearing on the problem of the cosmological constant.

No superluminal propagation for classical relativistic and relativistic quantum fields

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part B
Author(s): John Earman
A criterion is proposed to ensure that classical relativistic fields do not propagate superluminally. If this criterion does indeed serve as a sufficient condition for no superluminal propagation it follows that various other criteria found in the physics literature cannot serve as necessary conditions since they can fail although the proffered condition holds. The rejected criteria rely on energy conditions that are believed to hold for most classical fields used in actual applications. But these energy conditions are known to fail at small scales for quantum fields. It is argued that such a failure is not necessarily a cause for concern about superluminal propagation in the quantum regime since the proffered criterion of no superluminal propagation for classical fields has a natural analog for quantum fields and, further, this quantum analog condition provably holds for some quantum fields despite the violation of energy conditions. The apparatus developed here also offers a different approach to treating the Reichenbach–Salmon cases of “pseudo-causal processes” and helps to clarify the issue of whether relativity theory is consistent with superluminal propagation.

Can the ontological models framework accommodate Bohmian mechanics?

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part A
Author(s): Benjamin Feintzeig
The ontological models framework has been proposed as a tool to prove general results about many competing interpretations of quantum mechanics at once. I argue that the ontological models framework is at best ambiguous, and at worst unable to accomplish its task of representing even the most well known interpretations of quantum mechanics. I show that when the framework is made mathematically precise, it cannot accommodate Bohmian mechanics, a well known interpretation of quantum mechanics in terms of hidden variables.

Comment on Ashtekar: Generalization of Wigner׳s principle

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: Available online 11 September 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Bryan W. Roberts
Ashtekar׳s generalization of Curie׳s principle and Kabir׳s principle in this volume shows that these principles are robust, obtaining in a variety of modifications of quantum theory. In this note, I illustrate how Wigner׳s principle can be similarly generalized.

Maudlin׳s challenge refuted: A reply to Lewis

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: August 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 47
Author(s): Ruth E. Kastner
Lewis has recently argued that Maudlin׳s contingent absorber experiment remains a significant problem for the Transactional Interpretation (TI). He argues that the only straightforward way to resolve the challenge is by describing the absorbers as offer waves, and asserts that this is a previously unnoticed aspect of the challenge for TI. This argument is refuted in two basic ways: (i) it is noted that the Maudlin experiment cannot be meaningfully recast with absorbers described by quantum states; instead the author replaces it with an ordinary which-way experiment; and (ii) the extant rebuttals to the Maudlin challenge in its original form are not in fact subject to the alleged flaws that Lewis ascribes to them. This paper further seeks to clarify the issues raised in Lewis’ presentation concerning the distinction between quantum systems and macroscopic objects in TI. It is noted that the latest, possibilist version of TI (PTI) has no ambiguity concerning macroscopic absorbers. In particular, macroscopic objects are not subject to indeterminate trajectories, since they are continually undergoing collapse. It is concluded that the Maudlin challenge poses no significant problem for the transactional interpretation.

Modular localization and the holistic structure of causal quantum theory, a historical perspective

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: February 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 49
Author(s): Bert Schroer
Recent insights into the conceptual structure of localization in QFT (modular localization) led to clarifications of old unsolved problems. The oldest one is the Einstein–Jordan conundrum which led Jordan in 1925 to the discovery of quantum field theory. This comparison of fluctuations in subsystems of heat bath systems (Einstein) with those resulting from the restriction of the QFT vacuum state to an open subvolume (Jordan) leads to a perfect analogy; the globally pure vacuum state becomes upon local restriction a strongly impure KMS state. This phenomenon of localization-caused thermal behavior as well as the vacuum-polarization clouds at the causal boundary of the localization region places localization in QFT into a sharp contrast with quantum mechanics and justifies the attribute “holstic”. In fact it positions the E–J Gedankenexperiment into the same conceptual category as the cosmological constant problem and the Unruh Gedankenexperiment. The holistic structure of QFT resulting from “modular localization” also leads to a revision of the conceptual origin of the crucial crossing property which entered particle theory at the time of the bootstrap S-matrix approach but suffered from incorrect use in the S-matrix settings of the dual model and string theory. The new holistic point of view, which strengthens the autonomous aspect of QFT, also comes with new messages for gauge theory by exposing the clash between Hilbert space structure and localization and presenting alternative solutions based on the use of stringlocal fields in Hilbert space. Among other things this leads to a reformulation of the Englert–Higgs symmetry breaking mechanism.

Measurements according to Consistent Histories

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part A
Author(s): Elias Okon , Daniel Sudarsky
We critically evaluate the treatment of the notion of measurement in the Consistent Histories approach to quantum mechanics. We find such a treatment unsatisfactory because it relies, often implicitly, on elements external to those provided by the formalism. In particular, we note that, in order for the formalism to be informative when dealing with measurement scenarios, one needs to assume that the appropriate choice of framework is such that apparatuses are always in states of well defined pointer positions after measurements. The problem is that there is nothing in the formalism to justify this assumption. We conclude that the Consistent Histories approach, contrary to what is claimed by its proponents, fails to provide a truly satisfactory resolution for the measurement problem in quantum theory.

Niels Bohr as philosopher of experiment: Does decoherence theory challenge Bohr׳s doctrine of classical concepts?

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: February 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 49
Author(s): Kristian Camilleri , Maximilian Schlosshauer
Niels Bohr׳s doctrine of the primacy of “classical concepts” is arguably his most criticized and misunderstood view. We present a new, careful historical analysis that makes clear that Bohr׳s doctrine was primarily an epistemological thesis, derived from his understanding of the functional role of experiment. A hitherto largely overlooked disagreement between Bohr and Heisenberg about the movability of the “cut” between measuring apparatus and observed quantum system supports the view that, for Bohr, such a cut did not originate in dynamical (ontological) considerations, but rather in functional (epistemological) considerations. As such, both the motivation and the target of Bohr׳s doctrine of classical concepts are of a fundamentally different nature than what is understood as the dynamical problem of the quantum-to-classical transition. Our analysis suggests that, contrary to claims often found in the literature, Bohr׳s doctrine is not, and cannot be, at odds with proposed solutions to the dynamical problem of the quantum–classical transition that were pursued by several of Bohr׳s followers and culminated in the development of decoherence theory.

Retrocausal models for EPR

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: February 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 49
Author(s): Richard Corry
This paper takes up Huw Price׳s challenge to develop a retrocausal toy model of the Bell-EPR experiment. I develop three such models which show that a consistent, local, hidden-variables interpretation of the EPR experiment is indeed possible, and which give a feel for the kind of retrocausation involved. The first of the models also makes clear a problematic feature of retrocausation: it seems that we cannot interpret the hidden elements of reality in a retrocausal model as possessing determinate dispositions to affect the outcome of experiments. This is a feature which Price has embraced, but Gordon Belot has argued that this feature renders retrocausal interpretations “unsuitable for formal development”, and the lack of such determinate dispositions threatens to undermine the motivation for hidden-variables interpretations in the first place. But Price and Belot are both too quick in their assessment. I show that determinate dispositions are indeed consistent with retrocausation. What is more, I show that the ontological economy allowed by retrocausation holds out the promise of a classical understanding of spin and polarization.

Time in fundamental physics

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: Available online 3 October 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Abhay Ashtekar
The first three sections of this paper contain a broad brush summary of the profound changes in the notion of time in fundamental physics that were brought about by three revolutions: the foundations of mechanics distilled by Newton in his Principia, the discovery of special relativity by Einstein and its reformulation by Minkowski, and, finally, the fusion of geometry and gravity in Einstein׳s general relativity. The fourth section discusses two aspects of yet another deep revision that waits in the wings as we attempt to unify general relativity with quantum physics.

Causality and chance in relativistic quantum field theories

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part B
Author(s): Richard A. Healey
Bell appealed to the theory of relativity in formulating his principle of local causality. But he maintained that quantum field theories do not conform to that principle, even when their field equations are relativistically covariant and their observable algebras satisfy a relativistically motivated microcausality condition. A pragmatist view of quantum theory and an interventionist approach to causation prompt the reevaluation of local causality and microcausality. Local causality cannot be understood as a reasonable requirement on relativistic quantum field theories: it is unmotivated even if applicable to them. But microcausality emerges as a sufficient condition for the consistent application of a relativistic quantum field theory.

Four tails problems for dynamical collapse theories

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: February 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 49
Author(s): Kelvin J. McQueen
The primary quantum mechanical equation of motion entails that measurements typically do not have determinate outcomes, but result in superpositions of all possible outcomes. Dynamical collapse theories (e.g. GRW) supplement this equation with a stochastic Gaussian collapse function, intended to collapse the superposition of outcomes into one outcome. But the Gaussian collapses are imperfect in a way that leaves the superpositions intact. This is the tails problem. There are several ways of making this problem more precise. But many authors dismiss the problem without considering the more severe formulations. Here I distinguish four distinct tails problems. The first (bare tails problem) and second (structured tails problem) exist in the literature. I argue that while the first is a pseudo-problem, the second has not been adequately addressed. The third (multiverse tails problem) reformulates the second to account for recently discovered dynamical consequences of collapse. Finally the fourth (tails problem dilemma) shows that solving the third by replacing the Gaussian with a non-Gaussian collapse function introduces new conflict with relativity theory.

Does the Reeh–Schlieder theorem violate relativistic causality?

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part B
Author(s): Giovanni Valente
The Reeh–Schlieder theorem is a purely relativistic result in local quantum field theory, which is often regarded as raising a conflict with relativistic causality, namely the requirement that causal processes cannot propagate faster than light. Allegedly, under an operational interpretation, the theorem would entail non-local effects, in that by performing an operation within a region of Minkowski spacetime one could instantaneously change the state of the field over another spacelike separated region. Here, we argue that such a conflict is only apparent. Indeed, a suitable understanding of the notion of local operations helps one dissolve the puzzle. Accordingly, even if one does not exclude superluminal signalling, the latter cannot be controlled, and thus it may not be used to give rise to causal paradoxes. On the other hand, we maintain that relativistic causality is expressed by the axiom of local primitive causality, assuring no superluminal propagation of a field. The Reeh–Schlieder theorem can be shown to be fully consistent with such a condition, and hence it does not imply that matter and energy carried by a quantum field can travel faster than light. Therefore, there is no violation of relativistic causality at all.

Response to Dr. Pashby: Time operators and POVM observables in quantum mechanics

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: Available online 22 September 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Gordon N. Fleming
I argue against a general time observable in quantum mechanics except for quantum gravity theory. Then I argue in support of case specific arrival, dwell and relative time observables with a cautionary note concerning the broad approach to POVM observables because of the wild proliferation available.

On the relation between the probabilistic characterization of the common cause and Bell׳s notion of local causality

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: February 2015
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 49
Author(s): Gábor Hofer-Szabó
In this paper the relation between the standard probabilistic characterization of the common cause (used for the derivation of the Bell inequalities) and Bell׳s notion of local causality will be investigated in the isotone net framework borrowed from algebraic quantum field theory. The logical role of two components in Bell׳s definition will be scrutinized; namely that the common cause is localized in the intersection of the past of the correlated events; and that it provides a complete specification of the ‘beables’ of this intersection.

Entanglement and disentanglement in relativistic quantum mechanics

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2015-3-19 10:33am GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part B
Author(s): Jeffrey A. Barrett
A satisfactory formulation of relativistic quantum mechanics requires that one be able to represent the entangled states of spacelike separated systems and describe how such states evolve. This paper presents two stories that one must be able to tell coherently in order to understand relativistic entangled systems. These stories help to illustrate why one׳s understanding of entanglement in relativistic quantum mechanics must ultimately depend on the details of one׳s strategy for addressing the quantum measurement problem.

The Wave Function as Matter Density: Ontological Assumptions and Experimental Consequences

Latest Results for Foundations of Physics

on 2015-3-19 12:00am GMT

Abstract

The wavefunction is the central mathematical entity of quantum mechanics, but it still lacks a universally accepted interpretation. Much effort is spent on attempts to probe its fundamental nature. Here I investigate the consequences of a matter ontology applied to spherical masses of constant bulk density. The governing equation for the center-of-mass wavefunction is derived and solved numerically. The ground state wavefunctions and resulting matter densities are investigated. A lowering of the density from its bulk value is found for low masses due to increased spatial spreading. A discussion of the possibility to experimentally observe these effects is given and the possible consequences for choosing an ontological interpretation for quantum mechanics are commented upon.

How the Weak Variance of Momentum Can Turn Out to be Negative

Latest Results for Foundations of Physics

on 2015-3-19 12:00am GMT

Abstract

Weak values are average quantities, therefore investigating their associated variance is crucial in understanding their place in quantum mechanics. We develop the concept of a position-postselected weak variance of momentum as cohesively as possible, building primarily on material from Moyal (Mathematical Proceedings of the Cambridge Philosophical Society, Cambridge University Press, Cambridge, 1949) and Sonego (Found Phys 21(10):1135, 1991) . The weak variance is defined in terms of the Wigner function, using a standard construction from probability theory. We show this corresponds to a measurable quantity, which is not itself a weak value. It also leads naturally to a connection between the imaginary part of the weak value of momentum and the quantum potential. We study how the negativity of the Wigner function causes negative weak variances, and the implications this has on a class of ‘subquantum’ theories. We also discuss the role of weak variances in studying determinism, deriving the classical limit from a variational principle.

Weak values as interference phenomena

PRA: Fundamental concepts

on 2015-3-18 2:00pm GMT

Author(s): Justin Dressel

Weak values arise experimentally as conditioned averages of weak (noisy) observable measurements that minimally disturb an initial quantum state, and also as dynamical variables for reduced quantum state evolution even in the absence of measurement. These averages can exceed the eigenvalue range of …

[Phys. Rev. A 91, 032116] Published Wed Mar 18, 2015

Is spontaneous wave function collapse testable at all?. (arXiv:1503.04681v1 [quant-ph])

quant-ph updates on arXiv.org

on 2015-3-17 12:48am GMT

Authors: Lajos Diósi

Mainstream literature on spontaneous wave function collapse never reflects on or profit from the formal coincidence and conceptual relationship with standard collapse under time-continuous quantum measurement (monitoring). I propose some easy lessons of standard monitoring theory which would make spontaneous collapse models revise some of their claims. In particular, the objective detection of spontaneous collapse remains impossible as long as the correct identification of what corresponds to the signal in standard monitoring is missing from spontaneous collapse models, the physical detectability of the “signal” is not stated explicitly and, finally, the principles of physical detection are not revealed.

Quantum Cloning using Protective Measurement. (arXiv:1503.04547v1 [quant-ph])

quant-ph updates on arXiv.org

on 2015-3-17 12:48am GMT

Authors: C. S. Sudheer Kumar

In this paper we show that, in principle it is possible to clone an arbitrary unknown quantum state of a spin-1/2 particle(an electron), using protective measurement. However, we donot make any comments about the precision attainable(i.e., how close the cloned copy will be, to the original one), as it requires further thorough analysis. Nonorthogonal state discrimination being a subclass of cloning, first half of the paper(till finding out $\theta_{m}$) is sufficient for discrimination.

Quantum gravity: Spacetime fuzziness in focus

Nature Physics – AOP – nature.com science feeds

on 2015-3-16 12:00am GMT

Nature Physics. doi:10.1038/nphys3293

Author: Agnieszka Jacholkowska

Photons emitted by extragalactic sources provide an opportunity to test quantum gravity effects that modify the speed of light in vacuum. Studying the arrival times of these cosmic messengers further constrains the energy scales involved.

A Planck-scale limit on spacetime fuzziness and stochastic Lorentz invariance violation

Nature Physics – AOP – nature.com science feeds

on 2015-3-16 12:00am GMT

Nature Physics. doi:10.1038/nphys3270

Authors: Vlasios Vasileiou, Jonathan Granot, Tsvi Piran & Giovanni Amelino-Camelia

Wheeler’s ‘spacetime-foam’ picture of quantum gravity (QG) suggests spacetime fuzziness (fluctuations leading to non-deterministic effects) at distances comparable to the Planck length, LPl ≈ 1.62 × 10−33 cm, the inverse (in natural units) of the Planck energy, EPl ≈ 1.22 × 1019 GeV. The resulting non-deterministic motion of photons on the Planck scale is expected to produce energy-dependent stochastic fluctuations in their speed. Such a stochastic deviation from the well-measured speed of light at low photon energies, c, should be contrasted with the possibility of an energy-dependent systematic, deterministic deviation. Such a systematic deviation, on which observations by the Fermi satellite set Planck-scale limits for linear energy dependence, is more easily searched for than stochastic deviations. Here, for the first time, we place Planck-scale limits on the more generic spacetime-foam prediction of energy-dependent fuzziness in the speed of photons. Using high-energy observations from the Fermi Large Area Telescope (LAT) of gamma-ray burst GRB090510, we test a model in which photon speeds are distributed normally around c with a standard deviation proportional to the photon energy. We constrain the model’s characteristic energy scale beyond the Planck scale at >2.8EPl(>1.6EPl), at 95% (99%) confidence. Our results set a benchmark constraint to be reckoned with by any QG model that features spacetime quantization.

Contextuality in Three Types of Quantum-Mechanical Systems

Latest Results for Foundations of Physics

on 2015-3-15 12:00am GMT

Abstract

We present a formal theory of contextuality for a set of random variables grouped into different subsets (contexts) corresponding to different, mutually incompatible conditions. Within each context the random variables are jointly distributed, but across different contexts they are stochastically unrelated. The theory of contextuality is based on the analysis of the extent to which some of these random variables can be viewed as preserving their identity across different contexts when one considers all possible joint distributions imposed on the entire set of the random variables. We illustrate the theory on three systems of traditional interest in quantum physics (and also in non-physical, e.g., behavioral studies). These are systems of the Klyachko–Can–Binicioglu–Shumovsky-type, Einstein–Podolsky–Rosen–Bell-type, and Suppes–Zanotti–Leggett–Garg-type. Listed in this order, each of them is formally a special case of the previous one. For each of them we derive necessary and sufficient conditions for contextuality while allowing for experimental errors and contextual biases or signaling. Based on the same principles that underly these derivations we also propose a measure for the degree of contextuality and compute it for the three systems in question.

Article written by

editor

Please comment with your real name using good manners.

Leave a Reply

You must be logged in to post a comment.