Weekly Papers on Quantum Foundations (5)

How to Build an Infinite Lottery Machine

We propose two novel alternative manners of doing so, `no-leaking’ (roughly that information gain causes disturbance) and `purity of cups’ (roughly the existence of entangled states). Interestingly, these turn out to be equivalent in any process theory with cups and caps. Additionally, we show how the standard purification postulate can then be seen as an immediate consequence of the symmetric purification postulate and purity of cups. Other tangential results concern the specific frameworks of generalised probabilistic theories (GPTs) and process theories (a.k.a.~CQM). Firstly, we provide a diagrammatic presentation of GPTs, which, henceforth, can now be subsumed under process theories. Secondly, we have now characterised necessary additional axioms for a process theory to correspond to the Hilbert space model, and in particular, that a `sharp dagger’ is indeed the right choice of a dagger structure.

Authors: Christian de RondeCésar Massri In this paper we attempt to consider quantum superpositions from the perspective of the logos categorical approach presented in [26]. We will argue that our approach allows us not only to better visualize the structural features of quantum superpositions providing an anschaulich content to all terms, but also to restore –through the intensive valuation of graphs and the notion of immanent power– an objective representation of what QM is really talking about. In particular, we will discuss how superpositions relate to some of the main features of the theory of quanta, namely, contextuality, paraconsistency, probability and measurement.
Authors: Werner A Hofer I revisit the reply of Bohr to Einstein. Bohr’s implication that there are no causes in atomic scale systems is, as a closer analysis reveals, not in line with the Copenhagen interpretation since it would contain a statement about reality. What Bohr should have written is that there are no causes in mathematics, which is universally acknowledged. The law of causality requires physical effects to be due to physical causes. For this reason any theoretical model which replaces physical causes by mathematical objects is creationism, that is, it creates physical objects out of mathematical elements. I show that this is the case for most of quantum mechanics.
Authors: Joseph Samuel The double slit experiment is iconic and widely used in classrooms to demonstrate the fundamental mystery of quantum physics. The puzzling feature is that the probability of an electron arriving at the detector when both slits are open is not the sum of the probabilities when the slits are open separately. The superposition principle of quantum mechanics tells us to add amplitudes rather than probabilities and this results in interference. This experiment defies our classical intuition that the probabilities of exclusive events add. In understanding the emergence of the classical world from the quantum one, there have been suggestions by Feynman, Diosi and Penrose that gravity is responsible for suppressing interference. This idea has been pursued in many different forms ever since, predominantly within Newtonian approaches to gravity. In this paper, we propose and theoretically analyse two `gedanken’ or thought experiments which lend strong support to the idea that gravity is responsible for decoherence. The first makes the point that thermal radiation can suppress interference. The second shows that in an accelerating frame, Unruh radiation plays the same role. Invoking the Einstein equivalence principle to relate acceleration to gravity, we support the view that gravity is responsible for decoherence.
Authors: Ehtibar N. DzhafarovJanne V. Kujala The Contextuality-by-Default theory is illustrated on contextuality analysis of the idealized double-slit experiment. The system of contextually labeled random variables describing this experiment forms a cyclic system of rank 4, formally the same as the system describing the EPR/Bohm paradigm (with signaling). Unlike the EPR/Bohm system, however, the double-slit experiment is always noncontextual, i.e., the context-dependence in it is entirely attributable to direct influences of contexts (closed-open arrangements of the slits) upon the random variables involved. The analysis presented is entirely within the framework of abstract classical probability theory (with multiple domain probability spaces). The only physical constraint used in the analysis is that a particle cannot reach a detector through a closed slit. The noncontextuality of the double-slit system does not generalize to systems describing experiments with more than two slits: an example shows that a triple-slit system may very well be contextual.
Authors: Jason PollackAshmeet Singh Field theories place one or more degrees of freedom at every point in space. Hilbert spaces describing quantum field theories, or their finite-dimensional discretizations on lattices, therefore have large amounts of structure: they are isomorphic to the tensor product of a smaller Hilbert space for each lattice site or point in space. Local field theories respecting this structure have interactions which preferentially couple nearby points. The emergence of classicality through decoherence relies on this framework of tensor-product decomposition and local interactions. We explore the emergence of such lattice structure from Hilbert-space considerations alone. We point out that the vast majority of finite-dimensional Hilbert spaces cannot be isomorphic to the tensor product of Hilbert-space subfactors that describes a lattice theory. A generic Hilbert space can only be split into a direct sum corresponding to a basis of state vectors spanning the Hilbert space; we consider setups in which the direct sum is naturally decomposed into two pieces. We define a notion of direct-sum locality which characterizes states and decompositions compatible with Hamiltonian time evolution. We illustrate these notions for a toy model that is the finite-dimensional discretization of the quantum-mechanical double-well potential. We discuss their relevance in cosmology and field theory, especially for theories which describe a landscape of vacua with different spacetime geometries.
ROVELLI, Carlo (2017) “Space is blue and birds fly through it”. [Preprint]
Author(s): Emily Adlam and Adrian Kent Bob has a black box that emits a single pure state qudit which is, from his perspective, uniformly distributed. Alice wishes to give Bob evidence that she has knowledge about the emitted state while giving him little or no information about it. We show that zero-knowledge evidencing of such knowledg… [Phys. Rev. Lett. 120, 050501] Published Tue Jan 30, 2018

Abstract

In the standard formalism of quantum gravity, black holes appear to form statistical distributions of quantum states. Now, however, we can present a theory that yields pure quantum states. It shows how particles entering a black hole can generate firewalls, which however can be removed, replacing them by the ‘footprints’ they produce in the out-going particles. This procedure can preserve the quantum information stored inside and around the black hole. We then focus on a subtle but unavoidable modification of the topology of the Schwarzschild metric: antipodal identification of points on the horizon. If it is true that vacuum fluctuations include virtual black holes, then the structure of space-time is radically different from what is usually thought.

De Haro, Sebastian (2018) The Heuristic Function of Duality. [Preprint]
Higashi, Katsuaki (2018) A no-go result on common cause approaches via Hardy’s paradox. [Preprint]
Juan, Villacrés (2018) Ontological Motivation in Obtaining Certain Quantum Equations: A Case for Panexperientialism. [Preprint]
Healey, Richard (2018) Pragmatist Quantum Realism. [Preprint]

Article written by