Latest Papers on Quantum Foundations - Updated Daily by IJQF

Authors: N.S. Kirsanov, A.V. Lebedev, I.A. Sadovskyy, M.V. Suslov, V.M. Vinokur, G. Blatter, G.B. Lesovik

The Second Law of Thermodynamics states that temporal evolution of an isolated system occurs with non-diminishing entropy. In quantum realm, this holds for energy-isolated systems the evolution of which is described by the so-called unital quantum channel. The entropy of a system evolving in a non-unital quantum channel can, in principle, decrease. We formulate a general criterion of unitality for the evolution of a quantum system, enabling a simple and rigorous approach for finding and identifying the processes accompanied by decreasing entropy in energy-isolated systems. We discuss two examples illustrating our findings, the quantum Maxwell demon and heating-cooling process within a two-qubit system.

Saatsi, Juha (2017) Scientific Realism meets Metaphysics of Quantum Mechanics. [Preprint]
Roy, Sudipto (2017) Theoretical Models of the Brans-Dicke Parameter for Time Independent Deceleration Parameters. International Journal of Mathematics and Physical Sciences Research, 5 (1). pp. 94-101. ISSN 2348-5736
Schroeren, David (2018) The Metaphysics of Invariance. [Preprint]
Menon, Tushar and Linnemann, Niels and Read, James (2018) Clocks and Chronogeometry: Rotating Spacetimes and the relativistic null hypothesis. [Preprint]


A number of recent theories of quantum gravity lack a one-dimensional structure of ordered temporal instants. Instead, according to many of these views, our world is either best represented as a single three-dimensional object, or as a configuration space composed of such three-dimensional objects, none of which bear temporal relations to one another. Such theories will be empirically self-refuting unless they can accommodate the existence of conscious beings capable of representation. For if representation itself is impossible in a timeless world, then no being in such a world could entertain the thought that a timeless theory is true, let alone believe such a theory or rationally believe it. This paper investigates the options for understanding representation in a three-dimensional, timeless, world. Ultimately it concludes that the only viable option is one according to which representation is taken to be deeply non-naturalistic. Ironically then we are left with two seemingly very unattractive options. Either a very naturalistic motivation—taking seriously a live view in fundamental physics—leads us to a very non-naturalistic view of the mental, or else views in the philosophy of mind partly dictate what is an acceptable theory in physics.

Manchak, JB and Weatherall, James Owen (2018) (Information) Paradox Regained? A Brief Comment on Maudlin on Black Hole Information Loss. [Preprint]
Inexplicable lab results may be telling us we’re on the cusp of a new scientific paradigm

--

show enclosure

(; 0.09 MB)

Authors: Ding-fang Zeng

The goal of looking for errors in the information missing reasoning is not to find the error itself, but to find the hints of quantum gravitation theories which can unify general relativity and quantum mechanics harmonically. Basing on works [NPB917,178] and [NPB930,533], we provide in this paper a very clear and concrete answer to the title question and speculate its relevance with string theory counterparts and the gravitational wave phenomenological physics.

Authors: R. Rossi Jr., Leonardo A. M. Souza

Bell inequalities or Bell-like experiments are supposed to test hidden variable theories based on three intuitive assumptions: determinism, locality and measurement independence. If one of the assumptions of Bell inequality is properly relaxed, the probability distribution of the singlet state, for example, can be reproduced by a hidden variable model. Models that deal with the relaxation of some condition above, with more than one hidden variable, have been studied in the literature nowadays. In this work the relation between the number of hidden variables and the degree of relaxation necessary to reproduce the singlet correlations is investigated. For the examples studied, it is shown that the increase of the number of hidden variables does not allow for more efficiency in the reproduction of quantum correlations.

Authors: Luca Mancino, Marco Sbroscia, Emanuele Roccia, Ilaria Gianani, Valeria Cimini, Mauro Paternostro, Marco Barbieri

The emergence of realistic properties is a key problem in understanding the quantum-to-classical transition. In this respect, measurements represent a way to interface quantum systems with the macroscopic world: these can be driven in the weak regime, where a reduced back-action can be imparted by choosing meter states able to extract different amounts of information. Here we explore the implications of such weak measurement for the variation of realistic properties of two-level quantum systems pre- and post-measurement, and extend our investigations to the case of open systems implementing the measurements.

Authors: Xiao-Kan Guo, Qing-yu Cai

The back reactions of Hawking radiation allow nontrivial correlations between consecutive Hawking quanta, which gives a possible way to resolving the paradox of black hole information loss known as the hidden massenger method. In a recent work of Ma et al [arXiv:1711.10704], this method is enhanced by a general derivation using small deviations of the states of Hawking quanta off canonical typicality. In this paper, we use this typicality argument to study the effects of back reactions on quantum geometries described by spin network states, and discuss the viability of entropy conservation in loop quantum gravity. We find that such back reactions lead to small area deformations of quantum geometries including those of quantum black holes. This shows that the hidden-messenger method is still viable in loop quantum gravity, which is a first step towards resolving the paradox of black hole information loss in quantum gravity.

Why we perceive the passage of time is one of the biggest mysteries of physics. Now we could have found its source – in our most potent theory of reality
Roy, Sudipto (2018) Time Variation of the Matter Content of the Expanding Universe in the Framework of Brans-Dicke Theory. [Preprint]
Gao, Shan (2018) Does protective measurement imply the reality of the wave function? [Preprint]
Liu, Chuang (2018) Infinte Idealization and Contextual Realism. Synthese. ISSN 1573-0964
Physicists from Einstein to Hawking tried and failed to unite gravity and quantum theory. Now we have hints of a better – but not so beautiful – answer  

Author(s): Eric G. Cavalcanti

A new analysis puts quantum nonlocality and contextuality—key resources for quantum computing—on equal theoretical footing as violations of classical causality.

[Phys. Rev. X 8, 021018] Published Fri Apr 13, 2018

Author(s): Q. Duprey and A. Matzkin

We discuss the preceding Comment [D. Sokolovski, preceding Comment, Phys. Rev. A 97, 046102 (2018)] and conclude that the arguments given there against the relevance of null weak values as representing the absence of a system property are not compelling. We give an example in which the transition ma...

[Phys. Rev. A 97, 046103] Published Fri Apr 13, 2018

Author(s): D. Sokolovski

In a recent paper [Phys. Rev. A 95, 032110 (2017)], Duprey and Matzkin investigated the meaning of vanishing weak values and their role in the retrodiction of the past of a preselected and postselected quantum system in the presence of interference. Here we argue that any proposition regarding the w...

[Phys. Rev. A 97, 046102] Published Fri Apr 13, 2018

Authors: Tien D. Kieu

The Principle of Unattainability rules out the attainment of absolute zero temperature by any finite physical means, no matter how idealised they could be. Nevertheless, we clarify that the Third Law of Thermodynamics, as defined by Nernst's heat theorem statement, is distinct from the Principle of Unattainability in the sense that the Third Law is mathematically equivalent only to the unattainability of absolute zero temperature by {\em quasi-static adiabatic} processes. This thus leaves open the possibility of attainability of absolute zero, without violating the Third Law, by non-adiabatic means. Such a means may be provided in principle and in particular by projective measurements in quantum mechanics. This connection also establishes some intimate relationship between the postulate of projective measurement and the Principle of Unattainability.

Author(s): Hyukjoon Kwon, Hyunseok Jeong, David Jennings, Benjamin Yadin, and M. S. Kim

In thermodynamics, quantum coherences—superpositions between energy eigenstates—behave in distinctly nonclassical ways. Here we describe how thermodynamic coherence splits into two kinds—“internal” coherence that admits an energetic value in terms of thermodynamic work, and “external” coherence that...

[Phys. Rev. Lett. 120, 150602] Published Thu Apr 12, 2018

Manchak, JB (2018) Space and Time. [Preprint]

Author(s): Philippe Faist and Renato Renner

A new theoretical analysis derives a precise fundamental lower limit to the work cost for processing information in any type of system, thereby cornering a new microscopic formulation of thermodynamics and shedding light on how far the second law can be applied.

[Phys. Rev. X 8, 021011] Published Tue Apr 10, 2018

Dewar, Neil (2018) Algebraic structuralism. [Preprint]

Author(s): Maximilian Schlosshauer

This paper presents a proof-of-principle scheme for the protective measurement of a single photon. In this scheme, the photon is looped arbitrarily many times through an optical stage that implements a weak measurement of a polarization observable followed by a strong measurement protecting the stat...

[Phys. Rev. A 97, 042104] Published Mon Apr 09, 2018

We address what we consider to be the main points of disagreement by showing that (a) scientific plausibility (or lack thereof) is a weak argument in the face of empirical data, (b) the statistical methods we used were sound according to at least one of several possible statistical positions, and (c) the potential physical mechanisms underlying precognition could include quantum biological phenomena. We close with a discussion of what we believe is an unfortunate but currently dominant tendency to focus on reducing Type-I statistical errors without balancing that approach by also paying attention to the potential for Type-II errors. (PsycINFO Database Record (c) 2018 APA, all rights reserved)

Decoding Reality



on 2018-4-08 12:00am GMT
Author: Vlatko Vedral
ISBN: 9780198815433
Binding: Paperback
Publication Date: 08 April 2018
Price: $12.95

Authors: T.N.Palmer

Hardy's axiomatic approach to the quantum theory of discrete Hilbert Spaces reveals that just one principle distinguishes it from classical probability theory: there should be continuous (and hence infinitesimal) reversible transformations between any pair of pure states - the single word `continuous' giving rise to quantum theory. This raises the question: Can one formulate a finite theory of qubit physics (FTQP) - necessary different from quantum theory - which can replicate the tested predictions of quantum theory of qubits to experimental accuracy? Here we show that an FTQP based on complex Hilbert vectors with rational squared amplitudes and rational phase angles is possible, provided the metric of state space, $g_p$, is based on $p$-adic rather than Euclidean distance. A key number theorem describing an incompatibility between rational angles and rational cosines accounts for quantum complementarity in this FTQP. Dynamical evolution is described by a deterministic mapping on the set of $p$-adic integers and the measurement problem is trivially solved in terms of a nonlinear clustering of states in state space. Based on $g_p$, causal deterministic analyses of quantum interferometry, GHZ, the sequential Stern-Gerlach experiment, Leggett-Garg and the Bell Theorem are described. The close relationship between fractals and $p$-adic integers suggest the existence of a primal fractal-like 'invariant set' geometry $I_U$ in cosmological state space, from which space-time and the laws of physics in space-time are emergent.

Authors: Shiva Meucci

The need for revolution in modern physics is a well known and often broached subject, however, the precision and success of current models narrows the possible changes to such a great degree that there appears to be no major change possible. We provide herein, the first step toward a possible solution to this paradox via reinterpretation of the conceptual-theoretical framework while still preserving the modern art and tools in an unaltered form. This redivision of concepts and redistribution of the data can revolutionize expectations of new experimental outcomes. This major change within finely tuned constraints is made possible by the fact that numerous mathematically equivalent theories were direct precursors to, and contemporaneous with, the modern interpretations. In this first of a series of papers, historical investigation of the conceptual lineage of modern theory reveals points of exacting overlap in physical theories which, while now considered cross discipline, originally split from a common source and can be reintegrated as a singular science again. This revival of an older associative hierarchy, combined with modern insights, can open new avenues for investigation. This reintegration of cross-disciplinary theories and tools is defined as the Neoclassical Interpretation.

Wuthrich, Christian (2018) The emergence of space and time. [Preprint]
We thought only fools messed with the cast-iron laws of thermodynamics – but quantum trickery is rewriting the rulebook, says physicist Vlatko Vedral


We review connections between the metric of spacetime and the quantum fluctuations of fields. We start with the finding that the spacetime metric can be expressed entirely in terms of the 2-point correlator of the fluctuations of quantum fields. We then discuss the open question whether the knowledge of only the spectra of the quantum fluctuations of fields also suffices to determine the spacetime metric. This question is of interest because spectra are geometric invariants and their quantization would, therefore, have the benefit of not requiring the modding out of diffeomorphisms. Further, we discuss the fact that spacetime at the Planck scale need not necessarily be either discrete or continuous. Instead, results from information theory show that spacetime may be simultaneously discrete and continuous in the same way that information can. Finally, we review the recent finding that a covariant natural ultraviolet cutoff at the Planck scale implies a signature in the cosmic microwave background (CMB) that may become observable.


The Horizon Quantum Mechanics is an approach that allows one to analyse the gravitational radius of spherically symmetric systems and compute the probability that a given quantum state is a black hole. We first review the (global) formalism and show how it reproduces a gravitationally inspired GUP relation. This results leads to unacceptably large fluctuations in the horizon size of astrophysical black holes if one insists in describing them as (smeared) central singularities. On the other hand, if they are extended systems, like in the corpuscular models, no such issue arises and one can in fact extend the formalism to include asymptotic mass and angular momentum with the harmonic model of rotating corpuscular black holes. The Horizon Quantum Mechanics then shows that, in simple configurations, the appearance of the inner horizon is suppressed and extremal (macroscopic) geometries seem disfavoured.

Pitts, J. Brian (2017) Underconsideration in Space-time and Particle Physics. [Preprint]

From meat to mind: the root of consciousness

From meat to mind: the root of consciousness, Published online: 04 April 2018; doi:10.1038/d41586-018-03920-z

Douwe Draaisma enjoys Michael Gazzaniga’s exploration of the biological basis of consciousness.


I show why old and new claims on the role of counterfactual reasoning for the EPR argument and Bell’s theorem are unjustified: once the logical relation between locality and counterfactual reasoning is clarified, the use of the latter does no harm and the nonlocality result can well follow from the EPR premises. To show why, after emphasizing the role of incompleteness arguments that Einstein developed before the EPR paper, I critically review more recent claims that equate the use of counterfactual reasoning with the assumption of a strong form of realism and argue that such claims are untenable.


The hypothesis (Sparenberg et al. in EPJ Web Conf 58:01016, [1]. that the particular linear tracks appearing in the measurement of a spherically-emitting radioactive source in a cloud chamber are determined by the (random) positions of atoms or molecules inside the chamber is further explored in the framework of a recently established one-dimensional model (Carlone et al. Comm Comput Phys 18:247, [2]. In this model, meshes of localized spins 1/2 play the role of the cloud-chamber atoms and the spherical wave is replaced by a linear superposition of two wave packets moving from the origin to the left and to the right, evolving deterministically according to the Schrödinger equation. We first revisit these results using a time-dependent approach, where the wave packets impinge on a symmetric two-sided detector. We discuss the evolution of the wave function in the configuration space and stress the interest of a non-symmetric detector in a quantum-measurement perspective. Next we use a time-independent approach to study the scattering of a plane wave on a single-sided detector. Preliminary results are obtained, analytically for the single-spin case and numerically for up to 8 spins. They show that the spin-excitation probabilities are sometimes very sensitive to the parameters of the model, which corroborates the idea that the measurement result could be determined by the atom positions. The possible origin of decoherence and entropy increase in future models is finally discussed.

van Dongen, Jeroen (2017) The Epistemic Virtues of the Virtuous Theorist: On Albert Einstein and His Autobiography. Epistemic Virtues in the Sciences and the Humanities. Edited by Jeroen van Dongen and Herman Paul (Boston Studies in the Philosophy and History of Science, Vol. 321). pp. 63-77.
Walter, Scott A. (2018) Figures of light in the early history of relativity (1905-1914). [Preprint]
Oldofredi, Andrea (2018) No-Go Theorems and the Foundations of Quantum Physics. [Preprint]
Oldofredi, Andrea (2018) Particle Creation and Annihilation: Two Bohmian Approaches. [Preprint]

Authors: Eugenio Bianchi, Hal M. Haggard

Due to quantum fluctuations, a non-rotating black hole should be the average over an ensemble of black hole geometries with angular momentum. This observation invites the question: Is the average of timelike singularities really spacelike? We use the Bekenstein-Hawking entropy formula to introduce a microcanonical ensemble for spin fluctuations and argue that the onset of quantum gravity is always spacelike. We also hint at the possibility of an observational test.

Authors: Song Ming Du, Yanbei Chen

It has been speculated that quantum gravity corrections may lead to modifications to space-time geometry near black hole horizons. Such structures may cause reflections to gravitational waves, causing {\it echoes} that follow the main gravitational waves from binary black hole coalescence. We show that such echoes, if exist, will give rise to a stochastic gravitational-wave background, which is very substantial if the near-horizon structure has a near unity reflectivity for gravitational waves, readily detectable by Advanced LIGO. In case reflectivity is much less than unity, the background will mainly be arising from the first echo, with a level proportional to the power reflectivity of the near-horizon structure, but robust against uncertainties in the location of the structure --- as long as it is very close to the horizon. Sensitivity of third-generation detectors allows the detection of a background that corresponds to power reflectivity $\sim 10^{-3}$, if the uncertainties in the binary black-hole merger rate can be removed. We note that the echoes do alter the $f^{2/3}$ power law of the background spectra at low frequencies, which is rather robust against the uncertainties.

The Quantum Shorts competition invited stories incorporating the laws of quantum mechanics

--

show enclosure

(; 0.64 MB)

Volume 4, Issue 2, pages 204-209

Peter J. Lewis [Show Biography]

Peter Lewis studied physics at Oxford University and philosophy at the University of California, Irvine. He has taught philosophy at Texas Tech University and the University of Miami, and he is currently Professor of Philosophy at Dartmouth College. He has been a visiting scholar at Hong Kong University, the University of Sydney, and Durham University. His work concentrates on the foundations of quantum mechanics, but he has also published on scientific realism and on the epistemology of self-locating belief.

This is a review of Shan Gao’s book The Meaning of the Wave Function: In Search of the Ontology of Quantum Mechanics (Cambridge University Press, 2017).
Full Text Download (305k) | View Submission Post

Volume 4, Issue 1, pages 128-141

Ruth E. Kastner [Show Biography]

Ruth E. Kastner earned her M.S. in Physics and Ph.D. in Philosophy (History and Philosophy of Science) and the University of Maryland, College Park (1999). She has taught a variety of philosophy and physics courses throughout the Baltimore-Washington corridor, and currently is a member of the Foundations of Physics group at UMCP. She is also an Affiliate of the physics department at the SUNY Albany campus. She specializes in time-symmetry and the Transactional Interpretation (TI) of quantum mechanics, and in particular has extended the original TI of John Cramer to the relativistic domain. Her interests and publications include topics in thermodynamics and statistical mechanics, quantum ontology, counterfactuals, spacetime emergence, and free will. She is the author of two books: The Transactional Interpretation of Quantum Mechanics: The Reality of Possibility (Cambridge, 2012) and Understanding Our Unseen Reality: Solving Quantum Riddles (Imperial College Press, 2015). She is also an Editor of the collected volume Quantum Structural Studies (World Scientific, 2016).

I attempt to clear up some misunderstandings in a recent paper by Marchildon regarding the Relativistic Transactional Interpretation (RTI), showing that the negative conclusions therein regarding the transactional model are unfounded.

Full Text Download (308k) | View Submission Post

Volume 4, Issue 2, pages 173-198

Mohammed Sanduk [Show Biography]

Mohammed Sanduk is an Iraqi born British physicist. He was educated at University of Baghdad and University of Manchester. Before attending his undergraduate study, he pub-lished a book in particle physics entitled “Mesons”. Sanduk has worked in industry and academia, and his last post in Iraq was head of the Laser and Opto-electronics Engineering department at Nahrain University in Baghdad. Owing to his interest in the philosophy of science, and he was a member of the academic staff of Pontifical Babel College for Philosophy. Sanduk is working with the department of chemical and process engineering at the University of Surrey. Sanduk is interested in transport of charged particles, Magnetohydro-dynamics, and the renewable energy technology. In addition to that, Sanduk is interested in the foundation of Quantum mechanics, and the philosophy of science & technology.

Based on de Broglie’s wave hypothesis and the covariant ether, the Three Wave Hypothesis (TWH) has been proposed and developed in the last century. In 2007, the author found that the TWH may be attributed to a kinematical classical system of two perpendicular rolling circles. In 2012, the author showed that the position vector of a point in a model of two rolling circles in plane can be transformed to a complex vector under a proposed effect of partial observation. In the present project, this concept of transformation is developed to be a lab observation concept. Under this transformation of the lab observer, it is found that velocity equation of the motion of the point is transformed to an equation analogising the relativistic quantum mechanics equation (Dirac equation). Many other analogies has been found, and are listed in a comparison table. The analogy tries to explain the entanglement within the scope of the transformation. These analogies may suggest that both quantum mechanics and special relativity are emergent, both of them are unified, and of the same origin. The similarities suggest analogies and propose questions of interpretation for the standard quantum theory, without any possible causal claims.

Full Text Download (1511k) | View Submission Post

Volume 4, Issue 2, pages 158-172

R. E. Kastner [Show Biography], Stuart Kauffman [Show Biography] and Michael Epperson [Show Biography]

Ruth E. Kastner earned her M.S. in Physics and Ph.D. in Philosophy (History and Philosophy of Science) and the University of Maryland, College Park (1999). She has taught a variety of philosophy and physics courses throughout the Baltimore-Washington corridor, and currently is a member of the Foundations of Physics group at UMCP. She is also an Affiliate of the physics department at the SUNY Albany campus. She specializes in time-symmetry and the Transactional Interpretation (TI) of quantum mechanics, and in particular has extended the original TI of John Cramer to the relativistic domain. Her interests and publications include topics in thermodynamics and statistical mechanics, quantum ontology, counterfactuals, spacetime emergence, and free will. She is the author of two books: The Transactional Interpretation of Quantum Mechanics: The Reality of Possibility (Cambridge, 2012) and Understanding Our Unseen Reality: Solving Quantum Riddles (Imperial College Press, 2015). She is also an Editor of the collected volume Quantum Structural Studies (World Scientific, 2016).

It is argued that quantum theory is best understood as requiring an ontological dualism of res extensa and res potentia, where the latter is understood per Heisenberg’s original proposal, and the former is roughly equivalent to Descartes’ ‘extended substance.’ However, this is not a dualism of mutually exclusive substances in the classical Cartesian sense, and therefore does not inherit the infamous ‘mind-body’ problem. Rather, res potentia and res extensa are understood as mutually implicative ontological extants that serve to explain the key conceptual challenges of quantum theory; in particular, nonlocality, entanglement, null measurements, and wave function collapse. It is shown that a natural account of these quantum perplexities emerges, along with a need to reassess our usual ontological commitments involving the nature of space and time.

Full Text Download (827k) | View Submission Post


I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system’s matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind–Horowitz–Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.’s ‘canonical typicality’ result to systems which are not necessarily small.

Castellani, Elena and De Haro, Sebastian (2018) Duality, Fundamentality, and Emergence. [Preprint]

Entanglement of purification through holographic duality

Entanglement of purification through holographic duality, Published online: 26 March 2018; doi:10.1038/s41567-018-0075-2

A quantity that connects quantum information and gravity in the light of gauge/gravity correspondence is pointed out, leading to interesting properties of the entanglement of purification predicted in the holographic theories.
Human Brain Mapping, EarlyView.

Authors: R. E. Kastner, Stuart Kauffman, Michael Epperson

It is argued that quantum theory is best understood as requiring an ontological duality of res extensa and res potentia, where the latter is understood per Heisenberg's original proposal, and the former is roughly equivalent to Descartes' 'extended substance.' However, this is not a dualism of mutually exclusive substances in the classical Cartesian sense, and therefore does not inherit the infamous 'mind-body' problem. Rather, res potentia and res extensa are proposed as mutually implicative ontological extants that serve to explain the key conceptual challenges of quantum theory; in particular, nonlocality, entanglement, null measurements, and wave function collapse. It is shown that a natural account of these quantum perplexities emerges, along with a need to reassess our usual ontological commitments involving the nature of space and time.


We use the path integral form of quantum electrodynamics (QED) to show that a causal classical limit to QED can be derived by functionally integrating over the photon coordinates, starting from an initial photon vacuum and ending in a final coherent radiation state driven by the anticipated classical charged particle trajectories. The resulting charged particle transition amplitude depends only on particle coordinates. When the \( {\hbar} \, \to \,0 \) limit is taken, only those particle paths that are not constrained by the final radiation state are varied. These results demonstrate that the collapse from an infinity of charged particle paths, a path integral description, to causally interacting classical trajectories, a stationary-action description, is critically dependent on including final coherent state radiation and maintaining the distinction between particle paths that are free to vary and those trajectories that can be monitored by the final state radiation.

Hugh Everett, creator of this radical idea during a drunken debate more than 60 years ago, died before he could see his theory gain widespread popularity  

--

show enclosure

(; 0.69 MB)

Author(s): Esteban Castro-Ruiz, Flaminia Giacomini, and Časlav Brukner

A new theoretical framework describes the dynamics of causal structures in quantum mechanics and finds that a scenario where the order of events is definite cannot transform into one where the order of events is not well defined, and vice versa, if the dynamics is continuous and reversible.

[Phys. Rev. X 8, 011047] Published Wed Mar 21, 2018

American Journal of Physics, Volume 86, Issue 4, Page 280-283, April 2018.


Causal sets (or causets) are a particular class of partially ordered sets, which are proposed as basic models of discrete space-time, specially in the field of quantum gravity. In this context, we show the existence of temporal foliations for any causal set, or more generally, for a causal space. Moreover, we show that (order-preserving) automorphisms of a large class of infinite causal sets fall into two classes 1) Automorphisms of spacelike hypersurfaces in some given foliation (i.e. spacelike automorphisms), or 2) Translations in time. More generally, we show that for any automorphism \(\Phi \) of a generic causal set \({\mathcal {C}}\) , there exists a partition of \({\mathcal {C}}\) into finitely many subcausets, on each of which (1) or (2) above hold. These subcausets can be assumed connected if, in addition, there are enough distinct orbits under \(\Phi \) .


The relational interpretation of quantum mechanics proposes to solve the measurement problem and reconcile completeness and locality of quantum mechanics by postulating relativity to the observer for events and facts, instead of an absolute “view from nowhere”. The aim of this paper is to clarify this interpretation, and in particular, one of its central claims concerning the possibility for an observer to have knowledge about other observer’s events. I consider three possible readings of this claim (deflationist, relationist and relativist), and develop the most promising one, relativism, to show how it fares when confronted with the traditional interpretative problems of quantum mechanics. Although it provides answers to some problems, I claim that there is currently no adapted locality criterion to evaluate whether the resulting interpretation is local or not.

Author(s): John Sous and Edward Grant

We argue that the quenched ultracold plasma presents an experimental platform for studying the quantum many-body physics of disordered systems in the long-time and finite energy-density limits. We consider an experiment that quenches a plasma of nitric oxide to an ultracold system of Rydberg molecul...

[Phys. Rev. Lett. 120, 110601] Published Wed Mar 14, 2018


There are quantum solutions for computational problems that make use of interference at some stage in the algorithm. These stages can be mapped into the physical setting of a single particle travelling through a many-armed interferometer. There has been recent foundational interest in theories beyond quantum theory. Here, we present a generalized formulation of computation in the context of a many-armed interferometer, and explore how theories can differ from quantum theory and still perform distributed calculations in this set-up. We shall see that quaternionic quantum theory proves a suitable candidate, whereas box-world does not. We also find that a classical hidden variable model first presented by Spekkens (Phys Rev A 75(3): 32100, 2007) can also be used for this type of computation due to the epistemic restriction placed on the hidden variable.

Publication date: Available online 7 March 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Bryan W. Roberts
How should we characterise the observable aspects of quantum theory? This paper argues that philosophers and physicists should jettison a standard dogma: that observables must be represented by self-adjoint or Hermitian operators. Four classes of non-standard observables are identified: normal operators, symmetric operators, real-spectrum operators, and none of these. The philosophical and physical implications of each are explored.

Publication date: 26 April 2018
Source:Physics Letters A, Volume 382, Issue 16
Author(s): Debarshi Das, Shiladitya Mal, Dipankar Home
Generalized quantum measurements with two outcomes are fully characterized by two real parameters, dubbed as sharpness parameter and biasedness parameter and they can be linked with different aspects of the experimental setup. It is known that sharpness parameter characterizes precision of the measurements and decreasing sharpness parameter of the measurements reduces the possibility of probing quantum features like quantum mechanical (QM) violation of local-realism (LR) or macro-realism (MR). Here we investigate the effect of biasedness together with that of sharpness of measurements and find a trade-off between those two parameters in the context of probing QM violations of LR and MR. Interestingly, we also find that the above mentioned trade-off is more robust in the latter case.

Author(s): J. Bengtsson, M. Nilsson Tengstrand, A. Wacker, P. Samuelsson, M. Ueda, H. Linke, and S. M. Reimann

We show that a quantum Szilard engine containing many bosons with attractive interactions enhances the conversion between information and work. Using an ab initio approach to the full quantum-mechanical many-body problem, we find that the average work output increases significantly for a larger numb...

[Phys. Rev. Lett. 120, 100601] Published Fri Mar 09, 2018


In the history of quantum physics several no-go theorems have been proved, and many of them have played a central role in the development of the theory, such as Bell’s or the Kochen–Specker theorem. A recent paper by F. Laudisa has raised reasonable doubts concerning the strategy followed in proving some of these results, since they rely on the standard framework of quantum mechanics, a theory that presents several ontological problems. The aim of this paper is twofold: on the one hand, I intend to reinforce Laudisa’s methodological point by critically discussing Malament’s theorem in the context of the philosophical foundation of quantum field theory; secondly, I rehabilitate Gisin’s theorem showing that Laudisa’s concerns do not apply to it.


This study attempts to spell out more explicitly than has been done previously the connection between two types of formal correspondence that arise in the study of quantum–classical relations: one the one hand, deformation quantization and the associated continuity between quantum and classical algebras of observables in the limit \(\hbar \rightarrow 0\) , and, on the other, a certain generalization of Ehrenfest’s Theorem and the result that expectation values of position and momentum evolve approximately classically for narrow wave packet states. While deformation quantization establishes a direct continuity between the abstract algebras of quantum and classical observables, the latter result makes in-eliminable reference to the quantum and classical state spaces on which these structures act—specifically, via restriction to narrow wave packet states. Here, we describe a certain geometrical re-formulation and extension of the result that expectation values evolve approximately classically for narrow wave packet states, which relies essentially on the postulates of deformation quantization, but describes a relationship between the actions of quantum and classical algebras and groups over their respective state spaces that is non-trivially distinct from deformation quantization. The goals of the discussion are partly pedagogical in that it aims to provide a clear, explicit synthesis of known results; however, the particular synthesis offered aspires to some novelty in its emphasis on a certain general type of mathematical and physical relationship between the state spaces of different models that represent the same physical system, and in the explicitness with which it details the above-mentioned connection between quantum and classical models.

Publication date: Available online 18 February 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): James Read
I consider the interrelations between two decision-theoretic approaches to probability which have been developed in the context of Everettian quantum mechanics: that due to Deutsch and Wallace on the one hand, and that due to Greaves and Myrvold on the other. Having made precise these interrelations, I defend Everettian decision theory against recent objections raised by Dawid and Thébault. Finally, I discuss the import of these results from decision theory for the rationality of an Everettian agent's betting in accordance with the Born rule.


This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called “microscopic theory”, applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen–Specker–Bell theorem and Gleason’s theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

American Journal of Physics, Volume 86, Issue 3, Page 237-239, March 2018.
American Journal of Physics, Volume 86, Issue 3, Page 201-205, March 2018.


The usual representation of quantum algorithms, limited to the process of solving the problem, is physically incomplete. We complete it in three steps: (i) extending the representation to the process of setting the problem, (ii) relativizing the extended representation to the problem solver to whom the problem setting must be concealed, and (iii) symmetrizing the relativized representation for time reversal to represent the reversibility of the underlying physical process. The third steps projects the input state of the representation, where the problem solver is completely ignorant of the setting and thus the solution of the problem, on one where she knows half solution (half of the information specifying it when the solution is an unstructured bit string). Completing the physical representation shows that the number of computation steps (oracle queries) required to solve any oracle problem in an optimal quantum way should be that of a classical algorithm endowed with the advanced knowledge of half solution.

Author(s): P. R. Dieguez and R. M. Angelo

Relations among the concepts of measurement, information, and physical reality are established with the quantification of the degree of reality of an observable for a given preparation. It is found that for pure states the entanglement with the apparatus precisely determines the amount by which the reality of the monitored observable increases.

[Phys. Rev. A 97, 022107] Published Fri Feb 16, 2018


It is generally argued that if the wave-function in the de Broglie–Bohm theory is a physical field, it must be a field in configuration space. Nevertheless, it is possible to interpret the wave-function as a multi-field in three-dimensional space. This approach hasn’t received the attention yet it really deserves. The aim of this paper is threefold: first, we show that the wave-function is naturally and straightforwardly construed as a multi-field; second, we show why this interpretation is superior to other interpretations discussed in the literature; third, we clarify common misconceptions.

Author(s): Robert B. Griffiths

While much of the technical analysis in the preceding Comment is correct, in the end it confirms the conclusion reached in my previous work [Phys. Rev. A 94, 032115 (2016)]: A consistent histories analysis provides no support for the claim of counterfactual quantum communication put forward by Salih...

[Phys. Rev. A 97, 026102] Published Fri Feb 09, 2018


A proposal is made for a fundamental theory, in which the history of the universe is constituted of diverse views of itself. Views are attributes of events, and the theory’s only be-ables; they comprise information about energy and momentum transferred to an event from its causal past. A dynamics is proposed for a universe constituted of views of events, which combines the energetic causal set dynamics with a potential energy based on a measure of the distinctiveness of the views, called the variety (Smolin in Found Phys 46(6):736–758, 2016). As in the real ensemble formulation of quantum mechanics (Barbour and Smolin in Variety, complexity and cosmology, arXiv: hep-th/9203041), quantum pure states are associated to ensembles of similar events; the quantum potential of Bohm then arises from the variety.


The formulation of quantum mechanics developed by Bohm, which can generate well-defined trajectories for the underlying particles in the theory, can equally well be applied to relativistic quantum field theories to generate dynamics for the underlying fields. However, it does not produce trajectories for the particles associated with these fields. Bell has shown that an extension of Bohm’s approach can be used to provide dynamics for the fermionic occupation numbers in a relativistic quantum field theory. In the present paper, Bell’s formulation is adopted and elaborated on, with a full account of all technical detail required to apply his approach to a bosonic quantum field theory on a lattice. This allows an explicit computation of (stochastic) trajectories for massive and massless particles in this theory. Also particle creation and annihilation, and their impact on particle propagation, is illustrated using this model.


We propose that observables in quantum theory are properly understood as representatives of symmetry-invariant quantities relating one system to another, the latter to be called a reference system. We provide a rigorous mathematical language to introduce and study quantum reference systems, showing that the orthodox “absolute” quantities are good representatives of observable relative quantities if the reference state is suitably localised. We use this relational formalism to critique the literature on the relationship between reference frames and superselection rules, settling a long-standing debate on the subject.

Author(s): Emily Adlam and Adrian Kent

Bob has a black box that emits a single pure state qudit which is, from his perspective, uniformly distributed. Alice wishes to give Bob evidence that she has knowledge about the emitted state while giving him little or no information about it. We show that zero-knowledge evidencing of such knowledg...

[Phys. Rev. Lett. 120, 050501] Published Tue Jan 30, 2018


In the standard formalism of quantum gravity, black holes appear to form statistical distributions of quantum states. Now, however, we can present a theory that yields pure quantum states. It shows how particles entering a black hole can generate firewalls, which however can be removed, replacing them by the ‘footprints’ they produce in the out-going particles. This procedure can preserve the quantum information stored inside and around the black hole. We then focus on a subtle but unavoidable modification of the topology of the Schwarzschild metric: antipodal identification of points on the horizon. If it is true that vacuum fluctuations include virtual black holes, then the structure of space-time is radically different from what is usually thought.

Author(s): Flavien Hirsch, Marco Túlio Quintino, and Nicolas Brunner

We discuss the connection between the incompatibility of quantum measurements, as captured by the notion of joint measurability, and the violation of Bell inequalities. Specifically, we explicitly present a given set of non-jointly-measurable positive-operator-value measures (POVMs) MA with the foll...

[Phys. Rev. A 97, 012129] Published Thu Jan 25, 2018

Author(s): V. S. Gomes and R. M. Angelo

Based on a recently proposed model of physical reality and an underlying criterion of nonlocality for contexts [A. L. O. Bilobran and R. M. Angelo, Europhys. Lett. 112, 40005 (2015)], we introduce a quantifier of realism-based nonlocality for bipartite quantum states, a concept that is profoundly di...

[Phys. Rev. A 97, 012123] Published Wed Jan 24, 2018

Publication date: Available online 20 January 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Carina E.A. Prunkl, Christopher G. Timpson
Recently, Cabello et al. (2016) claim to have proven the existence of an empirically verifiable difference between two broad classes of quantum interpretations. On the basis of three seemingly uncontentious assumptions, (i) the possibility of randomly selected measurements, (ii) the finiteness of a quantum system's memory, and (iii) the validity of Landauer's principle, and further, by applying computational mechanics to quantum processes, the authors arrive at the conclusion that some quantum interpretations (including central realist interpretations) are associated with an excess heat cost and are thereby untenable—or at least—that they can be distinguished empirically from their competitors by measuring the heat produced. Here, we provide an explicit counterexample to this claim and demonstrate that their surprising result can be traced back to a lack of distinction between system and external agent. By drawing the distinction carefully, we show that the resulting heat cost is fully accounted for in the external agent, thereby restoring the tenability of the quantum interpretations in question.

Author(s): Hyukjoon Kwon, Chae-Yeun Park, Kok Chuan Tan, Daekun Ahn, and Hyunseok Jeong

We investigate a measure of quantum coherence and its extension to quantify quantum macroscopicity. The coherence measure can also quantify the asymmetry of a quantum state with respect to a given group transformation. We then show that a weighted sum of asymmetry in each mode can be applied as a me...

[Phys. Rev. A 97, 012326] Published Tue Jan 23, 2018


In spite of being a well articulated proposal, the theory of quantum histories (TQH), in its different versions, suffers from certain difficulties that have been pointed out in the literature. Nevertheless, two facets of the proposal have not been sufficiently stressed. On the one hand, it is a non-collapse formalism that should be technically appropriate to supply descriptions based on quantum properties at different times. On the other hand, it intends to provide an interpretation of quantum mechanics that solves the traditional puzzles of the theory. In this article we spell out the main criticisms to TQH and classify them into two groups: theoretical and interpretive. Whereas the latter might be ignored if the TQH were considered as a quantum formalism with its minimum interpretation, the former seems to point toward technical difficulties that must be faced in a theoretically adequate proposal. Precisely with the purpose of solving these difficulties, we introduce a different perspective, called Formalism of Generalized Contexts or Formalism of Contextual Histories (FCH), which supplies a precise condition for consistently talking of quantum properties at different times without the theoretical shortcomings of the TQH.

Author(s): Miguel E. Rodriguez R.

Quantum mechanics in noncommutative space modifies the standard result of the Aharonov-Bohm effect for electrons and other recent quantum effects. Here we obtain the phase in noncommutative space for the Spavieri effect, a generalization of Aharonov-Bohm effect which involves a coherent superpositio...

[Phys. Rev. A 97, 012109] Published Fri Jan 12, 2018

Author(s): J. Tuziemski, P. Witas, and J. K. Korbicz

Broadly understood decoherence processes in quantum electrodynamics, induced by neglecting either the radiation [L. Landau, Z. Phys. 45, 430 (1927)] or the charged matter [N. Bohr and L. Rosenfeld, K. Danske Vidensk. Selsk, Math.-Fys. Medd. XII, 8 (1933)], have been studied from the dawn of the theo...

[Phys. Rev. A 97, 012110] Published Fri Jan 12, 2018

Author(s): Ivan Glasser, Nicola Pancotti, Moritz August, Ivan D. Rodriguez, and J. Ignacio Cirac

Two tools show great promise in approximating low-temperature, condensed-matter systems: Tensor-network states and artificial neural networks. A new analysis builds a bridge between these techniques, opening the way to a host of powerful approaches to understanding complex quantum systems.

[Phys. Rev. X 8, 011006] Published Thu Jan 11, 2018

Author(s): Bo-Bo Wei

Thermodynamics and information theory have been intimately related since the times of Maxwell and Boltzmann. Recently it was shown that the dissipated work in an arbitrary nonequilibrium process is related to the Rényi divergences between two states along the forward and reversed dynamics. Here we s...

[Phys. Rev. A 97, 012105] Published Tue Jan 09, 2018


General relativity cannot be formulated as a perturbatively renormalizable quantum field theory. An argument relying on the validity of the Bekenstein–Hawking entropy formula aims at dismissing gravity as non-renormalizable per se, against hopes (underlying programs such as Asymptotic Safety) that d-dimensional GR could turn out to have a non-perturbatively renormalizable d–dimensional quantum field theoretic formulation. In this note we discuss various forms of highly problematic semi-classical extrapolations assumed by both sides of the debate concerning what we call The Entropy Argument, and show that a large class of dimensional reduction scenarios leads to the blow-up of Bekenstein–Hawking entropy.


In this paper I argue that physics is, always was, and probably always will be voiceless with respect to tense and passage, and that, therefore, if, as I believe, tense and passage are the essence of time, physics’ contribution to our understanding of time can only be limited. The argument, in a nutshell, is that if "physics has no possibility of expression for the Now", to quote Einstein, then it cannot add anything to the study of tense and passage, and specifically, cannot add anything to the debate between deniers and affirmers of the existence or reality of tense and passage. Since relativity theory did not equip physics with a new language with which to speak of tense and passage, I draw the further conclusion that relativity theory has not generated the revolution to our conception of time that is attributed to it. In the last section I discuss the motivations behind the continued but misguided attempts to integrate tense into a relativistic setting, and assess the manners in which relativity theory has nevertheless enhanced, albeit indirectly, our understanding of tense and passage.


Quantum bit commitment is insecure in the standard non-relativistic quantum cryptographic framework, essentially because Alice can exploit quantum steering to defer making her commitment. Two assumptions in this framework are that: (a) Alice knows the ensembles of evidence E corresponding to either commitment; and (b) system E is quantum rather than classical. Here, we show how relaxing assumption (a) or (b) can render her malicious steering operation indeterminable or inexistent, respectively. Finally, we present a secure protocol that relaxes both assumptions in a quantum teleportation setting. Without appeal to an ontological framework, we argue that the protocol’s security entails the reality of the quantum state, provided retrocausality is excluded.


Initially motivated by their relevance in foundations of quantum mechanics and more recently by their applications in different contexts of quantum information science, violations of Bell inequalities have been extensively studied during the last years. In particular, an important effort has been made in order to quantify such Bell violations. Probabilistic techniques have been heavily used in this context with two different purposes. First, to quantify how common the phenomenon of Bell violations is; and second, to find large Bell violations in order to better understand the possibilities and limitations of this phenomenon. However, the strong mathematical content of these results has discouraged some of the potentially interested readers. The aim of the present work is to review some of the recent results in this direction by focusing on the main ideas and removing most of the technical details, to make the previous study more accessible to a wide audience.

Publication date: Available online 5 January 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Simon Friederich
The paper has three main aims: first, to make the asymptotic safety-based approach to quantum gravity better known to the community of researchers in the history and philosophy of modern physics by outlining its motivation, core tenets, and achievements so far; second, to preliminarily elucidate the finding that, according to the asymptotic safety scenario, space-time has fractal dimension 2 at short length scales; and, third, to provide the basis for a methodological appraisal of the asymptotic safety-based approach to quantum gravity in the light of the Kuhnian criteria of theory choice.