Weekly Papers on Quantum Foundations (13)

Grinbaum, Alexei (2017) The Effectiveness of Mathematics in Physics of the Unknown. [Preprint]
Feintzeig, Benjamin H. (2017) On the Choice of Algebra for Quantization. [Preprint]

Authors: Stefano Gogioso

We present a uniform framework for the treatment of a large class of toy models of quantum theory. Specifically, we will be interested in theories of wavefunctions valued in commutative involutive semirings, and which give rise to some semiring-based notion of classical non-determinism via the Born rule. The models obtained with our construction possess many of the familiar structures used in Categorical Quantum Mechanics. We also provide a bestiary of increasingly exotic examples: some well known, such as real quantum theory and relational quantum theory; some less known, such as hyperbolic quantum theory, p-adic quantum theory and “parity quantum theory”; and some entirely new, such as “finite-field quantum theory” and “tropical quantum theory”. As a further bonus, the measurement scenarios arising within these theories can be studied using the sheaf-theoretic framework for non-locality and contextuality.

Authors: Ward Struyve

Loop quantum gravity is believed to eliminate singularities such as the big bang and big crunch singularity. In order to base this belief on theoretical analysis, the notorious problems such as the problem of time and the problem of the actual meaning of singularities must be addressed and eventually overcome. In this paper, we address the problem of singularities in the context of the Bohmian formulation of loop quantum cosmology (which describes symmetry-reduced models of quantum gravity using the quantization techniques of loop quantum gravity). This formulation solves the mentioned conceptual problems. For example the notion of singularity is clear in this case, since there is an actual metric in addition to the wave function. As such, there is a singularity whenever this actual metric is singular. It is shown that in the loop quantum cosmology for a homogeneous and isotropic Friedmann-Lemaitre-Robertson-Walker space-time with arbitrary constant spatial curvature and possibly a cosmological constant, coupled to a massless homogeneous scalar field, a big bang or big crunch singularity is never obtained. This result is obtained without assuming any boundary conditions. This result should also be contrasted with the fact that in the Bohmian formulation of the Wheeler-DeWitt theory singularities may exist (depending on the wave function and the initial conditions for the metric and scalar field).

Authors: Tom Banks

I propose that much recent history can be explained by hypothesizing that sometime during the last quarter of 2016, the history of the world underwent a macroscopic quantum tunneling event, creating, according to the Many Worlds Interpretation, a new branch of the multiverse in which my consciousness and that of my readers is now trapped. The failure of much political polling is then understood by assuming that the particular branch we are on had very low amplitude in the quantum wave function of the multiverse. In this view, one must take a different attitude towards alternative facts than that proposed by the mainstream media. We know that quantum tunneling can change the low energy laws of physics in the different branches of the wave function. Alternative facts may simply be the reflection of the media’s ignorance of the state of the world after a quantum transition of this magnitude.

Authors: H. D. Zeh

Time-asymmetric spacetime structures, in particular those representing black holes and the expansion of the universe, are intimately related to other arrows of time, such as the second law and the retardation of radiation. The nature of the quantum arrow, often attributed to a collapse of the wave function, is essential, in particular, for understanding the much discussed “black hole information loss paradox”. However, this paradox assumes a new form and can possibly be avoided in a consistent causal treatment that may be able to avoid horizons and singularities. The master arrow that would combine all arrows of time does not have to be identified with a direction of the formal time parameter that serves to formulate the dynamics as a succession of global states (a trajectory in configuration or Hilbert space). It may even change direction with respect to a fundamental physical clock such as the cosmic expansion parameter if this was formally extended either into a future contraction era or to negative “pre-big-bang” values.

Authors: Yuan YuanZhibo HouYuan-Yuan ZhaoHan-Sen ZhongGuo-Yong XiangChuan-Feng LiGuang-Can Guo

Wave-particle duality is a typical example of Bohr’s principle of complementarity that plays a significant role in quantum mechanics. Previous studies used visibility to quantify wave property and used path information to quantify particle property. However, coherence is the core and basis of the interference phenomena of wave. If we use it to characterize wave property, which will be useful to strengthen the understanding of wave-particle duality. A recent theoretical work [Phys. Rev. Lett. 116, 160406 (2016)] found two relations between wave property quantified by coherence in different measure and particle property. Here, we demonstrated the wave-particle duality based on two coherence measures quantitatively for the first time. The path information can be obtained by the discrimination of detector states encoded in polarization of photons corresponding each path and mutual information between detector states and the outcome of the measurement performed on them. We obtain wave property quantified by coherence in l1 measure and relative entropy measure using tomography of photon state that encoded in paths. Our work will deepen people’s further understanding of coherence and provides a new angle of view for wave-particle duality.

Authors: John T. Brooker

This paper presents an alternative quantum theory, the Theory of Discrete Extension, which avoids many of the conceptual problems of standard quantum mechanics. It is a deterministic, dynamic collapse theory with a well-defined primitive ontology.

In place of the dual, classical concepts of wave and particle, the unitary, non-classical concept of a discretely extended object emerges directly from the theory’s dynamic equations as a primitive ontology. Because this ontology is unitary, the theory avoids the dilemmas of wave-particle dualism and complementarity.

Furthermore, the theory’s dynamic equations generate correct, quasi-discrete values of action increments and energy levels without recourse to the operator formalism and eigenvalue postulate of standard quantum mechanics. Quantization of the harmonic oscillator provides a simple illustration.

The theory provides insight into the nature of a number of quantum effects such as the zero-point energy of the harmonic oscillator. It also makes a number of predictions that distinguish it from standard quantum mechanics and from Bohmian mechanics.

Authors: Sujoy K. ModakDaniel Sudarsky

We give general overview of a novel approach, recently developed by us, to address the issue black hole information paradox. This alternative viewpoint is based on theories involving modifications of standard quantum theory, known as “spontaneous dynamical state reduction” or “wave-function collapse models” which were historically developed to overcome the notorious foundational problems of quantum mechanics known as the “measurement problem”. We show that these proposals, when appropriately adapted and refined for this context, provide a self-consistent picture where loss of information in the evaporation of black holes is no longer paradoxical.

Authors: Christos EfthymiopoulosGeorge ContopoulosAthanasios C. Tzemos

We discuss the main mechanisms generating chaotic behavior of the quantum trajectories in the de Broglie – Bohm picture of quantum mechanics, in systems of two and three degrees of freedom. In the 2D case, chaos is generated via multiple scatterings of the trajectories with one or more `nodal point – X-point complexes’. In the 3D case, these complexes form foliations along `nodal lines’ accompanied by `X-lines’. We also identify cases of integrable or partially integrable quantum trajectories. The role of chaos is important in interpreting the dynamical origin of the `quantum relaxation’ effect, i.e. the dynamical emergence of Born’s rule for the quantum probabilities, which has been proposed as an extension of the Bohmian picture of quantum mechanics. In particular, the local scaling laws characterizing the chaotic scattering phenomena near X-points, or X-lines, are related to the global rate at which the quantum relaxation is observed to proceed. Also, the degree of chaos determines the rate at which nearly-coherent initial wavepacket states lose their spatial coherence in the course of time.

Authors: Johann MartonS. BartalucciA. BassiM. BazziS. BertolucciC. BerucciM. BragadireanuM. CargnelliA. ClozzaCatalina CurceanuL. De PaolisS. Di MatteoS.DonadiJ.-P. EggerC. GuaraldoM. IliescuM. LaubensteinE. MilottiAndreas PichlerD. PietreanuK. PiscicchiaA. ScordoH. ShiD. Sirghi F. SirghiL. SperandioO. Vazquez-DoceE. WidmannJ. Zmeskal

We are experimentally investigating possible violations of standard quantum mechanics predictions in the Gran Sasso underground laboratory in Italy. We test with high precision the Pauli Exclusion Principle and the collapse of the wave function (collapse models). We present our method of searching for possible small violations of the Pauli Exclusion Principle (PEP) for electrons, through the search for anomalous X-ray transitions in copper atoms, produced by fresh electrons (brought inside the copper bar by circulating current) which can have the probability to undergo Pauli-forbidden transition to the 1 s level already occupied by two electrons and we describe the VIP2 (VIolation of PEP) experiment under data taking at the Gran Sasso underground laboratories. In this paper the new VIP2 setup installed in the Gran Sasso underground laboratory will be presented. The goal of VIP2 is to test the PEP for electrons with unprecedented accuracy, down to a limit in the probability that PEP is violated at the level of 10$^{-31}$. We show preliminary experimental results and discuss implications of a possible violation.

Authors: Stefan Ataman

Entangled states are notoriously non-separable, their sub-ensembles being only statistical mixtures yielding no coherences and no quantum interference phenomena. The interesting features of entangled states can be revealed only by coincidence counts over the (typically) two sub-ensembles of the system. In this paper we show that this feature extends to properties thought to be local, for example the transmissivity coefficient of a beam splitter. We discuss a well-known experimental setup and propose modifications, so that delayed-choice can be added and this new feature of entanglement tested.

Authors: B.L. van der Waerden in Groningen (Holland), Guglielmo Pasa

“Let us call the novel quantities which, in addition to the vectors and tensors, have appeared in the quantum mechanics of the spinning electron, and which in the case of the Lorentz group are quite differently transformed from tensors, as spinors for short. Is there no spinor analysis that every physicist can learn, such as tensor analysis, and with the aid of which all the possible spinors can be formed, and secondly, all the invariant equations in which spinors occur?” So Mr Ehrenfest asked me and the answer will be given below.

Authors: Ramon Torres

Quantum gravitational effects in black hole spacetimes with a cosmological constant $\Lambda$ are considered. The effective quantum spacetimes for the black holes are constructed by taking into account the renormalization group improvement of classical solutions obtained in the framework of Unimodular Gravity (a theory which is identical to General Relativity at a classical level). This allows us to avoid the usual divergences associated with the presence of a running $\Lambda$. The horizons and causal structure of the improved black holes are discussed taking into account the current observational bounds for the cosmological constant. It is shown that the resulting effective quantum black hole spacetimes are always devoid of singularities.

Redhead, Michael (2017) The Relativistic Einstein-Podolsky-Rosen Argument. [Preprint]

Author(s): M. Bilardello, A. Trombettoni, and A. Bassi

We investigate how ultracold atoms in double-well potentials can be used to study and put bounds on models describing wave-function collapse. We refer in particular to the continuous spontaneous localization (CSL) model, which is the most well studied among dynamical reduction models. It modifies th…
[Phys. Rev. A 95, 032134] Published Wed Mar 29, 2017

Author(s): Eduardo O. Dias and Fernando Parisio

In quantum theory we refer to the probability of finding a particle between positions x and x+dx at the instant t, although we have no capacity of predicting exactly when the detection occurs. In this work, we first present an extended nonrelativistic quantum formalism where space and time play equi…
[Phys. Rev. A 95, 032133] Published Wed Mar 29, 2017

Publication date: Available online 28 March 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Jean-Philippe Martinez
The Hartree-Fock method, one of the first applications of the new quantum mechanics in the frame of the many-body problem, had been elaborated by Rayner Douglas Hartree in 1928 and Vladimir Fock in 1930. Promptly, the challenge of tedious computations was being discussed and it is well known that the application of the method benefited greatly from the development of computers from the mid-to-late 1950s. However, the years from 1930 to 1950 were by no means years of stagnation, as the method was the object of several considerations related to its mathematical formulation, possible extension, and conceptual understanding. Thus, with a focus on the respective attitudes of Hartree and Fock, in particular with respect to the concept of quantum exchange, the present work puts forward some mathematical and conceptual clarifications, which played an important role for a better understanding of the many-body problem in quantum mechanics.

Authors: Sean M. CarrollAidan Chatwin-Davies

In a wide class of cosmological models, a positive cosmological constant drives cosmological evolution toward an asymptotically de Sitter phase. Here we connect this behavior to the increase of entropy over time, based on the idea that de Sitter space is a maximum-entropy state. We prove a cosmic no-hair theorem for Robertson-Walker and Bianchi I spacetimes by assuming that the generalized entropy of a Q-screen (“quantum” holographic screen), in the sense of the cosmological version of the Generalized Second Law conjectured by Bousso and Engelhardt, increases up to a finite maximum value, which we show coincides with the de Sitter horizon entropy. We do not use the Einstein field equations in our proof, nor do we assume the existence of a positive cosmological constant. As such, asymptotic relaxation to a de Sitter phase can, in a precise sense, be thought of as cosmological equilibration.

Authors: Imanol AlbarranMariam Bouhmadi-LópezChe-Yu ChenPisin Chen

By far cosmology is one of the most exciting subject to study, even more so with the current bulk of observations we have at hand. These observations might indicate different kinds of doomsdays, if dark energy follows certain patterns. Two of these doomsdays are the Little Rip (LR) and Little Sibling of the Big Rip (LSBR). In this work, aside from proving the unavoidability of the LR and LSBR in the Eddington-inspired-Born-Infeld (EiBI) scenario, we carry out a quantum analysis of the EiBI theory with a matter field, which, from a classical point of view would inevitably lead to a universe that ends with either LR or LSBR. Based on a modified Wheeler-DeWitt equation, we demonstrate that such fatal endings seems to be avoidable.

Authors: Netta EngelhardtSebastian Fischetti

In a full theory of quantum gravity, local physics is expected to be approximate rather than innate. It is therefore important to understand how approximate locality emerges in the semiclassical limit. Here we show that any notion of locality emergent from a holographic theory of quantum gravity is “all or nothing”: local data is not obtained gradually from subregions of the boundary, but is rather obtained all at once when enough of the boundary is accessed. Our assumptions are mild and thus this feature is quite general; for concreteness, we show how this phenomenon manifests in the special case of AdS/CFT.

Authors: Philip D. Mannheim

In applications of Einstein gravity one replaces the quantum-mechanical energy-momentum tensor of sources such as the degenerate electrons in a white dwarf or the black-body photons in the microwave background by c-number matrix elements. And not only that, one ignores the zero-point fluctuations in these sources by only retaining the normal-ordered parts of those matrix elements. There is no apparent justification for this procedure, and we show that it is precisely this procedure that leads to the cosmological constant problem. We suggest that solving the problem requires that gravity be treated just as quantum-mechanically as the sources to which it couples, and show that one can then solve the cosmological constant problem if one replaces Einstein gravity by the fully quantum-mechanically consistent conformal gravity theory.

Authors: Valentina BaccettiRobert B. MannDaniel R. Terno

We study collapse of evaporating spherically-symmetric thin dust shells and dust balls assuming that quantum effects are encapsulated in a spherically-symmetric metric that satisfied mild regularity conditions. The evaporation may accelerate collapse, but for a generic metric the Schwarzschild radius is not crossed. Instead the shell (or the layer in the ball of dust) is always at a certain sub-Planckian distance from it.

Authors: Jingbo Wang

Black holes are extraordinary massive objects which can be described classically by general relativity, and topological insulators are new phases of matter that could be use to built a topological quantum computer. They seem to be different objects, but in this paper, we claim that the black hole can be considered as a kind of topological insulator. For BTZ black hole in three dimensional $AdS_3$ spacetime we give two evidences to support this claim: the first evidence comes from the black hole “membrane paradigm”, which says that the horizon of black hole behaves like an electrical conductor. On the other hand, the vacuum can be considered as an insulator. The second evidence comes from the fact that the horizon of BTZ black hole can support two chiral massless scalar field with opposite chirality. Those are two key properties of 2D topological insulator. For higher dimensional black hole the first evidence is still valid. So we conjecture that the higher dimensional black hole can also be considered as higher dimensional topological insulators. This conjecture will have far-reaching influences on our understanding of quantum black hole and the nature of gravity.

Authors: Mairi Sakellariadou (King’s College London)

I will briefly discuss three cosmological models built upon three distinct quantum gravity proposals. I will first highlight the cosmological role of a vector field in the framework of a string/brane cosmological model. I will then present the resolution of the big bang singularity and the occurrence of an early era of accelerated expansion of a geometric origin, in the framework of group field theory condensate cosmology. I will then summarise results from an extended gravitational model based on non-commutative spectral geometry, a model that offers a purely geometric explanation for the standard model of particle physics.

Authors: Clive Emary

Ambiguous measurements do not reveal complete information about the system under test. Their quantum-mechanical counterparts are semi-weak (or in the limit, weak-) measurements and here we discuss their role in tests of the Leggett-Garg inequalities. We show that, whilst ambiguous measurements allow one to forgo the usual non-invasive measureability assumption, to derive an LGI that may be violated, we are forced to introduce another assumption that equates the invasive influence of ambiguous and unambiguous detectors. We then derive signalling conditions that should be fulfilled for the plausibility of the Leggett-Garg test and propose an experiment on a three-level system with a direct quantum-optics realisation that satisfies all signalling constraints and violates a Leggett-garg inequality.

Authors: Arkady Bolotin

In the paper, the question whether truth values can be assigned to the propositions about properties of a state of a physical system before the measurement is discussed. To answer this question, a notion that a propositionally noncontextual theory can provide a map linking each element of a bounded lattice to a truth value so as to explain the outcomes of experimental propositions associated with the state of the system is introduced. The paper demonstrates that no model based on the propositionally noncontextual theory can be consistent with the occurrence of a non-vanishing “two-path” quantum interference term and the quantum collapse postulate.

Battle between quantum and thermodynamic laws heats up

Nature 543, 7647 (2017). http://www.nature.com/doifinder/10.1038/543597a

Author: Davide Castelvecchi

Physicists try to rebuild the laws of heat and energy for processes at a quantum scale.

Authors: Alejandro Perez

This is a review of the results on black hole physics in the framework of loop quantum gravity. The key feature underlying the results is the discreteness of geometric quantities at the Planck scale predicted by this approach to quantum gravity. Quantum discreteness follows directly from the canonical quantization prescription when applied to the action of general relativity that is suitable for the coupling of gravity with gauge fields and specially with Fermions. Planckian discreteness and causal considerations provide the basic structure for the understanding of the thermal properties of black holes close to equilibrium. Discreteness also provides a fresh new look at more (at the moment) speculative issues such as those concerning the fate of information in black hole evaporation. The hypothesis of discreteness leads also to interesting phenomenology with possible observational consequences. The theory of loop quantum gravity is a developing program. This review reports its achievements and open questions in a pedagogical manner with an emphasis on quantum aspects of black hole physics.

Authors: H. Nikolic

Most physicists do not have patience for reading long and obscure interpretation arguments and disputes. Hence, to attract attention of a wider physics community, in this paper various old and new aspects of quantum interpretations are explained in a concise and simple (almost trivial) form. About the “Copenhagen” interpretation, we note that there are several different versions of it and explain how to make sense of “local non-reality” interpretation. About the many-world interpretation, we explain that it is neither local nor non-local, that it cannot explain the Born rule, that it suffers from the preferred basis problem, and that quantum suicide cannot be used to test it. About the Bohmian interpretation, we explain that it is analogous to dark matter, use it to explain that there is no big difference between non-local correlation and non-local causation, and use some condensed-matter ideas to outline how non-relativistic Bohmian theory could be a theory of everything. We also explain how different interpretations can be used to demystify the delayed choice experiment, to resolve the problem of time in quantum gravity, and to provide alternatives to quantum non-locality. Finally, we explain why life is compatible with the 2nd law.

Authors: B. J. Carr

There appears to be a duality between elementary particles, which span the mass range below the Planck scale, and black holes, which span the mass range range above it. In particular, the Black Hole Uncertainty Principle Correspondence posits a smooth transition between the Compton and Schwarzschild scales as a function of mass. This suggests that all black holes are in some sense quantum, that elementary particles can be interpreted as sub-Planckian black holes, and that there is a subtle connection between quantum and classical physics.

Authors: Lev Vaidman

Recent controversy regarding the meaning and usefulness of weak values is reviewed. It is argued that in spite of recent statistical arguments by Ferrie and Combes, experiments with anomalous weak values provide a useful amplification techniques for precision measurements of small effects in many realistic situations. The statistical nature of weak vales was questioned. Although measuring weak value requires an ensemble, it is argued that the weak value, similarly to an eigenvalue, is a property of a single pre- and post-selected quantum system.

Authors: S Datta (Department of Physics, The Icfai Foundation for Higher Education, Department of Physics, The University of Texas at Arlington), J M RejcekJ. L. Fry (Department of Physics, The University of Texas at Arlington)

In the Feynman-Kac[1] path integral approach the eigenvalues of a quantum system can be computed using Wiener measure which uses Brownian particle motion. In our previous work[2-3] on such systems we have observed that the Wiener process numerically converges slowly for dimensions greater than two because almost all trajectories will escape to infinity[4]. One can speed up this process by using a Generalized Feynman-Kac (GFK) method[5] in which the new measure associated with the trial function is stationary, so that the convergence rate becomes much faster. We thus achieve an example of Importance Sampling and in the present work we apply it to the Feynman-Kac(FK) path integrals for the ground and first few excited state energies for He to speed up the convergence rate. We calculate the path integrals using space averaging rather than the time averaging as done in the past. The best previous calculations from Variational computations report precisions of Hartrees, whereas in most cases our path integral results obtained for the ground and first excited states of He are lower than these results by about Hartrees or more.

Authors: A. D. Alhaidari

Within the recent reformulation of quantum mechanics where a potential function is not required, we show how to reconstruct the potential so that a correspondence with the standard formulation could be established. However, severe restriction is placed by the correspondence on the kinematics of such problems.

Dawid, Richard and Hartmann, Stephan (2017) The No Miracles Argument without the Base Rate Fallacy. [Preprint]
Heesen, Remco (2017) When Journal Editors Play Favorites. Philosophical Studies. ISSN 0031-8116

Abstract

Bell’s theorem has fascinated physicists and philosophers since his 1964 paper, which was written in response to the 1935 paper of Einstein, Podolsky, and Rosen. Bell’s theorem and its many extensions have led to the claim that quantum mechanics and by inference nature herself are nonlocal in the sense that a measurement on a system by an observer at one location has an immediate effect on a distant entangled system (one with which the original system has previously interacted). Einstein was repulsed by such “spooky action at a distance” and was led to question whether quantum mechanics could provide a complete description of physical reality. In this paper I argue that quantum mechanics does not require spooky action at a distance of any kind and yet it is entirely reasonable to question the assumption that quantum mechanics can provide a complete description of physical reality. The magic of entangled quantum states has little to do with entanglement and everything to do with superposition, a property of all quantum systems and a foundational tenet of quantum mechanics.

Article written by