This is a list of this week’s papers on quantum foundations published in various journals or uploaded to preprint servers such as arxiv.org and PhilSci Archive.

# de Sitter-invariant special relativity and the dark energy problem. (arXiv:1704.02120v2 [gr-qc] UPDATED)

## gr-qc updates on arXiv.org

Authors: A. Araujo, D. F. Lopez, J. G. Pereira The replacement of the Poincar\’e-invariant Einstein special relativity by a de Sitter-invariant special relativity produces concomitant changes in all relativistic theories, including general relativity. A crucial change in the latter is that both the background de Sitter curvature and the gravitational dynamical curvature turns out to be included in a single curvature tensor. This means that the cosmological term no longer explicitly appears in Einstein equation, and is consequently not restricted to be constant. In this paper, the Newtonian limit of such theory is obtained, and the ensuing Newtonian Friedmann equations are show to provide a good account of the dark energy content of the present-day universe.

# Gravitational memory for uniformly accelerated observers. (arXiv:1703.10619v2 [hep-th] UPDATED)

## gr-qc updates on arXiv.org

Authors: Sanved Kolekar, Jorma Louko Recently, Hawking, Perry and Strominger described a physical process that implants supertranslational hair on a Schwarzschild black hole by an infalling matter shock wave without spherical symmetry. Using the BMS-type symmetries of the Rindler horizon, we present an analogous process that implants supertranslational hair on a Rindler horizon by a matter shock wave without planar symmetry, and we investigate the corresponding memory effect on the Rindler family of uniformly linearly accelerated observers. We assume each observer to remain linearly uniformly accelerated through the wave, in the sense of the curved spacetime generalisation of the Letaw-Frenet equations. Starting with a family of observers who follow the orbits of a single boost Killing vector before the wave, we find that after the wave has passed, each observer still follows the orbit of a boost Killing vector but this boost differs from trajectory to trajectory, and the trajectory-dependence carries a memory of the planar inhomogeneity of the wave. We anticipate this classical memory phenomenon to have a counterpart in Rindler space quantum field theory.

# Quantum test of the equivalence principle for atoms in superpositions of internal energy eigenstates. (arXiv:1704.02296v2 [physics.atom-ph] CROSS LISTED)

## quant-ph updates on arXiv.org

Authors: G. Rosi, G. D’Amico, L. Cacciapuoti, F. Sorrentino, M. Prevedelli, M. Zych, C. Brukner, G. M. Tino The Einstein Equivalence Principle (EEP) has a central role in the understanding of gravity and space-time. In its weak form, or Weak Equivalence Principle (WEP), it directly implies equivalence between inertial and gravitational mass. Verifying this principle in a regime where the relevant properties of the test body must be described by quantum theory has profound implications. Here we report on a novel WEP test for atoms. A Bragg atom interferometer in a gravity gradiometer configuration compares the free fall of rubidium atoms prepared in two hyperfine states and in their coherent superposition. The use of the superposition state allows testing genuine quantum aspects of EEP with no classical analogue, which have remained completely unexplored so far. In addition, we measure the Eotvos ratio of atoms in two hyperfine levels with relative uncertainty in the low $10^{-9}$, improving previous results by almost two orders of magnitude.

# Occam’s Vorpal Quantum Razor: Memory Reduction When Simulating Continuous-Time Stochastic Processes With Quantum Devices. (arXiv:1704.04231v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Thomas J. Elliott, Mile Gu Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

Author(s): Adam Bednorz Local realism in recent experiments is excluded on condition of freedom or randomness of choice combined with no signaling between observers by implementations of simple quantum models. Both no signaling and the underlying quantum model can be directly checked by analysis of experimental data. For p… [Phys. Rev. A 95, 042118] Published Fri Apr 14, 2017

# Ten reasons why a thermalized system cannot be described by a many-particle wave function

## ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Publication date: Available online 14 April 2017

**Source:**Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics Author(s): Barbara Drossel It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practice and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper suggests that the irreversibility and stochasticity of statistical mechanics should be taken as a real property of nature. It follows that a gas of a macroscopic numberN of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.# The Klein Paradox: A New Treatment. (arXiv:1704.04108v1 [physics.gen-ph])

## hep-th updates on arXiv.org

Authors: Egon Truebenbacher The Dirac equation requires a treatment of the step potential that differs fundamentally from the traditional treatment, because the Dirac plane waves, besides momentum and spin, are characterized by a quantum number with the physical meaning of sign of charge. Since the Hermitean operator corresponding to this quantum number does not commute with the step potential, the time displacement parameter used in the ansatz of the stationary state does not have the physical meaning of energy. Therefore there are no paradoxal values of the energy. The new solution of the Dirac equation with a step potential is obtained. This solution, again, allows for phenomena of the Klein paradox type, but in addition it contains a positron amplitude localized at the threshold point of the step potential.

# On the Emergence of the Coulomb Forces in Quantum Electrodynamics. (arXiv:1704.04048v1 [hep-th])

## hep-th updates on arXiv.org

Authors: Jan Naudts A simple transformation of field variables eliminates Coulomb forces from the theory of quantum electrodynamics. This suggests that Coulomb forces may be an emergent phenomenon rather than being fundamental. This possibility is investigated in the context of reducible quantum electrodynamics. It is shown that states exist which bind free photon and free electron fields. The binding energy peaks in the long-wavelength limit. This makes it plausible that Coulomb forces result from the interaction of the electron/positron field with long-wavelength transversely polarized photons.

# Past of a quantum particle: Common sense prevails. (arXiv:1704.03722v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Berthold-Georg Englert, Kelvin Horia, Jibo Dai, Yink Loong Len, Hui Khoon Ng We analyze Vaidman’s three-path interferometer with weak path marking [Phys. Rev. A 87, 052104 (2013)] and find that common sense yields correct statements about the particle’s path through the interferometer. This disagrees with the original claim that the particles have discontinuous trajectories at odds with common sense. In our analysis, “the particle’s path” has operational meaning as acquired by a path-discriminating measurement. For a quantum-mechanical experimental demonstration of the case, one should perform a single-photon version of the experiment by Danan et al. [Phys. Rev. Lett. 111, 240402 (2013)] with unambiguous path discrimination. We present a detailed proposal for such an experiment.

# Sensing spontaneous collapse and decoherence with interfering Bose-Einstein condensates. (arXiv:1704.03608v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Björn Schrinski, Klaus Hornberger, Stefan Nimmrichter We study how matter-wave interferometry with Bose-Einstein condensates is affected by hypothetical collapse models and by environmental decoherence processes. Motivated by recent atom fountain experiments with macroscopic arm separations, we focus on the observable signatures of first-order and higher-order coherence for different two-mode superposition states, and on their scaling with particle number. This can be used not only to assess the impact of environmental decoherence on many-body coherence, but also to quantify the extent to which macrorealistic collapse models are ruled out by such experiments. We find that interference fringes of phase-coherently split condensates are most strongly affected by decoherence, whereas the quantum signatures of independent interfering condensates are more immune against macrorealistic collapse. A many-body enhanced decoherence effect beyond the level of a single atom can be probed if higher-order correlations are resolved in the interferogram.

# GUP, Einstein static universe and cosmological constant problem. (arXiv:1704.03040v1 [gr-qc])

## gr-qc updates on arXiv.org

Authors: F. Darabi, K. Atazadeh We study the Generalized Uncertainty Principle (GUP) in the framework of Einstein static universe (ESU). It is shown that the deformation parameter corresponding to the Snyder non-commutative space can induce an energy density subject to GUP which obeys the holographic principle (HP) and plays the role of a cosmological constant. Using the holographic feature of GUP energy density, we introduce new holographic based IR and UV cut-offs. Moreover, we propose a solution to the cosmological constant problem. This solution is based on the result that the Einstein equations just couple to the tiny holographic based surface energy density (cosmological constant) induced by the deformation parameter, rather than the large quantum gravitational based volume energy density (vacuum energy) having contributions of order $M_P^4$.

# Group theoretical derivation of the minimal coupling principle

## Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences current issue

The group theoretical methods worked out by Bargmann, Mackey and Wigner, which

*deductively*establish the Quantum Theory of a free particle for which Galileian transformations form a symmetry group, are extended to the case of an interacting particle. In doing so, the obstacles caused by loss of symmetry are overcome. In this approach, specific forms of the wave equation of an interacting particle, including the equation derived from the minimal coupling principle, are implied by particular first-order invariance properties that characterize the interaction with respect to specific subgroups of Galileian transformations; moreover, the possibility of yet unknown forms of the wave equation is left open.# Generalised ontological models based on processes. (arXiv:1704.03090v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Tung Ten Yong In this paper, we explore realist models of quantum theory that does not fit into the standard definitions of ontological models. The models here go beyond standard definition of ontological models in the sense that quantum states do not correspond to distributions over the ontic state space and a system prepared in a quantum state is not in an ontic state. Instead, a system in a quantum state is always in a process, i.e. moving around in the ontic state space. Also, quantum measurement outcomes are not direct measurement of the ontic state, but depend probabilistically on the entire path the system takes during the measurement process. Consequently, we explain how, in our model, quantum states can neither be classified as ontic nor epistemic in the sense of knowledge about an underlying reality. In our model, quantum probabilities describes our (objective) knowledge about measurement outcomes. We also look at two hybrid models where either the preparation or measurement do follow the definitions in standard ontological models. Lastly, we propose a form of generalised ontological model that reduces to the standard PBR model when the underlying process reduces to a point in ontic space.

# Driven quantum dynamics: will it blend?. (arXiv:1704.03041v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Leonardo Banchi, Daniel Burgarth, Michael J. Kastoryano Randomness is an essential tool in many disciplines of modern sciences, such as cryptography, black hole physics, random matrix theory and Monte Carlo sampling. In quantum systems, random operations can be obtained via random circuits thanks to so-called q-designs, and play a central role in the fast scrambling conjecture for black holes. Here we consider a more physically motivated way of generating random evolutions by exploiting the many-body dynamics of a quantum system driven with stochastic external pulses. We combine techniques from quantum control, open quantum systems and exactly solvable models (via the Bethe-Ansatz) to generate Haar-uniform random operations in driven many-body systems. We show that any fully controllable system converges to a unitary q-design in the long-time limit. Moreover, we study the convergence time of a driven spin chain by mapping its random evolution into a semigroup with an integrable Liouvillean and finding its gap. Remarkably, we find via Bethe-Ansatz techniques that the gap is independent of q. We use mean-field techniques to argue that this property may be typical for other controllable systems, although we explicitly construct counter-examples via symmetry breaking arguments to show that this is not always the case. Our findings open up new physical methods to transform classical randomness into quantum randomness, via a combination of quantum many-body dynamics and random driving.

# Reply to Comment [axXiv:1610.07734] by L. Vaidman on “Particle path through a nested Mach-Zehnder interferometer” [R. B. Griffiths, Phys. Rev. A 94 (2016) 032115]. (arXiv:1704.03010v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Robert B. Griffiths The correctness of the consistent histories analysis of weakly interacting probes, related to the path of a particle, is maintained against the criticisms in the Comment, and against the alternative approach described there, which receives no support from standard (textbook) quantum mechanics.

# An algorithm for quantum gravity phenomenology. (arXiv:1704.02404v1 [gr-qc])

## gr-qc updates on arXiv.org

Authors: Yuri Bonder Quantum gravity phenomenology is the strategy towards quantum gravity where the priority is to make contact with experiments. Here I describe what I consider to be the best procedure to do quantum gravity phenomenology. The key step is to have a generic parametrization which allows one to perform self-consistency checks and to deal with many different experiments. As an example I describe the role that the Standard Model Extension has played when looking for Lorentz violation.

# Probing nonclassicality under spontaneous decay. (arXiv:1704.02710v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Md. Manirul Ali, Po-Wen Chen We investigate the nonclassicality of an open quantum system using Leggett-Garg inequality (LGI) which test the correlations of a single system measured at different times. Violation of LGI implies nonclassical behavior of the open system. We investigate the violation of the Leggett-Garg inequality for a two level system (qubit) spontaneously decaying under a general non-Markovian dissipative environment. Our results are exact as we have calculated the two-time correlation functions exactly for a wide range of system-environment parameters beyond Born-Markov regime.

# Entropic Dynamics: Mechanics without Mechanism. (arXiv:1704.02663v1 [quant-ph])

## quant-ph updates on arXiv.org

Authors: Ariel Caticha Entropic Dynamics is a framework in which dynamical laws such as those that arise in physics are derived as an application of entropic methods of inference. No underlying action principle is postulated. Instead, the dynamics is driven by entropy subject to constraints reflecting the information that is relevant to the problem at hand. In this work I review the derivation of quantum theory but the fact that Entropic Dynamics is based on inference methods that are of universal applicability suggests that it may be possible to adapt these methods to fields other than physics.

# The reality of the wavefunction: old arguments and new

## Philsci-Archive: No conditions. Results ordered -Date Deposited.

Brown, Harvey R. (2017) The reality of the wavefunction: old arguments and new. [Preprint]

# Market Crashes as Critical Phenomena? Explanation, Idealization, and Universality in Econophysics

## Philsci-Archive: No conditions. Results ordered -Date Deposited.

Jhun, Jennifer and Palacios, Patricia and Weatherall, James Owen (2017) Market Crashes as Critical Phenomena? Explanation, Idealization, and Universality in Econophysics. [Preprint]

# Wittgenstein’s Elimination of Identity for Quantifier-Free Logic

## Philsci-Archive: No conditions. Results ordered -Date Deposited.

Lampert, Timm and Säbel, Markus (2016) Wittgenstein’s Elimination of Identity for Quantifier-Free Logic. [Preprint]

# The Case of the Disappearing (and Re-Appearing) Particle

*Scientific Reports*7, Article number: 531 (2017)- doi:10.1038/s41598-017-00274-w

## Abstract

A novel prediction is derived by the Two-State-Vector-Formalism (TSVF) for a particle superposed over three boxes. Under appropriate pre- and post-selections, and with tunneling enabled between two of the boxes, it is possible to derive not only one, but three predictions for three different times within the intermediate interval. These predictions are moreover contradictory. The particle (when looked for using a projective measurement) seems to disappear from the first box where it would have been previously found with certainty, appearing instead within the third box, to which no tunneling is possible, and later re-appearing within the second. It turns out that local measurement (i.e. opening one of the boxes) fails to indicate the particle’s presence, but subtler measurements performed on the two boxes together reveal the particle’s nonlocal modular momentum spatially separated from its mass. Another advance of this setting is that, unlike other predictions of the TSVF that rely on weak and/or counterfactual measurements, the present one uses actual projective measurements. This outcome is then corroborated by adding weak measurements and the Aharonov-Bohm effect. The results strengthen the recently suggested time-symmetric Heisenberg ontology based on nonlocal deterministic operators. They can be also tested using the newly developed quantum router.

*Entropy* **2017**, *19*(3), 111; doi:10.3390/e19030111

Article

# The Two-Time Interpretation and Macroscopic Time-Reversibility

1

School of Physics and Astronomy, Tel Aviv University, Tel Aviv 6997801, Israel

2

Schmid College of Science, Chapman University, Orange, CA 92866, USA

3

H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL, UK*

Correspondence: Tel.: +44-117-928-8314

Published: 12 March 2017

## Abstract**:**

The two-state vector formalism motivates a time-symmetric interpretation of quantum mechanics that entails a resolution of the measurement problem. We revisit a post-selection-assisted collapse model previously suggested by us, claiming that unlike the thermodynamic arrow of time, it can lead to reversible dynamics at the macroscopic level. In addition, the proposed scheme enables us to characterize the classical-quantum boundary. We discuss the limitations of this approach and its broad implications for other areas of physics.

Please comment with your real name using good manners.