Latest Papers on Quantum Foundations - Updated Daily by IJQF

Author(s): C. Jess Riedel

When the wave function of a large quantum system unitarily evolves away from a low-entropy initial state, there is strong circumstantial evidence it develops “branches”: a decomposition into orthogonal components that is indistinguishable from the corresponding incoherent mixture with feasible obser…


[Phys. Rev. Lett. 118, 120402] Published Fri Mar 24, 2017

Authors: Siddarth Koduru Joshi, Jacques Pienaar, Timothy C. Ralph, Luigi Cacciapuoti, Will McCutcheon, John Rarity, Dirk Giggenbach, Vadim Makarov, Ivette Fuentes, Thomas Scheidl, Erik Beckert, Mohamed Bourennane, David Edward Bruschi, Adan Cabello, Jose Capmany, José A. Carrasco, Alberto Carrasco-Casado, Eleni Diamanti, Miloslav Duusek, Dominique Elser, Angelo Gulinatti, Robert H. Hadfield, Thomas Jennewein, Rainer Kaltenbaek, Michael A. Krainak, Hoi-Kwong Lo, Christoph Marquardt, Paolo Mataloni, Gerard Milburn, Momtchil Peev, Andreas Poppe, Valerio Pruneri, Renato Renner, Christophe Salomon, Johannes Skaar, Nikolaos Solomos, Mario Stipčević, Juan P. Torres, Morio Toyoshima, Paolo Villoresi, Ian Walmsley, Gregor Weihs, Harald Weinfurter, Anton Zeilinger, Marek Żukowski, Rupert Ursin

Models of quantum systems on curved space-times lack sufficient experimental verification. Some speculative theories suggest that quantum properties, such as entanglement, may exhibit entirely different behavior to purely classical systems. By measuring this effect or lack thereof, we can test the hypotheses behind several such models. For instance, as predicted by Ralph and coworkers [T C Ralph, G J Milburn, and T Downes, Phys. Rev. A, 79(2):22121, 2009; T C Ralph and J Pienaar, New Journal of Physics, 16(8):85008, 2014], a bipartite entangled system could decohere if each particle traversed through a different gravitational field gradient. We propose to study this effect in a ground to space uplink scenario. We extend the above theoretical predictions of Ralph and coworkers and discuss the scientific consequences of detecting/failing to detect the predicted gravitational decoherence. We present a detailed mission design of the European Space Agency's (ESA) Space QUEST (Space - Quantum Entanglement Space Test) mission, and study the feasibility of the mission schema.

Authors: Supratik Sarkar, A. Bhattacharyay

Arising out of a Non-local non-relativistic BEC, we present an Analogue gravity model upto $\mathcal{O}(\xi^{2})$ accuracy in the presence of the quantum potential term for a canonical acoustic BH in $(3+1)$-d spacetime where the series solution of the free minimally coupled KG equation for the large length scale massive scalar modes is derived. We systematically address the issues of the presence of the quantum potential term being the root cause of a UV-IR coupling between short wavelength "primary" modes which are supposedly Hawking radiated through the sonic event horizon and the large wavelength "secondary" modes. In the quantum gravity experiments of analogue Hawking radiation in the laboratory, this UV-IR coupling is inevitable and one can not get rid of these large wavelength excitations which would grow over space by gaining energy from the short wavelength Hawking radiated modes. We identify the characteristic feature in the growth rate(s) that would distinguish these primary and secondary modes.

Authors: Xavier Calmet, Jacob Dunningham

We revisit the properties of qubits under Lorentz transformations and, by considering Lorentz invariant quantum states in the Heisenberg formulation, clarify some misleading notation that has appeared in the literature on relativistic quantum information theory. We then use this formulation to consider the transformation properties of qubits and density matrices under space-time and gauge transformations. Finally we use our results to understand the behaviour of entanglement between different partitions of quantum systems. Our approach not only clarifies the notation, but provides a more intuitive and simple way of gaining insight into the behaviour of relativistic qubits. In particular, it allows us to greatly generalize the results in the current literature as well as substantially simplifying the calculations that are needed.

Authors: Siddarth Koduru Joshi, Jacques Pienaar, Timothy C. Ralph, Luigi Cacciapuoti, Will McCutcheon, John Rarity, Dirk Giggenbach, Vadim Makarov, Ivette Fuentes, Thomas Scheidl, Erik Beckert, Mohamed Bourennane, David Edward Bruschi, Adan Cabello, Jose Capmany, José A. Carrasco, Alberto Carrasco-Casado, Eleni Diamanti, Miloslav Duusek, Dominique Elser, Angelo Gulinatti, Robert H. Hadfield, Thomas Jennewein, Rainer Kaltenbaek, Michael A. Krainak, Hoi-Kwong Lo, Christoph Marquardt, Paolo Mataloni, Gerard Milburn, Momtchil Peev, Andreas Poppe, Valerio Pruneri, Renato Renner, Christophe Salomon, Johannes Skaar, Nikolaos Solomos, Mario Stipčević, Juan P. Torres, Morio Toyoshima, Paolo Villoresi, Ian Walmsley, Gregor Weihs, Harald Weinfurter, Anton Zeilinger, Marek Żukowski, Rupert Ursin

Models of quantum systems on curved space-times lack sufficient experimental verification. Some speculative theories suggest that quantum properties, such as entanglement, may exhibit entirely different behavior to purely classical systems. By measuring this effect or lack thereof, we can test the hypotheses behind several such models. For instance, as predicted by Ralph and coworkers [T C Ralph, G J Milburn, and T Downes, Phys. Rev. A, 79(2):22121, 2009; T C Ralph and J Pienaar, New Journal of Physics, 16(8):85008, 2014], a bipartite entangled system could decohere if each particle traversed through a different gravitational field gradient. We propose to study this effect in a ground to space uplink scenario. We extend the above theoretical predictions of Ralph and coworkers and discuss the scientific consequences of detecting/failing to detect the predicted gravitational decoherence. We present a detailed mission design of the European Space Agency's (ESA) Space QUEST (Space - Quantum Entanglement Space Test) mission, and study the feasibility of the mission schema.

Authors: Christopher A. Fuchs, Michael C. Hoang, Blake C. Stacey

Recent years have seen significant advances in the study of symmetric informationally complete (SIC) quantum measurements, also known as maximal sets of complex equiangular lines. Previously, the published record contained solutions up to dimension 67, and was with high confidence complete up through dimension 50. Computer calculations have now furnished solutions in all dimensions up to 151, and in several cases beyond that, as large as dimension 323. These new solutions exhibit an additional type of symmetry beyond the basic definition of a SIC, and so verify a conjecture of Zauner in many new cases. The solutions in dimensions 68 through 121 were obtained by Andrew Scott, and his catalogue of distinct solutions is, with high confidence, complete up to dimension 90. Additional results in dimensions 122 through 151 were calculated by the authors using Scott's code. We recap the history of the problem, outline how the numerical searches were done, and pose some conjectures on how the search technique could be improved. In order to facilitate communication across disciplinary boundaries, we also present a comprehensive bibliography of SIC research.

Authors: Emmanuele Battista, Angelo Tartaglia, Giampiero Esposito, David Lucchesi, Matteo Luca Ruggiero, Pavol Valko, Simone Dell' Agnello, Luciano Di Fiore, Jules Simo, Aniello Grado

We examine quantum corrections of time delay arising in the gravitational field of a spinning oblate source. Low-energy quantum effects occurring in Kerr geometry are derived within a framework where general relativity is fully seen as an effective field theory. By employing such a pattern, gravitational radiative modifications of Kerr metric are derived from the energy-momentum tensor of the source, which at lowest order in the fields is modelled as a point mass. Therefore, in order to describe a quantum corrected version of time delay in the case in which the source body has a finite extension, we introduce a hybrid scheme where quantum fluctuations affect only the monopole term occurring in the multipole expansion of the Newtonian potential. The predicted quantum deviation from the corresponding classical value turns out to be too small to be detected in the next future, showing that new models should be examined in order to test low-energy quantum gravity within the solar system.

Authors: Keerthan Subramanian, Nirmal K. Viswanathan

The doubts regarding the completeness of quantum mechanics as raised by EPR were resolved by Aspect et.al. by resorting to a measurement of correlations in entangled photons which clearly demonstrate violation of Bell's inequality. This article is an attempt to reconcile incompatibility of hidden variable theories with reality by demonstrating violation of Bell's inequality in non-entangled systems whose two degrees of freedom, the spin and orbital angular momentum, are maximally correlated.

Authors: Alexandre Gondran (MAIAA), Michel Gondran (AEIS)

Playing the game of heads or tails in zero gravity demonstrates that there exists a contextual "measurement" in classical mechanics. When the coin is flipped, its orientation is a continuous variable. However, the "measurement" that occurs when the coin is caught by clapping two hands together gives a discrete value (heads or tails) that depends on the context (orientation of the hands). It is then shown that there is a strong analogy with the spin measurement of the Stern-Gerlach experiment, and in particular with Stern and Gerlach's sequential measurements. Finally, we clarify the analogy by recalling how the de Broglie-Bohm interpretation simply explains the spin "measurement".

Authors: Abhay Ashtekar, Jorge Pullin

This is the introductory Chapter in the monograph Loop Quantum Gravity: The First 30 Years, edited by the authors, that was just published in the series "100 Years of General Relativity. The 8 invited Chapters that follow provide fresh perspectives on the current status of the field from some of the younger and most active leaders who are currently shaping its development. The purpose of this Chapter is to provide a global overview by bridging the material covered in subsequent Chapters. The goal and scope of the monograph is described in the Preface which can be read by following the Front Matter link at the website listed below.

Authors: Bernard Carr, Florian Kuhnel, Marit Sandstad

The possibility that the dark matter comprises primordial black holes (PBHs) is considered, with particular emphasis on the currently allowed mass windows at $10^{16}$ - $10^{17}\,$g, $10^{20}$ - $10^{24}\,$g and $1$ - $10^{3}\,M_{\odot}$. The Planck mass relics of smaller evaporating PBHs are also considered. All relevant constraints (lensing, dynamical, large-scale structure and accretion) are reviewed and various effects necessary for a precise calculation of the PBH abundance (non-Gaussianity, non-sphericity, critical collapse and merging) are accounted for. It is difficult to put all the dark matter in PBHs if their mass function is monochromatic but this is still possible if the mass function is extended, as expected in many scenarios. A novel procedure for confronting observational constraints with an extended PBH mass spectrum is therefore introduced. This applies for arbitrary constraints and a wide range of PBH formation models, and allows us to identify which model-independent conclusions can be drawn from constraints over all mass ranges. We focus particularly on PBHs generated by inflation, pointing out which effects in the formation process influence the mapping from the inflationary power spectrum to the PBH mass function. We then apply our scheme to two specific inflationary models in which PBHs provide the dark matter. The possibility that the dark matter is in intermediate-mass PBHs of $1$ - $10^{3}\,M_{\odot}$ is of special interest in view of the recent detection of black-hole mergers by LIGO. The possibility of Planck relics is also intriguing but virtually untestable.

Authors: Fay Dowker, Stav Zalel

The causal set approach to the problem of quantum gravity is based on the hypothesis that spacetime is fundamentally discrete. Spacetime discreteness opens the door to novel types of dynamical law for cosmology and the Classical Sequential Growth (CSG) models of Rideout and Sorkin form an interesting class of such laws. It has been shown that a renormalisation of the dynamical parameters of a CSG model occurs whenever the universe undergoes a Big Crunch-Big Bang bounce. In this paper we propose a way to model the creation of a new universe after the singularity of a black hole. We show that renormalisation of dynamical parameters occurs in a CSG model after such a creation event. We speculate that this could realise aspects of Smolin's Cosmological Natural Selection proposal.

Authors: Andrzej Okolow

Nowadays projective quantum states can be constructed for a number of field theories including Loop Quantum Gravity. However, these states are kinematic in this sense that their construction does not take into account the dynamics of the theories. In particular, the construction neglects constraints on phase spaces. Here we present projective quantum states for a "toy-theory" called degenerate Plebanski gravity which satisfy a constraint of this theory.

Authors: Abhay Ashtekar, Jorge Pullin

This is the introductory Chapter in the monograph Loop Quantum Gravity: The First 30 Years, edited by the authors, that was just published in the series "100 Years of General Relativity. The 8 invited Chapters that follow provide fresh perspectives on the current status of the field from some of the younger and most active leaders who are currently shaping its development. The purpose of this Chapter is to provide a global overview by bridging the material covered in subsequent Chapters. The goal and scope of the monograph is described in the Preface which can be read by following the Front Matter link at the website listed below.

Authors: A. Vinante, R. Mezzena, P. Falferi, M. Carlesso, A. Bassi

Spontaneous collapse models predict that a weak force noise acts on any mechanical system, as a consequence of the collapse of the wave function. Significant upper limits on the collapse rate have been recently inferred from precision mechanical experiments, such as ultracold cantilevers and the space mission LISA Pathfinder. Here, we report new results from an experiment based on a high Q cantilever cooled to millikelvin temperature, potentially able to improve by one order of magnitude the current bounds on the continuous spontaneous localization (CSL) model. High accuracy measurements of the cantilever thermal fluctuations reveal a nonthermal force noise of unknown origin. This excess noise is compatible with the CSL heating predicted by Adler. Several physical mechanisms able to explain the observed noise have been ruled out.

Authors: C. Curceanu, S. Bartalucci, A. Bassi, M. Bazzi, S. Bertolucci, C. Berucci, A.M. Bragadireanu, M. Cargnelli, A. Clozza, L. De Paolis, S. Di Matteo, S. Donadi, J-P. Egger, C. Guaraldo, M. Iliescu, M. Laubenstein, J. Marton, E. Milotti, A. Pichler, D. Pietreanu, K. Piscicchia, A. Scordo, H. Shi, D. Sirghi, F. Sirghi, L. Sperandio, O. Vazquez Doce, J. Zmeskal

By performing X-rays measurements in the "cosmic silence" of the underground laboratory of Gran Sasso, LNGS-INFN, we test a basic principle of quantum mechanics: the Pauli Exclusion Principle (PEP), for electrons. We present the achieved results of the VIP experiment and the ongoing VIP2 measurement aiming to gain two orders of magnitude improvement in testing PEP. We also use a similar experimental technique to search for radiation (X and gamma) predicted by continuous spontaneous localization models, which aim to solve the "measurement problem".

Authors: Samuel Colin, Thomas Durt, Ralph Willox

Since de Broglie's pilot wave theory was revived by David Bohm in the 1950's, the overwhelming majority of researchers involved in the field have focused on what is nowadays called de Broglie-Bohm dynamics and de Broglie's original double solution program was gradually forgotten. As a result, several of the key concepts in the theory are still rather vague and ill-understood. In the light of the progress achieved over the course of the 90 years that have passed since de Broglie's presentation of his ideas at the Solvay conference of 1927, we reconsider in the present paper the status of the double solution program. More than a somewhat dusty archaeological piece of history of science, we believe it should be considered as a legitimate attempt to reconcile quantum theory with realism.

Authors: Diederik Aerts, Jonito Aerts Arguelles, Lester Beltran, Lyneth Beltran, Isaac Distrito, Massimiliano Sassoli de Bianchi, Sandro Sozzo, Tomas Veloz

We elaborate a quantum model for corpora of written documents, like the pages forming the World Wide Web. To that end, we are guided by how physicists constructed quantum theory for microscopic entities, which unlike classical objects cannot be fully represented in our spatial theater. We suggest that a similar construction needs to be carried out by linguists and computational scientists, to capture the full meaning content of collections of documental entities. More precisely, we show how to associate a quantum-like 'entity of meaning' to a 'language entity formed by printed documents', considering the latter as the collection of traces that are left by the former, in specific results of search actions that we describe as measurements. In other words, we offer a perspective where a collection of documents, like the Web, is described as the space of manifestation of a more complex entity - the QWeb - which is the object of our modeling, drawing its inspiration from previous studies on operational-realistic approaches to quantum physics and quantum modeling of human cognition and decision-making. We emphasize that a consistent QWeb model needs to account for the observed correlations between words appearing in printed documents, e.g., co-occurrences, as the latter would depend on the 'meaning connections' existing between the concepts that are associated with these words. In that respect, we show that both 'context and interference (quantum) effects' are required to explain the probabilities calculated by counting the relative number of documents containing certain words and co-ocurrrences of words.

Authors: V.I. Yukalov, D. Sornette

We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

Authors: Oliver Passon, Johannes Grebe-Ellis

Planck's law for black-body radiation marks the origin of quantum theory and is discussed in all introductory (or advanced) courses on this subject. However, the question whether Planck really implied quantisation is debated among historians of physics. We present a simplified account of this debate which also sheds light on the issue of indistinguishability and Einstein's light quantum hypothesis. We suggest that the teaching of quantum mechanics could benefit from including this material beyond the question of historical accuracy.

Krause, Decio (2017) Do `classical' space and time confer identity to quantum particles? [Preprint]

Authors: Steven B. Giddings

Quantum modifications to black holes on scales comparable to the horizon size, or even more radical physics, are apparently needed to reconcile the existence of black holes with the principles of quantum mechanics. This piece gives an overview of some possible observational tests for such departures from a classical description of black holes, via gravitational wave detection and very long baseline interferometry. (Invited comment for Nature Astronomy.)

Authors: Durmus Demir

It is shown that gravity can be incorporated into the Standard Model (SM) in a way solving the hierarchy problem. For this, the SM effective action in flat spacetime is adapted to curved spacetime via not only the general covariance but also the gauge invariance. For the latter, gauge field hard masses, induced by loops at the UV scale $\Lambda$, are dispelled by construing $\Lambda$ as the constant value assigned to curvature. This gives way to an unprecedented mechanism for incorporating gravity into the SM in that the hierarchy problem is solved by transmutation of the Higgs boson $\Lambda^2$--mass into the Higgs-curvature coupling, and the cosmological constant problem is alleviated by metamorphosis of the vacuum $\Lambda^4$--energy into the Einstein-Hilbert term. Gravity emerges correctly if the SM is accompanied by a secluded dark sector sourcing non-interacting dark matter, dark energy and dark radiation. Physics beyond the SM, containing Higgs-phobic scalars that resolve the strong CP problem, flavor problem, baryogenesis and inflation, respects the hierarchy. Majorana neutrinos are naturally incorporated if $\Lambda$ lies at the see-saw scale. This mechanism, in general, leaves no compelling reason to anticipate new particles at the LHC or higher-energy colliders.

Authors: Otto C. W. Kong (Nat'l Central U, Taiwan)

The physical world is quantum. However, our description of the quantum physics still relies much on concepts in classical physics and in some cases with `quantized' interpretations. The most important case example is that of spacetime. We examine the picture of the physical space as described by simple, so-called non-relativisitic, quantum mechanics instead of assuming the Newtonian model. The key perspective is that of (relativity) symmetry representation, and the idea that the physical space is to be identified as the configuration space for a free particle. Parallel to the case of the phase space, we have a model of the quantum physical space which reduces to the Newtonian classical model under the classical limit. The latter is to be obtained as a contraction limit of the relativity symmetry.

Authors: Liane Gabora, Kirsty Kitto

This paper proposes that cognitive humor can be modeled using the mathematical framework of quantum theory. We begin with brief overviews of both research on humor, and the generalized quantum framework. We show how the bisociation of incongruous frames or word meanings in jokes can be modeled as a linear superposition of a set of basis states, or possible interpretations, in a complex Hilbert space. The choice of possible interpretations depends on the context provided by the set-up vs. the punchline of a joke. We apply the approach to a verbal pun, and consider how it might be extended to frame blending. An initial study of that made use of the Law of Total Probability, involving 85 participant responses to 35 jokes (as well as variants), suggests that the Quantum Theory of Humor (QTH) proposed here provides a viable new approach to modeling humor.

Authors: Florian Fröwis, Peter C. Strassmann, Alexey Tiranov, Corentin Gut, Jonathan Lavoie, Nicolas Brunner, Félix Bussières, Mikael Afzelius, Nicolas Gisin

Quantum theory predicts that entanglement can also persist in macroscopic physical systems, albeit difficulties to demonstrate it experimentally remain. Recently, significant progress has been achieved and genuine entanglement between up to 2900 atoms was reported. Here we demonstrate 16 million genuinely entangled atoms in a solid-state quantum memory prepared by the heralded absorption of a single photon. We develop an entanglement witness for quantifying the number of genuinely entangled particles based on the collective effect of directed emission combined with the nonclassical nature of the emitted light. The method is applicable to a wide range of physical systems and is effective even in situations with significant losses. Our results clarify the role of multipartite entanglement in ensemble-based quantum memories as a necessary prerequisite to achieve a high single-photon process fidelity crucial for future quantum networks. On a more fundamental level, our results reveal the robustness of certain classes of multipartite entangled states, contrary to, e.g., Schr\"odinger-cat states, and that the depth of entanglement can be experimentally certified at unprecedented scales.

Evans, Peter W. and Shrapnel, Sally (2017) The Two Sides of Interventionist Causation. [Preprint]
Holik, Federico and Fortin, Sebastian and Bosyk, Gustavo and Plastino, Angelo (2017) On the interpretation of probabilities in generalized probabilistic models. [Preprint]
Fortin, Sebastian and Lombardi, Olimpia and Martínez, Juan Camilo (2017) The relationship between chemistry and physics from the perspective of Bohmian mechanics. [Preprint]
Dawid, Richard (2017) Scientific Realism and High Energy Physics. [Preprint]

Authors: Sabine Hossenfelder, Tobias Zingg

Analogue gravity is based on a mathematical identity between quantum field theory in curved space-time and the propagation of perturbations in certain condensed matter systems. But not every curved space-time can be simulated in such a way, because one does not only need a condensed matter system that generates the desired metric tensor, but that system then also has to obey its own equations of motion. And specifying the metric tensor that one wishes to realize usually overdetermines the underlying condensed matter system, such that its equations of motion are in general not fulfilled, in which case the desired metric does not have an analogue.

Here, we show that the class of metrics that have an analogue is bigger than what a first cursory consideration might suggest. This is due to the analogue metric only being defined up to a choice of parametrization of the perturbation in the underlying condensed matter system. In this way, the class of analogue gravity models can be vastly expanded. In particular, we demonstrate how this freedom of choice can be used to insert an intermediary conformal factor. Then, as a corollary, we find that any metric conformal to a Painlev\'e--Gullstrand type line element can, potentially, result as an analogue of a perturbation propagating in a non-viscous, barotropic fluid.

Abstract

Englert et al. (Zeitschrift für Naturforschung, 47a, 1175–1186, 1992) claim that, in certain circumstances, the Bohmian trajectory of a test particle does not match the reports of which-path detectors, concluding that the Bohmian trajectories are not real, but “surrealistic.” However, Hiley and Callaghan (Physica Scripta, 74, 336–348, 2006) argue that, if Bohm’s interpretation is correctly applied, no such mismatch is ever sanctioned. Unfortunately, the debate was never settled since nobody showed where the source of disagreement resided. In this paper, I reassess the debate over such “surrealistic” trajectories and I derive both a necessary and a sufficient condition for there to be a mismatch between the Bohmian trajectories and the reports of which-path detectors. I conclude that the mismatch is possible as a matter of principle, but can be ruled out in practice. I explore in depth the philosophical consequences of such mismatch arguing that it does not render realism about the Bohmian trajectories untenable. In addition, I show that the opposing conclusion of Hiley and Callaghan is due to the fact that they assume a set of trajectories that are incompatible with the postulates of Bohmian mechanics.

Norton, John D. (2017) How to Built an Infinite Lottery Machine. [Preprint]
Szabo, Laszlo E. (2017) Meaning, Truth, and Physics. In G. Hofer-Szabó, L. Wronski (eds.) Making it Formally Explicit.

Author(s): Ying Li, Andrew M. Steane, Daniel Bedingham, and G. Andrew D. Briggs

Continuous spontaneous localization (CSL) is a model that captures the effects of a class of extensions to quantum theory which are expected to result from quantum gravity and is such that wave-function collapse is a physical process. The rate of such a process could be very much lower than the uppe…


[Phys. Rev. A 95, 032112] Published Mon Mar 13, 2017

Author(s): Q. Duprey and A. Matzkin

Nondestructive weak measurements (WMs) made on a quantum particle are useful in order to extract information as the particle evolves from a prepared state to a finally detected state. The physical meaning of this information has been open to debate, particularly in view of the apparent discontinuous…


[Phys. Rev. A 95, 032110] Published Mon Mar 13, 2017

Authors: T. P. Singh

We highlight three conflicts between quantum theory and classical general relativity, which make it implausible that a quantum theory of gravity can be arrived at by quantising classical gravity. These conflicts are: quantum nonlocality and space-time structure; the problem of time in quantum theory; and the quantum measurement problem. We explain how these three aspects bear on each other, and how they point towards an underlying noncommutative geometry of space-time.

Authors: Patrick P. Hofer, Jonatan Bohr Brask, Martí Perarnau-Llobet, Nicolas Brunner

We propose the use of a quantum thermal machine for low-temperature thermometry. A hot thermal reservoir coupled to the machine allows for simultaneously cooling the sample while determining its temperature without knowing the model-dependent coupling constants. In its most simple form, the proposed scheme works for all thermal machines which perform at Otto efficiency and can reach Carnot efficiency. We consider a circuit QED implementation which allows for precise thermometry down to $\sim$ 15mK with realistic parameters. Based on the quantum Fisher information, this is close to the optimal achievable performance.

Authors: George Japaridze, Dipendra Pokhrel, Xiao-Qian Wang

PT-symmetric quantum mechanics, the extension of conventional quantum mechanics to the non-Hermitian Hamiltonian invariant under the combined parity (P) and time reversal (T) symmetry, has been successfully applied to a variety of fields such as solid state physics, mathematical physics, optics, quantum field theory. Recently, the extension of PT-symmetrical theory to entangled quantum systems was challenged in that PT formulation within the conventional Hilbert space violates the no-signaling principle. Here, we revisit the derivation of non-signaling principle in the framework of PT inner product prescription. Our results preserve the no-signaling principle for a two-qubit system, reaffirm the invariance of the entanglement, and reproduce the Clauser-Horne-Shimony-Holt (CHSH) inequality. We conclude that PT-symmetric quantum mechanics satisfies the requirements for a fundamental theory and provides a consistent description of quantum systems.

Abstract

I consider a quantum system that possesses key features of quantum shape dynamics and show that the evolution of wave-packets will become increasingly classical at late times and tend to evolve more and more like an expanding classical system. At early times however, semiclassical effects become large and lead to an exponential mismatch of the apparent scale as compared to the expected classical evolution of the scale degree of freedom. This quantum inflation of an emergent and effectively classical system, occurs naturally in the quantum shape dynamics description of the system, while it is unclear whether and how it might arise in a constrained Hamiltonian quantization.

Cuffaro, Michael E. (2016) Reconsidering No-Go Theorems from a Practical Perspective. [Preprint]

Authors: Steven B. Giddings

Quantum modifications to black holes on scales comparable to the horizon size, or even more radical physics, are apparently needed to reconcile the existence of black holes with the principles of quantum mechanics. This piece gives an overview of some possible observational tests for such departures from a classical description of black holes, via gravitational wave detection and very long baseline interferometry. (Invited comment for Nature Astronomy.)

Authors: Erhard Scholz

Weyl's original scale geometry of 1918 ("purely infinitesimal geometry") was withdrawn by its author from physical theorizing in the early 1920s. It had a comeback in the last third of the 20th century in different contexts: scalar tensor theories of gravity, foundations of gravity, foundations of quantum mechanics, elementary particle physics, and cosmology. It seems that Weyl geometry continues to offer an open research potential for the foundations of physics even after the turn to the new millennium.

<span><div>Abstract</div>Wood and Spekkens ([<a href="#axw037-B37">2015</a>]) argue that any causal model explaining the EPRB correlations and satisfying the no-signalling constraint must also violate the assumption that the model faithfully reproduces the statistical dependences and independencesa so-called fine-tuning of the causal parameters. This includes, in particular, retrocausal explanations of the EPRB correlations. I consider this analysis with a view to enumerating the possible responses an advocate of retrocausal explanations might propose. I focus on the response of Nger (<a href="#axw037-B22">[2016]</a>), who argues that the central ideas of causal explanations can be saved if one accepts the possibility of a stable fine-tuning of the causal parameters. I argue that in light of this view, a violation of faithfulness does not necessarily rule out retrocausal explanations of the EPRB correlations. However, when we consider a plausible retrocausal picture in some detail, it becomes clear that the causal modelling framework is not a natural arena for representing such an account of retrocausality. <ul><li>1Causal Models, Quantum Mechanics, and Faithfulness</li><li>2Fine-Tuning <ul><li>2.1Fine-tuning in a retrocausal model</li></ul></li><li>3Possible Responses</li><li>4Quantum Causal Models and Retrocausality <ul><li>4.1A more detailed retrocausal account</li><li>4.2A model of the EPRB probabilities</li><li>4.3Mapping to a causal model</li></ul></li><li>5Conclusion</li></ul></span>

Authors: Q. Duprey, S. Kanjilal, U. Sinha, D. Home, A. Matzkin

The Quantum Cheshire Cat [New J. Phys. 15, 113015, 2013] (QCC) is an effect defined within the Weak Measurements framework by which a property of a quantum particle appears to be spatially separated from its position. The status of this effect has however remained unclear, as claims of experimental observation of the QCC have been disputed by strong criticism of the experimental as well as the theoretical aspects of the effect. In this paper we clarify in what precise sense the QCC can be regarded as an unambiguous consequence of the standard quantum mechanical formalism applied to describe quantum pointers weakly coupled to a system. In light of this clarification, the raised criticisms of the QCC effect are rebutted. We further point out that the limitations of the experiments performed to date imply that a loophole-free experimental demonstration of the QCC has not yet been achieved.

Authors: Zichang He, Wen Jiang

Categorization is necessary for many decision making tasks. However, the categorization process may interfere the decision making result and the law of total probability can be violated in some situations. To predict the interference effect of categorization, some model based on quantum probability has been proposed. In this paper, a new quantum dynamic belief (QDB) model is proposed. Considering the precise decision may not be made during the process, the concept of uncertainty is introduced in our model to simulate real human thinking process. Then the interference effect categorization can be predicted by handling the uncertain information. The proposed model is applied to a categorization decision-making experiment to explain the interference effect of categorization. Compared with other models, our model is relatively more succinct and the result shows the correctness and effectiveness of our model.

Author(s): Michael Zwolak and Wojciech H. Zurek

The objective, classical world emerges from the underlying quantum substrate via the proliferation of redundant copies of selected information into the environment, which acts as a communication channel, transmitting that information to observers. These copies are independently accessible, allowing …


[Phys. Rev. A 95, 030101(R)] Published Wed Mar 08, 2017

Publication date: 18 April 2017
Source:Physics Letters A, Volume 381, Issue 15
Author(s): Mariami Gachechiladze, Otfried Gühne
In a paper by Popescu and Rohrlich [1] a proof has been presented showing that any pure entangled multiparticle quantum state violates some Bell inequality. We point out a gap in this proof, but we also give a construction to close this gap. It turns out that with some extra effort all the results from the aforementioned publication can be proven. Our construction shows how two-particle entanglement can be generated via performing local projections on a multiparticle state.

Authors: William G. Unruh, Robert M. Wald

The complete gravitational collapse of a body in general relativity will result in the formation of a black hole. Although the black hole is classically stable, quantum particle creation processes will result in the emission of Hawking radiation to infinity and corresponding mass loss of the black hole, eventually resulting in the complete evaporation of the black hole. Semiclassical arguments strongly suggest that, in the process of black hole formation and evaporation, a pure quantum state will evolve to a mixed state, i.e., there will be "information loss." There has been considerable controversy over this issue for more than 40 years. In this review, we present the arguments in favor of information loss, and analyze some of the counter-arguments and alternative possibilities.

Authors: Job Feldbrugge, Jean-Luc Lehners, Neil Turok

We argue that the Lorentzian path integral is a better starting point for quantum cosmology than the Euclidean version. In particular, we revisit the mini-superspace calculation of the Feynman path integral for quantum gravity with a positive cosmological constant. Instead of rotating to Euclidean time, we deform the contour of integration over metrics into the complex plane, exploiting Picard-Lefschetz theory to transform the path integral from a conditionally convergent integral into an absolutely convergent one. We show that this procedure unambiguously determines which semiclassical saddle point solutions are relevant to the quantum mechanical amplitude. Imposing "no-boundary" initial conditions, i.e., restricting attention to regular, complex metrics with no initial boundary, we find that the dominant saddle contributes a semiclassical exponential factor which is precisely the {\it inverse} of the famous Hartle-Hawking result.

Authors: Alain Connes

We give a survey of our joint ongoing work with Ali Chamseddine, Slava Mukhanov and Walter van Suijlekom. We show how a problem purely motivated by "how geometry emerges from the quantum formalism" gives rise to a slightly noncommutative structure and a spectral model of gravity coupled with matter which fits with experimental knowledge. This text will appear as a contribution to the volume: "Foundations of Mathematics and Physics one century after Hilbert". Editor: Joseph Kouneiher. Collection Mathematical Physics, Springer 2017

Abstract

Quantum violation of Bell inequalities is now used in many quantum information applications and it is important to analyze it both quantitatively and conceptually. In the present paper, we analyze violation of multipartite Bell inequalities via the local probability model—the LqHV (local quasi hidden variable) model (Loubenets in J Math Phys 53:022201, 2012), incorporating the LHV model only as a particular case and correctly reproducing the probabilistic description of every quantum correlation scenario, more generally, every nonsignaling scenario. The LqHV probability framework allows us to construct nonsignaling analogs of Bell inequalities and to specify parameters quantifying violation of Bell inequalities—Bell’s nonlocality—in a general nonsignaling case. For quantum correlation scenarios on an N-qudit state, we evaluate these nonlocality parameters analytically in terms of dilation characteristics of an N-qudit state and also, numerically—in d and N. In view of our rigorous mathematical description of Bell’s nonlocality in a general nonsignaling case via the local probability model, we argue that violation of Bell inequalities in a quantum case is not due to violation of the Einstein–Podolsky–Rosen (EPR) locality conjectured by Bell but due to the improper HV modelling of “quantum realism”.

Hoehn, Philipp A (2017) Quantum theory from rules on information acquisition. In: UNSPECIFIED.
Christian, Joy (2017) Refutation of Richard Gill's Argument Against my Disproof of Bell's Theorem. [Preprint]

Author(s): Jonathan Maltz and Leonard Susskind

de Sitter space is shown to arise as resonance in transition amplitudes in quantum gravity.


[Phys. Rev. Lett. 118, 101602] Published Tue Mar 07, 2017

Authors: D. Sokolovski

The Salecker-Wigner-Peres (SWP) clock is often used to determine the duration a quantum particle is supposed to spend is a specified region of space $\Om$. By construction, the result is a real positive number, and the method seems to avoid the difficulty of introducing complex time parameters, which arises in the Feynman paths approach. However, it tells very little about what is being learnt about the particle's motion. We investigate this matter further, and show that the SWP clock, like any other Larmor clock, correlates the rotation of its angular momentum with the durations $\tau$ Feynman paths spend in $\Om$, therefore destroying interference between different durations. An inaccurate weakly coupled clock leaves the interference almost intact, and the need to resolve resulting "which way?" problem is the main difficulty at the centre of the "tunnelling time" controversy. In the absence of a probability distribution for the values of $\tau$, the SWP results are expressed in terms of moduli of the "complex times", given by the weighted sums of the corresponding probability amplitudes. It is shown that over-interpretation of these results, by treating the SWP times as physical time intervals, leads to paradoxes and should be avoided. We analyse various settings of the SWP clock, different calibration procedures, and the relation between the SWP results and the quantum dwell time. Our general analysis is applied to the cases of stationary tunnelling and tunnel ionisation

Authors: Ralf Blattmann, Klaus Mølmer

We study the fluctuations of the work performed on a driven quantum system, defined as the difference between subsequent measurements of energy eigenvalues. These work fluctuations are governed by statistical theorems with similar expressions in classical and quantum physics. In this article we show that we can distinguish quantum and classical work fluctuations, as the latter can be described by a macrorealistic theory and hence obey Leggett-Garg inequalities. We show that these inequalities are violated by quantum processes in a driven two-level system and in a harmonic oscillator subject to a squeezing transformation.

Authors: Ian T. Durham

According to quantum theory, randomness is a fundamental property of the universe yet classical physics is mostly deterministic. In this article I show that it is possible for deterministic systems to arise from random ones and discuss the implications of this for the concept of free will.

Tappenden, Paul (2017) Objective Probability and the Mind-Body Relation. [Preprint]
Myrvold, Wayne C. and Albert, David Z. and Callender, Craig and Ismael, Jenann (2016) Book Symposium: David Albert, After Physics. UNSPECIFIED.

Authors: Ämin Baumeler, Fabio Costa, Timothy C. Ralph, Stefan Wolf, Magdalena Zych

General relativity predicts the existence of closed time-like curves, along which a material object could travel back in time and interact with its past self. The natural question is whether this possibility leads to inconsistencies: Could the object interact in such a way to prevent its own time travel? If this is the case, self-consistency should forbid certain initial conditions from ever happening, a possibility at odds with the local nature of dynamical laws. Here we consider the most general deterministic dynamics connecting classical degrees of freedom defined on a set of bounded space-time regions, requiring that it is compatible with arbitrary operations performed in the local regions. We find that any such dynamics can be realised through reversible interactions. We further find that consistency with local operations is compatible with non-trivial time travel: Three parties can interact in such a way to be all both in the future and in the past of each other, while being free to perform arbitrary local operations. We briefly discuss the quantum extension of the formalism.

Authors: Qingdi Wang, Zhen Zhu, William G. Unruh

We investigate the gravitational property of the quantum vacuum by treating its large energy density predicted by quantum field theory seriously and assuming that it does gravitate to obey the equivalence principle of general relativity. We find that the quantum vacuum would gravitate differently from what people previously thought. The consequence of this difference is an accelerating universe with a small Hubble expansion rate $H\propto \Lambda e^{-\beta\sqrt{G}\Lambda}\to 0$ instead of the previous prediction $H=\sqrt{8\pi G\rho^{vac}/3}\propto\sqrt{G}\Lambda^2\to\infty$ which was unbounded, as the high energy cutoff $\Lambda$ is taken to infinity. It gives the observed slow rate of the accelerating expansion as $\Lambda$ is taken to be some large value of the order of Planck energy or higher. This result suggests that there is no necessity to introduce the cosmological constant, which is required to be fine tuned to an accuracy of $10^{-120}$, or other forms of dark energy, which are required to have peculiar negative pressure, to explain the observed accelerating expansion of the universe.

Pitts, J. Brian (2017) Equivalent Theories Redefine Hamiltonian Observables to Exhibit Change in General Relativity. Classical and Quantum Gravity, 34. 055008. ISSN 1361-6382, Print ISSN 0264-9381
Cuffaro, Michael E. (2014) On the Significance of the Gottesman-Knill Theorem. [Preprint]
Publication date: Available online 24 February 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Alexander S. Blum


Gyenis, Zalán and Rédei, Miklós (2017) Common cause completability of non-classical probability spaces. [Preprint]

Authors: H. D. Zeh

Time-asymmetric spacetime structures, in particular those representing black holes and the expansion of the universe, are intimately related to other arrows of time, such as the second law and the retardation of radiation. The nature of the quantum arrow, often attributed to a collapse of the wave function, is essential, in particular, for understanding the much discussed "black hole information loss paradox". However, this paradox assumes a new form and can possibly be avoided in a consistent causal treatment that may be able to avoid horizons and singularities. The master arrow that would combine all arrows of time does not have to be identified with a direction of the formal time parameter that serves to formulate the dynamics as a succession of global states (a trajectory in configuration or Hilbert space). It may even change direction with respect to a fundamental physical clock such as the cosmic expansion parameter if this was formally extended either into a future contraction era or to negative "pre-big-bang" values.

Authors: Andrei G Lebed

We have recently shown that both passive and active gravitational masses of a composite body are not equivalent to its energy due to some quantum effects. We have also suggested an idealized and more realistic experiments to detect the above mentioned inequivalence for a passive gravitational mass. The suggested idealized effect is as follows. A spacecraft moves protons of a macroscopic ensemble of hydrogen atoms with constant velocity in the Earth's gravitational field. Due to non-homogeneous squeezing of space by the field, electron ground state wave function experiences time-dependent perturbation in each hydrogen atom. This perturbation results in the appearance of a finite probability for an electron to be excited at higher energy levels and to emit a photon. The experimental task is to detect such photons from the ensemble of the atoms. More realistic variants of such experiment can be realized in solid crystals and nuclei, as first mentioned by us. In his recent Comment on our paper, Crowell has argued that the effect, suggested by us, contradicts the existing experiments and, in particular, astronomic data. We show here that this conclusion is incorrect and based on the so-called "free fall" experiments, where our effect does not have to be observed.

Authors: Hans C. Ohanian

The collapse of a spatial probability distribution is triggered by a measurement at a given spacetime point. It is customarily assumed that this collapse occurs along an equal-time hypersurface, say, t = 0. However, such a na\"ive instantaneous collapse process is inconsistent with relativity, because the equal-time hypersurfaces of different inertial reference frames are different. The attempts at implementation of instantaneous collapse in several different reference frames then lead to violations of probability conservation and violations of the scalar character of the probability contained in given volume elements. This problem affects not only the Copenhagen interpretation of quantum mechanics, but also other interpretations in which it is still necessary to specify what changes in probabilities occur when and where in a manner consistent with relativistic spacetime geometry. In the 1980s Schlieder and Hellwig and Kraus proposed that collapse of the probability distribution along the past light cone of the measurement point avoids these difficulties and leads to a Lorentz-invariant collapse scenario. Their proposal received little attention and some negative criticisms. In this paper I argue that the proposed past-light cone collapse is not only reasonable, but is compelled by Lorentz invariance of probability conservation, and is equally valid for the spatial probability distributions in quantum mechanics and for those in a game of chance, for instance, the probability distribution for a game with playing cards scattered over some spatial region. I examine the objections that have been made to the past-light-cone collapse scenario and show that these objections are not valid. Finally, I propose two possible interferometer experiments that can serve as direct tests of past-light-cone collapse, one with an atom interferometer, and the other with a light interferometer.

Authors: Hervé Zwirn

It is well known that Wheeler proposed several delayed choice experiments in order to show the impossibility to speak of the way a quantum system behaves before being detected. In a double-slit experiment, when do photons decide to travel by one way or by two ways? Delayed choice experiments seem to indicate that, strangely, it is possible to change the decision of the photons until the very last moment before they are detected. This led Wheeler to his famous sentence: No elementary quantum phenomenon is a phenomenon until it is a registered phenomenon, brought to a close by an irreversible act of amplification. Nevertheless some authors wrote that backward in time effects were needed to explain these results. I will show that in delayed choice experiments involving only one particle, a simple explanation is possible without invoking any backward in time effect. Delayed choice experiments involving entangled particles such as the so called quantum eraser can also be explained without invoking any backward in time effect but I will argue that these experiments cannot be accounted for so simply because they rise the whole problem of knowing what a measurement and a collapse are. A previously presented interpretation, Convivial Solipsism, is a natural framework for giving a simple explanation of these delayed choice experiments with entangled particles. In this paper, I show how Convivial Solipsism helps clarifying the puzzling questions raised by the collapse of the wave function of entangled systems.

Authors: Jean Bricmont

The goal of this paper is to explain how the views of Albert Einstein, John Bell and others, about nonlocality and the conceptual issues raised by quantum mechanics, have been rather systematically misunderstood by the majority of physicists.

Authors: R. Tsekov

The quantum Liouville equation, which describes the phase space dynamics of a quantum system, is analyzed from statistical point of view as a particular example of the Kramers-Moyal expansion. An imaginary stochastic process is proposed as the origin of quantum mechanics. Quantum mechanics is extended to the relativistic case by generalizing the Wigner-Moyal equation and an expression is derived for the relativistic mass in the Wigner quantum phase space presentation. The diffusion with an imaginary diffusion coefficient is also discussed.

Authors: Tom Campbell, Houman Owhadi, Joe Sauvageau, David Watkinson

Can the hypothesis that reality is a simulation be tested? We investigate this question based on the assumption that if the system performing the simulation is finite (i.e. has limited resources), then to achieve low computational complexity, such a system would, as in a video game, render content (reality) only at the moment that information becomes available for observation by a player and not at the moment of detection by a machine (that would be part of the simulation and whose detection would also be part of the internal computation performed by the Virtual Reality server before rendering content to the player). Guided by this principle we describe conceptual wave/particle duality experiments aimed at testing the simulation hypothesis.

Authors: Jean Bricmont

The goal of this paper is to explain how the views of Albert Einstein, John Bell and others, about nonlocality and the conceptual issues raised by quantum mechanics, have been rather systematically misunderstood by the majority of physicists.

Abstract

A generalized Schrödinger equation containing correction terms to classical kinetic energy, has been derived in the complex vector space by considering an extended particle structure in stochastic electrodynamics with spin. The correction terms are obtained by considering the internal complex structure of the particle which is a consequence of stochastic average of particle oscillations in the zeropoint field. Hence, the generalised Schrödinger equation may be called stochastic Schrödinger equation. It is found that the second order correction terms are similar to corresponding relativistic corrections. When higher order correction terms are neglected, the stochastic Schrödinger equation reduces to normal Schrödinger equation. It is found that the Schrödinger equation contains an internal structure in disguise and that can be revealed in the form of internal kinetic energy. The internal kinetic energy is found to be equal to the quantum potential obtained in the Madelung fluid theory or Bohm statistical theory. In the rest frame of the particle, the stochastic Schrödinger equation reduces to a Dirac type equation and its Lorentz boost gives the Dirac equation. Finally, the relativistic Klein–Gordon equation is derived by squaring the stochastic Schrödinger equation. The theory elucidates a logical understanding of classical approach to quantum mechanical foundations.

Lazarovici, Dustin (2016) Against Fields. [Preprint]

Authors: Michael Pretko

Recent work has established the existence of stable quantum phases of matter described by "higher spin" (symmetric tensor) gauge fields, which naturally couple to particles of restricted mobility, such as fractons. We focus on the minimal model, consisting of fractons coupled to an emergent graviton (massless spin-2 excitation), which we show serves as a toy model for emergent gravity. We begin to bridge the gap in understanding between the fields of fractons and gravity. First, we reformulate the fracton phenomenon in terms of the emergent center of mass quantum number, and we discuss how an emergent gravitational attraction arises from the principles of locality and conservation of center of mass. We show that, while an isolated fracton is immobile, fractons are endowed with finite inertia by the presence of a large-scale distribution of other fractons to serve as a bath for exchanging center of mass, in a concrete manifestation of Mach's principle. We treat the two-body problem, discussing how the fractonic and Newtonian limits arise under different conditions. We then recast the motion in terms of an appropriate geodesic principle. Our formalism provides suggestive hints that matter plays a fundamental role, not only in perturbing, but in creating the background space in which it propagates.

Authors: David Blanco

In the last decades, it has been understood that a wide variety of phenomena in quantum field theory (QFT) can be characterised using quantum information measures, such as the entanglement entropy of a state and the relative entropy between quantum states in the same Hilbert space. In this thesis, we use these and other tools from quantum information theory to study several interesting problems in quantum field theory. The topics analysed range from the study of the Aharonov-Bohm effect in QFT using entanglement entropy, to the consistence of the Ryu-Takayanagi formula (proposed in the context of the AdS/CFT duality) using properties of relative entropy. We show that relative entropy can also be used to obtain new interesting quantum energy inequalities, that constrain the spatial distribution of negative energy density.

Authors: Sumanta Chakraborty, Kinjalk Lochan

Black holes, initially thought of as very interesting geometric constructions of nature, over time, have learnt to (often) come up with surprises and challenges. From the era of being described as merely some interesting and exotic solutions of General Relativity, they have, in modern times, really started to test our confidence in everything else, we thought we know about the nature. They have in this process, also earned a dreadsome reputation in some corners of theoretical physics. The most serious charge on the black holes is that they eat up information, never to release and subsequently erase it. This goes absolutely against the sacred principles of all other branches of fundamental sciences. This realization has shaken the very base of foundational concepts, both in quantum theory and gravity, which we always took for granted. Attempts to exorcise black holes of this charge, have led us to crossroads with concepts, hold dearly in quantum theory. The sphere of black hole's tussle with quantum theory has readily and steadily grown, from the advent of the Hawking radiation some four decades back, into domain of quantum information theory in modern times, most aptly, recently put in the form of the firewall puzzle. Do black holes really indicate something sinister about their existence or do they really take the lid off our comfort with ignoring the fundamental issues, our modern theories are seemingly plagued with? In this review, we focus on issues pertaining to black hole evaporation, the development of the information loss paradox, its recent formulation, the leading debates and promising directions in the community.

Authors: Luigi Seveso, Valerio Peri, Matteo G.A. Paris

Can quantum-mechanical particles propagating on a fixed spacetime background be approximated as test bodies satisfying the weak equivalence principle? We ultimately answer the question in the negative but find that, when universality of free-fall is assessed locally, a nontrivial agreement between quantum mechanics and the weak equivalence principle exists. Implications for mass sensing by quantum probes are discussed in some details.

Authors: Rodger I. Thompson

The observed constraints on the variability of the proton to electron mass ratio $\mu$ and the fine structure constant $\alpha$ are used to establish constraints on the variability of the Quantum Chromodynamic Scale and a combination of the Higgs Vacuum Expectation Value and the Yukawa couplings. Further model dependent assumptions provide constraints on the Higgs VEV and the Yukawa couplings separately. A primary conclusion is that limits on the variability of dimensionless fundamental constants such as $\mu$ and $\alpha$ provide important constraints on the parameter space of new physics and cosmologies.

Authors: Thibaut Josset

In quantum statistical mechanics, equilibrium states have been shown to be the typical states for a system that is entangled with its environment, suggesting a possible identification between thermodynamic and von Neumann entropies. In this paper, we investigate how the relaxation toward equilibrium is made possible through interactions that do not lead to significant exchange of energy, and argue for the validity of the second law of thermodynamics at the microscopic scale.

Authors: James M. Feagin, John S. Briggs

The precise connection between quantum wave functions and the underlying classical trajectories often is presented rather vaguely by practitioners of quantum mechanics. Here we demonstrate, with simple examples, that the imaging theorem (IT) based on the semiclassical propagator provides a precise connection. Wave functions are preserved out to macroscopic distances but the variables, position and momentum, of these functions describe classical trajectories. We show that the IT, based on an overtly time-dependent picture, provides a strategy alternative to standard scattering theory with which to compare experimental results to theory.

Abstract

Realists wanting to capture the facts of quantum entanglement in a metaphysical interpretation find themselves faced with several options: to grant some species of fundamental nonseparability, adopt holism, or (more radically) to view localized spacetime systems as ultimately reducible to a higher-dimensional entity, the quantum state or wave function. Those adopting the latter approach and hoping to view the macroscopic world as grounded in the quantum wave function face the macro-object problem. The challenge is to articulate the metaphysical relation obtaining between three-dimensional macro-objects and the wave function so that the latter may be seen in some sense as constituting the former. This paper distinguishes several strategies for doing so and defends one based on a notion of partial instantiation.

Dürr, Patrick and Ehmann, Alexander (2017) Probabilities in deBroglie-Bohm Theory: Towards a Stochastic Alternative. [Preprint]

Authors: Lorenzo Iorio

In the framework of the emergent gravity scenario by Verlinde, it was recently observed by Liu and Prokopec that, among other things, an anomalous pericenter precession would affect the orbital motion of a test particle orbiting an isolated central body. Here, it is shown that, if it were real, its expected magnitude for the inner planets of the Solar System would be at the same level of the present-day accuracy in constraining any possible deviations from their standard perihelion precessions as inferred from long data records spanning about the last century. The most favorable situation for testing the Verlinde-type precession seems to occur for Mars. Indeed, according to recent versions of the EPM and INPOP planetary ephemerides, non-standard perihelion precessions, of whatsoever physical origin, which are larger than some $\approx 0.02-0.11$ milliarcseconds per century are not admissible, while the putative precession predicted by Liu and Prokopec amounts to $0.09$ milliarcseconds per century. Other potentially interesting astronomical and astrophysical scenarios like, e.g., the Earth's LAGEOS II artificial satellite, the double pulsar system PSR J0737-3039A/B and the S-stars orbiting the Supermassive Black Hole in Sgr A$^\ast$ are, instead, not viable because of the excessive smallness of the predicted effects for them.

Authors: Ning Bao, ChunJun Cao, Sean M. Carroll, Liam McAllister

We consider cosmological evolution from the perspective of quantum information. We present a quantum circuit model for the expansion of a comoving region of space, in which initially-unentangled ancilla qubits become entangled as expansion proceeds. We apply this model to the comoving region that now coincides with our Hubble volume, taking the number of entangled degrees of freedom in this region to be proportional to the de Sitter entropy. The quantum circuit model is applicable for at most 140 $e$-folds of inflationary and post-inflationary expansion: we argue that no geometric description was possible before the time $t_1$ when our comoving region was one Planck length across, and contained one pair of entangled degrees of freedom. This approach could provide a framework for modeling the initial state of inflationary perturbations.

Authors: Gabriele Carcassi, Christine A. Aidala, David J. Baker, Lydia Bieri

The aim of this work is to show that particle mechanics, both classical and quantum, Hamiltonian and Lagrangian, can be derived from few simple physical assumptions. Assuming deterministic and reversible time evolution will give us a dynamical system whose set of states forms a topological space and whose law of evolution is a self-homeomorphism. Assuming the system is infinitesimally reducible---specifying the state and the dynamics of the whole system is equivalent to giving the state and the dynamics of its infinitesimal parts---will give us a classical Hamiltonian system. Assuming the system is irreducible---specifying the state and the dynamics of the whole system tells us nothing about the state and the dynamics of its substructure---will give us a quantum Hamiltonian system. Assuming kinematic equivalence, that studying trajectories is equivalent to studying state evolution, will give us Lagrangian mechanics and limit the form of the Hamiltonian/Lagrangian to the one with scalar and vector potential forces.

Peacock, Kent A. (2014) Would Superluminal Influences Violate the Principle of Relativity ? [Published Article or Volume]
Vervoort, Louis (2014) The Manipulability Account of Causation Applied to Typical Physical Systems. [Published Article or Volume]

Author(s): Mario Krenn, Armin Hochrainer, Mayukh Lahiri, and Anton Zeilinger

Quantum entanglement is one of the most prominent features of quantum mechanics and forms the basis of quantum information technologies. Here we present a novel method for the creation of quantum entanglement in multipartite and high-dimensional systems. The two ingredients are (i) superposition of …


[Phys. Rev. Lett. 118, 080401] Published Thu Feb 23, 2017

Author(s): Keren Li, Guofei Long, Hemant Katiyar, Tao Xin, Guanru Feng, Dawei Lu, and Raymond Laflamme

Superposition, arguably the most fundamental property of quantum mechanics, lies at the heart of quantum information science. However, how to create the superposition of any two unknown pure states remains as a daunting challenge. Recently, it was proved that such a quantum protocol does not exist i…


[Phys. Rev. A 95, 022334] Published Thu Feb 23, 2017

Publication date: Available online 16 February 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Dennis Dieks
According to what has become a standard history of quantum mechanics, in 1932 von Neumann persuaded the physics community that hidden variables are impossible as a matter of principle, after which leading proponents of the Copenhagen interpretation put the situation to good use by arguing that the completeness of quantum mechanics was undeniable. This state of affairs lasted, so the story continues, until Bell in 1966 exposed von Neumann's proof as obviously wrong. The realization that von Neumann's proof was fallacious then rehabilitated hidden variables and made serious foundational research possible again. It is often added in recent accounts that von Neumann's error had been spotted almost immediately by Grete Hermann, but that her discovery was of no effect due to the dominant Copenhagen Zeitgeist. We shall attempt to tell a story that is more historically accurate and less ideologically charged. Most importantly, von Neumann never claimed to have shown the impossibility of hidden variables tout court, but argued that hidden-variable theories must possess a structure that deviates fundamentally from that of quantum mechanics. Both Hermann and Bell appear to have missed this point; moreover, both raised unjustified technical objections to the proof. Von Neumann's argument was basically that hidden-variables schemes must violate the “quantum principle” that physical quantities are to be represented by operators in a Hilbert space. As a consequence, hidden-variables schemes, though possible in principle, necessarily exhibit a certain kind of contextuality. As we shall illustrate, early reactions to Bohm's theory are in agreement with this account. Leading physicists pointed out that Bohm's theory has the strange feature that pre-existing particle properties do not generally reveal themselves in measurements, in accordance with von Neumann's result. They did not conclude that the “impossible was done” and that von Neumann had been shown wrong.

Publication date: Available online 15 February 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): O.J.E. Maroney
A quantum pre- and post-selection paradox involves making measurements at two separate times on a quantum system, and making inferences about the state of the system at an intermediate time, conditional upon the observed outcomes. The inferences lead to predictions about the results of measurements performed at the intermediate time, which have been well confirmed experimentally, but which nevertheless seem paradoxical when inferences about different intermediate measurements are combined. The three box paradox is the paradigm example of such an effect, where a ball is placed in one of three boxes and is shuffled between the boxes in between two measurements of its location. By conditionalising on the outcomes of those measurements, it is inferred that between the two measurements the ball would have been found with certainty in Box 1 and with certainty in Box 2, if either box been opened on their own. Despite experimental confirmation of the predictions, and much discussion, it has remained unclear what exactly is supposed to be paradoxical or what specifically is supposed to be quantum, about these effects. In this paper I identify precisely the conditions under which the quantum three box paradox occurs, and show that these conditions are the same as arise in the derivation of the Leggett–Garg Inequality, which is supposed to demonstrate the incompatibility of quantum theory with macroscopic realism. I will argue that, as in Leggett–Garg Inequality violations, the source of the effect actually lies in the disturbance introduced by the intermediate measurement, and that the quantum nature of the effect is that no classical model of measurement disturbance can reproduce the paradox.

Author(s): L. Czekaj, M. Horodecki, P. Horodecki, and R. Horodecki

To explain the conceptual gap between classical and quantum and other, hypothetical descriptions of the world, several principles have been proposed. So far, all these principles have not explicitly included the uncertainty relation. Here we introduce an information content principle (ICP) which rep…


[Phys. Rev. A 95, 022119] Published Fri Feb 17, 2017

Abstract

A two boundary quantum mechanics without time ordered causal structure is advocated as consistent theory. The apparent causal structure of usual “near future” macroscopic phenomena is attributed to a cosmological asymmetry and to rules governing the transition between microscopic to macroscopic observations. Our interest is a heuristic understanding of the resulting macroscopic physics.

<span>I argue that our judgements regarding the locally causal models that are compatible with a given constraint implicitly depend, in part, on the context of inquiry. It follows from this that certain quantum no-go theorems, which are particularly striking in the traditional foundational context, have no force when the context switches to a discussion of the physical systems we are capable of building with the aim of classically reproducing quantum statistics. I close with a general discussion of the possible implications of this for our understanding of the limits of classical description, and for our understanding of the fundamental aim of physical investigation. <ul><li><strong>1</strong><span>Introduction</span></li><li><strong>2</strong><span>No-Go Results</span><ul><li><strong>2.1</strong><span>The CHSH inequality</span></li><li><strong>2.2</strong><span>The GHZ equality</span></li></ul></li><li><strong>3</strong><span>Classically Simulating Quantum Statistics</span><ul><li><strong>3.1</strong><span>GHZ statistics</span></li><li><strong>3.2</strong><span>Singlet statistics</span></li></ul></li><li><strong>4</strong><span>What Is a Classical Computer Simulation?</span></li><li><strong>5</strong><span>Comparing the All-or-Nothing GHZ with Statistical (In)equalities</span></li><li><strong>6</strong><span>General Discussion</span></li><li><strong>7</strong><span>Conclusion</span></li></ul></span>

Author(s): Eli Pollak

A quantum mechanical transition path time probability distribution is formulated and its properties are studied using a parabolic barrier potential model. The average transit time is well defined and readily calculated. It is smaller than the analogous classical mechanical average transit time, vani…


[Phys. Rev. Lett. 118, 070401] Published Wed Feb 15, 2017

Abstract

Svensson (Found Phys 45: 1645, 2015) argued that the concept of the weak value of an observable of a pre- and post-selected quantum system cannot be applied when the expectation value of the observable in the initial state vanishes. Svensson’s argument is analyzed and shown to be inconsistent using several examples.

Author(s): Martí Perarnau-Llobet, Elisa Bäumer, Karen V. Hovhannisyan, Marcus Huber, and Antonio Acin

The operator of work in quantum thermodynamics is incompatible with quantum mechanics, which is why the correspondence principle has to be critically examined whenever work is involved.


[Phys. Rev. Lett. 118, 070601] Published Tue Feb 14, 2017