Latest Papers on Quantum Foundations - Updated Daily by IJQF

Authors: Juan Cayuso, Néstor Ortiz, Luis Lehner

The question of what gravitational theory could supersede General Relativity has been central in theoretical physics for decades. Many disparate alternatives have been proposed motivated by cosmology, quantum gravity and phenomenological angles, and have been subjected to tests derived from cosmological, solar system and pulsar observations typically restricted to linearized regimes. Gravitational waves from compact binaries provide new opportunities to probe these theories in the strongly gravitating/highly dynamical regimes. To this end however, a reliable understanding of the dynamics in such a regime is required. Unfortunately, most of these theories fail to define well posed initial value problems, which prevents at face value from meeting such challenge. In this work, we introduce a consistent program able to remedy this situation. This program is inspired in the approach to "fixing" viscous relativistic hydrodynamics introduced by Israel and Stewart in the late 70's. We illustrate how to implement this approach to control undesirable effects of higher order derivatives in gravity theories and argue how the modified system still captures the true dynamics of the putative underlying theories in 3+1 dimensions. We sketch the implementation of this idea in a couple of effective theories of gravity, one in the context of Non-commutative geometry, and one in the context of Chern-Simons modified General Relativity.

Authors: Andrew Strominger

We argue that four-dimensional black hole evaporation inevitably produces an infinite number of soft particles in addition to the thermally distributed `hard' Hawking quanta, and moreover that the soft and hard particles are highly correlated. This raises the possibility that quantum purity is restored by correlations between the hard and soft radiation, while inclusive measurements which omit the soft radiation observe the thermal Hawking spectrum. In theories whose only stable particle is the graviton, conservation laws are used to argue that such correlations are in principle sufficient for the soft gravitons to purify the hard thermal ones.

Authors: Matthias Lienert, Roderich Tumulka

Suppose that particle detectors are placed along a Cauchy surface $\Sigma$ in Minkowski space-time, and consider a quantum theory with fixed or variable number of particles (i.e., using Fock space or a subspace thereof). It is straightforward to guess what Born's rule should look like for this setting: The probability distribution of the detected configuration on $\Sigma$ has density $|\psi_\Sigma|^2$, where $\psi_\Sigma$ is a suitable wave function on $\Sigma$, and the operation $|\cdot|^2$ is suitably interpreted. We call this statement the "curved Born rule." Since in any one Lorentz frame, the appropriate measurement postulates referring to constant-$t$ hyperplanes should determine the probabilities of the outcomes of any conceivable experiment, they should also imply the curved Born rule. This is what we are concerned with here: deriving Born's rule for $\Sigma$ from Born's rule in one Lorentz frame (along with a collapse rule). We describe two ways of defining an idealized detection process, and prove for one of them that the probability distribution coincides with $|\psi_\Sigma|^2$. For this result, we need two hypotheses on the time evolution: that there is no interaction faster than light, and that there is no creation of particles from the Fock vacuum. The wave function $\psi_\Sigma$ can be obtained from the Tomonaga--Schwinger equation, or from a multi-time wave function by inserting configurations on $\Sigma$. Thus, our result establishes in particular how multi-time wave functions are related to detection probabilities.

Abstract

In this paper, I will argue that metaphysicians ought to utilize quantum theories of gravity (QG) as incubators for a future metaphysics. I will argue why this ought to be done and will present cases studies from the history of science where physical theories have challenged both the dogmatic and speculative metaphysician. I provide two theories of QG and demonstrate the challenge they pose to certain aspects of our current metaphysics; in particular, how they challenge our understanding of the abstract–concrete distinction. I demonstrate how five different accounts of the distinction each fail to hold under the received interpretations of loop quantum gravity and string theory. The central goal of this paper is to encourage metaphysicians to look to physical theories, especially those involving cosmology such as string theory and loop quantum gravity, when doing metaphysics.

Sanders, Ko (2015) What can (mathematical) categories tell us about space-time? In: UNSPECIFIED.
Publication date: Available online 22 June 2017
Source:Physics Reports
Author(s): Luigi Delle Site, Matej Praprotnik
Typical experimental setups for molecular systems must deal with a certain coupling to the external environment, that is, the system is open and exchanges mass, momentum, and energy with its surroundings. Instead, standard molecular simulations are mostly performed using periodic boundary conditions with a constant number of molecules. In this review, we summarize major developments of open simulation methodologies, which, contrary to standard techniques, open up the boundaries of a molecular system and allow for exchange of energy and matter with the environment, in and out of equilibrium. In particular, we construct the review around the open simulation approaches based on the Adaptive Resolution Scheme (AdResS), which seamlessly couples different levels of resolution in molecular simulations. Ideas and theoretical concepts used in its development lie at the crossroad of different fields and disciplines and open many different directions for future developments in molecular simulation. We examine progress related to theoretical as well as novel modeling approaches bridging length scales from quantum to the continuum description and report on their application in various molecular systems. The outlook of the review is dedicated to the perspective of how to further incorporate rigorous theoretical approaches such as the Bergman-Lebowitz and Emch-Sewell models into the molecular simulation algorithms and stimulate further development of open simulation methods and their application.

Menon, Tushar and Moller-Nielsen, Thomas and Read, James (2017) Regarding `Regarding the `Hole Argument''. [Preprint]
Hoehn, Philipp (2017) Reflections on the information paradigm in quantum and gravitational physics. [Preprint]

Huw Price has proposed an argument that suggests a time symmetric ontology for quantum theory must necessarily be retrocausal, i.e. it must involve influences that travel backwards in time. One of Price's assumptions is that the quantum state is a state of reality. However, one of the reasons for exploring retrocausality is that it offers the potential for evading the consequences of no-go theorems, including recent proofs of the reality of the quantum state. Here, we show that this assumption can be replaced by a different assumption, called -mediation, that plausibly holds independently of the status of the quantum state. We also reformulate the other assumptions behind the argument to place them in a more general framework and pin down the notion of time symmetry involved more precisely. We show that our assumptions imply a timelike analogue of Bell's local causality criterion and, in doing so, give a new interpretation of timelike violations of Bell inequalities. Namely, they show the impossibility of a (non-retrocausal) time symmetric ontology.

Bacelar Valente, Mario (2017) Conventionality in Einstein's practical geometry. THEORIA. An International Journal for Theory, History and Foundations of Science, 32 (2). pp. 177-190. ISSN 2171-679X

Authors: Ichiro Oda

An important hurdle to be faced by any model proposing a resolution to the cosmological constant problem is Weinberg's venerable no go theorem. This theorem states that no local field equations including classical gravity can have a flat Minkowski solution for generic values of the parameters, in other words, the no go theorem forbids the existence of any solution to the cosmological constant problem within local field theories without fine tuning. Though the original Weinberg theorem is valid only in classical gravity, in this article we prove that this theorem holds even in quantum gravity. Our proof is very general since it makes use of the BRST invariance emerging after gauge-fixing of general coordinate invariance and does not depend on the detail of quantum gravity.

Authors: Luis E. Ibanez, Victor Martin-Lozano, Irene Valenzuela

It is known that there are AdS vacua obtained from compactifying the SM to 2 or 3 dimensions. The existence of such vacua depends on the value of neutrino masses through the Casimir effect. Using the Weak Gravity Conjecture, it has been recently argued by Ooguri and Vafa that such vacua are incompatible with the SM embedding into a consistent theory of quantum gravity. We study the limits obtained for both the cosmological constant $\Lambda_4$ and neutrino masses from the absence of such dangerous 3D and 2D SM AdS vacua. One interesting implication is that $\Lambda_4$ is bounded to be larger than a scale of order $m_\nu^4$, as observed experimentally. Interestingly, this is the first argument implying a non-vanishing $\Lambda_4$ only on the basis of particle physics, with no cosmological input. Conversely, the observed $\Lambda_4$ implies strong constraints on neutrino masses in the SM and also for some BSM extensions including extra Weyl or Dirac spinors, gravitinos and axions. We find that experimental neutrino masses are incompatible with Majorana neutrinos unless new physics in the form of at least one Weyl fermion or a multi-axion set are present. Candidates for such particles are sterile neutrinos, axinos or very light gravitinos. Dirac neutrino masses with the lightest of them lighter than $4.1\times 10^{-3} eV$ are still viable with no need of extra BSM particles. We also critically discuss the issue of the stability of these 3D SM vacua, which is required for these bounds to hold.

Authors: Ya Xiao, Yaron Kedem, Jin-Shi Xu, Chuan-Feng Li, Guang-Can Guo

Interpretations of quantum mechanics (QM), or proposals for underlying theories, that attempt to present a definite realist picture, such as Bohmian mechanics, require strong non-local effects. Naively, these effects would violate causality and contradict special relativity. However if the theory agrees with QM the violation cannot be observed directly. Here, we demonstrate experimentally such an effect: we steer the velocity and trajectory of a Bohmian particle using a remote measurement. We use a pair of photons and entangle the spatial transverse position of one with the polarization of the other. The first photon is sent to a double-slit-like apparatus, where its trajectory is measured using the technique of Weak Measurements. The other photon is projected to a linear polarization state. The choice of polarization state, and the result, steer the first photon in the most intuitive sense of the word. The effect is indeed shown to be dramatic, while being easy to visualize. We discuss its strength and what are the conditions for it to occur.

Authors: Angelo Bassi, André Großardt, Hendrik Ulbricht

We discuss effects of loss of coherence in low energy quantum systems caused by or related to gravitation, referred to as gravitational decoherence. These effects, resulting from random metric fluctuations, for instance, promise to be accessible by relatively inexpensive table-top experiments, way before the scales where true quantum gravity effects become important. Therefore, they can provide a first experimental view on gravity in the quantum regime. We will survey models of decoherence induced both by classical and quantum gravitational fluctuations; it will be manifest that a clear understanding of gravitational decoherence is still lacking. Next we will review models where quantum theory is modified, under the assumption that gravity causes the collapse of the wave functions, when systems are large enough. These models challenge the quantum-gravity interplay, and can be tested experimentally. In the last part we have a look at the state of the art of experimental research. We will review efforts aiming at more and more accurate measurements of gravity (G and g) and ideas for measuring conventional and unconventional gravity effects on nonrelativistic quantum systems.

Authors: Jonathan Olson, Yudong Cao, Jonathan Romero, Peter Johnson, Pierre-Luc Dallaire-Demers, Nicolas Sawaya, Prineha Narang, Ian Kivlichan, Michael Wasielewski, Alán Aspuru-Guzik

The NSF Workshop in Quantum Information and Computation for Chemistry assembled experts from directly quantum-oriented fields such as algorithms, chemistry, machine learning, optics, simulation, and metrology, as well as experts in related fields such as condensed matter physics, biochemistry, physical chemistry, inorganic and organic chemistry, and spectroscopy. The goal of the workshop was to summarize recent progress in research at the interface of quantum information science and chemistry as well as to discuss the promising research challenges and opportunities in the field. Furthermore, the workshop hoped to identify target areas where cross fertilization among these fields would result in the largest payoff for developments in theory, algorithms, and experimental techniques. The ideas can be broadly categorized in two distinct areas of research that obviously have interactions and are not separated cleanly. The first area is quantum information for chemistry, or how quantum information tools, both experimental and theoretical can aid in our understanding of a wide range of problems pertaining to chemistry. The second area is chemistry for quantum information, which aims to discuss the several aspects where research in the chemical sciences can aid progress in quantum information science and technology. The results of the workshop are summarized in this report.

Authors: Dharam Vir Ahluwalia

A broad brush impressionistic view of physics from the vantage point of she living on a nearby dark-planet Zimpok is presented so as to argue that the observed and the observer are reflected in quantum gravity through a universal mass shared by neurones and a unification scale of the high energy physics.

Authors: Pierre-Henri Chavanis

Using Nottale's theory of scale relativity, we derive a generalized Schr\"odinger equation applying to dark matter halos. This equation involves a logarithmic nonlinearity associated with an effective temperature and a source of dissipation. Fundamentally, this wave equation arises from the nondifferentiability of the trajectories of the dark matter particles whose origin may be due to ordinary quantum mechanics, classical ergodic (or almost ergodic) chaos, or to the fractal nature of spacetime at the cosmic scale. The generalized Schr\"odinger equation involves a coefficient ${\cal D}$, possibly different from $\hbar/2m$, whose value for dark matter halos is ${\cal D}=1.02\times 10^{23}\, {\rm m^2/s}$. We suggest that the cold dark matter crisis may be solved by the fractal (nondifferentiable) structure of spacetime at the cosmic scale, or by the chaotic motion of the particles on a very long timescale, instead of ordinary quantum mechanics. The equilibrium states of the generalized Schr\"odinger equation correspond to configurations with a core-halo structure. The quantumlike potential generates a solitonic core that solves the cusp problem of the classical cold dark matter model. The logarithmic nonlinearity accounts for the presence of an isothermal halo that leads to flat rotation curves. The damping term ensures that the system relaxes towards an equilibrium state. This property is guaranteed by an $H$-theorem satisfied by a Boltzmann-like free energy functional. In our approach, the temperature and the friction arise from a single formalism. They correspond to the real and imaginary parts of the complex friction coefficient present in the scale covariant equation of dynamics that is at the basis of Nottale's theory of scale relativity.

Authors: Furkan Semih Dündar, Metin Arik

In this paper, we investigated the role of accelerated observers in observing the Unruh radiation in the Bohmian field theory on a shape dynamics background setting. Since metric and metric momentum are real quantities, the integral kernel to invert the Lichnerowicz-York equation for first order deviations due to existence of matter terms turns out to be real. This fact makes the interaction Hamiltonian real. On the other hand, since the ground state wave functional for rectilinear observers is also real, the jump rate of the Bohmian field theory turns out to vanish. Hence we conclude that Unruh effect in Bohmian field theory on a shape dynamics background setting is non-existent. It is also found out that the non-existence of Unruh radiation is independent from the compact nature of space in shape dynamics. Therefore, observation of Unruh radiation may be a test for quantum field theory and Bohmian field theory on a shape dynamics background.

Abstract

In an accompanying paper Gomes (arXiv:1504.02818, 2015), we have put forward an interpretation of quantum mechanics based on a non-relativistic, Lagrangian 3+1 formalism of a closed Universe M, existing on timeless configuration space \(\mathcal {Q}\) of some field over M. However, not much was said there about the role of locality, which was not assumed. This paper is an attempt to fill that gap. Locality in full can only emerge dynamically, and is not postulated. This new understanding of locality is based solely on the properties of extremal paths in configuration space. I do not demand locality from the start, as it is usually done, but showed conditions under which certain systems exhibit it spontaneously. In this way we recover semi-classical local behavior when regions dynamically decouple from each other, a notion more appropriate for extension into quantum mechanics. The dynamics of a sub-region O within the closed manifold M is independent of its complement, \(M-O\) , if the projection of extremal curves on \(\mathcal {Q}\) onto the space of extremal curves intrinsic to O is a surjective map. This roughly corresponds to \(e^{i\hat{H}t}\circ \mathsf {pr}_{\mathrm{O}}= \mathsf {pr}_{\mathrm{O}}\circ e^{i\hat{H}t}\) , where \(\mathsf {pr}_{\mathrm{O}}:\mathcal {Q}\rightarrow \mathcal {Q}_O^{\partial O}\) is a linear projection. This criterion for locality can be made approximate—an impossible feat had it been already postulated—and it can be applied for theories which do not have hyperbolic equations of motion, and/or no fixed causal structure. When two regions are mutually independent according to the criterion proposed here, the semi-classical path integral kernel factorizes, showing cluster decomposition which is the ultimate aim of a definition of locality.

Author(s): Stephen L. Adler

Assuming the standard axioms for quaternionic quantum theory and a spatially localized scattering interaction, the S matrix in quaternionic quantum theory is complex valued, not quaternionic. Using the standard connections between the S matrix, the forward scattering amplitude for electromagnetic wa…


[Phys. Rev. A 95, 060101(R)] Published Mon Jun 19, 2017

Authors: Wen-ge Wang

A method is proposed for the formulation of the quantum electrodynamics, which is completely quantum mechanical and does not make use of gauge symmetry of the corresponding classical fields. In this formulation, photon states are introduced based on a geometric property of the quantum state space for one electron and one positron, and an interaction Hamiltonian appears as an operator, which maps in a most natural way the quantum state space for electron and positron to the quantum state space for photon. This method could be useful in the search for a description of the three interactions (electromagnetic, weak, and strong), which is more unified than that given in the standard model.

Authors: Jerome Martin, Vincent Vennin

We present a general and systematic study of how a Bell experiment on the cosmic microwave background could be carried out. We introduce different classes of pseudo-spin operators and show that, if the system is placed in a two-mode squeezed state as inflation predicts, they all lead to a violation of the Bell inequality. However, we also discuss the obstacles that one faces in order to realize this program in practice and show that they are probably insurmountable. We suggest alternative methods that would reveal the quantum origin of cosmological structures without relying on Bell experiments.

Authors: Catarina Moreira, Andreas Wichert

The application of principles of Quantum Mechanics in areas outside of physics has been getting increasing attention in the scientific community in an emergent disciplined called Quantum Cognition. These principles have been applied to explain paradoxical situations that cannot be easily explained through classical theory. In quantum probability, events are characterised by a superposition state, which is represented by a state vector in a $N$-dimensional vector space. The probability of an event is given by the squared magnitude of the projection of this superposition state into the desired subspace. This geometric approach is very useful to explain paradoxical findings that involve order effects, but do we really need quantum principles for models that only involve projections?

This work has two main goals. First, it is still not clear in the literature if a quantum projection model has any advantage towards a classical projection. We compared both models and concluded that the Quantum Projection model achieves the same results as its classical counterpart, because the quantum interference effects play no role in the computation of the probabilities. Second, it intends to propose an alternative relativistic interpretation of rotation parameters that are involved in both classical and quantum models. In the end, instead of interpreting these parameters as a similarity measure between questions, we propose that they emerge due to the lack of knowledge concerned with a personal basis state and also due to uncertainties towards the state of the world and towards the context of the questions.

Volume 3, Issue 3, pages 78-99

Tom Campbell [Show Biography], Houman Owhadi [Show Biography], Joe Sauvageau [Show Biography], and David Watkinson [Show Biography]

Tom Campbell, born in the USA in 1944, earned his BS degree (Cum Laude) in 1966 with majors in both Mathematics and Physics. While an undergraduate, Tom became the president of his fraternity and chief Justice of the college’s student court.  Tom was awarded a master’s degree in Physics from Purdue university in 1968 after which PhD work commenced at the University of Virginia with a specialization in experimental nuclear physics.  Campbell was an analyst with Army technical intelligence for a decade before moving into the research and development of technology supporting defensive missile systems.  He also worked as a consultant for NASA within the Ares I program assessing and solving problems of system risk and survivability to insure crew survivability and mission success.  Campbell published a trilogy My Big TOE (MBT) in 2003 that offered a fully complete cosmology based on the simulation hypothesis including a theory of consciousness, and a derivation of both relativity and Quantum Mechanics from one overarching set of principles.  Furthermore, Campbell’s theory eliminates any nonlocal "weirdness"…. replacing it with a completely rational and logical causal process as found in all other subsets of science.  MBT has been successful at solving many outstanding fundamental paradoxes within physics in particular, science in general, and within several other major fields of study including: philosophy (cosmology, epistemology, ontology), psychology (mind models), mathematics (cellular automata and evolution as process fractals), medicine (mind-body connection), biology (math & other anomalies), and theology (source). In October 2016, Campbell presented, in Los Angeles CA (MBT-LA), a set of quantum experiments that would support his theory if they worked as predicted (available on DVD upon request or on YouTube).   Fortunately, these experiments are relatively inexpensive and not particularly difficult to perform.

Houman Owhadi is Professor of Applied & Computational Mathematics and Control and Dynamical Systems in the Computing and Mathematical Sciences Department at the California Institute of Technology. His work lies at the interface between applied mathematics, probability and statistics. At the center of his work are fundamental problems such as the optimal quantification of uncertainties in presence of limited information, statistical inference/game theoretic approaches to numerical approximation and algorithm design, multiscale analysis with non-separated scales, and the geometric integration of structured stochastic systems.

Joe Sauvageau received a M.A, and Ph.D. from Stony Brook University in New York in Applied Physics in 1987. He is currently serving as a Senior Systems Manager at the NASA Jet Propulsion Laboratory in Pasadena, CA in the Astronomy, Physics and Space Technology Office. His career experience has spanned scientific pursuits as a government scientist at NIST studying superconducting quantum devices; industrial physics and engineering development in the semiconductor, optoelectronics and photonics industries; and leading the optical engineering design, development and deployment of next generation optical instruments including visible and infrared imaging sensors for space and airborne applications. He was the recipient of the Rotary National Award for Space Achievement (RNASA) Stellar Award in 2013, Aviation Week Technical Program Excellence Award and an IEEE Outstanding Engineer of the Year Award in 2012 associated with the design and development of a multispectral sensor payload currently in geosynchronous orbit. He has also served in various senior management positions ranging from start-ups through Fortune 500 companies and he has held positions on several Technical Advisory Boards. His publication portfolio includes four patents and a multitude of articles and technical reports in journals.

In 1964, David Watkinson went to the University of North Carolina on a navy scholarship to major in physics and mathematics. In his junior year, he visited Dr. J. B. Rhine’s Parapsychology laboratory at nearby Duke University and became so interested in that field of study that he graduated with a degree in psychology. Having developed programming and 2D/3D animation skills, Watkinson was recruited to work on feature films which led to a long career in Hollywood and some time as a Visiting Assistant Professor in the UCLA Graduate School of Film. During that time, Watkinson became interested in virtual reality technology while writing about it for Videography Magazine. Watkinson was finally able to combine his interest in virtual reality simulations, parapsychology and physics when he formed a group to study physicist Tom Campbell’s TOE which unifies all three fields. In pursuing that study, Watkinson visited physicist Marlan Scully and his team at Texas A&M to discuss Scully’s Delayed Choice Quantum Eraser. During the visit, Watkinson became aware of an important variation of the Double Slit experiment that had never been performed. That led Watkinson to get involved with Campbell and the other authors in an effort to promote interest in performing multiple experiments to test the simulation hypothesis.

Can the theory that reality is a simulation be tested? We investigate this question based on the assumption that if the system performing the simulation is finite (i.e. has limited resources), then to achieve low computational complexity, such a system would, as in a video game, render content (reality) only at the moment that information becomes available for observation by a player and not at the moment of detection by a machine (that would be part of the simulation and whose detection would also be part of the internal computation performed by the Virtual Reality server before rendering content to the player). Guided by this principle we describe conceptual wave/particle duality experiments aimed at testing the simulation theory.

Full Text Download (378k) | View Submission Post

Volume 3, Issue 3, pages 65-77

Andreas Schlatter [Show Biography]

Born in Zurich, Switzerland, Andreas Schlatter was educated at the Swiss Federal Institute of Technology in Zurich, where he studied mathematics. He got his PhD in 1994 with work in partial differential equations. He subsequently held a research position at Princeton University, where he did further work mainly on the Yang-Mills heat equation. In 1997 Andreas joined the Asset Management industry and pursued a distinguished career over twenty years, which brought him into the Executive Committee of one of the world’s large Asset Management firms. Today Andreas does consulting work and holds a number of independent board seats. Andreas has been doing research and published during his professional life, mainly in the area of Quantum Foundations and Relativity but also in Finance.

The notion of duration is a fundamental feature of intuitive time. Under the assumption that the duration of a measurement is defined to produce a result with certainty, we define a universal observer who observes the position of a physical system. We investigate what conclusions for the structure of space-time the universal observer can draw. It turns out that the perspective of this observer is compatible with the Minkowski structure of space-time.

Full Text Download (751k) | View Submission Post

Publication date: Available online 16 June 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Gabriel Catren
We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.

Publication date: Available online 16 June 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Flavio Del Santo
I present the reconstruction of the involvement of Karl Popper in the community of physicists concerned with foundations of quantum mechanics, in the 1980s. At that time Popper gave active contribution to the research in physics, of which the most significant is a new version of the EPR thought experiment, alleged to test different interpretations of quantum mechanics. The genesis of such an experiment is reconstructed in detail, and an unpublished letter by Popper is reproduced in the present paper to show that he formulated his thought experiment already two years before its first publication in 1982. The debate stimulated by the proposed experiment as well as Popper's role in the physics community throughout 1980s is here analysed in detail by means of personal correspondence and publications.

Authors: Yuri Bonder, Gabriel Leon

Modified gravity theories are supposed to incorporate low-energy quantum-gravity effects and, at the same time, they could shed light into the dark matter and dark energy problems. Here we study a particular modification of general relativity where local Lorentz invariance is spontaneously broken and whose physical effects, despite a decade-long effort, were unknown. We show that, during inflation, this modification produces anisotropies that would generate measurable effects on the Cosmic Microwave Background. Then, by using empirical constraints on the B-mode polarization spectrum, we can estimate that the `coefficient' components absolute value have to be smaller than $10^{-43}$. This is a remarkably strong limit, in fact, it is 29 orders of magnitude better than the best constraints on similar coefficients. Thus, we propose that inflation could stringently test other modified gravity theories.

Authors: L. Castellani, R. Catenacci, P. A. Grassi

We reformulate Super Quantum Mechanics in the context of integral forms. This framework allows to interpolate between different actions for the same theory, connected by different choices of Picture Changing Operators (PCO). In this way we retrieve component and superspace actions, and prove their equivalence. The PCO are closed integral forms, and can be interpreted as super Poincar\'e duals of bosonic submanifolds embedded into a supermanifold.. We use them to construct Lagrangians that are top integral forms, and therefore can be integrated on the whole supermanifold. The $D=1, ~N=1$ and the $D=1,~ N=2$ cases are studied, in a flat and in a curved supermanifold. In this formalism we also consider coupling with gauge fields, Hilbert space of quantum states and observables.

Authors: Dushyant Kumar

We introduce a framework for non-linear time evolution in quantum mechanics as a natural non-linear generalization of the Schrodinger equation. Within our framework, we derive simple toy models of dynamical geometry on finite graphs. Along similar lines we also propose a model of non-linear quantum field theory on spaces with state-dependent geometry.

Authors: Roumen Tsekov, Eyal Heifetz, Eliahu Cohen

We regard the non-relativistic Schrodinger equation as an ensemble mean representation of the stochastic motion of a single particle in a vacuum, subject to an undefined stochastic quantum force. The local mean of the quantum force is found to be proportional to the third spatial derivative of the probability density function, while its associated pressure is proportional to the second spatial derivative. The latter arises from the single particle diluted gas pressure, and this observation allows to interpret the quantum Bohm potential as the energy required to put a particle in a bath of fluctuating vacuum at constant entropy and volume. The stochastic force expectation value is zero and is uncorrelated with the particle location, thus does not perform work on average. Nonetheless it is anti-correlated with volume and this anti-correlation leads to an uncertainty relation. We analyze the dynamic Gaussian solution to the Schrodinger equation as a simple example for exploring the mean properties of this quantum force. We conclude with a few possible interpretations as to the origins of quantum stochasticity.

Publication date: Available online 12 June 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Norman Sieroka
This paper aims at closing a gap in recent Weyl research by investigating the role played by Leibniz for the development and consolidation of Weyl's notion of theoretical (symbolic) construction. For Weyl, just as for Leibniz, mathematics was not simply an accompanying tool when doing physics—for him it meant the ability to engage in well-guided speculations about a general framework of reality and experience. The paper first introduces some of the background of Weyl's notion of theoretical construction and then discusses particular Leibnizian inheritances in Weyl's ‘Philosophie der Mathematik und Naturwissenschaft’, such as the general appreciation of the principles of sufficient reason and of continuity. Afterwards the paper focuses on three themes: first, Leibniz's primary quality phenomenalism, which according to Weyl marked the decisive step in realizing that physical qualities are never apprehended directly; second, the conceptual relation between continuity and freedom; and third, Leibniz's notion of ‘expression’, which allows for a certain type of (surrogative) reasoning by structural analogy and which gave rise to Weyl's optimism regarding the scope of theoretical construction.

Probe sends entangled photons — which could underpin quantum-based data encryption — over unprecedented distance.

Nature News doi: 10.1038/nature.2017.22142

Author(s): P. M. Harrington, J. T. Monroe, and K. W. Murch

The Zeno and anti-Zeno effects are features of measurement-driven quantum evolution where frequent measurement inhibits or accelerates the decay of a quantum state. Either type of evolution can emerge depending on the system-environment interaction and measurement method. In this experiment, we use …


[Phys. Rev. Lett. 118, 240401] Published Wed Jun 14, 2017

Crowther, Karen and Linnemann, Niels (2017) Renormalizability, fundamentality and a final theory: The role of UV-completion in the search for quantum gravity. The British Journal for the Philosophy of Science.

Abstract

In quantum statistical mechanics, equilibrium states have been shown to be the typical states for a system that is entangled with its environment, suggesting a possible identification between thermodynamic and von Neumann entropies. In this paper, we investigate how the relaxation toward equilibrium is made possible through interactions that do not lead to significant exchange of energy, and argue for the validity of the second law of thermodynamics at the microscopic scale.

Authors: J. Acacio de Barros, Gary Oas

In this paper we examine some proposals to disprove the hypothesis that the interaction between mind and matter causes the collapse of the wave function, showing that such proposals are fundamentally flawed. We then describe a general experimental setup retaining the key features of the ones examined, and show that even a more general case is inadequate to disprove the mind-matter collapse hypothesis. Finally, we use our setup provided to argue that, under some reasonable assumptions about consciousness, such hypothesis is unfalsifiable.

Authors: Sisi Zhou, Mengzhen Zhang, John Preskill, Liang Jiang

We study the fundamental limits on precision for parameter estimation using a quantum probe subject to Markovian noise. The best possible scaling of precision $\delta$ with the total probing time $t$ is the Heisenberg limit (HL) $\delta \propto 1/t$, which can be achieved by a noiseless probe, but noise can reduce the precision to the standard quantum limit (SQL) $\delta \propto 1 /\sqrt{t}$. We find a condition on the probe's Markovian noise such that SQL scaling cannot be surpassed if the condition is violated, but if the condition is satisfied then HL scaling can be achieved by using quantum error correction to protect the probe from damage, assuming that noiseless ancilla qubits are available, and fast, accurate quantum processing can be performed. When the HL is achievable, the optimal quantum code can be found by solving a semidefinite program. If the condition is violated but the noise channel acting on the probe is close to one that satisfies the condition, then approximate HL scaling can be realized for an extended period before eventually crossing over to SQL scaling.

Authors: Roderick Sutherland

It has become increasingly apparent that a number of perplexing issues associated with the interpretation of quantum mechanics are more easily resolved once the notion of retrocausality is introduced. The aim here is to list and discuss various examples where a clear explanation has become available via this approach. In so doing, the intention is to highlight that this direction of research deserves more attention than it presently receives.

Publication date: 5 August 2017
Source:Physics Letters A, Volume 381, Issue 29
Author(s): Alexandra Bakman, Hagar Veksler, Shmuel Fishman
The effect of quantum collapse and revival is a fascinating interference phenomenon. In this paper the phenomenon is studied analytically and numerically for a simple system, a slightly anharmonic oscillator. The initial wave-function corresponds to a displaced ground state of a harmonic oscillator. Possible experimental realizations for cold atoms are discussed in detail.

Franklin, Alexander (2017) On the Renormalisation Group Explanation of Universality. [Preprint]

Abstract

In a recent paper, Cumpa (Am Philos Q 51(4): 319–324, 2014) argues that a scientific turn in metaphysics requires the acceptance of a materialist criterion of fundamentality, according to which the most fundamental metaphysical category is the one that provides us with a reconciliation of the ordinary world and the physical universe. He concludes that the dominant category of substance cannot be the most fundamental category, for it does not satisfy this criterion of fundamentality. The most fundamental category is instead the category of fact. Although convincing, the defense of factualism over substantialism offered by Cumpa takes into account the case of classical physics without considering the physical universe of quantum mechanics. My aim in this paper is to offer a completion to Cumpa’s factualist approach. To achieve my aim, I show that substances cannot provide a satisfactory account of the relationship between the ordinary world and the physical universe even in the case of quantum mechanics, whereas a factualist approach does.

Authors: Valentina Baccetti, Robert B. Mann, Daniel R. Terno

Event horizons are the defining feature of classical black holes. They are the key ingredient of the information loss paradox which, as paradoxes in quantum foundations, is built on a combination of predictions of quantum theory and counterfactual classical features: neither horizon formation nor its crossing by a test body is observable. Furthermore, horizons are unnecessary for the production of Hawking-like radiation. We demonstrate that when this radiation is taken into account it prevents horizon crossing/formation in a large class of models. We conjecture that horizon avoidance is a general feature of collapse. The non-existence of event horizons dispels the paradox, but opens up important questions about thermodynamic properties of the resulting objects and correlations between different degrees of freedom.

Authors: Jesús Mateos, Carlos Sabín

In this work we propose a recipe for the quantum simulation of traversable wormhole spacetimes in a Bose-Einstein condensate, both in $1+1D$ and $3+1D$. While in the former case it is enough to modulate the speed of sound along the condensate, in the latter case we need to choose particular coordinates, namely generalized Gullstrand-Painlev\'e coordinates. For weakly interacting condensates, in both cases we present the spatial dependence of the external magnetic field which is needed for the simulation, and we analyze under which conditions the simulation is possible with the experimental state-of-art.

Abstract

In this paper we study a classical and theoretical system which consists of an elastic medium carrying transverse waves and one point-like high elastic medium density, called concretion. We compute the equation of motion for the concretion as well as the wave equation of this system. Afterwards we always consider the case where the concretion is not the wave source any longer. Then the concretion obeys a general and covariant guidance formula, which leads in low-velocity approximation to an equivalent de Broglie-Bohm guidance formula. The concretion moves then as if exists an equivalent quantum potential. A strictly equivalent free Schrödinger equation is retrieved, as well as the quantum stationary states in a linear or spherical cavity. We compute the energy (and momentum) of the concretion, naturally defined from the energy (and momentum) density of the vibrating elastic medium. Provided one condition about the amplitude of oscillation is fulfilled, it strikingly appears that the energy and momentum of the concretion not only are written in the same form as in quantum mechanics, but also encapsulate equivalent relativistic formulas.

Shrapnel, Sally (2017) Discovering Quantum Causal Models (final). [Preprint]

Authors: Dmitry V. Zhdanov (1), Denys I. Bondar (2), Tamar Seideman (1) ((1) Northwestern University, Evanston, IL, USA, (2) Princeton University, Princeton, NJ, USA)

The proof of the long-standing conjecture is presented that Markovian quantum master equations are at odds with quantum thermodynamics under conventional assumptions of fluctuation-dissipation theorems (implying a translation invariant dissipation). Specifically, except for identified systems, persistent system-bath correlations of at least one kind, spatial or temporal, are obligatory for thermalization. A practical methodology to mitigate this constraint in modeling atomic and molecular dynamics with friction is proposed. The quantum optical scheme for its laboratory assessment is outlined.

Authors: Orlando Tapia

The paper focuses on aspects of the measurement problem introducing quantum states (q-states) for measured and measuring systems. The link between non-interacting and interacting quantum systems is first look at. For two independent partite systems logical sums A+B stand for non-interacting q-systems; while a direct product space AxB gathers interacting states. However this latter should support physical q-states with base states that do not separately belong to either A nor B; the latter correspond to bridge states, namely entangled states that can perform as links (bridges) between A+B and AxB domains. Bridge states at laboratory space open possibilities to describe transport in quantized amounts of energy and angular momentum. These link bases sustain entanglements of different kinds. Interactions bring in quantized electromagnetic (em) fields. Matter sustained q-states entangled to em-sustained q-states open bridges to transport information between matter and radiation.

Baron, Samuel and Evans, Peter W. (2017) What's So Spatial About Time Anyway? [Preprint]
Barrett, Thomas William (2017) Equivalent and Inequivalent Formulations of Classical Mechanics. [Preprint]

Author(s): Katja Ried, Jean-Philippe W. MacLean, Robert W. Spekkens, and Kevin J. Resch

The landscape of causal relations that can hold among a set of systems in quantum theory is richer than in classical physics. In particular, a pair of time-ordered systems can be related as cause and effect or as the effects of a common cause, and each of these causal mechanisms can be coherent or n…


[Phys. Rev. A 95, 062102] Published Fri Jun 02, 2017

Publication date: 25 July 2017
Source:Physics Letters A, Volume 381, Issue 28
Author(s): Brian R. La Cour
An experiment has recently been performed to demonstrate quantum nonlocality by establishing contextuality in one of a pair of photons encoding four qubits; however, low detection efficiencies and use of the fair-sampling hypothesis leave these results open to possible criticism due to the detection loophole. In this Letter, a physically motivated local hidden-variable model is considered as a possible mechanism for explaining the experimentally observed results. The model, though not intrinsically contextual, acquires this quality upon post-selection of coincident detections.

Nature Physics 13, 618 (2017). doi:10.1038/nphys4158

Author: Stephan Schlamminger

Stephan Schlamminger looks at the origins of the Planck constant and its current role in redefining the kilogram.

Nature Physics 13, 533 (2017). doi:10.1038/nphys4159

Author: Jean-Philippe Karr

Improved-accuracy measurements of the ground-state hyperfine splitting in highly charged bismuth ions reveal a surprising discrepancy with the predictions of quantum electrodynamics.

Khudairi, Hasen (2016) Entanglement, Modality, and Indeterminacy. [Preprint]

Author(s): Matteo Caiaffa, Andrea Smirne, and Angelo Bassi

Stochastic unravelings represent a useful tool to describe the dynamics of open quantum systems, and standard methods, such as quantum state diffusion (QSD), call for the complete positivity of the open-system dynamics. Here, we present a generalization of QSD, which also applies to positive, but no…


[Phys. Rev. A 95, 062101] Published Thu Jun 01, 2017

Abstract

H. G. Wells’ Time Traveller inhabits uniform Newtonian time. Where relativistic/quantum travelers into the past follow spacetime curvatures, past-bound Wellsians must reverse their direction of travel relative to absolute time. William Grey and Robin Le Poidevin claim reversing Wellsians must overlap with themselves or fade away piecemeal like the Cheshire Cat. Self-overlap is physically impossible but ‘Cheshire Cat’ fades destroy Wellsians’ causal continuity and breed bizarre fusions of traveler-stages with opposed time-directions. However, Wellsians who rotate in higher-dimensional space can reverse temporal direction without self-overlap, Cheshire Cats or mereological monstrosities. Alas, hyper-rotation in Newtonian space poses dynamic and biological problems, (e.g. gravitational/electrostatic singularities and catastrophic blood-loss). Controllable and survivable Wellsian travel needs topologically-variable spaces. Newtonian space, not Newtonian time, is Wellsians’ real enemy.

Chen, Eddy Keming (2017) An Intrinsic Theory of Quantum Mechanics: Progress in Field's Nominalistic Program, Part I. [Preprint]

Despite certain quantum concepts, such as superposition states, entanglement, ‘spooky action at a distance’ and tunnelling through insulating walls, being somewhat counterintuitive, they are no doubt extremely useful constructs in theoretical and experimental physics. More uncertain, however, is whether or not these concepts are fundamental to biology and living processes. Of course, at the fundamental level all things are quantum, because all things are built from the quantized states and rules that govern atoms. But when does the quantum mechanical toolkit become the best tool for the job? This review looks at four areas of ‘quantum effects in biology’. These are biosystems that are very diverse in detail but possess some commonality. They are all (i) effects in biology: rates of a signal (or information) that can be calculated from a form of the ‘golden rule’ and (ii) they are all protein–pigment (or ligand) complex systems. It is shown, beginning with the rate equation, that all these systems may contain some degree of quantumeffect, and where experimental evidence is available, it is explored to determine how the quantum analysis aids in understanding of the process.

Suárez, Mauricio (2017) PHYSICAL CHANCE. [Preprint]

Author(s): Martí Perarnau-Llobet and Theodorus Maria Nieuwenhuizen

The possibility of performing simultaneous measurements in quantum mechanics is investigated in the context of the Curie-Weiss model for a projective measurement. Concretely, we consider a spin–12 system simultaneously interacting with two magnets, which act as measuring apparatuses of two different…


[Phys. Rev. A 95, 052129] Published Tue May 30, 2017

Authors: Juven C. Wang

In this thesis, we explore the aspects of symmetry, topology and anomalies in quantum matter with entanglement from both condensed matter and high energy theory viewpoints. The focus of our research is on the gapped many-body quantum systems including symmetry-protected topological states and topologically ordered states. Chapter 1. Introduction. Chapter 2. Geometric phase, wavefunction overlap, spacetime path integral and topological invariants. Chapter 3. Aspects of Symmetry. Chapter 4. Aspects of Topology. Chapter 5. Aspects of Anomalies. Chapter 6. Quantum Statistics and Spacetime Surgery. Chapter 7. Conclusion: Finale and A New View of Emergence-Reductionism. (Thesis supervisor: Prof. Xiao-Gang Wen)

Authors: Jonas Maziero

The existence of incompatible observables constitutes one of the most prominent characteristics of quantum mechanics (QM) and can be revealed and formalized through uncertainty relations. The Heisenberg-Robertson-Schr\"odinger uncertainty relation (HRSUR) was proved at the dawn of quantum formalism and is ever-present in the teaching and research on QM. Notwithstanding, the HRSUR possess the so called triviality problem. That is to say, the HRSUR yields no information about the possible incompatibility between two observables if the system was prepared in a state which is an eigenvector of one of them. After about 85 years of existence of the HRSUR, this problem was solved recently by Lorenzo Maccone and Arun K. Pati. In this article, we start doing a brief discussion of general aspects of the uncertainty principle in QM and recapitulating the proof of HRSUR. Afterwards we present in simple terms the proof of the Maccone-Pati uncertainty relation, which can be obtained basically via the application of the parallelogram law and the Cauchy-Schwarz inequality.

Authors: Partha Ghose

The measurement problem in quantum mechanics originates in the inability of the Schr\"odinger equation to predict definite outcomes of measurements. This is due to the lack of objectivity of the eigenstates of the measuring apparatus. Such objectivity can be achieved if a unified realist conceptual framework can be formulated in terms of wave functions and operators acting on them for both the quantum and classical domains. Such a framework was proposed and an equation for the wave function (13, 14) smoothly interpolates between the quantum and classical limits. The arguments leading to the equation are clarified in this paper, and the theory is developed further. The measurement problem in quantum mechanics is then briefly reviewed and re-examined from the point of view of this theory, and it is shown how the classical limit of the wave function of the measuring apparatus leads to a natural solution of the problem of definite measurement outcomes without the need for either collapse or pragmatic thermodynamic arguments. This is consistent with Bohr's emphasis on the primacy of classical concepts and classical measuring devices. Possible tests of the theory using low-dimensional systems such as quantum dots are indicated.

Authors: Lee Smolin

Some reflections are presented on the state of the search for a quantum theory of gravity. I discuss diverse regimes of possible quantum gravitational phenomenon, some well explored, some novel.

Authors: Cesar A. Aguillón, Albert Much, Marcos Rosenbaum, J. David Vergara

We investigate a quantum geometric space in the context of what could be considered an emerging effective theory from Quantum Gravity. Specifically we consider a two-parameter class of twisted Poincar\'e algebras, from which Lie-algebraic noncommutativities of the translations are derived as well as associative star-products, deformed Riemannian geometries, Lie-algebraic twisted Minkowski spaces and quantum effects that arise as noncommutativities. Starting from a universal differential algebra of forms based on the above mentioned Lie-algebraic noncommutativities of the translations, we construct the noncommutative differential forms and Inner and Outer derivations, which are the noncommutative equivalents of the vector fields in the case of commutative differential geometry. Having established the essentials of this formalism we construct a bimodule, required to be central under the action of the Inner derivations in order to have well defined contractions and from where the algebraic dependence of its coefficients is derived. This again then defines the noncommutative equivalent of the geometrical line-element in commutative differential geometry. We stress, however, that even though the components of the twisted metric are by construction symmetric in their algebra valuation, this is not so for their inverse and thus to construct it we made use of Gel'fand's theory of quasi-determinants, which is conceptually straightforward but computationally becoming quite complicate beyond an algebra of 3 generators. The consequences of the noncommutativity of the Lie-algebra twisted geometry are further discussed.

Authors: Muxin Han

In this paper we explain how 4-dimensional general relativity and in particular, the Einstein equation, emerge from the spinfoam amplitude in loop quantum gravity. We propose a new limit which couples both the semiclassical limit and continuum limit of spinfoam amplitudes. The continuum Einstein equation emerges in this limit. Solutions of Einstein equation can be approached by dominant configurations in spinfoam amplitudes. A running scale is naturally associated to the sequence of refined triangulations. The continuum limit corresponds to the infrared limit of the running scale. An important ingredient in the derivation is a regularization for the sum over spins, which is necessary for the semiclassical continuum limit. We also explain in this paper the role played by the so-called flatness in spinfoam formulation, and how to take advantage of it.

Author(s): Diego A. Alcala, Joseph A. Glick, and Lincoln D. Carr

Tunneling of a quasibound state is a nonsmooth process in the entangled many-body case. Using time-evolving block decimation, we show that repulsive (attractive) interactions speed up (slow down) tunneling. While the escape time scales exponentially with small interactions, the maximization time of …


[Phys. Rev. Lett. 118, 210403] Published Thu May 25, 2017

Author(s): Philip Pearle and Anthony Rizzi

We give a complete quantum analysis of the Aharonov-Bohm (AB) magnetic phase shift involving three entities: the electron, the charges constituting the solenoid current, and the vector potential. The usual calculation supposes that the solenoid's vector potential may be well approximated as classica…


[Phys. Rev. A 95, 052124] Published Thu May 25, 2017

Author(s): Philip Pearle and Anthony Rizzi

Following semiclassical arguments by Vaidman [Phys. Rev. A 86, 040101(R) (2012)], we show that the phase shifts arising in the Aharonov-Bohm (AB) magnetic or electric effects can be treated as due to the electric force of a classical electron, respectively acting on quantized solenoid particles or q…


[Phys. Rev. A 95, 052123] Published Thu May 25, 2017

Abstract

Two approaches to understanding the idealizations that arise in the Aharonov–Bohm (AB) effect are presented. It is argued that a common topological approach, which takes the non-simply connected electron configuration space to be an essential element in the explanation and understanding of the effect, is flawed. An alternative approach is outlined. Consequently, it is shown that the existence and uniqueness of self-adjoint extensions of symmetric operators in quantum mechanics have important implications for philosophical issues. Also, the alleged indispensable explanatory role of said idealizations is examined via a minimal model explanatory scheme. Last, the idealizations involved in the AB effect are placed in a wider philosophical context via a short survey of part of the literature on infinite and essential idealizations.

Determining the state of a quantum system is a consuming procedure. For this reason, whenever one is interested only in some particular property of a state, it would be desirable to design a measurement set-up that reveals this property with as little effort as possible. Here, we investigate whether, in order to successfully complete a given task of this kind, one needs an informationally complete measurement, or if something less demanding would suffice. The first alternative means that in order to complete the task, one needs a measurement which fully determines the state. We formulate the task as a membership problem related to a partitioning of the quantum state space and, in doing so, connect it to the geometry of the state space. For a general membership problem, we prove various sufficient criteria that force informational completeness, and we explicitly treat several physically relevant examples. For the specific cases that do not require informational completeness, we also determine bounds on the minimal number of measurement outcomes needed to ensure success in the task.

Abstract

It is shown that quantum mechanics is a plausible statistical description of an ontology described by classical electrodynamics. The reason that no contradiction arises with various no-go theorems regarding the compatibility of QM with a classical ontology, can be traced to the fact that classical electrodynamics of interacting particles has never been given a consistent definition. Once this is done, our conjecture follows rather naturally, including a purely classical explanation of photon related phenomena. Our analysis entirely rests on the block-universe view entailed by relativity theory.

Author(s): T. H. Hansson, M. Hermanns, S. H. Simon, and S. F. Viefers

The quantum Hall effects by now are recognized as prime examples of the importance of topological considerations in condensed-matter physics. The fractional Quantum Hall effect in particular has proven to display a large number of topologically ordered states that have been classified and understood in terms of hierarchical schemes. This review explains the current understanding of such classifications, with particular emphasis on conformal-field-theory approaches.


[Rev. Mod. Phys. 89, 025005] Published Tue May 23, 2017

Author(s): Charis Anastopoulos and Ntina Savvidou

Attempts to find a quantum-to-classical correspondence in a classically forbidden region leads to nonphysical paths involving, for example, complex time or spatial coordinates. Here, we identify genuine quasiclassical paths for tunneling in terms of probabilistic correlations in sequential time-of-a…


[Phys. Rev. A 95, 052120] Published Tue May 23, 2017

Abstract

Based on a gambling formulation of quantum mechanics, we derive a Gleason-type theorem that holds for any dimension n of a quantum system, and in particular for \(n=2\) . The theorem states that the only logically consistent probability assignments are exactly the ones that are definable as the trace of the product of a projector and a density matrix operator. In addition, we detail the reason why dispersion-free probabilities are actually not valid, or rational, probabilities for quantum mechanics, and hence should be excluded from consideration.

Author(s): Zongping Gong, Sho Higashikawa, and Masahito Ueda

The quantum Zeno effect is predicted to give rise to Hall-effect-like behavior for wavepackets in an ultracold condensate.


[Phys. Rev. Lett. 118, 200401] Published Thu May 18, 2017

Author(s): T. Ollikainen, K. Tiurev, A. Blinova, W. Lee, D. S. Hall, and M. Möttönen

Magnetic monopoles have been sought for decades but never definitively observed. Recent experiments have created different analogs of monopoles in Bose-Einstein condensates, including quantum-mechanical and Dirac monopoles. Now a new experiment in this system shows how a quantum-mechanical monopole can evolve into a Dirac monopole.


[Phys. Rev. X 7, 021023] Published Wed May 17, 2017

Author(s): Kiran E. Khosla and Natacha Altamirano

The notion of time is given a different footing in quantum mechanics and general relativity, treated as a parameter in the former and being an observer-dependent property in the latter. From an operational point of view time is simply the correlation between a system and a clock, where an idealized …


[Phys. Rev. A 95, 052116] Published Wed May 17, 2017

Author(s): Felix Huber, Otfried Gühne, and Jens Siewert

Pure multiparticle quantum states are called absolutely maximally entangled if all reduced states obtained by tracing out at least half of the particles are maximally mixed. We provide a method to characterize these states for a general multiparticle system. With that, we prove that a seven-qubit st…


[Phys. Rev. Lett. 118, 200502] Published Wed May 17, 2017

Abstract

The appearance of negative terms in quasiprobability representations of quantum theory is known to be inevitable, and, due to its equivalence with the onset of contextuality, of central interest in quantum computation and information. Until recently, however, nothing has been known about how much negativity is necessary in a quasiprobability representation. Zhu (Phys Rev Lett 117 (12):120404, 2016) proved that the upper and lower bounds with respect to one type of negativity measure are saturated by quasiprobability representations which are in one-to-one correspondence with the elusive symmetric informationally complete quantum measurements (SICs). We define a family of negativity measures which includes Zhu’s as a special case and consider another member of the family which we call “sum negativity.” We prove a sufficient condition for local maxima in sum negativity and find exact global maxima in dimensions 3 and 4. Notably, we find that Zhu’s result on the SICs does not generally extend to sum negativity, although the analogous result does hold in dimension 4. Finally, the Hoggar lines in dimension 8 make an appearance in a conjecture on sum negativity.

Quantum Weirdness

-

Physics

on 2017-5-16 12:00am GMT
Author: William J. Mullin
ISBN: 9780198795131
Binding: Hardcover
Publication Date: 16 May 2017
Price: $39.95

Author(s): Fernando Pastawski and John Preskill

Deep theoretical links may exist between how space encodes information and error correcting codes being developed for quantum computers. A new analysis explores these connections further and offers insights into not just error-correction codes but also how we interpret ideas about spacetime.


[Phys. Rev. X 7, 021022] Published Mon May 15, 2017

Nature Physics. doi:10.1038/nphys4118

Authors: Yiqiu Ma, Haixing Miao, Belinda Heyun Pang, Matthew Evans, Chunnong Zhao, Jan Harms, Roman Schnabel & Yanbei Chen

Nature Physics. doi:10.1038/nphys4152

Author: Raffaele Flaminio

The Einstein–Podolsky–Rosen type of quantum entanglement can be used to improve the sensitivity of laser interferometer gravitational-wave detectors beyond the quantum limit.

Publication date: Available online 9 May 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Silvia De Bianchi
This paper focuses on Hermann Weyl’s two-component theory and frames it within the early development of different theories of spinors and the history of the discovery of parity violation in weak interactions. In order to show the implications of Weyl’s theory, the paper discusses the case study of Ettore Majorana’s symmetric theory of electron and positron (1937), as well as its role in inspiring Case’s formulation of parity violation for massive neutrinos in 1957. In doing so, this paper clarifies the relevance of Weyl’s and Majorana’s theories for the foundations of neutrino physics and emphasizes which conceptual aspects of Weyl’s approach led to Lee’s and Yang’s works on neutrino physics and to the solution of the theta-tau puzzle in 1957. This contribution thus sheds a light on the alleged “re-discovery” of Weyl’s and Majorana’s theories in 1957, by showing that this did not happen all of a sudden. On the contrary, the scientific community was well versed in applying these theories in the 1950s on the ground of previous studies that involved important actors in both Europe and United States.

Author(s): N. Nikitin and K. Toms

In this paper we propose a time-independent equality and time-dependent inequality, suitable for an experimental test of the hypothesis of realism. The derivation of these relations is based on the concept of conditional probability and on Bayes' theorem in the framework of Kolmogorov's axiomatics o…


[Phys. Rev. A 95, 052103] Published Fri May 05, 2017

Author(s): Wenchao Ma, Bin Chen, Ying Liu, Mengqi Wang, Xiangyu Ye, Fei Kong, Fazhan Shi, Shao-Ming Fei, and Jiangfeng Du

The uncertainty principle is considered to be one of the most striking features in quantum mechanics. In the textbook literature, uncertainty relations usually refer to the preparation uncertainty which imposes a limitation on the spread of measurement outcomes for a pair of noncommuting observables…


[Phys. Rev. Lett. 118, 180402] Published Thu May 04, 2017

Author(s): Alexey A. Gorlach, Maxim A. Gorlach, Andrei V. Lavrinenko, and Andrey Novitsky

Theory shows that the quantum-mechanical wave of a beam of particles can exert a pulling force on a small particle, just as other waves do.


[Phys. Rev. Lett. 118, 180401] Published Thu May 04, 2017

Abstract

We discuss the problems of quantum theory (QT) complicating its merging with general relativity (GR). QT is treated as a general theory of micro-phenomena—a bunch of models. Quantum mechanics (QM) and quantum field theory (QFT) are the most widely known (but, e.g., Bohmian mechanics is also a part of QT). The basic problems of QM and QFT are considered in interrelation. For QM, we stress its nonrelativistic character and the presence of spooky action at a distance. For QFT, we highlight the old problem of infinities. And this is the main point of the paper: it is meaningless to try to unify QFT so heavily suffering of infinities with GR. We also highlight difficulties of the QFT-treatment of entanglement. We compare the QFT and QM based measurement theories by presenting both theoretical and experimental viewpoints. Then we discuss two basic mathematical constraints of both QM and QFT, namely, the use of real (and, hence, complex) numbers and the Hilbert state space. We briefly present non-archimedean and non-hilbertian approaches to QT and their consequences. Finally, we claim that, in spite of the Bell theorem, it is still possible to treat quantum phenomena on the basis of a classical-like causal theory. We present a random field model generating the QM and QFT formalisms. This emergence viewpoint can serve as the basis for unification of novel QT (may be totally different from presently powerful QM and QFT) and GR. (It may happen that the latter would also be revolutionary modified.)

Nature Physics 13, 518 (2017). doi:10.1038/nphys4126

Author: Alberto Moscatelli

Alberto Moscatelli surveys a series of experiments on the electron g-factor that marked the departure from the Dirac equation and contributed to the development of quantum electrodynamics.

Nature Physics 13, 419 (2017). doi:10.1038/nphys4134

Author: Yun Li

Publication date: Available online 29 April 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Iulian D. Toader


Abstract

Quantum trajectory-based descriptions of interference between two coherent stationary waves in a double-slit experiment are presented, as given by the de Broglie–Bohm (dBB) and modified de Broglie–Bohm (MdBB) formulations of quantum mechanics. In the dBB trajectory representation, interference between two spreading wave packets can be shown also as resulting from motion of particles. But a trajectory explanation for interference between stationary states is so far not available in this scheme. We show that both the dBB and MdBB trajectories are capable of producing the interference pattern for stationary as well as wave packet states. However, the dBB representation is found to provide the ‘which-way’ information that helps to identify the hole through which the particle emanates. On the other hand, the MdBB representation does not provide any which-way information while giving a satisfactory explanation of interference phenomenon in tune with the de Broglie’s wave particle duality. By counting the trajectories reaching the screen, we have numerically evaluated the intensity distribution of the fringes and found very good agreement with the standard results.

Publication date: Available online 20 April 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Jeffrey A. Barrett
Hugh Everett III presented pure wave mechanics, sometimes referred to as the many-worlds interpretation, as a solution to the quantum measurement problem. While pure wave mechanics is an objectively deterministic physical theory with no probabilities, Everett sought to show how the theory might be understood as making the standard quantum statistical predictions as appearances to observers who were themselves described by the theory. We will consider his argument and how it depends on a particular notion of branch typicality. We will also consider responses to Everett and the relationship between typicality and probability. The suggestion will be that pure wave mechanics requires a number of significant auxiliary assumptions in order to make anything like the standard quantum predictions.

Author(s): Miloslav Znojil, Iveta Semorádová, František Růžička, Hafida Moulla, and Ilhem Leghrib

During recent developments in quantum theory it has been clarified that observable quantities (such as energy and position) may be represented by operators Λ (with real spectra) that are manifestly non-Hermitian in a preselected friendly Hilbert space H(F). The consistency of these models is known t…


[Phys. Rev. A 95, 042122] Published Tue Apr 18, 2017

Author(s): Robin Harper, Robert J. Chapman, Christopher Ferrie, Christopher Granade, Richard Kueng, Daniel Naoumenko, Steven T. Flammia, and Alberto Peruzzo

We propose a framework for the systematic and quantitative generalization of Bell's theorem using causal networks. We first consider the multiobjective optimization problem of matching observed data while minimizing the causal effect of nonlocal variables and prove an inequality for the optimal regi…


[Phys. Rev. A 95, 042120] Published Mon Apr 17, 2017