Weekly Papers on Quantum Foundations (39)

Authors: Selman IpekAriel Caticha

In the Entropic Dynamics (ED) framework quantum theory is derived as an application of entropic methods of inference. The physics is introduced through appropriate choices of variables and of constraints that codify the relevant physical information. In previous work, a manifestly covariant ED of quantum scalar fields in a fixed background spacetime was developed. Manifest relativistic covariance was achieved by imposing constraints in the form of Poisson brackets and of intial conditions to be satisfied by a set of local Hamiltonian generators. Our approach succeeded in extending to the quantum domain the classical framework that originated with Dirac and was later developed by Teitelboim and Kuchar. In the present work the ED of quantum fields is extended further by allowing the geometry of spacetime to fully partake in the dynamics. The result is a first-principles ED model that in one limit reproduces quantum mechanics and in another limit reproduces classical general relativity. Our model shares some formal features with the so-called semi-classical approach to gravity.

下午12:34 | quant-ph updates on arXiv.org

Authors: Valentina GualtieriClaudia BenedettiMatteo G. A. Paris

We introduce a fidelity-based measure of nonclassicality Q to quantify the differences between the dynamics of classical and quantum continous-time walks over a graph. We provide universal, graph-independent, analytic expressions of Q, showing that at short times nonclassicality of quantum walks is due to the appearance of coherence, whereas for long times it depends only on the size of the graph, with quantumness contributing only partially to the overall nonclassicality. At intermediate times, nonclassicality does instead depend on the graph topology through its algebraic connectivity.

下午12:34 | quant-ph updates on arXiv.org

Authors: Johannes BauschToby S. CubittJames D. Watson

The phase diagram of a material is of central importance to describe the properties and behaviour of a condensed matter system. We prove that the general task of determining the quantum phase diagram of a many-body Hamiltonian is uncomputable, by explicitly constructing a one-parameter family of Hamiltonians for which this is the case. This work builds off recent results from Cubitt et al. and Bausch et al., proving undecidability of the spectral gap problem. However, in all previous constructions, the Hamiltonian was necessarily a discontinuous function of its parameters, making it difficult to derive rigorous implications for phase diagrams or related condensed matter questions. Our main technical contribution is to prove undecidability of the spectral gap for a continuous, single-parameter family of translationally invariant, nearest-neighbour spin-lattice Hamiltonians on a 2D square lattice: $H(\varphi)$ where $\varphi\in \mathbb R$. As well as implying uncomputablity of phase diagrams, our result also proves that undecidability can hold for a set of positive measure of a Hamiltonian’s parameter space, whereas previous results only implied undecidability on a zero measure set.

Authors: KelvinKelvin OnggadinataMatthew J. LakeTomasz Paterek

The Schr\”odinger-Newton equation is a proposed model to explain the localisation of macroscopic particles by suppressing quantum dispersion with the particle’s own gravitational attraction. On cosmic scales, however, dark energy also acts repulsively, as witnessed by the accelerating rate of universal expansion. Here, we introduce the effects of dark energy in the form of cosmological constant $\Lambda$, that drives the late-time acceleration of the universe, into the Schr\”odinger-Newton approach. We then ask in which regime dark energy dominates both canonical quantum diffusion and gravitational self-attraction. It turns out that this happens for sufficiently delocalised objects with arbitrary mass and that there exists a minimal delocalisation width of about $67$ meters. While extremely macroscopic from a quantum perspective, the value is in principle accessible to laboratories on Earth. Hence, we analyse, numerically, how the dynamics of an initially spherical Gaussian wave packet is modified in the presence of $\Lambda > 0$. A notable feature is the gravitational collapse of part of the wave packet, in the core region close to the centre of mass, accompanied by the accelerated expansion of the more distant shell surrounding it. The order of magnitude of the distance separating collapse from expansion matches analytical estimates of the classical turnaround radius for a spherically symmetric body in the presence of dark energy. However, the time required to observe these modifications is astronomical. They can potentially be measured only in physical systems simulating a high effective cosmological constant, or, possibly, via their effects on the inflationary universe.

Authors: Rubén ArjonaSavvas Nesseris

Machine learning algorithms have revolutionized the way we interpret data in astronomy, particle physics, biology and even economics, since they can remove biases due to a priori chosen models. Here we apply a specific machine learning method, the genetic algorithms (GA), to cosmological data that describes the background expansion of the Universe, the Pantheon Type Ia supernovae and the Hubble expansion history $H(z)$ datasets. We obtain model independent and non-parametric reconstructions of the luminosity distance $d_L(z)$ and Hubble parameter $H(z)$ without assuming any dark energy model or a flat Universe. We then estimate the deceleration parameter $q(z)$, a measure of the acceleration of the Universe, and we make a $\sim4.5\sigma$ model independent detection of the accelerated expansion, but we also place constraints on the transition redshift of the acceleration phase $(z_{\textrm{tr}}=0.662\pm0.027)$. We also confirm a recently reported mild tension between the SnIa/quasar data and the cosmological constant $\Lambda$CDM model at high redshifts $(z\gtrsim1.5)$ and finally, we show that the GA can be used in complementary null tests of the $\Lambda$CDM via reconstructions of the Hubble parameter and the luminosity distance.

2019年10月4日 星期五 上午8:00 | Latest Results for Foundations of Physics

Abstract

The Hole Argument is primarily about the meaning of general covariance in general relativity. As such it raises many deep issues about identity in mathematics and physics, the ontology of space–time, and how scientific representation works. This paper is about the application of a new foundational programme in mathematics, namely homotopy type theory (HoTT), to the Hole Argument. It is argued that the framework of HoTT provides a natural resolution of the Hole Argument. The role of the Univalence Axiom in the treatment of the Hole Argument in HoTT is clarified.

2019年10月4日 星期五 上午8:00 | Latest Results for Foundations of Physics

Abstract

The usual representation of quantum algorithms is limited to the process of solving the problem. We extend it to the process of setting the problem. Bob, the problem setter, selects a problem-setting by the initial measurement. Alice, the problem solver, unitarily computes the corresponding solution and reads it by the final measurement. This simple extension creates a new perspective from which to see the quantum algorithm. First, it highlights the relevance of time-symmetric quantum mechanics to quantum computation: the problem-setting and problem solution, in their quantum version, constitute pre- and post-selection, hence the process as a whole is bound to be affected by both boundary conditions. Second, it forces us to enter into relational quantum mechanics. There must be a representation of the quantum algorithm with respect to Bob, and another one with respect to Alice, from whom the outcome of the initial measurement, specifying the setting and thus the solution of the problem, must be concealed. Time-symmetrizing the quantum algorithm to take into account both boundary conditions leaves the representation to Bob unaltered. It shows that the representation to Alice is a sum over histories in each of which she remains shielded from the information coming to her from the initial measurement, not from that coming to her backwards in time from the final measurement. In retrospect, all is as if she knew in advance, before performing her problem-solving action, half of the information that specifies the solution of the problem she will read in the future and could use this information to reach the solution with fewer computation steps (oracle queries). This elucidates the quantum computational speedup in all the quantum algorithms examined.

2019年10月4日 星期五 上午3:28 | Philsci-Archive: No conditions. Results ordered -Date Deposited.
Gomes, Henrique (2019) Holism as the significance of gauge symmetries. [Preprint]
2019年10月4日 星期五 上午3:21 | Philsci-Archive: No conditions. Results ordered -Date Deposited.
Dougherty, John (2019) Large gauge transformations and the strong CP problem. Studies in History and Philosophy of Modern Physics. ISSN 1355-2198
2019年10月3日 星期四 下午6:00 | Alejandro Pozas-Kerstjens, Rafael Rabelo, Łukasz Rudnicki, Rafael Chaves, Daniel Cavalcanti, Miguel Navascués, and Antonio Acín | PRL: General Physics: Statistical and Quantum Mechanics, Quantum Information, etc.

Author(s): Alejandro Pozas-Kerstjens, Rafael Rabelo, Łukasz Rudnicki, Rafael Chaves, Daniel Cavalcanti, Miguel Navascués, and Antonio Acín

We present a method that allows the study of classical and quantum correlations in networks with causally independent parties, such as the scenario underlying entanglement swapping. By imposing relaxations of factorization constraints in a form compatible with semidefinite programing, it enables the…

[Phys. Rev. Lett. 123, 140503] Published Thu Oct 03, 2019

2019年10月1日 星期二 上午8:00 | Swanson N. | The British Journal for the Philosophy of Science Advance Access
Abstract

Candidates for fundamental physical laws rarely, if ever, employ higher than second time derivatives. Easwaran ([2014]) sketches an enticing story that purports to explain away this puzzling fact and thereby provides indirect evidence for a particular set of metaphysical theses used in the explanation. I object to both the scope and coherence of Easwaran’s account, before going on to defend an alternative, more metaphysically deflationary explanation: in interacting Lagrangian field theories, it is either impossible or very hard to incorporate higher than second time derivatives without rendering the vacuum state unstable. The so-called Ostrogradski instability represents a powerful constraint on the construction of new field theories and supplies a novel, largely overlooked example of non-causal explanation in physics.

The Information Geometry of Space-time

The method of maximum entropy is used to model curved physical space in terms of points defined with a finite resolution. Such a blurred space is automatically endowed with a metric given by information geometry. The corresponding space-time is such that the geometry of any embedded spacelike surface is given by its information geometry. The dynamics of blurred space, its geometrodynamics, is constructed by requiring that as space undergoes the deformations associated with evolution in local time, it sweeps a four-dimensional space-time. This reproduces Einstein’s equations for vacuum gravity. We conclude with brief comments on some of the peculiar properties of blurred space: There is a minimum length and blurred points have a finite volume. There is a relativistic “blur dilation”. The volume of space is a measure of its entropy.

The Entropic Dynamics approach to Quantum Mechanics

Entropic Dynamics (ED) is a framework in which Quantum Mechanics is derived as an application of entropic methods of inference. In ED the dynamics of the probability distribution is driven by entropy subject to constraints that are codified into a quantity later identified as the phase of the wave function. The central challenge is to specify how those constraints are themselves updated. In this paper we review and extend the ED framework in several directions. A new version of ED is introduced in which particles follow smooth differentiable Brownian trajectories (as opposed to non-differentiable Brownian paths). To construct the ED we make use of the fact that the space of probabilities and phases has a natural symplectic structure (i.e., it is a phase space with Hamiltonian flows and Poisson brackets). Then, using an argument based on information geometry, a metric structure is introduced. It is shown that the ED that preserves the symplectic and metric structures — which is a Hamilton-Killing flow in phase space — is the linear Schrödinger equation. These developments allow us to discuss why wave functions are complex and the connections between the superposition principle, the single-valuedness of wave functions, and the quantization of electric charges. Finally, it is observed that Hilbert spaces are not necessary ingredients in this construction. They are a clever but merely optional trick that turns out to be convenient for practical calculations.

Article written by