Weekly Papers on Quantum Foundations (51)

Publication date: Available online 30 December 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Karen Crowther
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory.
The quantum computing era is upon us. Google is building up to a breakthrough on par with the launch of Sputnik or the splitting of the atom
Koberinski, Adam (2017) Problems with the cosmological constant problem. In: UNSPECIFIED.

Authors: Juliusz DoboszewskiNiels Siegbert Linnemann

General relativity cannot be formulated as a perturbatively renormalizable quantum field theory. An argument relying on the validity of the Bekenstein-Hawking entropy formula aims at dismissing gravity as non-renormalizable per se, against hopes (underlying programs such as Asymptotic Safety) that d-dimensional GR could turn out to have a non-perturbatively renormalizable d-dimensional quantum field theoretic formulation. In this note we discuss various forms of highly problematic semi-classical extrapolations assumed by both sides of the debate concerning what we call The Entropy Argument, and show that a large class of dimensional reduction scenarios leads to the blow-up of Bekenstein-Hawking entropy.

Authors: Clement BerthiereDebajyoti SarkarSergey N. Solodukhin

The presence of a horizon is the principal marker for black holes as they appear in the classical theory of gravity. In General Relativity (GR), horizons have several defining properties. First, there exists a static spherically symmetric solution to vacuum Einstein equations which possesses a horizon defined as a null-surface on which the time-like Killing vector becomes null. Second, in GR, a co-dimension two sphere of minimal area is necessarily a horizon. On a quantum level, the classical gravitational action is supplemented by the quantum effective action obtained by integrating out the quantum fields propagating on a classical background. In this note we consider the case when the quantum fields are conformal and perform a certain non-perturbative analysis of the semiclassical equations obtained by varying the complete gravitational action. We show that, for these equations, both of the above aspects do not hold. More precisely, we prove that i) a static spherically symmetric metric that would describe a horizon with a finite Hawking temperature is, generically, {\it not} a solution; ii) a minimal $2$-sphere is {\it not} a horizon but a tiny throat of a wormhole. We find certain bounds on the norm of the Killing vector at the throat and show that it is, while non-zero, an exponentially small function of the Bekenstein-Hawking (BH) entropy of the classical black hole. We also find that the possible temperature of the semiclassical geometry is exponentially small for large black holes. These findings suggest that a black hole in the classical theory can be viewed as a certain (singular) limit of the semiclassical wormhole geometry. We discuss the possible implications of our results.

Authors: Antti Veilahti

According to the Butterfield–Isham proposal, to understand quantum gravity we must revise the way we view the universe of mathematics. However, this paper demonstrates that the current elaborations of this programme neglect quantum interactions. The paper then introduces the Faddeev–Mickelsson anomaly which obstructs the renormalization of Yang–Mills theory, suggesting that to theorise on many-particle systems requires a many-topos view of mathematics itself: higher theory. As our main contribution, the topos theoretic framework is used to conceptualise the fact that there are principally three different quantisation problems, the differences of which have been ignored not just by topos physicists but by most philosophers of science. We further argue that if higher theory proves out to be necessary for understanding quantum gravity, its implications to philosophy will be foundational: higher theory challenges the propositional concept of truth and thus the very meaning of theorising in science.

Authors: John SkillingKevin H. Knuth

Quantification starts with sum and product rules that express combination and partition. These rules rest on elementary symmetries that have wide applicability, which explains why arithmetical adding up and splitting into proportions are ubiquitous. Specifically, measure theory formalises addition, and probability theory formalises inference in terms of proportions. Quantum theory rests on the same simple symmetries, but is formalised in two dimensions, not just one, in order to track an object through its binary interactions with other objects. The symmetries still require sum and product rules (here known as the Feynman rules), but they apply to complex numbers instead of real scalars, with observable probabilities being modulus-squared (known as the Born rule). The standard quantum formalism follows. There is no mystery or weirdness, just ordinary probabilistic inference.

Authors: Nuno Costa DiasMaurice A. de GossonJoao Nuno Prata

We propose a refinement of the Robertson-Schrodinger uncertainty principle (RSUP) using Wigner distributions. This new principle is stronger than the RSUP. In particular, and unlike the RSUP, which can be saturated by many phase space functions, the refined RSUP can be saturated by pure Gaussian Wigner functions only. Moreover, the new principle is technically as simple as the standard RSUP. In addition, it makes a direct connection with modern harmonic analysis, since it involves the Wigner transform and its symplectic Fourier transform, which is the radar ambiguity function. As a by-product of the refined RSUP, we derive inequalities involving the entropy and the covariance matrix of Wigner distributions. These inequalities refine the Shanon and the Hirschman inequalities for the Wigner distribution of a mixed quantum state $\rho$. We prove sharp estimates which critically depend on the purity of $\rho$ and which are saturated in the Gaussian case.

Authors: Q. DupreyA. Matzkin

We discuss the preceding Comment and conclude that the arguments given there against the relevance of null weak values as representing the absence of a system property are not compelling. We give an example in which the transition matrix elements that make the projector weak values vanish are the same ones that suppress detector clicks in strong measurements. Whether weak values are taken to account for the past of a quantum system or not depend on general interpretional commitments of the quantum formalism itself rather than on peculiarities of the weak measurements framework.

Authors: Keren LiYouning LiMuxin HanSirui LuJie ZhouDong RuanGuilu LongYidun WanDawei LuBei ZengRaymond Laflamme

We experimentally simulate the spin networks — a fundamental description of quantum spacetime at the Planck level. We achieve this by simulating quantum tetrahedra and their interactions. The tensor product of these quantum tetrahedra comprises spin networks. In this initial attempt to study quantum spacetime by quantum information processing, on a four-qubit nuclear magnetic resonance quantum simulator, we simulate the basic module — comprising five quantum tetrahedra — of the interactions of quantum spacetime. By measuring the geometric properties on the corresponding quantum tetrahedra and simulate their interactions, our experiment serves as the basic module that represents the Feynman diagram vertex in the spin-network formulation of quantum spacetime.

To reconcile quantum mechanics and gravity, a new theory flips the usual script. Space-time and gravity may emerge from quantum effects, not the other way round

Article written by