Latest Papers on Quantum Foundations - Updated Daily by IJQF

How the mind can make sense of quantum physics in more ways than one 

-- Read more on ScientificAmerican.com


show enclosure

(; 0.17 MB)

Author(s): P. R. Dieguez and R. M. Angelo

Relations among the concepts of measurement, information, and physical reality are established with the quantification of the degree of reality of an observable for a given preparation. It is found that for pure states the entanglement with the apparatus precisely determines the amount by which the reality of the monitored observable increases.


[Phys. Rev. A 97, 022107] Published Fri Feb 16, 2018

Moulavi Ardakani, Reza (2017) Time Reversal Invariance in Quantum Mechanics.
McQueen, Kelvin J. and Vaidman, Lev (2018) In defence of the self-location uncertainty account of probability in the many-worlds interpretation. [Preprint]

Authors: Ram Brustein, A.J.M. Medved, Yoav Zigdon

We show that the state of the Hawking radiation emitted from a large Schwarzschild black hole (BH) deviates significantly from a classical state, in spite of its apparent thermal nature. For this state, the occupation numbers of single modes of massless asymptotic fields, such as photons, gravitons and possibly neutrinos, are small and, as a result, their relative fluctuations are large. The occupation numbers of massive fields are much smaller and suppressed beyond even the expected Boltzmann suppression. It follows that this type of thermal state cannot be viewed as classical or even semiclassical. We substantiate this claim by showing that, in a state with low occupation numbers, physical observables have large quantum fluctuations and, as such, cannot be faithfully described by a mean-field or by a WKB-like semiclassical state. Since the evolution of the BH is unitary, our results imply that the state of the BH interior must also be non-classical when described in terms of the asymptotic fields. We show that such a non-classical interior cannot be described in terms of a semiclassical geometry, even though the average curvature is sub-Planckian.

Authors: Vyshnav Mohan

In this paper, we will show that gravity can emerge from an effective field theory, obtained by tracing out the fermionic system from an interacting quantum field theory, when we impose the condition that the field equations must be Cauchy predictable. The source of the gravitational field can be identified with the quantum interactions that existed in the interacting QFT. This relation is very similar to the ER= EPR conjecture and strongly relies on the fact that emergence of a classical theory will be dependent on the underlying quantum processes and interactions. We consider two concrete example for reaching the result - one where initially there was no gravity and other where gravity was present. The latter case will result in first order corrections to Einstein's equations and immediately reproduces well-known results like effective event horizons and gravitational birefringence.

Authors: R. Friedberg, P. C. Hohenberg

This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called \lq microscopic theory' (MIQM), applicable to any closed system $S$ of arbitrary size $N$, using concepts referring to $S$ alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

Authors: Eugenio Bianchi, Marios Christodoulou, Fabio D'Ambrosio, Hal M. Haggard, Carlo Rovelli

Quantum tunneling of a black hole into a white hole provides a model for the full life cycle of a black hole. The white hole acts as a long-lived remnant, solving the black-hole information paradox. The remnant solution of the paradox has long been viewed with suspicion, mostly because remnants seemed to be such exotic objects. We point out that (i) established physics includes objects with precisely the required properties for remnants: white holes with small masses but large finite interiors; (ii) non-perturbative quantum-gravity indicates that a black hole tunnels precisely into such a white hole, at the end of its evaporation. We address the objections to the existence of white-hole remnants, discuss their stability, and show how the notions of entropy relevant in this context allow them to evade several no-go arguments. A black hole's formation, evaporation, tunneling to a white hole, and final slow decay, form a unitary process that does not violate any known physics.

Authors: Flavio Del Santo, Chiara Cardelli

Commonly accepted views on foundations of science, either based on bottom-up construction or top-down reduction of fundamental entities are here rejected. We show how the current scientific methodology entails a certain kind of research for foundations of science, which are here regarded as insurmountable limitations. At the same time, this methodology allows to surpass the bounds classically accepted as fundamental, yet often based on mere "philosophical prejudices". Practical examples are provided from quantum mechanics and biophysics.

The idea that the future can influence the past may finally explain the inherent randomness of quantum theory and bring it in line with Einstein's space-time
Dewar, Neil and Weatherall, James Owen (2017) On Gravitational Energy in Newtonian Theories. [Preprint]

Abstract

It is generally argued that if the wave-function in the de Broglie–Bohm theory is a physical field, it must be a field in configuration space. Nevertheless, it is possible to interpret the wave-function as a multi-field in three-dimensional space. This approach hasn’t received the attention yet it really deserves. The aim of this paper is threefold: first, we show that the wave-function is naturally and straightforwardly construed as a multi-field; second, we show why this interpretation is superior to other interpretations discussed in the literature; third, we clarify common misconceptions.

Maxwell, Nicholas (2018) Misunderstanding Understanding Scientific Progress. [Preprint]

Authors: Rodolfo Gambini, Jorge Pullin

In the 1960's, Mandelstam proposed a new approach to gauge theories and gravity based on loops. The program for gauge theories was completed for Yang--Mills theories by Gambini and Trias in the 1980's. Gauge theories could be understood as representations of certain group: the group of loops. The same formalism could not be implemented at that time for the gravitational case. Here we would like to propose an extension to the case of gravity. The resulting theory is described in terms of loops and open paths and can provide the underpinning for a new quantum representation for gravity distinct from the one used in loop quantum gravity or string theory. In it, space-time points are emergent entities that would only have quasi-classical status. The formulation may be given entirely in terms of Dirac observables that form a complete set of gauge invariant functions that completely define the Riemannian geometry of the spacetime. At the quantum level this formulation will lead to a reduced phase space quantization free of any constraints.

Authors: Denis Bashkirov

We suggest an interpretation of Einstein Equations of General Relativity at large scales in which the Cosmological constant is exactly zero in the limit of zero spacetime variations of fundamental constants. We argue that in a quasiclassical Universe such variation should be tiny which leads to a tiny value for the Dark Energy. Next, we suggest that the are two sources of the Dark Energy. The first is the variation in Newton's constant $G_N$. It is a form of Dark Energy in that it has negative pressure, but it differs from the Cosmological Constant by a negative contribution to the energy. The second is the contribution of (causal) nonlocalities to the Dark Energy.

This comes together with a particular view of Quantum Mechanics and the wavefunction collapse, in particular. The collapse is neither dynamical nor subjective.

Many of those involved in the race to unleash the power of quantum computing predict it will happen soon. Here's why, says Graeme Malcolm
Research could lead to more robust quantum-information systems

Author(s): Robert B. Griffiths

While much of the technical analysis in the preceding Comment is correct, in the end it confirms the conclusion reached in my previous work [Phys. Rev. A 94, 032115 (2016)]: A consistent histories analysis provides no support for the claim of counterfactual quantum communication put forward by Salih...


[Phys. Rev. A 97, 026102] Published Fri Feb 09, 2018

Castellani, Elena (2018) Scientific Methodology: A View from Early String Theory. [Preprint]
Moulavi Ardakani, Reza (2017) Time Reversal Invariance in Quantum Mechanics. UNSPECIFIED.

Abstract

A proposal is made for a fundamental theory, in which the history of the universe is constituted of diverse views of itself. Views are attributes of events, and the theory’s only be-ables; they comprise information about energy and momentum transferred to an event from its causal past. A dynamics is proposed for a universe constituted of views of events, which combines the energetic causal set dynamics with a potential energy based on a measure of the distinctiveness of the views, called the variety (Smolin in Found Phys 46(6):736–758, 2016). As in the real ensemble formulation of quantum mechanics (Barbour and Smolin in Variety, complexity and cosmology, arXiv: hep-th/9203041), quantum pure states are associated to ensembles of similar events; the quantum potential of Bohm then arises from the variety.

ROVELLI, Carlo (2018) Space and Time in Loop Quantum Gravity. [Preprint]
Cuffaro, Michael E. (2018) Causality and Complementarity in Kant, Hermann, and Bohr. [Preprint]
Johns, Richard (2018) Epistemic Theories of Objective Chance. [Preprint]
Oldofredi, Andrea (2018) Stochasticity and Bell-type Quantum Field Theory. [Preprint]
Farr, Matt (2015) Review of Tim Maudlin, "Philosophy of Physics: Space and Time". Philosophy in Review, Vol 35 (4).
Vickers, Peter (2018) Disarming the Ultimate Historical Challenge to Scientific Realism. The British Journal for the Philosophy of Science.

Abstract

Various claims regarding intertheoretic reduction, weak and strong notions of emergence, and explanatory fictions have been made in the context of first-order thermodynamic phase transitions. By appealing to John Norton’s recent distinction between approximation and idealization, I argue that the case study of anyons and fractional statistics, which has received little attention in the philosophy of science literature, is more hospitable to such claims. In doing so, I also identify three novel roles that explanatory fictions fulfill in science. Furthermore, I scrutinize the claim that anyons, as they are ostensibly manifested in the fractional quantum Hall effect, are emergent entities and urge caution. Consequently, it is suggested that a particular notion of strong emergence signals the need for the development of novel physical–mathematical research programs.

Abstract

The formulation of quantum mechanics developed by Bohm, which can generate well-defined trajectories for the underlying particles in the theory, can equally well be applied to relativistic quantum field theories to generate dynamics for the underlying fields. However, it does not produce trajectories for the particles associated with these fields. Bell has shown that an extension of Bohm’s approach can be used to provide dynamics for the fermionic occupation numbers in a relativistic quantum field theory. In the present paper, Bell’s formulation is adopted and elaborated on, with a full account of all technical detail required to apply his approach to a bosonic quantum field theory on a lattice. This allows an explicit computation of (stochastic) trajectories for massive and massless particles in this theory. Also particle creation and annihilation, and their impact on particle propagation, is illustrated using this model.

Abstract

We propose that observables in quantum theory are properly understood as representatives of symmetry-invariant quantities relating one system to another, the latter to be called a reference system. We provide a rigorous mathematical language to introduce and study quantum reference systems, showing that the orthodox “absolute” quantities are good representatives of observable relative quantities if the reference state is suitably localised. We use this relational formalism to critique the literature on the relationship between reference frames and superselection rules, settling a long-standing debate on the subject.

Norton, John D. (2017) How to Build an Infinite Lottery Machine. [Preprint]

Authors: Juan Maldacena, Alexey Milekhin

The D0 brane, or BFSS, matrix model is a quantum mechanical theory with an interesting gravity dual. We consider a variant of this model where we treat the $SU(N)$ symmetry as a global symmetry, rather than as a gauge symmetry. This variant contains new non-singlet states. We consider the impact of these new states on its gravity dual. We argue that the gravity dual is essentially the same as the one for the original matrix model. The non-singlet states have higher energy at strong coupling and are therefore dynamically suppressed.

Authors: Bruno Cocciaro, Sandro Faetti, Leone Fronzoni

Superluminal communications have been proposed to solve the Einstein, Podolsky and Rosen (EPR) paradox. So far, no evidence for these superluminal communications has been obtained and only lower bounds for the superluminal velocities have been established. In this paper we describe an improved experiment that increases by about two orders of magnitude the maximum detectable superluminal velocities. The locality, the freedom-of-choice and the detection loopholes are not addressed here. No evidence for superluminal communications has been found and a new higher lower bound for their velocities has been established.

Authors: John H. Selby, Carlo Maria Scandolo, Bob Coecke

We present a reconstruction of finite-dimensional quantum theory where all of the postulates are stated entirely in diagrammatic terms, making them intuitive. Equivalently, they are stated in category-theoretic terms, making them mathematically appealing. Again equivalently, they are stated in process-theoretic terms, establishing the fact that the conceptual bare-bones of quantum theory concerns the manner in which systems and processes compose.

Aside from the diagrammatic form, the key novel aspect of this reconstruction is the introduction of a new postulate, symmetric purification. Unlike the ordinary purification postulate, symmetric purification applies equally well to classical theory as well as quantum theory. We therefore first reconstruct the full process theoretic description of quantum theory, consisting of composite classical-quantum systems and their interactions, before restricting ourselves to just the `fully quantum' systems in a final step.

We propose two novel alternative manners of doing so, `no-leaking' (roughly that information gain causes disturbance) and `purity of cups' (roughly the existence of entangled states). Interestingly, these turn out to be equivalent in any process theory with cups and caps. Additionally, we show how the standard purification postulate can then be seen as an immediate consequence of the symmetric purification postulate and purity of cups.

Other tangential results concern the specific frameworks of generalised probabilistic theories (GPTs) and process theories (a.k.a.~CQM). Firstly, we provide a diagrammatic presentation of GPTs, which, henceforth, can now be subsumed under process theories. Secondly, we have now characterised necessary additional axioms for a process theory to correspond to the Hilbert space model, and in particular, that a `sharp dagger' is indeed the right choice of a dagger structure.

Authors: Christian de Ronde, César Massri

In this paper we attempt to consider quantum superpositions from the perspective of the logos categorical approach presented in [26]. We will argue that our approach allows us not only to better visualize the structural features of quantum superpositions providing an anschaulich content to all terms, but also to restore --through the intensive valuation of graphs and the notion of immanent power-- an objective representation of what QM is really talking about. In particular, we will discuss how superpositions relate to some of the main features of the theory of quanta, namely, contextuality, paraconsistency, probability and measurement.

Authors: Werner A Hofer

I revisit the reply of Bohr to Einstein. Bohr's implication that there are no causes in atomic scale systems is, as a closer analysis reveals, not in line with the Copenhagen interpretation since it would contain a statement about reality. What Bohr should have written is that there are no causes in mathematics, which is universally acknowledged. The law of causality requires physical effects to be due to physical causes. For this reason any theoretical model which replaces physical causes by mathematical objects is creationism, that is, it creates physical objects out of mathematical elements. I show that this is the case for most of quantum mechanics.

Authors: Joseph Samuel

The double slit experiment is iconic and widely used in classrooms to demonstrate the fundamental mystery of quantum physics. The puzzling feature is that the probability of an electron arriving at the detector when both slits are open is not the sum of the probabilities when the slits are open separately. The superposition principle of quantum mechanics tells us to add amplitudes rather than probabilities and this results in interference. This experiment defies our classical intuition that the probabilities of exclusive events add. In understanding the emergence of the classical world from the quantum one, there have been suggestions by Feynman, Diosi and Penrose that gravity is responsible for suppressing interference. This idea has been pursued in many different forms ever since, predominantly within Newtonian approaches to gravity. In this paper, we propose and theoretically analyse two `gedanken' or thought experiments which lend strong support to the idea that gravity is responsible for decoherence. The first makes the point that thermal radiation can suppress interference. The second shows that in an accelerating frame, Unruh radiation plays the same role. Invoking the Einstein equivalence principle to relate acceleration to gravity, we support the view that gravity is responsible for decoherence.

Authors: Ehtibar N. Dzhafarov, Janne V. Kujala

The Contextuality-by-Default theory is illustrated on contextuality analysis of the idealized double-slit experiment. The system of contextually labeled random variables describing this experiment forms a cyclic system of rank 4, formally the same as the system describing the EPR/Bohm paradigm (with signaling). Unlike the EPR/Bohm system, however, the double-slit experiment is always noncontextual, i.e., the context-dependence in it is entirely attributable to direct influences of contexts (closed-open arrangements of the slits) upon the random variables involved. The analysis presented is entirely within the framework of abstract classical probability theory (with multiple domain probability spaces). The only physical constraint used in the analysis is that a particle cannot reach a detector through a closed slit. The noncontextuality of the double-slit system does not generalize to systems describing experiments with more than two slits: an example shows that a triple-slit system may very well be contextual.

Authors: Jason Pollack, Ashmeet Singh

Field theories place one or more degrees of freedom at every point in space. Hilbert spaces describing quantum field theories, or their finite-dimensional discretizations on lattices, therefore have large amounts of structure: they are isomorphic to the tensor product of a smaller Hilbert space for each lattice site or point in space. Local field theories respecting this structure have interactions which preferentially couple nearby points. The emergence of classicality through decoherence relies on this framework of tensor-product decomposition and local interactions. We explore the emergence of such lattice structure from Hilbert-space considerations alone. We point out that the vast majority of finite-dimensional Hilbert spaces cannot be isomorphic to the tensor product of Hilbert-space subfactors that describes a lattice theory. A generic Hilbert space can only be split into a direct sum corresponding to a basis of state vectors spanning the Hilbert space; we consider setups in which the direct sum is naturally decomposed into two pieces. We define a notion of direct-sum locality which characterizes states and decompositions compatible with Hamiltonian time evolution. We illustrate these notions for a toy model that is the finite-dimensional discretization of the quantum-mechanical double-well potential. We discuss their relevance in cosmology and field theory, especially for theories which describe a landscape of vacua with different spacetime geometries.

Physics Today, Volume 71, Issue 2, Page 70-71, February 2018.
ROVELLI, Carlo (2017) "Space is blue and birds fly through it". [Preprint]

Author(s): Emily Adlam and Adrian Kent

Bob has a black box that emits a single pure state qudit which is, from his perspective, uniformly distributed. Alice wishes to give Bob evidence that she has knowledge about the emitted state while giving him little or no information about it. We show that zero-knowledge evidencing of such knowledg...


[Phys. Rev. Lett. 120, 050501] Published Tue Jan 30, 2018

Abstract

In the standard formalism of quantum gravity, black holes appear to form statistical distributions of quantum states. Now, however, we can present a theory that yields pure quantum states. It shows how particles entering a black hole can generate firewalls, which however can be removed, replacing them by the ‘footprints’ they produce in the out-going particles. This procedure can preserve the quantum information stored inside and around the black hole. We then focus on a subtle but unavoidable modification of the topology of the Schwarzschild metric: antipodal identification of points on the horizon. If it is true that vacuum fluctuations include virtual black holes, then the structure of space-time is radically different from what is usually thought.

De Haro, Sebastian (2018) The Heuristic Function of Duality. [Preprint]
Higashi, Katsuaki (2018) A no-go result on common cause approaches via Hardy's paradox. [Preprint]
Juan, Villacrés (2018) Ontological Motivation in Obtaining Certain Quantum Equations: A Case for Panexperientialism. [Preprint]
Healey, Richard (2018) Pragmatist Quantum Realism. [Preprint]
Acuña, Pablo (2017) Inertial Trajectories in de Broglie-Bohm Theory: an unexpected problem. International Studies in the Philosophy of Science, 30. pp. 201-230.
Romero, Gustavo E. (2014) PHILOSOPHICAL ISSUES ABOUT BLACK HOLES. Advances in Black Holes Research. pp. 27-58. ISSN 978-1-63463-168-6

Authors: Sean M. Carroll, Ashmeet Singh

To the best of our current understanding, quantum mechanics is part of the most fundamental picture of the universe. It is natural to ask how pure and minimal this fundamental quantum description can be. The simplest quantum ontology is that of the Everett or Many-Worlds interpretation, based on a vector in Hilbert space and a Hamiltonian. Typically one also relies on some classical structure, such as space and local configuration variables within it, which then gets promoted to an algebra of preferred observables. We argue that even such an algebra is unnecessary, and the most basic description of the world is given by the spectrum of the Hamiltonian (a list of energy eigenvalues) and the components of some particular vector in Hilbert space. Everything else - including space and fields propagating on it - is emergent from these minimal elements.

Authors: Jungjai Lee, Hyun Seok Yang

We suggest that dark energy and dark matter may be a cosmic ouroboros of quantum gravity due to the coherent vacuum structure of spacetime. We apply the emergent gravity to a large $N$ matrix model by considering the vacuum in the noncommutative (NC) Coulomb branch satisfying the Heisenberg algebra. We observe that UV fluctuations in the NC Coulomb branch are always paired with IR fluctuations and these UV/IR fluctuations can be extended to macroscopic scales. We show that space-like fluctuations give rise to the repulsive gravitational force while time-like fluctuations generate the attractive gravitational force. When considering the fact that the fluctuations are random in nature and we are living in the (3+1)-dimensional spacetime, the ratio of the repulsive and attractive components will end in $\frac{3}{4}: \frac{1}{4}=75:25$ and this ratio curiously coincides with the dark composition of our current Universe. If one includes ordinary matters which act as the attractive force, the emergent gravity may explain the dark sector of our Universe more precisely.

Authors: Andrew D. Bond, Daniel F. Litim

All known examples of 4d quantum field theories with asymptotic freedom or asymptotic safety at weak coupling involve non-abelian gauge interactions. We demonstrate that this is not a coincidence: no weakly coupled fixed points, ultraviolet or otherwise, can be reliably generated in theories lacking gauge interactions. Implications for conformal field theory and phase transitions are indicated.

Authors: Robert Alicki, Ronnie Kosloff

Quantum Thermodynamics is a continuous dialogue between two independent theories: Thermodynamics and Quantum Mechanics. Whenever the two theories addressed the same phenomena new insight has emerged. We follow the dialogue from equilibrium Quantum Thermodynamics and the notion of entropy and entropy inequalities which are the base of the II-law. Dynamical considerations lead to non-equilibrium thermodynamics of quantum Open Systems. The central part played by completely positive maps is discussed leading to the Gorini-Kossakowski-Lindblad-Sudarshan GKLS equation. We address the connection to thermodynamics through the system-bath weak-coupling-limit WCL leading to dynamical versions of the I-law. The dialogue has developed through the analysis of quantum engines and refrigerators. Reciprocating and continuous engines are discussed. The autonomous quantum absorption refrigerator is employed to illustrate the III-law. Finally, we describe some open questions and perspectives.

To truly understand data, we need to rethink what we mean by "measurement"

-- Read more on ScientificAmerican.com


show enclosure

(; 0.06 MB)

Author(s): Flavien Hirsch, Marco Túlio Quintino, and Nicolas Brunner

We discuss the connection between the incompatibility of quantum measurements, as captured by the notion of joint measurability, and the violation of Bell inequalities. Specifically, we explicitly present a given set of non-jointly-measurable positive-operator-value measures (POVMs) MA with the foll...


[Phys. Rev. A 97, 012129] Published Thu Jan 25, 2018

Authors: Daniel Kabat, Gilad Lifschytz

Perturbative bulk reconstruction in AdS/CFT starts by representing a free bulk field $\phi^{(0)}$ as a smeared operator in the CFT. A series of $1/N$ corrections must be added to $\phi^{(0)}$ to represent an interacting bulk field $\phi$. These corrections have been determined in the literature from several points of view. Here we develop a new perspective. We show that correlation functions involving $\phi^{(0)}$ suffer from ambiguities due to analytic continuation. As a result $\phi^{(0)}$ fails to be a well-defined linear operator in the CFT. This means bulk reconstruction can be understood as a procedure for building up well-defined operators in the CFT which singles out the interacting field $\phi$. We further propose that the difficulty with defining $\phi^{(0)}$ as a linear operator can be re-interpreted as a breakdown of associativity. Presumably $\phi^{(0)}$ can only be corrected to become an associative operator in perturbation theory. This suggests that quantum mechanics in the bulk is only valid in perturbation theory around a semiclassical bulk geometry.

Authors: Stephen L. Adler

We calculate the rate of heating through phonon excitation implied by the noise postulated in mass-proportional-coupled collapse models, for a general noise power spectrum. For white noise with reduction rate $\lambda$, the phonon heating rate reduces to the standard formula, but for non-white noise with power spectrum $\lambda(\omega)$, the rate $\lambda$ is replaced by $\lambda_{\rm eff}=\frac{2}{3 \pi^{3/2}} \int d^3w e^{-\vec w^2} \vec w^2 \lambda(\omega_L(\vec w/r_c))$, with $\omega_L(\vec q)$ the longitudinal acoustic phonon frequency as a function of wave number $\vec q$, and with $r_C$ the noise correlation length. Hence if the noise power spectrum is cut off below $\omega_L(|\vec q| \sim r_c^{-1})$, the heating rate is sharply reduced.

Author(s): V. S. Gomes and R. M. Angelo

Based on a recently proposed model of physical reality and an underlying criterion of nonlocality for contexts [A. L. O. Bilobran and R. M. Angelo, Europhys. Lett. 112, 40005 (2015)], we introduce a quantifier of realism-based nonlocality for bipartite quantum states, a concept that is profoundly di...


[Phys. Rev. A 97, 012123] Published Wed Jan 24, 2018

Publication date: Available online 20 January 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Carina E.A. Prunkl, Christopher G. Timpson
Recently, Cabello et al. (2016) claim to have proven the existence of an empirically verifiable difference between two broad classes of quantum interpretations. On the basis of three seemingly uncontentious assumptions, (i) the possibility of randomly selected measurements, (ii) the finiteness of a quantum system's memory, and (iii) the validity of Landauer's principle, and further, by applying computational mechanics to quantum processes, the authors arrive at the conclusion that some quantum interpretations (including central realist interpretations) are associated with an excess heat cost and are thereby untenable—or at least—that they can be distinguished empirically from their competitors by measuring the heat produced. Here, we provide an explicit counterexample to this claim and demonstrate that their surprising result can be traced back to a lack of distinction between system and external agent. By drawing the distinction carefully, we show that the resulting heat cost is fully accounted for in the external agent, thereby restoring the tenability of the quantum interpretations in question.

Acuña, Pablo (2016) Minkowski Spacetime and Lorentz Invariance: the cart and the horse or two sides of a single coin. Studies in History and Philosophy of Modern Physics, 55. pp. 1-12. ISSN 1355-2198

Author(s): Hyukjoon Kwon, Chae-Yeun Park, Kok Chuan Tan, Daekun Ahn, and Hyunseok Jeong

We investigate a measure of quantum coherence and its extension to quantify quantum macroscopicity. The coherence measure can also quantify the asymmetry of a quantum state with respect to a given group transformation. We then show that a weighted sum of asymmetry in each mode can be applied as a me...


[Phys. Rev. A 97, 012326] Published Tue Jan 23, 2018

Abstract
I present an argument against a relational theory of spacetime that regards spacetime as a ‘structural quality of the field’. The argument takes the form of a trilemma. To make the argument, I focus on relativistic worlds in which there exist just two fields, an electromagnetic field and a gravitational field. Then there are three options: either spacetime is a structural quality of each field separately, both fields together, or one field but not the other. I argue that the first option founders on a problem of geometric coordination and that the second and third options collapse into substantivalism. In particular, on the third option it becomes clear that the relationalist’s path to Leibniz equivalence is no simpler or more straightforward than the substantivalist’s.
Halvorson, Hans (2018) To be a realist about quantum theory. [Preprint]
Gryb, Sean and Thebault, Karim (2018) Bouncing Unitary Cosmology II. Mini-Superspace Phenomenology. [Preprint]
Gryb, Sean and Thebault, Karim (2018) Bouncing Unitary Cosmology I. Mini-Superspace General Solution. [Preprint]
Gryb, Sean and Thebault, Karim P Y (2018) Superpositions of the cosmological constant allow for singularity resolution and unitary evolution in quantum cosmology. [Preprint]
Christian, Joy (2017) Quantum Correlations are Weaved by the Spinors of the Euclidean Primitives. [Preprint]
Hubert, Mario and Romano, Davide (2017) The Wave-Function as a Multi-Field. [Preprint]

Authors: Juan Villacrés

In this work I argue for the existence of an ontological state in which no entity in it can be more basic than the others in such a state. This is used to provide conceptual justification for a method that is applied to obtain the Schr\"{o}dinger equation, the Klein-Gordon equation, and the Klein-Gordon equation for a particle in an electromagnetic field. Additionally, it is argued that the existence of such state is incompatible with indirect realism; and the discussion suggests that a panexeperientialist view is a straightforward means to embrace it.

Authors: Tonghua Liu, Jieci Wang, Jiliang Jing, Heng Fan

We study the dynamics of steering between two correlated Unruh-Dewitt detectors when one of them locally interacts with external scalar field via different quantifiers. We find that the quantum steering, either measured by the entropic steering inequality or the Cavalcanti-Jones-Wiseman-Reid inequality, is fragile under the influence of Unruh thermal noise. The quantum steering is found always asymmetric and the asymmetry is extremely sensitive to the initial state parameter. In addition, the steering-type quantum correlations experience "sudden death" for some accelerations, which are quite different from the behaviors of other quantum correlations in the same system. It is worth noting that the domination value of the tight quantum steering exists a transformation point with increasing acceleration. We also find that the robustness of quantum steerability under the Unruh thermal noise can be realized by choosing the smallest energy gap in the detectors.

Authors: Hisham Ghassib

Relativity was Einstein's main research program and scientific project. It was an open-ended program that developed throughout Einstein's scientific career, giving rise to special relativity, general relativity and unified field theory. In this paper, we want to uncover the methodological logic of the Einsteinian program, which animated the whole program and its development, and as it was revealed in SR, GR, and unified field theory. We aver that the same methodological logic animated all these theories as Einstein's work progressed. Each of these theories contributed towards constructing Einstein's ambitious program. This paper is not a paper in the history of Relativity, but, rather, it utilizes our knowledge of this history to uncover the methodological logic of the relativity program and its development. This logic is latent in the historical narrative, but is not identical to it. We hope to show that the Einsteinian relativity project is still relevant today as a theoretical scheme, despite its failures and despite quantum mechanics.

Manchak, JB and Weatherall, James Owen (2018) (Information) Paradox Regained? A Brief Comment on Maudlin on Black Hole Information Loss. [Preprint]
Quantum processor named after Alaska’s Tangled Lakes
American Journal of Physics, Volume 86, Issue 2, Page 159-160, February 2018.
Conference: 30 Apr 2018 - 4 May 2018, Bangalore, India.
Okon, Elias and Sebastián, Miguel Ángel (2018) A Consciousness-Based Quantum Objective Collapse Model. [Preprint]
Kuby, Daniel (2018) Carnap, Feyerabend and the pragmatic theory of observation. [Preprint]

Recently, increased computational power and data availability, as well as algorithmic advances, have led machine learning (ML) techniques to impressive results in regression, classification, data generation and reinforcement learning tasks. Despite these successes, the proximity to the physical limits of chip fabrication alongside the increasing size of datasets is motivating a growing number of researchers to explore the possibility of harnessing the power of quantum computation to speed up classical ML algorithms. Here we review the literature in quantum ML and discuss perspectives for a mixed readership of classical ML and quantum computation experts. Particular emphasis will be placed on clarifying the limitations of quantum algorithms, how they compare with their best classical counterparts and why quantum resources are expected to provide advantages for learning problems. Learning in the presence of noise and certain computationally hard problems in ML are identified as promising directions for the field. Practical questions, such as how to upload classical data into quantum form, will also be addressed.

Abstract

In spite of being a well articulated proposal, the theory of quantum histories (TQH), in its different versions, suffers from certain difficulties that have been pointed out in the literature. Nevertheless, two facets of the proposal have not been sufficiently stressed. On the one hand, it is a non-collapse formalism that should be technically appropriate to supply descriptions based on quantum properties at different times. On the other hand, it intends to provide an interpretation of quantum mechanics that solves the traditional puzzles of the theory. In this article we spell out the main criticisms to TQH and classify them into two groups: theoretical and interpretive. Whereas the latter might be ignored if the TQH were considered as a quantum formalism with its minimum interpretation, the former seems to point toward technical difficulties that must be faced in a theoretically adequate proposal. Precisely with the purpose of solving these difficulties, we introduce a different perspective, called Formalism of Generalized Contexts or Formalism of Contextual Histories (FCH), which supplies a precise condition for consistently talking of quantum properties at different times without the theoretical shortcomings of the TQH.

Author(s): J. Tuziemski, P. Witas, and J. K. Korbicz

Broadly understood decoherence processes in quantum electrodynamics, induced by neglecting either the radiation [L. Landau, Z. Phys. 45, 430 (1927)] or the charged matter [N. Bohr and L. Rosenfeld, K. Danske Vidensk. Selsk, Math.-Fys. Medd. XII, 8 (1933)], have been studied from the dawn of the theo...


[Phys. Rev. A 97, 012110] Published Fri Jan 12, 2018

Author(s): Miguel E. Rodriguez R.

Quantum mechanics in noncommutative space modifies the standard result of the Aharonov-Bohm effect for electrons and other recent quantum effects. Here we obtain the phase in noncommutative space for the Spavieri effect, a generalization of Aharonov-Bohm effect which involves a coherent superpositio...


[Phys. Rev. A 97, 012109] Published Fri Jan 12, 2018

Author(s): Ivan Glasser, Nicola Pancotti, Moritz August, Ivan D. Rodriguez, and J. Ignacio Cirac

Two tools show great promise in approximating low-temperature, condensed-matter systems: Tensor-network states and artificial neural networks. A new analysis builds a bridge between these techniques, opening the way to a host of powerful approaches to understanding complex quantum systems.


[Phys. Rev. X 8, 011006] Published Thu Jan 11, 2018

Abstract

In his seminal work, McTaggart (Mind 17(68):457–484, 1908; The nature of existence, Cambridge University Press, Cambridge, 1927) dismissed the possibility of understanding the B-Relations (earlier than, simultaneity, and later than) as irreducibly temporal relations, and with it dismissing the B-Theory of time, which assumes the reality of irreducible B-relations. Instead, he thought they were mere constructions from irreducible A-determinations (pastness, presentness, and futurity) and timeless ordering relations (his C-Relations). However, since, philosophers have almost universally dismissed his dismissal of irreducible B-relations. This paper argues that McTaggart was correct to dismiss the possibility of B-relations, and that would be B-theorists should be C-theorists and its concomitant commitment to the unreality of time. I do this by first elaborating C-Theory, noting that B-relations appear indiscernible from C-relations on close examination. This establishes an onus on B-theorists to distinguish B-relations from C-relations by elaborating the distinctively temporal character of the former. I then present a problem for the possibility of accommodating temporal character in B-relations. Following this, I question from whence derives our sense of the temporal character that purportedly resides in the irreducible B-relations. Finally, I extend the challenge against irreducible B-relations to a series of irreducible abstract temporal relations—so called Ersatz-B-Relations—modelled on them.

Author(s): Bo-Bo Wei

Thermodynamics and information theory have been intimately related since the times of Maxwell and Boltzmann. Recently it was shown that the dissipated work in an arbitrary nonequilibrium process is related to the Rényi divergences between two states along the forward and reversed dynamics. Here we s...


[Phys. Rev. A 97, 012105] Published Tue Jan 09, 2018

Abstract

General relativity cannot be formulated as a perturbatively renormalizable quantum field theory. An argument relying on the validity of the Bekenstein–Hawking entropy formula aims at dismissing gravity as non-renormalizable per se, against hopes (underlying programs such as Asymptotic Safety) that d-dimensional GR could turn out to have a non-perturbatively renormalizable d–dimensional quantum field theoretic formulation. In this note we discuss various forms of highly problematic semi-classical extrapolations assumed by both sides of the debate concerning what we call The Entropy Argument, and show that a large class of dimensional reduction scenarios leads to the blow-up of Bekenstein–Hawking entropy.

Abstract

Quantum bit commitment is insecure in the standard non-relativistic quantum cryptographic framework, essentially because Alice can exploit quantum steering to defer making her commitment. Two assumptions in this framework are that: (a) Alice knows the ensembles of evidence E corresponding to either commitment; and (b) system E is quantum rather than classical. Here, we show how relaxing assumption (a) or (b) can render her malicious steering operation indeterminable or inexistent, respectively. Finally, we present a secure protocol that relaxes both assumptions in a quantum teleportation setting. Without appeal to an ontological framework, we argue that the protocol’s security entails the reality of the quantum state, provided retrocausality is excluded.

Abstract

In this paper I argue that physics is, always was, and probably always will be voiceless with respect to tense and passage, and that, therefore, if, as I believe, tense and passage are the essence of time, physics’ contribution to our understanding of time can only be limited. The argument, in a nutshell, is that if "physics has no possibility of expression for the Now", to quote Einstein, then it cannot add anything to the study of tense and passage, and specifically, cannot add anything to the debate between deniers and affirmers of the existence or reality of tense and passage. Since relativity theory did not equip physics with a new language with which to speak of tense and passage, I draw the further conclusion that relativity theory has not generated the revolution to our conception of time that is attributed to it. In the last section I discuss the motivations behind the continued but misguided attempts to integrate tense into a relativistic setting, and assess the manners in which relativity theory has nevertheless enhanced, albeit indirectly, our understanding of tense and passage.

Abstract

Initially motivated by their relevance in foundations of quantum mechanics and more recently by their applications in different contexts of quantum information science, violations of Bell inequalities have been extensively studied during the last years. In particular, an important effort has been made in order to quantify such Bell violations. Probabilistic techniques have been heavily used in this context with two different purposes. First, to quantify how common the phenomenon of Bell violations is; and second, to find large Bell violations in order to better understand the possibilities and limitations of this phenomenon. However, the strong mathematical content of these results has discouraged some of the potentially interested readers. The aim of the present work is to review some of the recent results in this direction by focusing on the main ideas and removing most of the technical details, to make the previous study more accessible to a wide audience.

Publication date: Available online 5 January 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Simon Friederich
The paper has three main aims: first, to make the asymptotic safety-based approach to quantum gravity better known to the community of researchers in the history and philosophy of modern physics by outlining its motivation, core tenets, and achievements so far; second, to preliminarily elucidate the finding that, according to the asymptotic safety scenario, space-time has fractal dimension 2 at short length scales; and, third, to provide the basis for a methodological appraisal of the asymptotic safety-based approach to quantum gravity in the light of the Kuhnian criteria of theory choice.

Publication date: Available online 30 December 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Karen Crowther
Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory.

Volume 4, Issue 1, pages 147-157

Jean Bricmont [Show Biography]

I was born 12 April 1952 in Belgium; I got my phD in 1977 at the University of Louvain in Belgium. I worked at Rutgers and Princeton universities and have been a professor of theoretical physics at the university of Louvain, but I am now retired. I worked on statistical mechanics, the renormalization group and nonlinear partial differential equations. I am also interested in making sense of quantum mechanics, see http://www.springer.com/gp/book/9783319258874.

This is a review of Travis Norsen's book \emph{Foundations of Quantum Mechanics: An Exploration of the Physical Meaning of Quantum Theory} (Springer, 2017).

Full Text Download (145k) | View Submission Post

Volume 4, Issue 1, pages 142-146

Louis Marchildon [Show Biography]

Louis Marchildon is Professor of Physics (Emeritus) at Université du Québec à Trois-Rivières (UQTR). He obtained his B.Sc. and M.Sc. from UQTR, and his Ph.D. from Yale University in 1978. After postdoctoral work at Institut des hautes études scientifiques (France), he returned to UQTR where, in addition to research in relativity, he collaborated with a group investigating dielectric properties of materials. His book Quantum Mechanics: From Basic Principles to Numerical Methods and Applications was published by Springer in 2002. He served as President of the Canadian Association of Physicists in 2007-2008. He has now been working on quantum foundations for more than 15 years, and is also interested in science popularization.

Kastner (this issue) and Kastner and Cramer (arXiv:1711.04501) argue that the Relativistic Transactional Interpretation (RTI) of quantum mechanics provides a clear definition of absorbers and a solution to the measurement problem. I briefly examine how RTI stands with respect to unitarity in quantum mechanics. I then argue that a specific proposal to locate the origin of nonunitarity is flawed, at least in its present form.

Full Text Download (137k) | View Submission Post

Volume 4, Issue 1, pages 128-141

Ruth E. Kastner [Show Biography]

Ruth E. Kastner earned her M.S. in Physics and Ph.D. in Philosophy (History and Philosophy of Science) and the University of Maryland, College Park (1999). She has taught a variety of philosophy and physics courses throughout the Baltimore-Washington corridor, and currently is a member of the Foundations of Physics group at UMCP. She is also an Affiliate of the physics department at the SUNY Albany campus. She specializes in time-symmetry and the Transactional Interpretation (TI) of quantum mechanics, and in particular has extended the original TI of John Cramer to the relativistic domain. Her interests and publications include topics in thermodynamics and statistical mechanics, quantum ontology, counterfactuals, spacetime emergence, and free will. She is the author of two books: The Transactional Interpretation of Quantum Mechanics: The Reality of Possibility (Cambridge, 2012) and Understanding Our Unseen Reality: Solving Quantum Riddles (Imperial College Press, 2015). She is also an Editor of the collected volume Quantum Structural Studies (World Scientific, 2016).

In view of a resurgence of concern about the measurement problem, it is pointed out that the Relativistic Transactional Interpretation (RTI) remedies issues previously considered as drawbacks or refutations of the original Transactional Interpretation (TI). Specifically, once one takes into account relativistic processes that are not representable at the non-relativistic level (such as particle creation and annihilation, and virtual propagation), absorption is quantitatively defined in unambiguous physical terms. In addition, specifics of the relativistic transactional model demonstrate that the Maudlin ‘contingent absorber’ challenge to the original TI cannot even be mounted: basic features of established relativistic field theories (in particular, the asymmetry between field sources and the bosonic fields, and the fact that slow-moving bound states, such as atoms, are not offer waves) dictate that the ‘slow-moving offer wave’ required for the challenge scenario cannot exist. It is concluded that issues previously considered obstacles for the Transactional Interpretation are no longer legitimately viewed as such, and that reconsideration of the model is warranted in connection with solving the measurement problem.
Full Text Download (278k) | View Submission Post

Volume 4, Issue 1, pages 117-127

Andreas Schlatter [Show Biography]

Born in Zurich, Switzerland, Andreas Schlatter was educated at the Swiss Federal Institute of Technology in Zurich, where he studied mathematics. He got his PhD in 1994 with work in partial differential equations. He subsequently held a research position at Princeton University, where he did further work mainly on the Yang-Mills heat equation. In 1997 Andreas joined the Asset Management industry and pursued a distinguished career over twenty years, which brought him into the Executive Committee of one of the world’s large Asset Management firms. Today Andreas does consulting work and holds a number of independent board seats. Andreas has been doing research and published during his professional life, mainly in the area of Quantum Foundations and Relativity but also in Finance.

There are so called MOND corrections to the general relativistic laws of gravity, able to explain phenomena like the rotation of large spiral galaxies or gravitational lensing by certain galaxy clusters. We show that these corrections can be derived in the framework of synchronizing thermal clocks. We develop a general formula, which reproduces the deep MOND correction at large scales and defines the boundary-acceleration beyond which corrections are necessary.

Full Text Download (903k) | View Submission Post

Volume 4, Issue 1, pages 1-116

Per Östborn [Show Biography]

Born in Lund, Sweden, Per Östborn was educated at Lund University. He got his PhD at the Division of Mathematical Physics in 2003. The subject of the dissertation was phase transitions toward synchrony in large lattices of limit cycle oscillators. Such phase transitions are examples of phase transitions in non-equilibrium systems. More recently he has held a cross-disciplinary research position at the Department of Archaeology and Ancient History at Lund University. He has developed and used network-based methods to analyze the diffusion of innovations in antiquity. Per works outside academia as well, mostly with environmental issues relating to transport. Interest in the philosophical foundations was the reason why he started to study physics, but this is his first publication in this field.

We derive the Hilbert space formalism of quantum mechanics from epistemic principles. A key assumption is that a physical theory that relies on entities or distinctions that are unknowable in principle gives rise to wrong predictions. An epistemic formalism is developed, where concepts like individual and collective knowledge are used, and knowledge may be actual or potential. The physical state S corresponds to the collective potential knowledge. The state S is a subset of a state space S = {Z}, such that S always contains several elements Z, which correspond to unattainable states of complete potential knowledge of the world. The evolution of S cannot be determined in terms of the individual evolution of the elements Z, unlike the evolution of an ensemble in classical phase space. The evolution of S is described in terms of sequential time n belonging to N, which is updated according to n -> n+1 each time potential knowledge changes. In certain experimental contexts C, there is knowledge at the start of the experiment at time n that a given series of properties P, P',... will be observed within a given time frame, meaning that a series of values p, p',... of these properties will become known. At time n, it is just known that these values belong to predefined, finite sets {p},{p'},... In such a context C, it is possible to define a complex Hilbert space HC on top of S, in which the elements are contextual state vectors Sc. Born’s rule to calculate the probabilities to find the values p,p',... is derived as the only generally applicable such rule. Also, we can associate a self-adjoint operator P with eigenvalues {p} to each property P observed within C. These operators obey [P, P'] = 0 if and only if the precise values of P and P' are simultaneoulsy knowable. The existence of properties whose precise values are not simultaneously knowable follows from the hypothesis that collective potential knowledge is always incomplete, corresponding to the above-mentioned statement that S always contains several elements Z.

Full Text Download (1137k) | View Submission Post

Publication date: Available online 21 December 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): R. Hermens, O.J.E. Maroney
Macroscopic realism is the thesis that macroscopically observable properties must always have definite values. The idea was introduced by Leggett and Garg (1985), who wished to show a conflict with the predictions of quantum theory, by using it to derive an inequality that quantum theory violates. However, Leggett and Garg's analysis required not just the assumption of macroscopic realism per se, but also that the observable properties could be measured non-invasively. In recent years there has been increasing interest in experimental tests of the violation of the Leggett-Garg inequality, but it has remained a matter of controversy whether this second assumption is a reasonable requirement for a macroscopic realist view of quantum theory. In a recent critical assessment Maroney and Timpson (2014) identified three different categories of macroscopic realism, and argued that only the simplest category could be ruled out by Leggett-Garg inequality violations. Allen, Maroney, and Gogioso (2016) then showed that the second of these approaches was also incompatible with quantum theory in Hilbert spaces of dimension 4 or higher. However, we show that the distinction introduced by Maroney and Timpson between the second and third approaches is not noise tolerant, so unfortunately Allen's result, as given, is not directly empirically testable. In this paper we replace Maroney and Timpson's three categories with a parameterization of macroscopic realist models, which can be related to experimental observations in a noise tolerant way, and recover the original definitions in the noise-free limit. We show how this parameterization can be used to experimentally rule out classes of macroscopic realism in Hilbert spaces of dimension 3 or higher, without any use of the non-invasive measurability assumption. Even for relatively low precision experiments, this will rule out the original category of macroscopic realism, that is tested by the Leggett-Garg inequality, while as the precision of the experiments increases, all cases of the second category and many cases of the third category, will become experimentally ruled out.