Weekly Papers on Quantum Foundations (8)

Authors: Björn SchrinskiStefan NimmrichterBenjamin A. SticklerKlaus Hornberger

We establish an objective scheme to determine the macroscopicity of quantum mechanical superposition tests, which is based on the Bayesian hypothesis falsification of macrorealistic modifications of quantum theory. The measure uses the raw data gathered in an experiment, taking into account all measurement uncertainties, and can be used to directly assess any conceivable quantum test. We determine the resulting macroscopicity for three recent tests of quantum physics: double-well interference of Bose-Einstein condensates, Leggett-Garg tests with atomic random walks, and entanglement generation and read-out of nanomechanical oscillators.

Authors: Arnold Neumaier

This paper presents the measurement problem from the point of view of the thermal interpretation of quantum physics introduced in Part II. The measurement of a Hermitian quantity $A$ is regarded as giving an uncertain value approximating the q-expectation $\langle A\rangle$ rather than (as tradition wanted to have it) as an exact revelation of an eigenvalue of $A$. Single observations of microscopic systems are (except under special circumstances) very uncertain measurements only. The thermal interpretation

* treats detection events as a statistical measurement of particle beam intensity;

* claims that the particle concept is only asymptotically valid, under conditions where particles are essentially free.

* claims that the unmodeled environment influences the results enough to cause all randomness in quantum physics.

* implies that part of Born’s rule holds exactly: Whenever a quantity $A$ is measured exactly, its value is an eigenvalue of $A$.

* allows one to derive Born’s rule for scattering and in the limit of ideal measurements;

* has no explicit collapse — the latter emerges approximately in non-isolated subsystems;

* gives a valid interpretation of systems modeled by a quantum-classical dynamics;

* explains the peculiar features of the Copenhagen interpretation (lacking realism between measurements) and the minimal statistical interpretation (lacking realism for the single case) where these interpretations apply — in the microscopic domain.

The thermal interpretation is an interpretation of quantum physics that is in principle refutable by theoretical arguments leading to a negative answer to a number of open issues collected at the end of the paper, since there is plenty of experimental evidence for each of the points mentioned there.

Authors: Arnold Neumaier

This paper presents the thermal interpretation of quantum physics. The insight from Part I of this series that Born’s rule has its limitations — hence cannot be the foundation of quantum physics — opens the way for an alternative interpretation — the thermal interpretation of quantum physics. It gives new foundations that connect quantum physics (including quantum mechanics, statistical mechanics, quantum field theory and their applications) to experiment. The thermal interpretation resolves the problems of the foundations of quantum physics revealed in the critique from Part I of this series. It improves the traditional foundations in several respects:

* The thermal interpretation reflects the actual practice of quantum physics, especially regarding its macroscopic implications.

* The thermal interpretation gives a fair account of the interpretational differences between quantum mechanics and quantum field theory.

* The thermal interpretation gives a natural, realistic meaning to the standard formalism of quantum mechanics and quantum field theory in a single world, without introducing additional hidden variables.

* The thermal interpretation is independent of the measurement problem. The latter becomes a precise problem in statistical mechanics rather than a fuzzy and problematic notion in the foundations. Details will be discussed in Part III.

Authors: Arnold Neumaier

This paper gives a thorough critique of the foundations of quantum physics in its mainstream interpretation (i.e., treating pure states as primitives, without reference to hidden variables, and without modifications of the quantum laws). This is achieved by cleanly separating a concise version of the (universally accepted) formal core of quantum physics from the (controversial) interpretation issues. The latter are primarily related to measurement, but also to questions of existence and of the meaning of basic concepts like ‘state’ and ‘particle’. The requirements for good foundations of quantum physics are discussed. Main results:

* Born’s rule cannot be valid universally, and must be considered as a scientific law with a restricted domain of validity.

* If the state of every composite quantum system contains all information that can be known about this system, it cannot be a pure state in general.

Publication date: Available online 26 February 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Joshua Rosaler, Robert Harlander

Abstract

The Higgs naturalness principle served as the basis for the so far failed prediction that signatures of physics beyond the Standard Model (SM) would be discovered at the LHC. One influential formulation of the principle, which prohibits fine tuning of bare Standard Model (SM) parameters, rests on the assumption that a particular set of values for these parameters constitute the “fundamental parameters” of the theory, and serve to mathematically define the theory. On the other hand, an old argument by Wetterich suggests that fine tuning of bare parameters merely reflects an arbitrary, inconvenient choice of expansion parameters and that the choice of parameters in an EFT is therefore arbitrary. We argue that these two interpretations of Higgs fine tuning reflect distinct ways of formulating and interpreting effective field theories (EFTs) within the Wilsonian framework: the first takes an EFT to be defined by a single set of physical, fundamental bare parameters, while the second takes a Wilsonian EFT to be defined instead by a whole Wilsonian renormalization group (RG) trajectory, associated with a one-parameter class of physically equivalent parametrizations. From this latter perspective, no single parametrization constitutes the physically correct, fundamental parametrization of the theory, and the delicate cancellation between bare Higgs mass and quantum corrections appears as an eliminable artifact of the arbitrary, unphysical reference scale with respect to which the physical amplitudes of the theory are parametrized. While the notion of fundamental parameters is well motivated in the context of condensed matter field theory, we explain why it may be superfluous in the context of high energy physics.

Eva, Benjamin and Hartmann, Stephan (2019) On the Origins of Old Evidence. [Preprint]

Author(s): Berthold-Georg Englert, Kelvin Horia, Jibo Dai, Yink Loong Len, and Hui Khoon Ng

We stand by our findings in Phys. Rev. A 96, 022126 (2017). In addition to refuting the invalid objections raised by Peleg and Vaidman, we report a retrocausation problem inherent in Vaidman’s definition of the past of a quantum particle.

[Phys. Rev. A 99, 026104] Published Wed Feb 27, 2019

Author(s): Uri Peleg and Lev Vaidman

The recent criticism of Vaidman’s proposal for the analysis of the past of a particle in the nested interferometer is refuted. It is shown that the definition of the past of the particle adopted by Englert et al. [B. G. Englert et al.Phys. Rev. A 96, 022126 (2017)] is applicable only to a tiny fra…

[Phys. Rev. A 99, 026103] Published Wed Feb 27, 2019

Author(s): Aonan Zhang, Huichao Xu, Jie Xie, Han Zhang, Brian J. Smith, M. S. Kim, and Lijian Zhang

Contextuality is considered as an intrinsic signature of nonclassicality and a crucial resource for achieving unique advantages of quantum information processing. However, recently, there have been debates on whether classical fields may also demonstrate contextuality. Here, we experimentally config…

[Phys. Rev. Lett. 122, 080401] Published Tue Feb 26, 2019

Dieks, Dennis (2017) Mechanisms, Explanation and Understanding in Physics. [Preprint]
Dieks, Dennis and Lubberdink, Andrea (2019) Identical quantum particles as distinguishable objects. [Preprint]

Author(s): Andrea Addazi, Antonino Marcianò, and Nicolás Yunes

A theoretical study quantifies future challenges for probing the horizon structure of merging black holes using gravitational waves as a tool to study quantum gravity.


[Phys. Rev. Lett. 122, 081301] Published Mon Feb 25, 2019

We elaborate on the idea of fake particle and study its physical consequences. When a theory contains fakeons, the true classical limit is determined by the quantization and a subsequent process of ‘classicization’. One of the major predictions due to the fake particles is the violation of microcausality, which survives the classical limit. This fact gives hope to detect the violation experimentally. A fakeon of spin two, together with a scalar field, is able to make quantum gravity renormalizable while preserving unitarity. We claim that the theory of quantum gravity emerging from this construction is the right one. By means of the classicization, we work out the corrections to the field equations of general relativity. We show that the finalized equations have, in simple terms, the form ##IMG## [http://ej.iop.org/images/0264-9381/36/6/065010/cqgab04c8ieqn001.gif] , where ##IMG## [http://ej.iop.org/images/0264-9381/36/6/065010/cqga…]
Darby, George and Pickup, Martin (2019) Modelling Deep Indeterminacy. [Preprint]
Kuby, Daniel (2018) Feyerabend’s Reevaluation of Scientific Practice: Quantum Mechanics, Realism and Niels Bohr. [Preprint]

Authors: Arkady Bolotin

As per Einstein’s design, particles are introduced into the double-slit experiment through a small hole in a plate which can either move up and down (and its momentum can be measured) or be stopped (and its position can be measured). Suppose one measures the position of the plate and this act verifies the statement that the interference pattern is observed in the experiment. However, if it is possible to think about the outcome that one would have obtained if one had measured plate’s momentum instead of its position, then it is possible to consider, together with the aforesaid statement, another statement that each particle passes through either slit of the double-slit screen. Hence, the proposition affirming the wave-like behavior and the proposition affirming the particle-like behavior might be true together, which would imply that Bohr’s complementarity principle is incorrect. The analysis of Einstein’s design and ways to refute it based on an approach that uses exclusively assignments of the truth values to experimental propositions is presented in this paper.

Article written by