Weekly Papers on Quantum Foundations (41)

Constructing and constraining wave functions for identical quantum particles

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2016-10-08 3:10am GMT

Publication date: Available online 7 October 2016
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Charles T. Sebens
I address the problem of explaining why wave functions for identical particles must be either symmetric or antisymmetric (the symmetry dichotomy) within two interpretations of quantum mechanics which include particles following definite trajectories in addition to, or in lieu of, the wave function: Bohmian mechanics and Newtonian quantum mechanics (a.k.a. many interacting worlds). In both cases I argue that, if the interpretation is formulated properly, the symmetry dichotomy can be derived and need not be postulated.

Four principles for quantum gravity. (arXiv:1610.01968v1 [gr-qc])

hep-th updates on arXiv.org

on 2016-10-07 12:49pm GMT

Authors: Lee Smolin

Four principles are proposed to underlie the quantum theory of gravity. We show that these suffice to recover the Einstein equations. We also suggest that MOND results from a modification of the classical equivalence principle, due to quantum gravity effects.

Explaining the Born rule in the intuitionistic interpretation of quantum mechanics. (arXiv:1610.01847v1 [quant-ph])

quant-ph updates on arXiv.org

on 2016-10-07 12:48pm GMT

Authors: Arkady Bolotin

This paper presents a novel explanation of the cause of quantum probabilities and the Born rule based on the intuitionistic interpretation of quantum mechanics where propositions obey constructive (intuitionistic) logic. Being the objects of belief bearing logical (truth) values, propositions allow for the entrance of a probabilistic concept into quantum theory. Additionally, the use of constructive logic makes it possible (through a replacement of the concept of truth with the concept of constructive provability) to abandon the law of excluded middle in the intuitionistic interpretation so that it does not fall victim to Schrodinger’s cat and the like.

The Angular Momentum Dilemma and Born–Jordan Quantization

Latest Results for Foundations of Physics

on 2016-10-07 12:00am GMT

Abstract

The rigorous equivalence of the Schrödinger and Heisenberg pictures requires that one uses Born–Jordan quantization in place of Weyl quantization. We confirm this by showing that the much discussed “ angular momentum dilemma” disappears if one uses Born–Jordan quantization. We argue that the latter is the only physically correct quantization procedure. We also briefly discuss a possible redefinition of phase space quantum mechanics, where the usual Wigner distribution has to be replaced with a new quasi-distribution associated with Born–Jordan quantization, and which has proven to be successful in time-frequency analysis.

How particles can emerge in a relativistic version of Bohmian Quantum Field Theory. (arXiv:1605.06562v5 [quant-ph] UPDATED)

quant-ph updates on arXiv.org

on 2016-10-06 2:15am GMT

Authors: T. Mark Harder

It is shown how bosonic material particles can emerge from a covariant formulation of de Broglie-Bohm theory. The formulation is based on the work of Nikolic. Material particles are continuous fields, formed as the eigenvalue of the Schrodinger field operator, evaluated along a Bohmian trajectory.

The Hardy’s nonlocality argument. (arXiv:1610.01152v1 [quant-ph])

quant-ph updates on arXiv.org

on 2016-10-06 2:15am GMT

Authors: Sujit K ChoudharyPankaj Agrawal

Certain predictions of quantum theory are not compatible with the notion of local-realism. This was the content of Bell’s famous theorem of the year 1964. Bell proved this with the help of an inequality, famously known as Bell’s inequality. The alternative proofs of Bell’s theorem without using Bell’s inequality are known as `nonlocality without inequality (NLWI)’ proofs. We, review one such proof, namely the Hardy’s proof which due to its simplicity and generality has been considered the best version of Bell’s theorem.

How does Quantum Uncertainty Emerge from Deterministic Bohmian Mechanics?. (arXiv:1610.01138v1 [quant-ph])

quant-ph updates on arXiv.org

on 2016-10-05 3:28am GMT

Authors: Albert SoléXavier OriolsDamiano MarianNino Zanghì

Bohmian mechanics is a theory that provides a consistent explanation of quantum phenomena in terms of point particles whose motion is guided by the wave function. In this theory, the state of a system of particles is defined by the actual positions of the particles and the wave function of the system; and the state of the system evolves deterministically. Thus, the Bohmian state can be compared with the state in classical mechanics, which is given by the positions and momenta of all the particles, and which also evolves deterministically. However, while in classical mechanics it is usually taken for granted and considered unproblematic that the state is, at least in principle, measurable, this is not the case in Bohmian mechanics. Due to the linearity of the quantum dynamical laws, one essential component of the Bohmian state, the wave function, is not directly measurable. Moreover, it turns out that the measurement of the other component of the state -the positions of the particles- must be mediated by the wave function; a fact that in turn implies that the positions of the particles, though measurable, are constrained by absolute uncertainty. This is the key to understanding how Bohmian mechanics, despite being deterministic, can account for all quantum predictions, including quantum randomness and uncertainty.

Science and the special composition question

Latest Results for Synthese

on 2016-10-05 12:00am GMT

Abstract

Mereological nihilism is the thesis that composition never occurs. Some philosophers have thought that science gives us compelling evidence against nihilism. In this article I respond to this concern. An initial challenge for nihilism stems from the fact that composition is such a ubiquitous feature of scientific theories. In response I motivate a restricted form of scientific anti-realism with respect to those components of scientific theories which make reference to composition. A second scientifically based worry for nihilism is that certain specific scientific phenomena (quantum entanglement, natural selection) might require ineliminable quantification over composite objects. I address these concerns, and argue that there seem to be nihilist-friendly construals of the scientific phenomena in question.

Quantum dynamics of simultaneously measured non-commuting observables

Nature Physical Sciences Research

on 2016-10-05 12:00am GMT

In quantum mechanics, measurements cause wavefunction collapse that yields precise outcomes, whereas for non-commuting observables such as position and momentum Heisenberg’s uncertainty principle limits the intrinsic precision of a state. Although theoretical work has demonstrated that it should be possible to perform simultaneous non-commuting measurements and has revealed the limits on measurement outcomes, only recently has the dynamics of the quantum state been discussed. To realize this unexplored regime, we simultaneously apply two continuous quantum non-demolition probes of non-commuting observables to a superconducting qubit. We implement multiple readout channels by coupling the qubit to multiple modes of a cavity. To control the measurement observables, we implement a ‘single quadrature’ measurement by driving the qubit and applying cavity sidebands with a relative phase that sets the observable. Here, we use this approach to show that the uncertainty principle governs the dynamics of the wavefunction by enforcing a lower bound on the measurement-induced disturbance. Consequently, as we transition from measuring identical to measuring non-commuting observables, the dynamics make a smooth transition from standard wavefunction collapse to localized persistent diffusion and then to isotropic persistent diffusion. Although the evolution of the state differs markedly from that of a conventional measurement, information about both non-commuting observables is extracted by keeping track of the time ordering of the measurement record, enabling quantum state tomography without alternating measurements. Our work creates novel capabilities for quantum control, including rapid state purification, adaptive measurement, measurement-based state steering and continuous quantum error correction. As physical systems often interact continuously with their environment via non-commuting degrees of freedom, our work offers a way to study how notions of contemporary quantum foundations arise in such settings.

Nature doi: 10.1038/nature19762

Theoretical physics: The emperor’s new physics

Nature – Issue – nature.com science feeds

on 2016-10-05 12:00am GMT

Theoretical physics: The emperor’s new physics

Nature 538, 7623 (2016). doi:10.1038/538036a

Author: Richard Dawid

Richard Dawid examines a critique of quantum mechanics, string theory and inflationary cosmology.

arXiv:1609.08148 [quant-ph]

p-adic Distance, Finite Precision and Emergent Superdeterminism: A Number-Theoretic Consistent-Histories Approach to Local Quantum Realism

T.N.Palmer

Although the notion of superdeterminism can, in principle, account for the violation of the Bell inequalities, this potential explanation has been roundly rejected by the quantum foundations community. The arguments for rejection, one of the most substantive coming from Bell himself, are critically reviewed. In particular, analysis of Bell’s argument reveals an implicit unwarranted assumption: that the Euclidean metric is the appropriate yardstick for measuring distances in state space. Bell’s argument is largely negated if this yardstick is instead based on the alternative p-adic metric. Such a metric, common in number theory, arises naturally when describing chaotic systems which evolve precisely on self-similar invariant sets in their state space. A locally-causal realistic model of quantum entanglement is developed, based on the premise that the laws of physics ultimately derive from an invariant-set geometry in the state space of a deterministic quasi-cyclic mono-universe. Based on this, the notion of a complex Hilbert vector is reinterpreted in terms of an uncertain selection from a finite sample space of states, leading to a novel form of `consistent histories’ based on number-theoretic properties of the transcendental cosine function. This leads to novel realistic interpretations of position/momentum non-commutativity, EPR, the Bell Theorem and the Tsirelson bound. In this inherently holistic theory – neither conspiratorial, retrocausal, fine tuned nor nonlocal – superdeterminism is not invoked by fiat but is emergent from these `consistent histories’ number-theoretic constraints. Invariant set theory provides new perspectives on many of the contemporary problems at the interface of quantum and gravitational physics, and, if correct, may signal the end of particle physics beyond the Standard Model.

 

Article written by

One Response

  1. jacksarfatti
    jacksarfatti at |

    “Mereological nihilism is the thesis that composition never occurs. Some philosophers have thought that science gives us compelling evidence against nihilism. In this article I respond to this concern. An initial challenge for nihilism stems from the fact that composition is such a ubiquitous feature of scientific theories. In response I motivate a restricted form of scientific anti-realism with respect to those components of scientific theories which make reference to composition. A second scientifically based worry for nihilism is that certain specific scientific phenomena (quantum entanglement, natural selection) might require ineliminable quantification over composite objects. I address these concerns, and argue that there seem to be nihilist-friendly construals of the scientific phenomena in question.”

    I do not understand what this is about. Can anyone explain it? 😉

Comments are closed.