Weekly Papers on Quantum Foundations (40)

This is a list of this week’s papers on quantum foundations published in the various journals or uploaded to the preprint servers such as arxiv.org and PhilSci Archive.

Contrary Inferences in Consistent Histories and a Set Selection Criterion

Latest Results for Foundations of Physics

on 2014-10-04 12:00am GMT

Abstract

The best developed formulation of closed system quantum theory that handles multiple-time statements, is the consistent (or decoherent) histories approach. The most important weaknesses of the approach is that it gives rise to many different consistent sets, and it has been argued that a complete interpretation should be accompanied with a natural mechanism leading to a (possibly) unique preferred consistent set. The existence of multiple consistent sets becomes more problematic because it allows the existence of contrary inferences [1]. We analyse the conceptual difficulties that arise from the existence of multiple consistent sets and provide a suggestion for a natural set selection criterion. This criterion does not lead to a unique physical consistent set, however it evades the existence of consistent sets with contrary inferences. The criterion is based on the concept of preclusion and the requirement that probability one propositions and their inferences should be non-contextual. The allowed consistent sets turn-out to be compatible with coevents which are the ontology of an alternative, histories based, formulation [2–4].

Time in fundamental physics

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-10-03 8:57pm GMT

Publication date: Available online 3 October 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Abhay Ashtekar
The first three sections of this paper contain a broad brush summary of the profound changes in the notion of time in fundamental physics that were brought about by three revolutions: the foundations of mechanics distilled by Newton in his Principia, the discovery of special relativity by Einstein and its reformulation by Minkowski, and, finally, the fusion of geometry and gravity in Einstein׳s general relativity. The fourth section discusses two aspects of yet another deep revision that waits in the wings as we attempt to unify general relativity with quantum physics.

Decay Law of Relativistic Particles: Quantum Theory Meets Special Relativity. (arXiv:1408.6564v1 [hep-ph] CROSS LISTED)

hep-th updates on arXiv.org

on 2014-10-03 2:54am GMT

Late time properties of moving relativistic particles are studied. Within the proper relativistic treatment of the problem we find decay curves of such particles and we show that late time deviations of the survival probability of these particles from the exponential form of the decay law, that is the transition times region between exponential and non-expo\-nen\-tial form of the survival amplitude, occur much earlier than it follows from the classical standard approach boiled down to replace time $t$ by $t/\gamma_{L}$ (where $\gamma_{L}$ is the relativistic Lorentz factor) in the formula for the survival probability. The consequence is that fluctuations of the corresponding decay curves can appear much earlier and much more unstable particles have a chance to survive up to these times or later. It is also shown that fluctuations of the instantaneous energy of the moving unstable particles has a similar form as the fluctuations in the particle rest frame but they are seen by the observer in his rest system much earlier than one could expect replacing $t$ by $t/\gamma_{L}$ in the corresponding expressions for this energy and that the amplitude of these fluctuations can be even larger than it follows from the standard approach. All these effects seems to be important when interpreting some accelerator experiments with high energy unstable particles and the like (possible connections of these effects with GSI anomaly are analyzed) and some results of astrophysical observations.

No ‘anomalous’ weak values in a classical theory. (arXiv:1410.0570v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-10-03 2:54am GMT

The authors of a recent paper [Phys. Rev. Lett. 113, 120404 (2014)] suggest that “weak values are not inherently quantum but rather a purely statistical feature of pre- and postselection with disturbance”. We argue that this claim is erroneous, since such values require averaging with distributions which change sign. This type of averaging arises naturally in quantum mechanics, but may not occur in classical statistics.

Comment on ‘How the result of a single coin toss can turn out to be 100 heads’. (arXiv:1410.0381v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-10-03 2:54am GMT

The “anomalous” values of C. Ferrie and J. Combes in Phys. Rev. Lett. 113, 120404 (2014) say nothing about quantum – or even classical – physics. They are not analogues of the weak values that emerge when we describe the quantum world via an initial state evolving forwards in time and a final state evolving backwards in time, and couple this world weakly to realistic measuring devices.

[In Depth] Breakthrough lost in coin toss?

Science: Current Issue

on 2014-10-03 12:00am GMT

A controversial quantum measurement that seems to bend the rules may not be very quantum after all. Author: Adrian Cho

Quantum mechanics without state vectors

PRA: Fundamental concepts

on 2014-10-02 2:00pm GMT

Author(s): Steven Weinberg

A proposal is formulated to give up the description of physical states in terms of ensembles of state vectors and to rely only on density matrices instead, which opens up a variety of new ways for density matrices to transform under various symmetries different from the unitary symmetries of ordinary quantum mechanics.

[Phys. Rev. A 90, 042102] Published Thu Oct 02, 2014

Generalized Uncertainty Principle: Approaches and Applications. (arXiv:1410.0206v1 [gr-qc])

gr-qc updates on arXiv.org

on 2014-10-02 7:37am GMT

We review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed. We compare between them. They entered the literature as the Generalized Uncertainty Principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of Applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker–Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high–energy collisions. One of the higher–order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

Appearing Out of Nowhere: The Emergence of Spacetime in Quantum Gravity. (arXiv:1410.0345v1 [physics.hist-ph])

physics.hist-ph updates on arXiv.org

on 2014-10-02 7:37am GMT

Quantum gravity is understood as a theory that, in some sense, unifies general relativity (GR) and quantum theory, and is supposed to replace GR at extremely small distances (high-energies). It may be that quantum gravity represents the breakdown of spacetime geometry described by GR. The relationship between quantum gravity and spacetime has been deemed “emergence”, and the aim of this thesis is to investigate and explicate this relation. After finding traditional philosophical accounts of emergence to be inappropriate, I develop a new conception of emergence by considering physical case studies including condensed matter physics, hydrodynamics, critical phenomena and quantum field theory understood as effective field theory.

This new conception of emergence is independent of reduction and derivation. Instead, a low-energy theory is understood as emergent from a high-energy theory if it is novel and autonomous compared to the high-energy theory, and the low-energy physics is dependent (in a particular, minimal sense) on the high-energy physics (this dependence is revealed by the techniques of effective field theory and the renormalisation group). These ideas are important in exploring the relationship between quantum gravity and GR, where GR is understood as an effective, low-energy theory of quantum gravity. Without experimental data or a theory of quantum gravity, we rely on principles and techniques from other areas of physics to guide the way. As well as considering the idea of emergence appropriate to treating GR as an effective field theory, I investigate the emergence of spacetime (and other aspects of GR) in several concrete approaches to quantum gravity, including examples of the condensed matter approaches, the “discrete approaches” (causal set theory, causal dynamical triangulations, quantum causal histories and quantum graphity) and loop quantum gravity.

A Robust Mathematical Model for Clauser-Horne Experiments, With Implications for Rigorous Statistical Analysis. (arXiv:1312.2999v2 [quant-ph] UPDATED)

quant-ph updates on arXiv.org

on 2014-10-02 7:37am GMT

Recent experiments have reached detection efficiencies sufficient to close the detection loophole, testing the Clauser-Horne (CH) version of Bell’s inequality. For a similar future experiment to be completely loophole-free, it will be important to have discrete experimental trials with randomized measurement settings for each trial, and the statistical analysis should not overlook the possibility of a local state varying over time with possible dependence on earlier trials (the “memory loophole”). In this paper, a mathematical model for such a CH experiment is presented, and a method for statistical analysis that is robust to memory effects is introduced. Additionally, a new method for calculating exact p-values for martingale-based statistics is described; previously, only non-sharp upper bounds derived from the Azuma-Hoeffding inequality have been available for such statistics. This improvement decreases the required number of experimental trials to demonstrate non-locality. The statistical techniques are applied to the data of recent experiments and found to perform well.

Algebras without Involution and Quantum Field Theories. (arXiv:1203.2705v4 [math-ph] UPDATED)

quant-ph updates on arXiv.org

on 2014-10-02 7:37am GMT

Explicit realizations of quantum field theory (QFT) are admitted by a revision to the Wightman axioms for the vacuum expectation values (VEV) of fields. The technical development of QFT is expanded beyond positive functionals on *-algebras while the physically motivated properties: Poincare covariance; positive energy; microcausality; and a Hilbert space realization of states, are preserved.

Testing the limits of quantum mechanical superpositions. (arXiv:1410.0270v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-10-02 7:37am GMT

Quantum physics has intrigued scientists and philosophers alike, because it challenges our notions of reality and locality–concepts that we have grown to rely on in our macroscopic world. It is an intriguing open question whether the linearity of quantum mechanics extends into the macroscopic domain. Scientific progress over the last decades inspires hope that this debate may be decided by table-top experiments.

The role of quantum recurrence in superconductivity, carbon nanotubes and related gauge symmetry breaking. (arXiv:1307.5062v2 [physics.gen-ph] UPDATED)

quant-ph updates on arXiv.org

on 2014-10-01 12:47am GMT

Pure quantum phenomena are characterized by intrinsic recurrences in space and time. We use such an intrinsic periodicity as a quantization condition to derive the essential phenomenology of superconductivity. The resulting description is based on fundamental quantum dynamics and geometrical considerations, rather than on microscopical characteristics of the superconducting materials. This allows for the interpretation of the related gauge symmetry breaking by means of the competition between quantum recurrence and thermal noise. We also test the validity of this approach to describe the case of carbon nanotubes.

A comment on “How the result of a single coin toss can turn out to be 100 heads”. (arXiv:1409.8555v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-10-01 12:47am GMT

In [1] the authors claim that “weak values are not inherently quantum but rather a purely statistical feature”. I argue that their model reproduces only few elements of weak measurements but fails to reproduce all the other intrinsically quantum features.

Insolubility from No-Signalling

Latest Results for International Journal of Theoretical Physics

on 2014-10-01 12:00am GMT

Abstract

This paper improves on the result in my (Bacciagaluppi in Eur. J. Philos. Sci. 3: 87–100, 2013), showing that within the framework of the unitary Schrödinger equation it is impossible to reproduce the phenomenological description of quantum mechanical measurements (in particular the collapse of the state of the measured system) by assuming a suitable mixed initial state of the apparatus. The result follows directly from the no-signalling theorem applied to the entangled state of measured system and ancilla. As opposed to many other ‘insolubility theorems’ for the measurement problem of quantum mechanics, it focuses on the impossibility of reproducing the phenomenological collapse of the state of the measured system.

Review on the quantization of gravity. (arXiv:1409.7977v1 [gr-qc])

gr-qc updates on arXiv.org

on 2014-9-30 1:09pm GMT

pThis is a review article on quantum gravity. In section 1, the Penrose singularity theorem is proven. In section 2, the covariant quantization approach of gravity is reviewed. In section 3, an article by Hawking is reviewed that shows the gravitational path integral at one loop level to be dominated by contributions from some kind of virtual gravitational instantons. In section 4, the canonical, non-perturbative quantization approach is reviewed. In section 5, arguments from Hawking are mentioned which show the gravitational path integral to be an approximate solution of the Wheeler deWitt equation. In section 6, the black hole entropy is derived in various ways. Section 6.1 uses the gravitational path integral for this calculation. Section 6.2 shows how the black hole entropy can be derived from canonical quantum gravity. In section 7.1, arguments from Dvali and Gomez who claim that gravity can be quantized in a way which would be in some sense self-complete are critically assessed. In section 7.2 a model from Dvali and Gomez for the description of quantum mechanical black holes is critically assessed and compared with the standard quantization methods of gravity. /p

A proposal for the experimental detection of CSL induced random walk. (arXiv:1409.8204v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-9-30 9:09am GMT

Continuous Spontaneous Localization (CSL) is one possible explanation for dynamically induced collapse of the wave-function during a quantum measurement. The collapse is mediated by a stochastic non-linear modification of the Schrodinger equation. A consequence of the CSL mechanism is an extremely tiny violation of energy-momentum conservation, which can, in principle, be detected in the laboratory via the random diffusion of a particle induced by the stochastic collapse mechanism. In a paper in 2003, Collett and Pearle investigated the translational CSL diffusion of a sphere, and the rotational CSL diffusion of a disc, and showed that this effect dominates over the ambient environmental noise at low temperatures and extremely low pressures (about ten-thousandth of a pico-Torr). In the present paper, we revisit their analysis and argue that this stringent condition on pressure can be relaxed, and that the CSL effect can be seen at the pressure of about a pico-Torr. A similar analysis is provided for diffusion produced by gravity-induced decoherence, where the effect is typically much weaker than CSL. We also discuss the CSL induced random displacement of a quantum oscillator. Lastly, we propose possible experimental set-ups justifying that CSL diffusion is indeed measurable with the current technology.

Randomness Requirement on CHSH Bell Test in the Multiple Run Scenario. (arXiv:1409.7875v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-9-30 9:09am GMT

The Clauser-Horne-Shimony-Holt inequality test is widely used as a mean of invalidating the local deterministic theories and a tool of device independent quantum cryptographic tasks. There exists a randomness (freewill) loophole in the test, which is widely believed impossible to be closed perfectly. That is, certain random inputs are required for the test. Following a randomness quantification method used in literature, we investigate the randomness required in the test under various assumptions. By comparing the results, one can conclude that the key to make the test result reliable is to rule out correlations between multiple runs.

The Born rule in Bohmian mechanics. (arXiv:1409.7891v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-9-30 9:09am GMT

When quantum mechanics is formulated as an indeterministic theory, the Born rule is an independent postulate. Here we argue that consistency of the deterministic formulation of quantum mechanics developed by Bohm requires that the Born rule be derivable, without any statistical assumptions. We solve a simple example where the creation of an ensemble of identical quantum states, together with position measurements on those states, are described by the deterministic laws of Bohmian mechanics. The derived measurement results agree with the Born rule, which is thus a consequence of deterministic evolution.

Bell Inequalities for Continuously Emitting Sources. (arXiv:1409.7732v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-9-30 9:09am GMT

A common experimental strategy for demonstrating non-classical correlations is to show violation of a Bell inequality by measuring a continuously emitted stream of entangled photon pairs. The measurements involve the detection of photons by two spatially separated parties. The detection times are recorded and compared to quantify the violation. The violation critically depends on determining which detections are coincident. Because the recorded detection times have “jitter”, coincidences cannot be inferred perfectly. In the presence of settings-dependent timing errors, this can allow a local-realistic system to show apparent violation–the so-called “coincidence loophole”. Here we introduce a family of Bell inequalities based on signed, directed distances between the parties’ sequences of recorded timetags. Given that the timetags are recorded for synchronized, fixed observation periods and that the settings choices are random and independent of the source, violation of these inequalities unambiguously shows non-classical correlations violating local realism. Distance-based Bell inequalities are generally useful for two-party configurations where the effective size of the measurement outcome space is large or infinite. We show how to systematically modify the underlying Bell functions to improve the signal to noise ratio and to quantify the significance of the violation.

Assessing the Montevideo interpretation of quantum mechanics

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-9-29 8:37pm GMT

Publication date: Available online 24 May 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Jeremy Butterfield
This paper gives a philosophical assessment of the Montevideo interpretation of quantum theory, advocated by Gambini, Pullin and co-authors. This interpretation has the merit of linking its proposal about how to solve the measurement problem to the search for quantum gravity: namely by suggesting that quantum gravity makes for fundamental limitations on the accuracy of clocks, which imply a type of decoherence that ‘collapses the wave-packet’. I begin (Section 2) by sketching the topics of decoherence, and quantum clocks, on which the interpretation depends. Then I expound the interpretation, from a philosopher׳s perspective (Sections 3–5). Finally, inSection 6, I argue that the interpretation, at least as developed so far, is best seen as a form of the Everett interpretation: namely with an effective or approximate branching, that is induced by environmental decoherence of the familiar kind, and by the Montevideans’ ‘temporal decoherence’.

Entanglement and disentanglement in relativistic quantum mechanics

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-9-29 8:37pm GMT

Publication date: Available online 16 September 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Jeffrey A. Barrett
A satisfactory formulation of relativistic quantum mechanics requires that one be able to represent the entangled states of spacelike separated systems and describe how such states evolve. This paper presents two stories that one must be able to tell coherently in order to understand relativistic entangled systems. These stories help to illustrate why one׳s understanding of entanglement in relativistic quantum mechanics must ultimately depend on the details of one׳s strategy for addressing the quantum measurement problem.

Identity versus determinism: Émile Meyerson׳s neo-Kantian interpretation of the quantum theory

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-9-29 8:37pm GMT

Publication date: August 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 47
Author(s): M. Anthony Mills
Despite the praise his writing garnered during his lifetime, e.g., from readers such as Einstein and de Broglie, Émile Meyerson has been largely forgotten. The rich tradition of French épistémologie has recently been taken up in some Anglo-American scholarship, but Meyerson—who popularized the term épistémologie through his historical method of analyzing science, and criticized positivism long before Quine and Kuhn—remains overlooked. If Meyerson is remembered at all, it is as a historian of classical science. This paper attempts to rectify both states of affairs by explicating one of Meyerson׳s last and untranslated works, Réel et déterminisme dans la théorie quantique, an opuscule on quantum physics. I provide an overview of Meyerson׳s philosophy, his critique of Max Planck׳s interpretation of quantum physics, and then outline and evaluate Meyerson׳s neo-Kantian alternative. I then compare and contrast this interpretation with Cassirer׳s neo-Kantian program. Finally I show that, while Meyerson believes the revolutionary new physics requires “profoundly” modifying our conception of reality, ultimately, he thinks, it secures the legitimacy of his thesis: that science seeks explanations in the form of what he calls “identification.” I hope my research will enable a more general and systematic engagement with Meyerson׳s work, especially with a view to assessing its viability as a philosophical method today.

Measurements according to Consistent Histories

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-9-29 8:37pm GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part A
Author(s): Elias Okon , Daniel Sudarsky
We critically evaluate the treatment of the notion of measurement in the Consistent Histories approach to quantum mechanics. We find such a treatment unsatisfactory because it relies, often implicitly, on elements external to those provided by the formalism. In particular, we note that, in order for the formalism to be informative when dealing with measurement scenarios, one needs to assume that the appropriate choice of framework is such that apparatuses are always in states of well defined pointer positions after measurements. The problem is that there is nothing in the formalism to justify this assumption. We conclude that the Consistent Histories approach, contrary to what is claimed by its proponents, fails to provide a truly satisfactory resolution for the measurement problem in quantum theory.

Foundations of quantum gravity: The role of principles grounded in empirical reality

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-9-29 8:37pm GMT

Publication date: May 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 46, Part B
Author(s): Marc Holman
When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data – either directly or indirectly – and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the “gauge principle” are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where – actual or potential – empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained – when appropriate – how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.

Attosecond science and the tunneling time problem

ScienceDirect Publication: Physics Reports

on 2014-9-29 4:54pm GMT

Publication date: Available online 28 September 2014
Source:Physics Reports
Author(s): Alexandra S. Landsman , Ursula Keller
The question of how long it takes a particle to tunnel through a potential barrier has been a subject of intense theoretical debate for the last 80 years. In this decade of attosecond science, the answer to this question not only promises to deepen our understanding of fundamental quantum mechanics, but also has significant practical implications for how we interpret attosecond electron dynamics that underlie important phenomena in physics, chemistry and biology. Here we attempt to address this problem in the context of recent experimental measurements which use state-of-the-art ultrafast laser technology to resolve electron dynamics on the attosecond time-scale. This review therefore brings the theory of tunnelling time to the arena of ultrafast science, opening the door to improved resolution of, and cross-fertilization between, significant practical and fundamental questions in both fields.

Quantum Random Number Generation on a Mobile Phone

Recent Articles in Phys. Rev. X

on 2014-9-29 2:00pm GMT

Author(s): Bruno Sanguinetti, Anthony Martin, Hugo Zbinden, and Nicolas Gisin

Generating random numbers is critical to securing both communications and data. New results reveal how consumer hardware such as mobile phones can generate random numbers with a quantum origin.

[Phys. Rev. X 4, 031056] Published Mon Sep 29, 2014

Trans-Planckian fluctuations and the stability of quantum mechanics. (arXiv:1409.7467v1 [hep-th])

gr-qc updates on arXiv.org

on 2014-9-29 1:18pm GMT

We present arguments suggesting that deviations from the Born probability rule could be generated for trans-Planckian field modes during inflation. Such deviations are theoretically possible in the de Broglie-Bohm pilot-wave formulation of quantum mechanics, according to which the Born rule describes a state of statistical equilibrium. We suggest that a stable equilibrium state can exist only in restricted conditions: on a classical background spacetime that is globally hyperbolic or in a mild quantum-gravity regime in which there is an effective Schr\”odinger equation with a well-defined time parameter. These arguments suggest that quantum equilibrium will be unstable at the Planck scale. We construct a model in which quantum nonequilibrium is generated by a time-dependent regulator for pilot-wave dynamics, where the regulator is introduced to eliminate phase singularities. Applying our model to trans-Planckian modes that exit the Planck radius, we calculate the corrected primordial power spectrum and show that it displays a power excess (above a critical wavenumber). We briefly consider how our proposals could be tested by measurements of the cosmic microwave background.

Pragmatic Interpretation of Quantum Logic. (arXiv:1409.0194v2 [quant-ph] UPDATED)

quant-ph updates on arXiv.org

on 2014-9-29 1:18pm GMT

Scholars have wondered for a long time whether the language of quantum mechanics introduces a quantum notion of truth which is formalized by quantum logic (QL) and is incompatible with the classical (Tarskian) notion. We show that QL can be interpreted as a pragmatic language of assertive formulas which formalize statements about physical systems that are empirically justified or unjustified in the framework of quantum mechanics. According to this interpretation, QL formalizes properties of the metalinguistic notion of empirical justification within quantum mechanics rather than properties of a quantum notion of truth. This conclusion agrees with a general integrationist perspective that interprets nonstandard logics as theories of metalinguistic notions different from truth, thus avoiding incompatibility with classical notions and preserving the globality of logic. By the way, some elucidations of the standard notion of quantum truth are also obtained.

Key words: pragmatics, quantum logic, quantum mechanics, justifiability, global pluralism.

Emergent “Quantum” Theory in Complex Adaptive Systems. (arXiv:1409.7588v1 [quant-ph])

quant-ph updates on arXiv.org

on 2014-9-29 1:18pm GMT

Motivated by the question of stability, in this letter we argue that an effective “quantum” theory can emerge in complex adaptive systems. In the concrete example of stochastic Lotka-Volterra dynamics, the relevant effective “Planck constant” associated with such emergent “quantum” theory has the dimensions of the square of the unit of time. Such an emergent quantum-like theory has inherently non-classical stability as well as coherent properties that are not, in principle, endangered by thermal fluctuations and therefore might be of crucial importance in complex adaptive systems.

Waiting for the quantum bus: The flow of negative probability

ScienceDirect Publication: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

on 2014-9-28 8:36pm GMT

Publication date: November 2014
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 48, Part A
Author(s): A.J. Bracken , G.F. Melloy
It is 45 years since the discovery of the peculiar quantum effect known as ‘probability backflow’, and it is 20 years since the greatest possible size of the effect was characterized. Recently an experiment has been proposed to observe it directly, for the first time, by manipulating ultra-cold atoms. Here a non-technical description is given of the effect and its interpretation in terms of the flow of negative probability.

A Generalized Quantum Theory

Latest Results for Foundations of Physics

on 2014-9-28 12:00am GMT

Abstract

In quantum mechanics, the selfadjoint Hilbert space operators play a triple role as observables, generators of the dynamical groups and statistical operators defining the mixed states. One might expect that this is typical of Hilbert space quantum mechanics, but it is not. The same triple role occurs for the elements of a certain ordered Banach space in a much more general theory based upon quantum logics and a conditional probability calculus (which is a quantum logical model of the Lüders-von Neumann measurement process). It is shown how positive groups, automorphism groups, Lie algebras and statistical operators emerge from one major postulate—the non-existence of third-order interference [third-order interference and its impossibility in quantum mechanics were discovered by Sorkin (Mod Phys Lett A 9:3119–3127,1994)]. This again underlines the power of the combination of the conditional probability calculus with the postulate that there is no third-order interference. In two earlier papers, its impact on contextuality and nonlocality had already been revealed.

Article written by

editor