Weekly Papers on Quantum Foundations (39)

Authors: Richard J. Szabo

We examine certain nonassociative deformations of quantum mechanics and gravity in three dimensions related to the dynamics of electrons in uniform distributions of magnetic charge. We describe a quantitative framework for nonassociative quantum mechanics in this setting, which exhibits new effects compared to ordinary quantum mechanics with sourceless magnetic fields, and the extent to which these theoretical consequences may be experimentally testable. We relate this theory to noncommutative Jordanian quantum mechanics, and show that its underlying algebra can be obtained as a contraction of the alternative algebra of octonions. The uncontracted octonion algebra conjecturally describes a nonassociative deformation of three-dimensional quantum gravity induced by magnetic monopoles, which we propose is realised by a non-geometric Kaluza-Klein monopole background in M-theory.

Authors: Kevin CostelloEdward WittenMasahito Yamazaki

Several years ago, it was proposed that the usual solutions of the Yang-Baxter equation associated to Lie groups can be deduced in a systematic way from four-dimensional gauge theory. In the present paper, we extend this picture, fill in many details, and present the arguments in a concrete and down-to-earth way. Many interesting effects, including the leading nontrivial contributions to the $R$-matrix, the operator product expansion of line operators, the framing anomaly, and the quantum deformation that leads from $\g[[z]]$ to the Yangian, are computed explicitly via Feynman diagrams. We explain how rational, trigonometric, and elliptic solutions of the Yang-Baxter equation arise in this framework, along with a generalization that is known as the dynamical Yang-Baxter equation.

Authors: Gilles BrassardPaul Raymond-Robichaud

We carry out a thought experiment in an imaginary world. Our world is both local and realistic, yet it violates a Bell inequality more than does quantum theory. This serves to debunk the myth that equates local realism with local hidden variables in the simplest possible manner. Along the way, we reinterpret the celebrated 1935 argument of Einstein, Podolsky and Rosen, and come to the conclusion that they were right in their questioning the completeness of quantum theory, provided one believes in a local-realistic universe. Throughout our journey, we strive to explain our views from first principles, without expecting mathematical sophistication nor specialized prior knowledge from the reader.

Authors: Augusto Cesar Lobo

We address the relation between two apparently distinct problems: The quest for a deeper understanding of the nature of consciousness and the search for time and space as emergent structures in the quantum mechanical world. We also advance a toy-model proposal of emergence of time from a timeless unus mundus quantum-like space by using Aharonov's two state formalism of quantum mechanics. We further speculate on these issues within a quantum cognitive perspective with particular interest in two recent papers on this emerging field of science. One (Aerts et al) entails (as we argue) a panpsychist top-down approach to the problem of consciousness. The second paper (Blutner et al) proposes a quantum cognitive model for Jung's psychological type structure. We discuss these concepts and their relation with our main thesis, that time is a measure of individuality. One of our central motivations is to provide arguments that allows the mainstream physicist to take seriously a panpsychist worldview, a position that has been openly forwarded by many modern philosophers.

Authors: M. LahiriA. HochrainerR. LapkiewiczG. B. LemosA. Zeilinger

Interference of two beams produced at separate biphoton sources was first observed more than two decades ago. The phenomenon, often called "induced coherence without induced emission", has recently gained attention after its applications to imaging, spectroscopy, and measuring biphoton correlations have been discovered. The sources used in the corresponding experiments are nonlinear crystals pumped by laser light. The use of a laser pump makes the occurrence of induced (stimulated) emission unavoidable and the effect of stimulated emission can be observed in the joint detection rate of the two beams. This fact raises the question whether the stimulated emission also lays a role in inducing the coherence. Here we investigate a case in which the crystals are pumped with a single-photon Fock state. We find that coherence is induced even though the possibility of stimulated emission is now fully ruled out. Furthermore, the joint detection rate of the two beams becomes ideally zero and does no longer change with the pump power. We illustrate our results by numerical simulations and by comparisons with experimental findings. Our results rule out any classical or semi-classical explanation of the phenomenon and also imply that similar experiments can be performed with fermions, for which stimulated emission is strictly forbidden.

Authors: James Q. QuachMaciej Lewenstein

We show that under the weak measurement scheme, the double-slit experiment can produce an interference pattern even when one of the slits is completely blocked. The initial and final states are corpuscular, whilst the intermediate states are wave-like, in that it exhibits an interference pattern. Remarkably, the interference pattern is measured to be vertically polarised, whilst simultaneously the individual photons are measured to be horizontally polarised. We call this the \textit{phantom slit} effect. The phantom slit is the dual of the quantum Cheshire cat.

Authors: D. ValenteF. BritoR. FerreiraT. Werlang

The work performed by a classical electromagnetic field on a quantum dipole is well known in quantum optics. The absorbed power linearly depends on the time derivative of the average dipole moment, in that case. The following problem, however, still lacks an answer: can the most elementary electromagnetic pulse, consisting of a single-photon state, perform work on a quantum dipole? As a matter of fact, the average quantum dipole moment exactly vanishes in such a scenario. In this paper, we present a method that positively answers to this question, by combining techniques from the fields of quantum machines and open quantum systems. Quantum work here is defined as the unitary contribution to the energy variation of the quantum dipole. We show that this quantum work corresponds to the energy spent by the photon pulse to dynamically Stark shift the dipole. The non-unitary contribution to the dipole energy is defined here as a generalized quantum heat. We show that this generalized quantum heat is the energy corresponding to out-of-equilibrium photon absorption and emission. Finally, we reveal connexions between the quantum work and the generalized quantum heat transferred by a single photon and those by a low-intensity coherent field.

Authors: Shu-Han JiangZhen-Peng XuHong-Yi SuArun Kumar PatiJing-Ling Chen

Hardy's paradox is an important all-versus-nothing proof of Bell's nonlocality. Hardy's original proof for two particles has been considered as "the simplest form of Bell's theorem" and "one of the strangest and most beautiful gems yet to be found in the extraordinary soil of quantum mechanics". Experimentally, a number of experiments has been carried out to confirm the paradox in two-particle systems. Theoretically, Hardy's paradox has been generalized from two-qubit to arbitrary $n$-qubit by Cereceda, who found that for the $n$-qubit GHZ state the maximal success probability is $[1+\cos\frac{\pi}{n-1}]/2^{n}$. Here we present the most general framework for $n$-particle Hardy's paradox, which includes Hardy's original one and Cereceda's extension as special cases. Remarkably, for any $n\ge 3$, there are always general Hardy's paradoxes (with the success probability $1/2^{n-1}$) that are stronger than the previous ones. An experimental proposal to observe the stronger paradox in the three-qubit system has also been presented. Furthermore, from the general Hardy's paradox we have constructed the most general Hardy's inequalities, which can detect Bell's nonlocality for more quantum states.

Authors: Simone Sturniolo

While historically many quantum mechanical simulations of molecular dynamics have relied on the Born-Oppenheimer approximation to separate electronic and nuclear behavior, recently a lot of interest has arisen towards quantum effects in nuclear dynamics as well, especially protons. Due to the computational difficulty of solving the Schr\"odinger equation in full, though, these effects are often treated with approximated, quasi-classical methods.

In this paper we present an extension to the Many Interacting Worlds approach to quantum mechanics developed using a kernel method to rebuild the probability density. This approach, at a difference with the approximation presented in the original paper, can be naturally extended to n-dimensional systems, making it a viable method for approximating both ground states and quantum evolution of physical systems. The behavior of the algorithm is studied in different potentials and numbers of dimensions and compared both to the original approach and to exact Schr\"odinger equation solutions whenever possible.

The idea that quantum computers can do things that regular ones cannot isn’t proven. But Google thinks it knows a problem only a quantum computer can solve
Publication date: Available online 28 September 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Edward MacKinnon
The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.

Physicist Gil Lonzarich has sparked a revolution in the study of phase transitions driven by quantum fluctuations

-- Read more on ScientificAmerican.com

      

show enclosure

Author(s): J. K. Korbicz, E. A. Aguilar, P. Ćwikliński, and P. Horodecki

Measurement is of central interest in quantum mechanics as it provides the link between the quantum world and the world of everyday experience. One of the features of everyday experience is its robust, objective character, contrasting the delicate nature of quantum systems. Here we analyze in a comp...
[Phys. Rev. A 96, 032124] Published Thu Sep 28, 2017

Authors: Bruce Levinson

Humanity's efforts to transmute lead into gold have impelled civilizations. Our efforts to transmute human experience into objective laws have enjoyed similar success. Through thinkers such as Oliver Wendell Holmes, William James, Felix S. Cohen, Carol E. Cleland, Russell K. Standish and Christopher A. Fuchs we can see that a source of the difficulty in understanding phenomena via objective laws is that the law can best be understood as a quantum system, not a classical one. Law resembles a quantum system because maximal legal information is not complete and cannot be completed.

Barrett, Thomas William (2017) What Do Symmetries Tell Us About Structure? [Preprint]

Author(s): Xiang Zhan, Lei Xiao, Zhihao Bian, Kunkun Wang, Xingze Qiu, Barry C. Sanders, Wei Yi, and Peng Xue

We report the experimental detection of bulk topological invariants in nonunitary discrete-time quantum walks with single photons. The nonunitarity of the quantum dynamics is enforced by periodically performing partial measurements on the polarization of the walker photon, which effectively introduc...
[Phys. Rev. Lett. 119, 130501] Published Wed Sep 27, 2017

Author(s): M. Beau, J. Kiukas, I. L. Egusquiza, and A. del Campo

A system prepared in an unstable quantum state generally decays following an exponential law, as environmental decoherence is expected to prevent the decay products from recombining to reconstruct the initial state. Here we show the existence of deviations from exponential decay in open quantum syst...
[Phys. Rev. Lett. 119, 130401] Published Wed Sep 27, 2017

Authors: Augusto Cesar Lobo

We address the relation between two apparently distinct problems: The quest for a deeper understanding of the nature of consciousness and the search for time and space as emergent structures in the quantum mechanical world. We also advance a toy-model proposal of emergence of time from a timeless unus mundus quantum-like space by using Aharonov's two state formalism of quantum mechanics. We further speculate on these issues within a quantum cognitive perspective with particular interest in two recent papers on this emerging field of science. One (Aerts et al) entails (as we argue) a panpsychist top-down approach to the problem of consciousness. The second paper (Blutner et al) proposes a quantum cognitive model for Jung's psychological type structure. We discuss these concepts and their relation with our main thesis, that time is a measure of individuality. One of our central motivations is to provide arguments that allows the mainstream physicist to take seriously a panpsychist worldview, a position that has been openly forwarded by many modern philosophers.

Authors: Danko GeorgievEliahu Cohen

Feynman's sum-over-histories formulation of quantum mechanics has been considered a useful calculational tool in which virtual Feynman histories entering into a quantum superposition cannot be individually measured. Here we show that sequential weak values inferred by weak measurements allow direct experimental probing of individual virtual Feynman histories thereby revealing the exact nature of quantum interference of superposed histories. In view of the existing controversy over the meaning and interpretation of weak values, our analysis demonstrates that sequential weak values of quantum histories (multi-time projection operators) are not arbitrary, but reflect true physical properties of the quantum physical system under study. If weak values are interpreted for a complete set of orthogonal quantum histories, the total sum of weak values is unity and the analysis agrees with the standard quantum mechanical picture.

Authors: Yakir AharonovEliahu CohenAvishy CarmiAvshalom C. Elitzur

Some predictions regarding pre- and post-selected particles are far-reaching, thereby requiring validation with standard quantum measurements in addition to the customary weak measurements used so far, as well as other advanced techniques. Following earlier papers, we continue this research program with two thought experiments. An excited atom traverses a Mach-Zehnder interferometer (MZI) under a special combination of pre- and post-selection. In the first experiment, photons emitted by the superposed atom, after being hit by two laser beams, are individually counted. Despite the interaction having definitely taken place, as revealed by the atom becoming ground, the numbers of photons emitted from each arm of the MZI are predicted, at the ensemble level, to be different from those expected with standard stimulated emission of a pre-selected-only atom. In the second experiment, the atom spontaneously emits a photon while still in the MZI. This photon later serves as a strong measurement of the atom's energy upon hitting a photographic plate. The experiment is repeated to enable an interference effect of the emitted photons. Surprisingly, the latter gives the appearance that the photons have been emitted by the atom from a position much farther from the two MZI arms L and R, as if in a "phantom arm" R'. Nevertheless, their time of arrival is similar to that of photons coming from L and R. These experiments also emphasize the key role of negative weak values of atoms under pre- and post-selection. The novel verification methods resemble weak measurements in some aspects, yet result from an unambiguous atomic transitions verified by the detected photons.

Ultrafast creation of large Schrödinger cat states of an atom

Nature Communications, Published online: 26 September 2017; doi:10.1038/s41467-017-00682-6

Generation of mesoscopic quantum superpositions requires both reliable coherent control and isolation from the environment. Here, the authors succeed in creating a variety of cat states of a single trapped atom, mapping spin superpositions into spatial superpositions using ultrafast laser pulses.

Schiemer, Georg and Wigglesworth, John (2017) The Structuralist Thesis Reconsidered. The British Journal for the Philosophy of Science.
Techniques could lead to better quantum-information networks

Authors: Gianluca Calcagni

Getting signatures of quantum gravity is one of the topical lines of research in modern theoretical physics and cosmology. This short review faces this challenge under a novel perspective. Instead of separating quantum-gravity effects of a specific model between UV and IR regimes, we consider a general feature, possibly common to many frameworks, where all scales are affected and spacetime geometry is characterized by a complex critical exponent. This leaves a log-oscillating modulation pattern in the cosmic microwave background spectrum and gives a unique opportunity, illustrated with the example of a multi-fractional theory, to test quantum gravities at cosmological scales.

Jabs, Arthur (2014) An interpretation of the formalism of quantum mechanics in terms of realism. arXiv, The British Journal for the Philosophy of Science, 43. pp. 405-421.
North, Jill (2017) A New Approach to the Relational-Substantival Debate. [Preprint]
Jabs, Arthur (2017) A conjecture concerning determinism, reduction, and measurement in quantum mechanics. arXiv. pp. 1-21.
Jabs, Arthur (2017) Quantum mechanics in terms of realism. arXiv. pp. 1-100.
Publication date: August 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 59
Author(s): Sebastian de Haro
In this paper I develop a framework for relating dualities and emergence: two notions that are close to each other but also exclude one another. I adopt the conception of duality as ‘isomorphism’, from the physics literature, cashing it out in terms of three conditions. These three conditions prompt two conceptually different ways in which a duality can be modified to make room for emergence; and I argue that this exhausts the possibilities for combining dualities and emergence (via coarse-graining). I apply this framework to gauge/gravity dualities, considering in detail three examples: AdS/CFT, Verlinde׳s scheme, and black holes. My main point about gauge/gravity dualities is that the theories involved, qua theories of gravity, must be background-independent. I distinguish two senses of background-independence: (i) minimalistic and (ii) extended. I argue that the former is sufficiently strong to allow for a consistent theory of quantum gravity; and that AdS/CFT is background-independent on this account; while Verlinde׳s scheme best fits the extended sense of background-independence. I argue that this extended sense should be applied with some caution: on pain of throwing the baby (general relativity) out with the bath-water (extended background-independence). Nevertheless, it is an interesting and potentially fruitful heuristic principle for quantum gravity theory construction. It suggests some directions for possible generalisations of gauge/gravity dualities. The interpretation of dualities is discussed; and the so-called ‘internal’ vs. ‘external’ viewpoints are articulated in terms of: (i) epistemic and metaphysical commitments; (ii) parts vs. wholes. I then analyse the emergence of gravity in gauge/gravity dualities in terms of the two available conceptualisations of emergence; and I show how emergence in AdS/CFT and in Verlinde׳s scenario differ from each other. Finally, I give a novel derivation of the Bekenstein–Hawking black hole entropy formula based on Verlinde׳s scheme; the derivation sheds light on several aspects of Verlinde׳s scheme and how it compares to Bekenstein׳s original calculation.

Publication date: August 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 59
Author(s): Joseph Polchinski
Duality, the equivalence between seemingly distinct quantum systems, is a curious property that has been known for at least three quarters of a century. In the past two decades it has played a central role in mapping out the structure of theoretical physics. I discuss the unexpected connections that have been revealed among quantum field theories and string theories. Written for a special issue of Studies in History and Philosophy of Modern Physics.

Publication date: August 2017
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 59
Author(s): Doreen Fraser
The application of analytic continuation in quantum field theory (QFT) is juxtaposed to T-duality and mirror symmetry in string theory. Analytic continuation—a mathematical transformation that takes the time variable tto negative imaginary time—it—was initially used as a mathematical technique for solving perturbative Feynman diagrams, and was subsequently the basis for the Euclidean approaches within mainstream QFT (e.g., Wilsonian renormalization group methods, lattice gauge theories) and the Euclidean field theory program for rigorously constructing non-perturbative models of interacting QFTs. A crucial difference between theories related by duality transformations and those related by analytic continuation is that the former are judged to be physically equivalent while the latter are regarded as physically inequivalent. There are other similarities between the two cases that make comparing and contrasting them a useful exercise for clarifying the type of argument that is needed to support the conclusion that dual theories are physically equivalent. In particular, T-duality and analytic continuation in QFT share the criterion for predictive equivalence that two theories agree on the complete set of expectation values and the mass spectra and the criterion for formal equivalence that there is a “translation manual” between the physically significant algebras of observables and sets of states in the two theories. The analytic continuation case study illustrates how predictive and formal equivalence are compatible with physical inequivalence, but not in the manner of standard underdetermination cases. Arguments for the physical equivalence of dual theories must cite considerations beyond predictive and formal equivalence. The analytic continuation case study is an instance of the strategy of developing a physical theory by extending the formal or mathematical equivalence with another physical theory as far as possible. That this strategy has resulted in developments in pure mathematics as well as theoretical physics is another feature that this case study has in common with dualities in string theory.

Article written by

editor

Please comment with your real name using good manners.

Leave a Reply

You must be logged in to post a comment.