Weekly Papers on Quantum Foundations (21)

Authors: Juven C. Wang

In this thesis, we explore the aspects of symmetry, topology and anomalies in quantum matter with entanglement from both condensed matter and high energy theory viewpoints. The focus of our research is on the gapped many-body quantum systems including symmetry-protected topological states and topologically ordered states. Chapter 1. Introduction. Chapter 2. Geometric phase, wavefunction overlap, spacetime path integral and topological invariants. Chapter 3. Aspects of Symmetry. Chapter 4. Aspects of Topology. Chapter 5. Aspects of Anomalies. Chapter 6. Quantum Statistics and Spacetime Surgery. Chapter 7. Conclusion: Finale and A New View of Emergence-Reductionism. (Thesis supervisor: Prof. Xiao-Gang Wen)

Authors: Partha Ghose

The measurement problem in quantum mechanics originates in the inability of the Schr\”odinger equation to predict definite outcomes of measurements. This is due to the lack of objectivity of the eigenstates of the measuring apparatus. Such objectivity can be achieved if a unified realist conceptual framework can be formulated in terms of wave functions and operators acting on them for both the quantum and classical domains. Such a framework was proposed and an equation for the wave function (13, 14) smoothly interpolates between the quantum and classical limits. The arguments leading to the equation are clarified in this paper, and the theory is developed further. The measurement problem in quantum mechanics is then briefly reviewed and re-examined from the point of view of this theory, and it is shown how the classical limit of the wave function of the measuring apparatus leads to a natural solution of the problem of definite measurement outcomes without the need for either collapse or pragmatic thermodynamic arguments. This is consistent with Bohr’s emphasis on the primacy of classical concepts and classical measuring devices. Possible tests of the theory using low-dimensional systems such as quantum dots are indicated.

Authors: Jonas Maziero

The existence of incompatible observables constitutes one of the most prominent characteristics of quantum mechanics (QM) and can be revealed and formalized through uncertainty relations. The Heisenberg-Robertson-Schr\”odinger uncertainty relation (HRSUR) was proved at the dawn of quantum formalism and is ever-present in the teaching and research on QM. Notwithstanding, the HRSUR possess the so called triviality problem. That is to say, the HRSUR yields no information about the possible incompatibility between two observables if the system was prepared in a state which is an eigenvector of one of them. After about 85 years of existence of the HRSUR, this problem was solved recently by Lorenzo Maccone and Arun K. Pati. In this article, we start doing a brief discussion of general aspects of the uncertainty principle in QM and recapitulating the proof of HRSUR. Afterwards we present in simple terms the proof of the Maccone-Pati uncertainty relation, which can be obtained basically via the application of the parallelogram law and the Cauchy-Schwarz inequality.

Authors: Lee Smolin

Some reflections are presented on the state of the search for a quantum theory of gravity. I discuss diverse regimes of possible quantum gravitational phenomenon, some well explored, some novel.

Authors: Cesar A. AguillónAlbert MuchMarcos RosenbaumJ. David Vergara

We investigate a quantum geometric space in the context of what could be considered an emerging effective theory from Quantum Gravity. Specifically we consider a two-parameter class of twisted Poincar\’e algebras, from which Lie-algebraic noncommutativities of the translations are derived as well as associative star-products, deformed Riemannian geometries, Lie-algebraic twisted Minkowski spaces and quantum effects that arise as noncommutativities. Starting from a universal differential algebra of forms based on the above mentioned Lie-algebraic noncommutativities of the translations, we construct the noncommutative differential forms and Inner and Outer derivations, which are the noncommutative equivalents of the vector fields in the case of commutative differential geometry. Having established the essentials of this formalism we construct a bimodule, required to be central under the action of the Inner derivations in order to have well defined contractions and from where the algebraic dependence of its coefficients is derived. This again then defines the noncommutative equivalent of the geometrical line-element in commutative differential geometry. We stress, however, that even though the components of the twisted metric are by construction symmetric in their algebra valuation, this is not so for their inverse and thus to construct it we made use of Gel’fand’s theory of quasi-determinants, which is conceptually straightforward but computationally becoming quite complicate beyond an algebra of 3 generators. The consequences of the noncommutativity of the Lie-algebra twisted geometry are further discussed.

Authors: Muxin Han

In this paper we explain how 4-dimensional general relativity and in particular, the Einstein equation, emerge from the spinfoam amplitude in loop quantum gravity. We propose a new limit which couples both the semiclassical limit and continuum limit of spinfoam amplitudes. The continuum Einstein equation emerges in this limit. Solutions of Einstein equation can be approached by dominant configurations in spinfoam amplitudes. A running scale is naturally associated to the sequence of refined triangulations. The continuum limit corresponds to the infrared limit of the running scale. An important ingredient in the derivation is a regularization for the sum over spins, which is necessary for the semiclassical continuum limit. We also explain in this paper the role played by the so-called flatness in spinfoam formulation, and how to take advantage of it.

Redhead, Michael (2017) The Relativistic Einstein-Podolsky-Rosen Argument. [Preprint]
Parker, Matthew W. (2012) More Trouble for Regular Probabilities. [Preprint]

Author(s): Diego A. Alcala, Joseph A. Glick, and Lincoln D. Carr

Tunneling of a quasibound state is a nonsmooth process in the entangled many-body case. Using time-evolving block decimation, we show that repulsive (attractive) interactions speed up (slow down) tunneling. While the escape time scales exponentially with small interactions, the maximization time of …
[Phys. Rev. Lett. 118, 210403] Published Thu May 25, 2017

Author(s): Philip Pearle and Anthony Rizzi

Following semiclassical arguments by Vaidman [Phys. Rev. A 86, 040101(R) (2012)], we show that the phase shifts arising in the Aharonov-Bohm (AB) magnetic or electric effects can be treated as due to the electric force of a classical electron, respectively acting on quantized solenoid particles or q…
[Phys. Rev. A 95, 052123] Published Thu May 25, 2017

Author(s): Philip Pearle and Anthony Rizzi

We give a complete quantum analysis of the Aharonov-Bohm (AB) magnetic phase shift involving three entities: the electron, the charges constituting the solenoid current, and the vector potential. The usual calculation supposes that the solenoid’s vector potential may be well approximated as classica…
[Phys. Rev. A 95, 052124] Published Thu May 25, 2017

Authors: D. SokolovskiE. Akhmatkaya

We explain the properties and clarify the meaning of quantum weak values using only the basic notions of elementary quantum mechanics.

Abstract

Two approaches to understanding the idealizations that arise in the Aharonov–Bohm (AB) effect are presented. It is argued that a common topological approach, which takes the non-simply connected electron configuration space to be an essential element in the explanation and understanding of the effect, is flawed. An alternative approach is outlined. Consequently, it is shown that the existence and uniqueness of self-adjoint extensions of symmetric operators in quantum mechanics have important implications for philosophical issues. Also, the alleged indispensable explanatory role of said idealizations is examined via a minimal model explanatory scheme. Last, the idealizations involved in the AB effect are placed in a wider philosophical context via a short survey of part of the literature on infinite and essential idealizations.

Authors: Federico Laudisa

I purport to show why old and new claims on the role of counterfactual reasoning for the EPR argument and the Bell theorem are unjustified: once the logical relation between locality and counterfactual reasoning is clarified, the use of the latter does no harm and the nonlocality result can well follow from the EPR premises. To show why, I critically review (i) incompleteness arguments that Einstein developed before the EPR paper, and (ii) more recent claims that equate the use of counterfactual reasoning with the assumption of a strong form of realism.

Authors: Gijs Leegwater

Recently, Roger Colbeck and Renato Renner (C&R) have claimed that ‘[n]o extension of quantum theory can have improved predictive power’. If correct, this is a spectacular impossibility theorem for hidden variable theories, which is more general than the theorems of Bell and Leggett. C&R’s claim essentially means that in any hidden variable theory that is compatible with quantum-mechanical predictions, probabilities of measurement outcomes are independent of these hidden variables. On closer inspection, however, the generality and validity of the claim can be contested. First, it is based on an assumption called ‘Freedom of Choice’. As the name suggests, this assumption involves the independence of an experimenter’s choice of measurement settings. But in the way C&R define this assumption, a no-signalling condition is surreptitiously presupposed, making the assumption less innocent than it sounds. When using this definition, any hidden variable theory violating Parameter Independence, such as Bohmian Mechanics, is immediately shown to be incompatible with quantum-mechanical predictions. Also, the argument of C&R is hard to follow and their mathematical derivation contains several gaps, some of which cannot be closed in the way they suggest. We shall show that these gaps can be filled. The issue with the ‘Freedom of Choice’ assumption can be circumvented by explicitly assuming Parameter Independence. This makes the result less general, but better founded. We then obtain an impossibility theorem for hidden variable theories satisfying Parameter Independence only. So, while quantum mechanics itself satisfies Parameter Independence, if a variable is added that changes the outcome probabilities, however slightly, Parameter Independence must be violated.

Authors: Jonathan G. RichensJohn H. SelbySabri W. Al-Safi

One of the most striking features of quantum theory is the existence of entangled states, responsible for Einstein’s so called “spooky action at a distance”. These states emerge from the mathematical formalism of quantum theory, but to date we do not have a clear idea of the physical principles that give rise to entanglement. Why does nature have entangled states? Would any theory superseding classical theory have entangled states, or is quantum theory special? One important feature of quantum theory is that it has a classical limit, recovering classical theory through the process of decoherence. We show that any theory with a classical limit must contain entangled states, thus establishing entanglement as an inevitable feature of any theory superseding classical theory.

Authors: Samson AbramskyRui Soares BarbosaShane Mansfield

We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e. tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programming; it is monotone with respect to the “free” operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement based quantum computing.

Authors: Sebastian DeffnerSteve Campbell

One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this Topical Review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields — including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this Topical Review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

Determining the state of a quantum system is a consuming procedure. For this reason, whenever one is interested only in some particular property of a state, it would be desirable to design a measurement set-up that reveals this property with as little effort as possible. Here, we investigate whether, in order to successfully complete a given task of this kind, one needs an informationally complete measurement, or if something less demanding would suffice. The first alternative means that in order to complete the task, one needs a measurement which fully determines the state. We formulate the task as a membership problem related to a partitioning of the quantum state space and, in doing so, connect it to the geometry of the state space. For a general membership problem, we prove various sufficient criteria that force informational completeness, and we explicitly treat several physically relevant examples. For the specific cases that do not require informational completeness, we also determine bounds on the minimal number of measurement outcomes needed to ensure success in the task.

Abstract

It is shown that quantum mechanics is a plausible statistical description of an ontology described by classical electrodynamics. The reason that no contradiction arises with various no-go theorems regarding the compatibility of QM with a classical ontology, can be traced to the fact that classical electrodynamics of interacting particles has never been given a consistent definition. Once this is done, our conjecture follows rather naturally, including a purely classical explanation of photon related phenomena. Our analysis entirely rests on the block-universe view entailed by relativity theory.

Manchak, John Byron (2017) Malament-Hogarth Machines. [Preprint]

Author(s): T. H. Hansson, M. Hermanns, S. H. Simon, and S. F. Viefers

The quantum Hall effects by now are recognized as prime examples of the importance of topological considerations in condensed-matter physics. The fractional Quantum Hall effect in particular has proven to display a large number of topologically ordered states that have been classified and understood in terms of hierarchical schemes. This review explains the current understanding of such classifications, with particular emphasis on conformal-field-theory approaches.


[Rev. Mod. Phys. 89, 025005] Published Tue May 23, 2017

Author(s): Charis Anastopoulos and Ntina Savvidou

Attempts to find a quantum-to-classical correspondence in a classically forbidden region leads to nonphysical paths involving, for example, complex time or spatial coordinates. Here, we identify genuine quasiclassical paths for tunneling in terms of probabilistic correlations in sequential time-of-a…
[Phys. Rev. A 95, 052120] Published Tue May 23, 2017

Authors: Tejinder P. Singh

We have recently proposed a new action principle for combining Einstein equations and the Dirac equation for a point mass. We used a length scale $L_{CS}$, dubbed the Compton-Schwarzschild length, to which the Compton wavelength and Schwarzschild radius are small mass and large mass approximations, respectively. Here we write down the field equations which follow from this action. We argue that the large mass limit yields Einstein equations, provided we assume wave function collapse and localisation for large masses. The small mass limit yields the Dirac equation. We explain why the Kerr-Newman black hole has the same gyromagnetic ratio as the Dirac electron, both being twice the classical value. The small mass limit also provides compelling reasons for introducing torsion, which is sourced by the spin density of the Dirac field. There is thus a symmetry between torsion and gravity: torsion couples to quantum objects through Planck’s constant $\hbar$ (but not $G$) and is important in the microscopic limit. Whereas gravity couples to classical matter, as usual, through Newton’s gravitational constant $G$ (but not $\hbar$), and is important in the macroscopic limit. We construct the Einstein-Cartan-Dirac equations which include the length $L_{CS}$. We find a potentially significant change in the coupling constant of the torsion driven cubic non-linear self-interaction term in the Dirac-Hehl-Datta equation. We speculate on the possibility that gravity is not a fundamental interaction, but emerges as a consequence of wave function collapse, and that the gravitational constant maybe expressible in terms of Planck’s constant and the parameters of dynamical collapse models.

Authors: Nirmalya Kajuri

In the canonical approach to quantization of gravity, one often uses relational clock variables and an interpretation in terms of conditional probabilities to overcome the problem of time. In this essay we show that these suffer from serious conceptual issues.

Authors: M. SkotiniotisW. DürP. Sekatski

We consider fundamental limits on the detectable size of macroscopic quantum superpositions. We argue that a full quantum mechanical treatment of system plus measurement device is required, and that a (classical) reference frame for phase or direction needs to be established to certify the quantum state. When taking the size of such a classical reference frame into account, we show that to reliably distinguish a quantum superposition state from an incoherent mixture requires a measurement device that is quadratically bigger than the superposition state. Whereas for moderate system sizes such as generated in previous experiments this is not a stringent restriction, for macroscopic superpositions of the size of a cat the required effort quickly becomes intractable, requiring measurement devices of the size of the Earth. We illustrate our results using macroscopic superposition states of photons, spins, and position. Finally, we also show how this limitation can be circumvented by dealing with superpositions in relative degrees of freedom.

Authors: K. Wiesner

An imperative aspect of modern science is that scientific institutions act for the benefit of a common scientific enterprise, rather than for the personal gain of individuals within them. This implies that science should not perpetuate existing or historical unequal social orders. Some scientific terminology, though, gives a very different impression. I will give two examples of terminology invented recently for the field of quantum information which use language associated with subordination, slavery, and racial segregation: ‘ancilla qubit’ and ‘quantum supremacy’.

Authors: Karen CrowtherNiels Linnemann

Principles are central to physical reasoning, particularly in the search for a theory of quantum gravity (QG), where novel empirical data is lacking. One principle widely adopted in the search for QG is UV completion: the idea that a theory should (formally) hold up to all possible high energies. We argue—\textit{contra} standard scientific practice—that UV-completion is poorly-motivated as a guiding principle in theory-construction, and cannot be used as a criterion of theory-justification in the search for QG. For this, we explore the reasons for expecting, or desiring, a UV-complete theory, as well as analyse how UV completion is used, and how it should be used, in various specific approaches to QG.

Volume 188 2017

Open Access

Foundations of Quantum Theory

From Classical Concepts to Operator Algebras

ISBN: 978-3-319-51776-6 (Print) 978-3-319-51777-3 (Online)

Article written by

editor

Please comment with your real name using good manners.

Leave a Reply

You must be logged in to post a comment.