Latest Papers on Quantum Foundations - Updated Daily by IJQF

Abstract

My central thesis is that presentism is incompatible with all of the main theories of persistence: endurance, exdurance (stage theory) and perdurance.

Authors: B. J. Dalton

The categorization of quantum states for composite systems as either separable or entangled, or alternatively as Bell local or Bell non-local states based on local hidden variable theory is reviewed in Sections 1 and 2, focusing on simple bipartite systems. The significance of states demonstrating Bell non-locality for settling the long standing controversy between the Copenhagen interpretation of the quantum measurement process involving the collapse of the wave-function and the alternative interpretation based on pre-existing hidden variables is emphasized. Although experiments demonstrating violations of Bell locality in microscopic systems have now been carried out (see Section 3), there is current interest in finding Bell non-locality in quantum systems on a macroscopic scale, since this is a regime where a classical hidden variable theory might still apply. Progress towards finding macroscopic quantum states that violate Bell inequalities is reviewed in Section 4.

A new test for Bell non-locality that applies when the sub-system measured quantities are spin components with large outcomes is described, and applied to four mode systems of identical massive bosons in Bose-Einstein condensates.

Authors: Bhavya Bhatt, Manish, Raj Patil, Ruchira Mishra, Shlok Nahar, Tejinder P. Singh

We recall that in order to obtain the classical limit of quantum mechanics one needs to take the $\hbar\rightarrow 0$ limit. In addition, one also needs an explanation for the absence of macroscopic quantum superposition of position states. One possible explanation for the latter is the Ghirardi-Rimini-Weber (GRW) model of spontaneous localisation. Here we describe how spontaneous localisation modifies the path integral formulation of density matrix evolution in quantum mechanics. (Such a formulation has been derived earlier by Pearle and Soucek; we provide two new derivations of their result). We then show how the von Neumann equation and the Liouville equation for the density matrix arise in the quantum and classical limit, respectively, from the GRW path integral. Thus we provide a rigorous demonstration of the quantum to classical transition.

Authors: Diederik Aerts, Massimiliano Sassoli de Bianchi, Sandro Sozzo, Tomas Veloz

We show that the Brussels operational-realistic approach to quantum physics and quantum cognition offers a fundamental strategy for modeling the meaning associated with collections of documental entities. To do so, we take the World Wide Web as a paradigmatic example and emphasize the importance of distinguishing the Web, made of printed documents, from a more abstract meaning entity, which we call the Quantum Web, or QWeb, where the former is considered to be the collection of traces that can be left by the latter, in specific measurements, similarly to how a non-spatial quantum entity, like an electron, can leave localized traces of impact on a detection screen. The double-slit experiment is extensively used to illustrate the rationale of the modeling, which is guided by how physicists constructed quantum theory to describe the behavior of the microscopic entities. We also emphasize that the superposition principle and the associated interference effects are not sufficient to model all experimental probabilistic data, like those obtained by counting the relative number of documents containing certain words and co-occurrences of words. For this, additional effects, like context effects, must also be taken into consideration.

Authors: Andrei Khrennikov

We emphasize the role of the precise correlations loophole in attempting to connect the CHSH type inequalities with the EPR-argument. The possibility to test theories with hidden variables experimentally by using such inequalities is questioned. The role of the original Bell inequality is highlighted. The interpretation of the CHSH inequality in the spirit of Bohr, as a new test of incompatibility, is presented. The positions of Bohr, Einstein, Podolsky, Rosen, Bell, Clauser, Horne, Shimony, Holt, De Broglie, Hertz, and Boltzmann on interrelation of theory and experiment are enlightened.

To reach quantum supremacy, a quantum computer has to do a task no ordinary computer can. Google has made that harder with an algorithm that beefs up regular PCs

Quantum 2, 81 (2018).

https://doi.org/10.22331/q-2018-08-13-81

We provide a fine-grained definition for monogamous measure of entanglement that does not invoke any particular monogamy relation. Our definition is given in terms an equality, as oppose to inequality, that we call the "disentangling condition". We relate our definition to the more traditional one, by showing that it generates standard monogamy relations. We then show that all quantum Markov states satisfy the disentangling condition for any entanglement monotone. In addition, we demonstrate that entanglement monotones that are given in terms of a convex roof extension are monogamous if they are monogamous on pure states, and show that for any quantum state that satisfies the disentangling condition, its entanglement of formation equals the entanglement of assistance. We characterize all bipartite mixed states with this property, and use it to show that the G-concurrence is monogamous. In the case of two qubits, we show that the equality between entanglement of formation and assistance holds if and only if the state is a rank 2 bipartite state that can be expressed as the marginal of a pure 3-qubit state in the W class.

Share

Author(s): Ulf Leonhardt, Itay Griniasty, Sander Wildeman, Emmanuel Fort, and Mathias Fink

In the Unruh effect an observer with constant acceleration perceives the quantum vacuum as thermal radiation. The Unruh effect has been believed to be a pure quantum phenomenon, but here we show theoretically how the effect arises from the correlation of noise, regardless of whether this noise is qu...


[Phys. Rev. A 98, 022118] Published Mon Aug 13, 2018

Quantum formulation of the Einstein equivalence principle

Quantum formulation of the Einstein equivalence principle, Published online: 13 August 2018; doi:10.1038/s41567-018-0197-6

The physical conditions that support a geometric interpretation of spacetime, such as the equivalence between rest and inertial mass, are shown not to be necessarily valid in the quantum regime, and a quantum formulation is provided.
In this paper I address the question of whether the incompleteness theorems imply that “the mind cannot be mechanized,” where this is understood in the specific sense that “the mathematical outputs of the idealized human mind do not coincide with the mathematical outputs of any idealized finite machine.” Gödel argued that his incompleteness theorems implied a weaker, disjunctive conclusion to the effect that either “the mind cannot be mechanized” or “mathematical truth outstrips the idealized human mind.” Others, most notably, Lucas and Penrose, have claimed more—they have claimed that the incompleteness theorems actually imply the first disjunct. I will show that by sharpening the fundamental concepts involved and articulating the background assumptions governing them, one can prove Gödel’s disjunction, one can show (by invoking results of Reinhardt and Carlson) that the arguments of Lucas and Penrose fail, and one can see what likely led proponents of the first disjunct astray.
Read, James and Le Bihan, Baptiste (2018) Duality and ontology. [Preprint]

Publication date: Available online 1 August 2018

Source: Physics Reports

Author(s): Li Li, Michael J.W. Hall, Howard M. Wiseman

Abstract

Markovian approximation is a widely-employed idea in descriptions of the dynamics of open quantum systems (OQSs). Although it is usually claimed to be a concept inspired by classical Markovianity, the term quantum Markovianity is used inconsistently and often unrigorously in the literature. In this report we compare the descriptions of classical stochastic processes and quantum stochastic processes (as arising in OQSs), and show that there are inherent differences that lead to the non-trivial problem of characterizing quantum non-Markovianity. Rather than proposing a single definition of quantum Markovianity, we study a host of Markov-related concepts in the quantum regime. Some of these concepts have long been used in quantum theory, such as quantum white noise, factorization approximation, divisibility, and GKS–Lindblad master equation. Others are first proposed in this report, including those we call past–future independence, no (quantum) information backflow, and composability. All of these concepts are defined under a unified framework, which allows us to rigorously build hierarchy relations among them. With various examples, we argue that the current most often used definitions of quantum Markovianity in the literature do not fully capture the memoryless property of OQSs. In fact, quantum non-Markovianity is highly context-dependent. The results in this report, summarized as a hierarchy figure, bring clarity to the nature of quantum non-Markovianity.

Saunders, Simon (2018) The Gibbs Paradox. Entropy, 20 (8). p. 552.

Two slits and one hell of a quantum conundrum

Two slits and one hell of a quantum conundrum, Published online: 07 August 2018; doi:10.1038/d41586-018-05892-6

Philip Ball lauds a study of a famous experiment and the insights it offers into a thoroughly maddening theory.

Quantum 2, 79 (2018).

https://doi.org/10.22331/q-2018-08-06-79

Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away - we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

Share

Curiel, Erik (2018) What Is a Black Hole? [Preprint]
Hartenstein, Vera and Hubert, Mario (2018) When Fields Are Not Degrees of Freedom. [Preprint]
McCoy, C.D. (2018) Stability in Cosmology, from Einstein to Inflation. [Preprint]
Read, James and Teh, Nicholas (2018) The Teleparallel Equivalent of Newton-Cartan Gravity. [Preprint]

Abstract

The existence of spacetime singularities is one of the biggest problems of nowadays physics. According to Penrose, each physical singularity should be covered by a “cosmic censor” which prevents any external observer from perceiving their existence. However, classical models describing the gravitational collapse usually results in strong curvature singularities, which can also remain “naked” for a finite amount of advanced time. This proceedings studies the modifications induced by asymptotically safe gravity on the gravitational collapse of generic Vaidya spacetimes. It will be shown that, for any possible choice of the mass function, quantum gravity makes the internal singularity gravitationally weak, thus allowing a continuous extension of the spacetime beyond the singularity.

Colombo, Matteo and Elkin, Lee and Hartmann, Stephan (2018) Being Realist about Bayes, and the Predictive Processing Theory of Mind. [Preprint]

Publication date: Available online 27 July 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Armond Duwell

Abstract

In this paper I examine the extent to which recent work in information-theoretic foundations of quantum mechanics can be thought to facilitate understanding, either of quantum phenomena or quantum theory. To do so I utilize the modal view of understanding phenomena. I extend this view to develop an analysis of understanding of theories. The extended modal view of understanding provides a unified view of recent work in information-theoretic foundations of quantum mechanics and explains how it facilitates understanding.

Publication date: Available online 24 July 2018

Source: Physics Letters A

Author(s): Bruno G. da Costa, Ignacio S. Gomez

Abstract

We discuss the Bohmian mechanics using a deformed Schrödinger equation for position-dependent mass systems, in the context of a q-algebra inspired by the nonextensive statistical mechanics. We obtain the Bohmian quantum formalism by means of a deformed version of the Fisher information functional, from which a deformed Cramér–Rao bound is derived. Lagrangian and Hamiltonian formulations, inherited by the q-algebra, are also developed. Then, we illustrate the results with a particle confined in an infinite square potential well. The preservation of the deformed Cramér–Rao bound for eigenstates shows the role played by the q-algebraic structure.

Abstract

Ever since the early days of quantum mechanics it has been suggested that consciousness could be linked to the collapse of the wave function. However, no detailed account of such an interplay is usually provided. In this paper we present an objective collapse model (a variation of the Continuous Spontaneous Location model) where the collapse operator depends on integrated information, which has been argued to measure consciousness. By doing so, we construct an empirically adequate scheme in which superpositions of conscious states are dynamically suppressed. Unlike other proposals in which “consciousness causes the collapse of the wave function,” our model is fully consistent with a materialistic view of the world and does not require the postulation of entities suspicious of laying outside of the quantum realm.

Feintzeig, Benjamin (2018) The Status of Scaling Limits as Approximations in Quantum Theories. [Preprint]
Gilead, Amihud (2018) CLASSICAL PHYSICS AND THE ACTUALIZATION OF QUANTUM PURE POSSIBILITIES. [Preprint]

When Sherlock Holmes enters the quantum realm

When Sherlock Holmes enters the quantum realm, Published online: 26 July 2018; doi:10.1038/d41586-018-05786-7

Algorithms that pit fraudster against detective find a home in quantum computers.

The theoretical physicist John Wheeler once used the phrase “great smoky dragon” to describe a particle of light going from a source to a photon counter. “The mouth of the dragon is sharp, where it bites the counter. The tail of the dragon is sharp, where the photon starts,” Wheeler wrote. The photon, in other words, has definite reality at the beginning and end. But its state in the middle — the dragon’s body — is nebulous. “What the dragon does or looks like in between we have no right to speak.”

Wheeler was espousing the view that elementary quantum phenomena are not real until observed, a philosophical position called anti-realism. He even designed an experiment to show that if you hold on to realism — in which quantum objects such as photons always have definite, intrinsic properties, a position that encapsulates a more classical view of reality — then one is forced to concede that the future can influence the past. Given the absurdity of backward time-travel, Wheeler’s experiment became an argument for anti-realism at the level of the quantum.

But in May, Rafael Chaves and colleagues at the International Institute of Physics in Natal, Brazil, found a loophole. They showed that Wheeler’s experiment, given certain assumptions, can be explained using a classical model that attributes to a photon an intrinsic nature. They gave the dragon a well-defined body, but one that is hidden from the mathematical formalism of standard quantum mechanics.

Chaves’s team then proposed a twist to Wheeler’s experiment to test the loophole. With unusual alacrity, three teams raced to do the modified experiment. Their results, reported in early June, have shown that a class of classical models that advocate realism cannot make sense of the results. Quantum mechanics may be weird, but it’s still, oddly, the simplest explanation around.

Dragon Trap

Wheeler devised his experiment in 1983 to highlight one of the dominant conceptual conundrums in quantum mechanics: wave-particle duality. Quantum objects seem to act either like particles or waves, but never both at the same time. This feature of quantum mechanics seems to imply that objects have no inherent reality until observed. “Physicists have had to grapple with wave-particle duality as an essential, strange feature of quantum theory for a century,” said David Kaiser, a physicist and historian of science at the Massachusetts Institute of Technology. “The idea pre-dates other quintessentially strange features of quantum theory, such as Heisenberg’s uncertainty principle and Schrödinger’s cat.”

The phenomenon is underscored by a special case of the famous double-slit experiment called the Mach-Zehnder interferometer.

In the experiment, a single photon is fired at a half-silvered mirror, or beam splitter. The photon is either reflected or transmitted with equal probability — and thus can take one of two paths. In this case, the photon will take either path 1 or path 2, and then go on to hit either detector D1 or D2 with equal probability. The photon acts like an indivisible whole, showing us its particle-like nature.

But there’s a twist. At the point where path 1 and path 2 cross, one can add a second beam splitter, which changes things. In this setup, quantum mechanics says that the photon seems to take both paths at once, as a wave would. The two waves come back together at the second beam splitter. The experiment can be set up so that the waves combine constructively — peak to peak, trough to trough — only when they move toward D1. The path toward D2, by contrast, represents destructive interference. In such a setup, the photon will always be found at D1 and never at D2. Here, the photon displays its wavelike nature.

Wheeler’s genius lay in asking: what if we delay the choice of whether to add the second beam splitter? Let’s assume the photon enters the interferometer without the second beam splitter in place. It should act like a particle. One can, however, add the second beam splitter at the very last nanosecond. Both theory and experiment show that the photon, which until then was presumably acting like a particle and would have gone to either D1 or D2, now acts like a wave and goes only to D1. To do so, it had to seemingly be in both paths simultaneously, not one path or the other. In the classical way of thinking, it’s as if the photon went back in time and changed its character from particle to wave.

One way to avoid such retro-causality is to deny the photon any intrinsic reality and argue that the photon becomes real only upon measurement. That way, there is nothing to undo.

Such anti-realism, which is often associated with the Copenhagen interpretation of quantum mechanics, took a theoretical knock with Chaves’s work, at least in the context of this experiment. His team wanted to explain counterintuitive aspects of quantum mechanics using a new set of ideas called causal modeling, which has grown in popularity in the past decade, advocated by computer scientist Judea Pearl and others. Causal modeling involves establishing cause-and-effect relationships between various elements of an experiment. Often when studying correlated events — call them A and B — if one cannot conclusively say that A causes B, or that B causes A, there exists a possibility that a previously unsuspected or “hidden” third event, C, causes both. In such cases, causal modeling can help uncover C.

Chaves and his colleagues Gabriela Lemos and Jacques Pienaar focused on Wheeler’s delayed choice experiment, fully expecting to fail at finding a model with a hidden process that both grants a photon intrinsic reality and also explains its behavior without having to invoke retro-causality. They thought they would prove that the delayed-choice experiment is “super counterintuitive, in the sense that there is no causal model that is able to explain it,” Chaves said.

But they were in for a surprise. The task proved relatively easy. They began by assuming that the photon, immediately after it has crossed the first beam splitter, has an intrinsic state denoted by a “hidden variable.” A hidden variable, in this context, is something that’s absent from standard quantum mechanics but that influences the photon’s behavior in some way. The experimenter then chooses to add or remove the second beam splitter. Causal modeling, which prohibits backward time travel, ensures that the experimenter’s choice cannot influence the past intrinsic state of the photon.

Given the hidden variable, which implies realism, the team then showed that it’s possible to write down rules that use the variable’s value and the presence or absence of the second beam splitter to guide the photon to D1 or D2 in a manner that mimics the predictions of quantum mechanics. Here was a classical, causal, realistic explanation. They had found a new loophole.

This surprised some physicists, said Tim Byrnes, a theoretical quantum physicist at New York University, Shanghai. “What people didn’t really appreciate is that this kind of experiment is susceptible to a classical version that perfectly mimics the experimental results,” Byrnes said. “You could construct a hidden variable theory that didn’t involve quantum mechanics.”

“This was the step zero,” Chaves said. The next step was to figure out how to modify Wheeler’s experiment in such a way that it could distinguish between this classical hidden variable theory and quantum mechanics.

In their modified thought experiment, the full Mach-Zehnder interferometer is intact; the second beam splitter is always present. Instead, two “phase shifts” — one near the beginning of the experiment, one toward the end — serve the role of experimental dials that the researcher can adjust at will.

The net effect of the two phase shifts is to change the relative lengths of the paths. This changes the interference pattern, and with it, the presumed “wavelike” or “particle-like” behavior of the photon. For example, the value of the first phase shift could be such that the photon acts like a particle inside the interferometer, but the second phase shift could force it to act like a wave. The researchers require that the second phase shift is set after the first.

With this setup in place, Chaves’s team came up with a way to distinguish between a classical causal model and quantum mechanics. Say the first phase shift can take one of three values, and the second one of two values. That makes six possible experimental settings in total. They calculated what they expected to see for each of these six settings. Here, the predictions of a classical hidden variable model and standard quantum mechanics differ. They then constructed a formula. The formula takes as its input probabilities calculated from the number of times that photons land on particular detectors (based on the setting of the two phase shifts). If the formula equals zero, the classical causal model can explain the statistics. But if the equation spits out a number greater than zero, then, subject to some constraints on the hidden variable, there’s no classical explanation for the experiment’s outcome.

Chaves teamed up with Fabio Sciarrino, a quantum physicist at the University of Rome La Sapienza, and his colleagues to test the inequality. Simultaneously, two teams in China — one led by Jian-Wei Pan, an experimental physicist at the University of Science and Technology of China (USTC) in Hefei, China, and another by Guang-Can Guo, also at USTC — carried out the experiment.

Each team implemented the scheme slightly differently. Guo’s group stuck to the basics, using an actual Mach-Zehnder interferometer. “It is the one that I would say is actually the closest to Wheeler’s original proposal,” said Howard Wiseman, a theoretical physicist at Griffith University in Brisbane, Australia, who was not part of any team.

But all three showed that the formula is greater than zero with irrefutable statistical significance. They ruled out the classical causal models of the kind that can explain Wheeler’s delayed-choice experiment. The loophole has been closed. “Our experiment has salvaged Wheeler’s famous thought experiment,” Pan said.

Hidden Variables That Remain

Kaiser is impressed by Chaves’s “elegant” theoretical work and the experiments that ensued. “The fact that each of the recent experiments has found clear violations of the new inequality … provides compelling evidence that ‘classical’ models of such systems really do not capture how the world works, even as quantum-mechanical predictions match the latest results beautifully,” he said.

The formula comes with certain assumptions. The biggest one is that the classical hidden variable used in the causal model can take one of two values, encoded in one bit of information. Chaves thinks this is reasonable, since the quantum system — the photon — can also only encode one bit of information. (It either goes in one arm of the interferometer or the other.) “It’s very natural to say that the hidden variable model should also have dimension two,” Chaves said.

But a hidden variable with additional information-carrying capacity can restore the classical causal model’s ability to explain the statistics observed in the modified delayed-choice experiment.

In addition, the most popular hidden variable theory remains unaffected by these experiments. The de Broglie-Bohm theory, a deterministic and realistic alternative to standard quantum mechanics, is perfectly capable of explaining the delayed-choice experiment. In this theory, particles always have positions (which are the hidden variables), and hence have objective reality, but they are guided by a wave. So reality is both wave and particle. The wave goes through both paths, the particle through one or the other. The presence or absence of the second beam splitter affects the wave, which then guides the particle to the detectors — with exactly the same results as standard quantum mechanics.

For Wiseman, the debate over Copenhagen versus de Broglie-Bohm in the context of the delayed-choice experiment is far from settled. “So in Copenhagen, there is no strange inversion of time precisely because we have no right to say anything about the photon’s past,” he wrote in an email. “In de Broglie-Bohm there is a reality independent of our knowledge, but there is no problem as there is no inversion — there is a unique causal (forward in time) description of everything.”

Kaiser, even as he lauds the efforts so far, wants to take things further. In current experiments, the choice of whether or not to add the second phase shift or the second beam splitter in the classic delayed-choice experiment was being made by a quantum random-number generator. But what’s being tested in these experiments is quantum mechanics itself, so there’s a whiff of circularity. “It would be helpful to check whether the experimental results remain consistent, even under complementary experimental designs that relied on entirely different sources of randomness,” Kaiser said.

To this end, Kaiser and his colleagues have built such a source of randomness using photons coming from distant quasars, some from more than halfway across the universe. The photons were collected with a one-meter telescope at the Table Mountain Observatory in California. If a photon had a wavelength less than a certain threshold value, the random number generator spit out a 0, otherwise a 1. In principle, this bit can be used to randomly choose the experimental settings. If the results continue to support Wheeler’s original argument, then “it gives us yet another reason to say that wave-particle duality is not going to be explained away by some classical physics explanation,” Kaiser said. “The range of conceptual alternatives to quantum mechanics has again been shrunk, been pushed back into a corner. That’s really what we are after.”

For now, the dragon’s body, which for a brief few weeks had come into focus, has gone back to being smoky and indistinct.



show enclosure

(image/jpg)

Quantum optics without photons

Quantum optics without photons, Published online: 25 July 2018; doi:10.1038/d41586-018-05738-1

Atoms can exhibit wave-like behaviour to form matter waves. Such waves have been used to model the basic processes that underpin how light interacts with matter, providing an experimental platform for future research.
Humans need much more information to study a problem backwards in time than forwards, but a quantum computer can ignore the flow of time all together

The dark side of neutrons

The dark side of neutrons, Published online: 24 July 2018; doi:10.1038/s41567-018-0261-2

The agent responsible for the accelerated expansion of the Universe is completely unknown. Delicate interference measurements of the quantum transitions of very slow neutrons bouncing on a flat table have constrained an interesting theoretical possibility.
de Ronde, Christian and Massri, Cesar (2018) The Logos Categorical Approach to Quantum Mechanics: III. Relational Potential Coding and Quantum Entanglement Beyond Collapses, Pure States and Particle Metaphysics. [Preprint]
Hoefer, Carl (2018) "Undermined" Undermined. [Preprint]
Eva, Benjamin and Stern, Reuben and Hartmann, Stephan (2018) The Similarity of Causal Structure. [Preprint]
Vinding, Mikkel C. (2018) Investigating Causal Effects of Mental Events in Cognitive Neuroscience. [Preprint]

Authors: Koji Azuma, Sathyawageeswar Subramanian

About 45 years ago, Bekenstein proposed that black holes should have entropy proportional to their areas in order to make black-hole physics compatible with the second law of thermodynamics. Hawking strengthened this argument by showing that black holes emit thermal radiation, as succinctly encapsulated in the phrase "a 'black hole' is not completely black". However, the heuristic picture of the microscopic process for this Hawking radiation, creation of pairs of positive- and negative-energy particles, leads to inconsistency among the first law for black holes, Bekenstein's argument, and the conservation law for the entropy. Parikh and Wilczek partially improved this consistency by treating Hawking radiation as tunnelling of particles in a dynamical geometry, but at the expense of the pure thermality of the radiation in Hawking's original proposal. Here we present an equation alternative to Bekenstein's, from a viewpoint of quantum information, rather than thermodynamics. Our alternative argues that the area of a black hole is proportional to the coherent information, which is 'minus' conditional entropy, defined only in the quantum regime, from the outside of the black hole to positive-energy particles inside the black hole. Our equation hints that negative-energy particles inside a black hole behave as if they have 'negative' entropy, and provides complete consistency without changing Hawking's original proposal. These ideas suggest that black holes store purely quantum information, rather than classical information.

Krause, Décio (2018) Quantum Mechanics, Ontology, and Non-Reflexive Logics. [Preprint]
Manchak, JB (2018) General Relativity as a Collection of Collections of Models. [Preprint]

Abstract

While the relation between visualization and scientific understanding has been a topic of long-standing discussion, recent developments in physics have pushed the boundaries of this debate to new and still unexplored realms. For it is claimed that, in certain theories of quantum gravity, spacetime ‘disappears’: and this suggests that one may have sensible physical theories in which spacetime is completely absent. This makes the philosophical question whether such theories are intelligible, even more pressing. And if such theories are intelligible, the question then is how they manage to do so. In this paper, we adapt the contextual theory of scientific understanding, developed by one of us, to fit the novel challenges posed by physical theories without spacetime. We construe understanding as a matter of skill rather than just knowledge. The appeal is thus to understanding, rather than explanation, because we will be concerned with the tools that scientists have at their disposal for understanding these theories. Our central thesis is that such physical theories can provide scientific understanding, and that such understanding does not require spacetimes of any sort. Our argument consists of four consecutive steps: (a) We argue, from the general theory of scientific understanding, that although visualization is an oft-used tool for understanding, it is not a necessary condition for it; (b) we criticise certain metaphysical preconceptions which can stand in the way of recognising how intelligibility without spacetime can be had; (c) we catalogue tools for rendering theories without a spacetime intelligible; and (d) we give examples of cases in which understanding is attained without a spacetime, and explain what kind of understanding these examples provide.

Manchak, JB (2018) On Feyerabend, General Relativity, and 'Unreasonable' Universes. [Preprint]
François, Jordan (2018) Artificial vs Substantial Gauge Symmetries: a Criterion and an Application to the Electroweak Model. [Preprint]
Frigg, Roman and Werndl, Charlotte (2018) Can Somebody Please Say What Gibbsian Statistical Mechanics Says? The British Journal for the Philosophy of Science.
Fortin, Sebastian and Jaimes Arriaga, Jesús Alberto (2018) About the nature of the wave function and its dimensionality: the case of quantum chemistry. [Preprint]
Oriti, Daniele (2018) Levels of spacetime emergence in quantum gravity. [Preprint]
O'Malley, Maureen A. and Parke, Emily C. (2018) Microbes, mathematics, and models. [Preprint]

Abstract

We review some ideas about the quantum physics of black hole information storage and processing in terms of a general phenomenon of quantum criticality.

Publication date: Available online 9 July 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Leah Henderson

Abstract

Jeff Bub has developed an information-theoretic interpretation of quantum mechanics on the basis of the programme to reaxiomatise the theory in terms of information-theoretic principles. According to the most recent version of the interpretation, reaxiomatisation can dissolve some of the demands for explanation traditionally associated with the task of providing an interpretation for the theory. The key idea is that the real lesson we should take away from quantum mechanics is that the ‘structure of information’ is not what we thought it was. In particular a feature of the new structure is intrinsic randomness of measurement, which allegedly dissolves a significant part of the measurement problem. I argue that it is difficult to find an appropriate argument to support the claim that measurement is intrinsically random in the relevant sense.

Abstract

In the asymptotic safety paradigm, a quantum field theory reaches a regime with quantum scale invariance in the ultraviolet, which is described by an interacting fixed point of the Renormalization Group. Compelling hints for the viability of asymptotic safety in quantum gravity exist, mainly obtained from applications of the functional Renormalization Group. The impact of asymptotically safe quantum fluctuations of gravity at and beyond the Planck scale could at the same time induce an ultraviolet completion for the Standard Model of particle physics with high predictive power.

Publication date: February 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 61

Author(s): Silvia De Bianchi, Gabriel Catren

Abstract

This Special Issue Hermann Weyl and the Philosophy of the ‘New Physics’ has two main objectives: first, to shed fresh light on the relevance of Weyl's work for modern physics and, second, to evaluate the importance of Weyl's work and ideas for contemporary philosophy of physics. Regarding the first objective, this Special Issue emphasizes aspects of Weyl's work (e.g. his work on spinors in n dimensions) whose importance has recently been emerging in research fields across both mathematical and experimental physics, as well as in the history and philosophy of physics. Regarding the second objective, this Special Issue addresses the relevance of Weyl's ideas regarding important open problems in the philosophy of physics, such as the problem of characterizing scientific objectivity and the problem of providing a satisfactory interpretation of fundamental symmetries in gauge theories and quantum mechanics. In this Introduction, we sketch the state of the art in Weyl studies and we summarize the content of the contributions to the present volume.

Publication date: Available online 20 January 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Carina E.A. Prunkl, Christopher G. Timpson

Abstract

Recently, Cabello et al. (2016) claim to have proven the existence of an empirically verifiable difference between two broad classes of quantum interpretations. On the basis of three seemingly uncontentious assumptions, (i) the possibility of randomly selected measurements, (ii) the finiteness of a quantum system's memory, and (iii) the validity of Landauer's principle, and further, by applying computational mechanics to quantum processes, the authors arrive at the conclusion that some quantum interpretations (including central realist interpretations) are associated with an excess heat cost and are thereby untenable—or at least—that they can be distinguished empirically from their competitors by measuring the heat produced. Here, we provide an explicit counterexample to this claim and demonstrate that their surprising result can be traced back to a lack of distinction between system and external agent. By drawing the distinction carefully, we show that the resulting heat cost is fully accounted for in the external agent, thereby restoring the tenability of the quantum interpretations in question.

Publication date: February 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 61

Author(s): Iulian D. Toader

Publication date: Available online 5 January 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Simon Friederich

Abstract

The paper has three main aims: first, to make the asymptotic safety-based approach to quantum gravity better known to the community of researchers in the history and philosophy of modern physics by outlining its motivation, core tenets, and achievements so far; second, to preliminarily elucidate the finding that, according to the asymptotic safety scenario, space-time has fractal dimension 2 at short length scales; and, third, to provide the basis for a methodological appraisal of the asymptotic safety-based approach to quantum gravity in the light of the Kuhnian criteria of theory choice.

Publication date: February 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 61

Author(s): Gabriel Catren

Abstract

We distinguish two orientations in Weyl's analysis of the fundamental role played by the notion of symmetry in physics, namely an orientation inspired by Klein's Erlangen program and a phenomenological-transcendental orientation. By privileging the former to the detriment of the latter, we sketch a group(oid)-theoretical program—that we call the Klein-Weyl program—for the interpretation of both gauge theories and quantum mechanics in a single conceptual framework. This program is based on Weyl's notion of a “structure-endowed entity” equipped with a “group of automorphisms”. First, we analyze what Weyl calls the “problem of relativity” in the frameworks provided by special relativity, general relativity, and Yang-Mills theories. We argue that both general relativity and Yang-Mills theories can be understood in terms of a localization of Klein's Erlangen program: while the latter describes the group-theoretical automorphisms of a single structure (such as homogenous geometries), local gauge symmetries and the corresponding gauge fields (Ehresmann connections) can be naturally understood in terms of the groupoid-theoretical isomorphisms in a family of identical structures. Second, we argue that quantum mechanics can be understood in terms of a linearization of Klein's Erlangen program. This stance leads us to an interpretation of the fact that quantum numbers are “indices characterizing representations of groups” ((Weyl, 1931a), p.xxi) in terms of a correspondence between the ontological categories of identity and determinateness.

Publication date: Available online 1 November 2017

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Dennis Lehmkuhl

Abstract

In this paper I describe the genesis of Einstein's early work on the problem of motion in general relativity (GR): the question of whether the motion of matter subject to gravity can be derived directly from the Einstein field equations. In addressing this question, Einstein himself always preferred the vacuum approach to the problem: the attempt to derive geodesic motion of matter from the vacuum Einstein equations. The paper first investigates why Einstein was so skeptical of the energy-momentum tensor and its role in GR. Drawing on hitherto unknown correspondence between Einstein and George Yuri Rainich, I then show step by step how his work on the vacuum approach came about, and how his quest for a unified field theory informed his interpretation of GR. I show that Einstein saw GR as a hybrid theory from very early on: fundamental and correct as far as gravity was concerned but phenomenological and effective in how it accounted for matter. As a result, Einstein saw energy-momentum tensors and singularities in GR as placeholders for a theory of matter not yet delivered. The reason he preferred singularities was that he hoped that their mathematical treatment would give a hint as to the sought after theory of matter, a theory that would do justice to quantum features of matter.

Publication date: Available online 18 February 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): James Read

Abstract

I consider the interrelations between two decision-theoretic approaches to probability which have been developed in the context of Everettian quantum mechanics: that due to Deutsch and Wallace on the one hand, and that due to Greaves and Myrvold on the other. Having made precise these interrelations, I defend Everettian decision theory against recent objections raised by Dawid and Thébault. Finally, I discuss the import of these results from decision theory for the rationality of an Everettian agent's betting in accordance with the Born rule.

Publication date: Available online 13 October 2017

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Christina Conroy

Abstract

There is an interpretation of Everettian Quantum Mechanics [EQM] known as the Relative Facts Interpretation [RFI] which is a single-world interpretation of EQM and takes generally all facts about objects to be relational. In this paper I argue that from the perspective of the RFI the best theory of modality for EQM is actualism rather than some version of modal realism, as has been suggested by Alastair Wilson. To argue this I draw a parallel between actualism as it was developed by Alvin Plantinga and actualism as it can be developed from the context of the RFI of EQM. The contention is not that Plantinga-style actualism is the only way one can be an actualist with respect to EQM, but rather that showing how one can be an actualist in at least one way, demonstrates that there are options for a modal metaphysics from the context of EQM.

Publication date: Available online 17 October 2017

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Vincent Ardourel, Alexandre Guay

Abstract

The transference theory reduces causation to the transmission (or regular manifestation) of physical conserved quantities, like energy or momenta. Although this theory aims at applying to all fields of physics, we claim that it fails to account for a quantum electrodynamic effect, viz. the Aharonov-Bohm effect. After having argued that the Aharonov-Bohm effect is a genuine counter-example for the transference theory, we offer a new physicalist approach of causation, ontic and modal, in which this effect is embedded.

Publication date: Available online 21 December 2017

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): R. Hermens, O.J.E. Maroney

Abstract

Macroscopic realism is the thesis that macroscopically observable properties must always have definite values. The idea was introduced by Leggett and Garg (1985), who wished to show a conflict with the predictions of quantum theory, by using it to derive an inequality that quantum theory violates. However, Leggett and Garg's analysis required not just the assumption of macroscopic realism per se, but also that the observable properties could be measured non-invasively. In recent years there has been increasing interest in experimental tests of the violation of the Leggett-Garg inequality, but it has remained a matter of controversy whether this second assumption is a reasonable requirement for a macroscopic realist view of quantum theory. In a recent critical assessment Maroney and Timpson (2014) identified three different categories of macroscopic realism, and argued that only the simplest category could be ruled out by Leggett-Garg inequality violations. Allen, Maroney, and Gogioso (2016) then showed that the second of these approaches was also incompatible with quantum theory in Hilbert spaces of dimension 4 or higher. However, we show that the distinction introduced by Maroney and Timpson between the second and third approaches is not noise tolerant, so unfortunately Allen's result, as given, is not directly empirically testable. In this paper we replace Maroney and Timpson's three categories with a parameterization of macroscopic realist models, which can be related to experimental observations in a noise tolerant way, and recover the original definitions in the noise-free limit. We show how this parameterization can be used to experimentally rule out classes of macroscopic realism in Hilbert spaces of dimension 3 or higher, without any use of the non-invasive measurability assumption. Even for relatively low precision experiments, this will rule out the original category of macroscopic realism, that is tested by the Leggett-Garg inequality, while as the precision of the experiments increases, all cases of the second category and many cases of the third category, will become experimentally ruled out.

Publication date: Available online 30 December 2017

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Karen Crowther

Abstract

Relationships between current theories, and relationships between current theories and the sought theory of quantum gravity (QG), play an essential role in motivating the need for QG, aiding the search for QG, and defining what would count as QG. Correspondence is the broad class of inter-theory relationships intended to demonstrate the necessary compatibility of two theories whose domains of validity overlap, in the overlap regions. The variety of roles that correspondence plays in the search for QG are illustrated, using examples from specific QG approaches. Reduction is argued to be a special case of correspondence, and to form part of the definition of QG. Finally, the appropriate account of emergence in the context of QG is presented, and compared to conceptions of emergence in the broader philosophy literature. It is argued that, while emergence is likely to hold between QG and general relativity, emergence is not part of the definition of QG, and nor can it serve usefully in the development and justification of the new theory.

Publication date: Available online 18 June 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): David Wallace

Abstract

I give a fairly systematic and thorough presentation of the case for regarding black holes as thermodynamic systems in the fullest sense, aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but not presuming advanced knowledge of quantum gravity. I pay particular attention to (i) the availability in classical black hole thermodynamics of a well-defined notion of adiabatic intervention; (ii) the power of the membrane paradigm to make black hole thermodynamics precise and to extend it to local-equilibrium contexts; (iii) the central role of Hawking radiation in permitting black holes to be in thermal contact with one another; (iv) the wide range of routes by which Hawking radiation can be derived and its back-reaction on the black hole calculated; (v) the interpretation of Hawking radiation close to the black hole as a gravitationally bound thermal atmosphere. In an appendix I discuss recent criticisms of black hole thermodynamics by Dougherty and Callender. This paper confines its attention to the thermodynamics of black holes; a sequel will consider their statistical mechanics.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Flavio Del Santo

Abstract

I present the reconstruction of the involvement of Karl Popper in the community of physicists concerned with foundations of quantum mechanics, in the 1980s. At that time Popper gave active contribution to the research in physics, of which the most significant is a new version of the EPR thought experiment, alleged to test different interpretations of quantum mechanics. The genesis of such an experiment is reconstructed in detail, and an unpublished letter by Popper is reproduced in the present paper to show that he formulated his thought experiment already two years before its first publication in 1982. The debate stimulated by the proposed experiment as well as Popper's role in the physics community throughout 1980s is here analysed in detail by means of personal correspondence and publications.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Boris Kožnjak

Abstract

In this paper, I analyze the historical context, scientific and philosophical content, and the implications of the thus far historically largely neglected Ninth Symposium of the Colston Research Society held in Bristol at the beginning of April 1957, the first major international event after World War II gathering eminent physicists and philosophers to discuss the foundational questions of quantum mechanics, in respect to the early reception of the causal quantum theory program mapped and defended by David Bohm during the five years preceding the Symposium. As will be demonstrated, contrary to the almost unanimously negative and even hostile reception of Bohm's ideas on hidden variables in the early 1950s, in the close aftermath of the 1957 Colston Research Symposium Bohm's ideas received a more open-minded and ideologically relaxed critical rehabilitation, in which the Symposium itself played a vital and essential part.

Publication date: Available online 7 March 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Bryan W. Roberts

Abstract

How should we characterise the observable aspects of quantum theory? This paper argues that philosophers and physicists should jettison a standard dogma: that observables must be represented by self-adjoint or Hermitian operators. Four classes of non-standard observables are identified: normal operators, symmetric operators, real-spectrum operators, and none of these. The philosophical and physical implications of each are explored.

Publication date: Available online 15 June 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Vincent Lam, Christian Wüthrich

Abstract

Theories of quantum gravity generically presuppose or predict that the reality underlying relativistic spacetimes they are describing is significantly non-spatiotemporal. On pain of empirical incoherence, approaches to quantum gravity must establish how relativistic spacetime emerges from their non-spatiotemporal structures. We argue that in order to secure this emergence, it is sufficient to establish that only those features of relativistic spacetimes functionally relevant in producing empirical evidence must be recovered. In order to complete this task, an account must be given of how the more fundamental structures instantiate these functional roles. We illustrate the general idea in the context of causal set theory and loop quantum gravity, two prominent approaches to quantum gravity.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Philipp Berghofer

Abstract

Ontic structural realism refers to the novel, exciting, and widely discussed basic idea that the structure of physical reality is genuinely relational. In its radical form, the doctrine claims that there are, in fact, no objects but only structure, i.e., relations. More moderate approaches state that objects have only relational but no intrinsic properties. In its most moderate and most tenable form, ontic structural realism assumes that at the most fundamental level of physical reality there are only relational properties. This means that the most fundamental objects only possess relational but no non-reducible intrinsic properties. The present paper will argue that our currently best physics refutes even this most moderate form of ontic structural realism. More precisely, I will claim that 1) according to quantum field theory, the most fundamental objects of matter are quantum fields and not particles, and show that 2) according to the Standard Model, quantum fields have intrinsic non-relational properties.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Sebastian Fortin, Olimpia Lombardi, Juan Camilo Martínez González

Publication date: Available online 21 April 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Niels S. Linnemann, Manus R. Visser

Abstract

A possible way out of the conundrum of quantum gravity is the proposal that general relativity (GR) emerges from an underlying microscopic description. Despite recent interest in the emergent gravity program within the physics as well as the philosophy community, an assessment of the general motivation for this idea is lacking at the moment. We intend to fill this gap in the literature by discussing the main arguments in favour of the hypothesis that the metric field and its dynamics are emergent. First, we distinguish between microstructure inspired from GR, such as through quantization or discretization, and microstructure that is not directly motivated from GR, such as strings, quantum bits or condensed matter fields. The emergent gravity approach can then be defined as the view that the metric field and its dynamics are derivable from the latter type of microstructure. Subsequently, we assess in how far the following properties of (semi-classical) GR are suggestive of underlying microstructure: (1) the metric's universal coupling to matter fields, (2) perturbative non-renormalizability, (3) black hole thermodynamics, and (4) the holographic principle. In the conclusion we formalize the general structure of the plausibility arguments put forward.

Publication date: Available online 27 April 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jeffrey Bub

Abstract

In a recent result, Frauchiger & Renner argue that if quantum theory accurately describes complex systems like observers who perform measurements, then “we are forced to give up the view that there is one single reality.” Following a review of the Frauchiger-Renner argument, I argue that quantum mechanics should be understood probabilistically, as a new sort of non-Boolean probability theory, rather than representationally, as a theory about the elementary constituents of the physical world and how these elements evolve dynamically over time. I show that this way of understanding quantum mechanics is not in conflict with a consistent “single-world” interpretation of the theory.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Tomasz Bigaj

Abstract

One of the key philosophical questions regarding quantum field theory is whether it should be given a particle or field interpretation. The particle interpretation of QFT is commonly viewed as being undermined by the well-known no-go results, such as the Malament, Reeh-Schlieder and Hegerfeldt theorems. These theorems all focus on the localizability problem within the relativistic framework. In this paper I would like to go back to the basics and ask the simple-minded question of how the notion of quanta appears in the standard procedure of field quantization, starting with the elementary case of the finite numbers of harmonic oscillators, and proceeding to the more realistic scenario of continuous fields with infinitely many degrees of freedom. I will try to argue that the way the standard formalism introduces the talk of field quanta does not justify treating them as particle-like objects with well-defined properties.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Vladislav Terekhovich

Abstract

Despite the importance of the variational principles of physics, there have been relatively few attempts to consider them for a realistic framework. In addition to the old teleological question, this paper continues the recent discussion regarding the modal involvement of the principle of least action and its relations with the Humean view of the laws of nature. The reality of possible paths in the principle of least action is examined from the perspectives of the contemporary metaphysics of modality and Leibniz's concept of essences or possibles striving for existence. I elaborate a modal interpretation of the principle of least action that replaces a classical representation of a system's motion along a single history in the actual modality by simultaneous motions along an infinite set of all possible histories in the possible modality. This model is based on an intuition that deep ontological connections exist between the possible paths in the principle of least action and possible quantum histories in the Feynman path integral. I interpret the action as a physical measure of the essence of every possible history. Therefore only one actual history has the highest degree of the essence and minimal action. To address the issue of necessity, I assume that the principle of least action has a general physical necessity and lies between the laws of motion with a limited physical necessity and certain laws with a metaphysical necessity.

Publication date: May 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, Volume 62

Author(s): Edward MacKinnon

Abstract

The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.

Publication date: Available online 20 June 2018

Source: Physics Letters A

Author(s): E. Gozzi

Abstract

In this paper we put forward some simple rules which can be used in order to pass from the quantum Moyal evolution operator to the classical one of Liouville without taking the limit of ħ0. These rules involve the averaging over some auxiliary variables.

Publication date: 25 August 2018

Source: Physics Letters A, Volume 382, Issue 33

Author(s): Anna M. Nobili, Alberto Anselmi

Abstract

Tests of the Weak Equivalence Principle (WEP) probe the foundations of physics. Ever since Galileo in the early 1600s, WEP tests have attracted some of the best experimentalists of any time. Progress has come in bursts, each stimulated by the introduction of a new technique: the torsion balance, signal modulation by Earth rotation, the rotating torsion balance. Tests for various materials in the field of the Earth and the Sun have found no violation to the level of about 1 part in 1013. A different technique, Lunar Laser Ranging (LLR), has reached comparable precision. Today, both laboratory tests and LLR have reached a point when improving by a factor of 10 is extremely hard. The promise of another quantum leap in precision rests on experiments performed in low Earth orbit. The Microscope satellite, launched in April 2016 and currently taking data, aims to test WEP in the field of Earth to 1015, a 100-fold improvement possible thanks to a driving signal in orbit almost 500 times stronger than for torsion balances on ground. The ‘Galileo Galilei’ (GG) experiment, by combining the advantages of space with those of the rotating torsion balance, aims at a WEP test 100 times more precise than Microscope, to 1017. A quantitative comparison of the key issues in the two experiments is presented, along with recent experimental measurements relevant for GG. Early results from Microscope, reported at a conference in March 2017, show measurement performance close to the expectations and confirm the key role of rotation with the advantage (unique to space) of rotating the whole spacecraft. Any non-null result from Microscope would be a major discovery and call for urgent confirmation; with 100 times better precision GG could settle the matter and provide a deeper probe of the foundations of physics.

Publication date: Available online 26 June 2018

Source: Physics Reports

Author(s): Liang Huang, Hong-Ya Xu, Celso Grebogi, Ying-Cheng Lai

Abstract

Quantum chaos is generally referred to as the study of quantum manifestations or fingerprints of nonlinear dynamical and chaotic behaviors in the corresponding classical system, an interdisciplinary field that has been active for about four decades. In closed chaotic Hamiltonian systems, for example, the basic phenomena studied include energy level-spacing statistics and quantum scarring. In open Hamiltonian systems, quantum chaotic scattering has been investigated extensively. Previous works were almost exclusively for nonrelativistic quantum systems described by the Schrödinger equation. Recent years have witnessed a rapid growth of interest in Dirac materials such as graphene, topological insulators, molybdenum disulfide and topological Dirac semimetals. A common feature of these materials is that their physics is described by the Dirac equation in relativistic quantum mechanics, generating phenomena that do not usually emerge in conventional semiconductor materials. This has important consequences. In particular, at the level of basic science, a new field has emerged: Relativistic Quantum Chaos (RQC), which aims to uncover, understand, and exploit relativistic quantum manifestations of classical nonlinear dynamical behaviors including chaos. Practically, Dirac materials have the potential to revolutionize solid-state electronic and spintronic devices, and have led to novel device concepts such as valleytronics. Exploiting manifestations of nonlinear dynamics and chaos in the relativistic quantum regime can have significant applications.

The aim of this article is to give a comprehensive review of the basic results obtained so far in the emergent field of RQC. Phenomena to be discussed in depth include energy level-spacing statistics in graphene or Dirac fermion systems that exhibit various nonlinear dynamical behaviors in the classical limit, relativistic quantum scars (unusually high concentrations of relativistic quantum spinors about classical periodic orbits), peculiar features of relativistic quantum chaotic scattering and quantum transport, manifestations of the Klein paradox and its effects on graphene or 2D Dirac material based devices, chaos based modulation of conductance fluctuations in relativistic quantum dots, regularization of relativistic quantum tunneling by chaos, superpersistent currents in chaotic Dirac rings subject to a magnetic flux, and exploitation of relativistic quantum whispering gallery modes for applications in quantum information science. Computational methods for solving the Dirac equation in various situations will be introduced and physical theories developed so far in RQC will be described. Potential device applications will be discussed.

Author(s): Jason F. Ralph, Marko Toroš, Simon Maskell, Kurt Jacobs, Muddassar Rashid, Ashley J. Setter, and Hendrik Ulbricht

We discuss a general method of model selection from experimentally recorded time-trace data. This method can be used to distinguish between quantum and classical dynamical models. It can be used in postselection as well as for real-time analysis, and offers an alternative to statistical tests based ...


[Phys. Rev. A 98, 010102(R)] Published Fri Jul 06, 2018

Volume 4, Issue 3, pages 210-222

R. E. Kastner [Show Biography] and John G. Cramer [Show Biography]

Ruth E. Kastner earned her M.S. in Physics and Ph.D. in Philosophy (History and Philosophy of Science) and the University of Maryland, College Park (1999). She has taught a variety of philosophy and physics courses throughout the Baltimore-Washington corridor, and currently is a member of the Foundations of Physics group at UMCP. She is also an Affiliate of the physics department at the SUNY Albany campus. She specializes in time-symmetry and the Transactional Interpretation (TI) of quantum mechanics, and in particular has extended the original TI of John Cramer to the relativistic domain. Her interests and publications include topics in thermodynamics and statistical mechanics, quantum ontology, counterfactuals, spacetime emergence, and free will. She is the author of two books: The Transactional Interpretation of Quantum Mechanics: The Reality of Possibility (Cambridge, 2012) and Understanding Our Unseen Reality: Solving Quantum Riddles (Imperial College Press, 2015). She is also an Editor of the collected volume Quantum Structural Studies (World Scientific, 2016).

John G. Cramer is Professor Emeritus in Physics at the University of Washington (UW) in Seattle, where he has had five decades of experience in teaching undergraduate and graduate level physics. John was born in Houston, Texas on October 24, 1934, and was educated in the Houston Public Schools (Poe, Lanier, Lamar) and at Rice University, where he received a BA (1957), MA (1959), and Ph.D. (1961) in Experimental Nuclear Physics. He began his professional physics career as a Postdoc and then Assistant Professor at Indiana University, Bloomington, Indiana (1961–1964) before joining the Physics Faculty of the University of Washington. He has done cutting-edge research in experimental and theoretical nuclear and ultra-relativistic heavy ion physics, including active participation in Experiments NA35 and NA49 at CERN, Geneva, Switzerland, and the STAR Experiment at RHIC, Brookhaven National Laboratory, Long Island, NY. He has also worked in the foundations of quantum mechanics (QM) and is the originator of the transactional interpretation of quantum mechanics. He is co-author of around 300 publications in nuclear and ultra-relativistic heavy ion physics published in peer-reviewed physics journals, as well as over 141 publications in conference proceedings, and has written several chapters for multiauthor books about physics. His recent book on the transactional interpretation, The Quantum Handshake – Entanglement, Nonlocality and Transactions, was published by Springer in 2016.

The Transactional Interpretation offers a solution to the measurement problem by identifying specific physical conditions precipitating the non-unitary `measurement transition’ of von Neumann. Specifically, the transition occurs as a result of absorber response (a process lacking in the standard approach to the theory). The purpose of this Letter is to make clear that, despite recent claims to the contrary, the concepts of `absorber’ and `absorber response,’ as well as the process of absorption, are physically and quantitatively well-defined in the transactional picture. In addition, the Born Rule is explicitly derived for radiative processes.

Full Text Download (279k) | View Submission Post

Abstract

I sketch a line of thought about consciousness and physics that gives some motivation for the hypothesis that conscious observers deviate—perhaps only very subtly and slightly—from quantum dynamics. Although it is hard to know just how much credence to give this line of thought, it does add motivation for a stronger and more comprehensive programme of quantum experiments involving quantum observers.

This is a Leap — a popular science article on quantum research written by scientists and reviewed by teenagers — published in Quantum Views.

By Eric Copenhaver (Department of Physics, University of California, Berkeley, California 94720, USA).

Published:2018-07-04, volume 2, page 6
Doi:https://doi.org/10.22331/qv-2018-07-04-6
Citation:Quantum Views 2, 6 (2018)
print page

Rocks and waves

I used to go camping with my family. We always chose to camp close to the water. I didn’t like reading when I was young, so I always got bored when my parents sat down to read books by the tent. Instead, I would go to the water’s edge and throw rocks into the water. I tried to skip flat rocks on the water for as many bounces as I could. I also liked to drop two rocks beside each other and watch the waves overlap.

Have you ever dropped two rocks onto a pond at the same time? When they hit, each rock sends out waves, ripples in the water that interfere with each other into a pretty pattern as they criss-cross. But which rock hit the water first? After all, your two rocks probably don’t weigh exactly the same amount. You can even try the experiment now. You don’t even have to drop rocks into water. You could take two different coins (like a 5c and a 50c) and try really hard to see which one hits the ground first. Can you tell which hit first? Even if you think you know, consider all the tiny things that could influence your careful measurement. For example, are you sure that you dropped the two coins at the exact same time?


Galileo, now over 400 years ago, thought he knew just what would happen in your experiment. He even disagreed with Aristotle, one of the most famous thinkers of all time. He was so confident that he marched up to the top of the famous Leaning Tower of Pisa to prove himself right. With a crowd below, he dropped two balls of different sizes from the same height at the same time. What do you think happened? To the crowd’s amazement, both balls hit at the same time. This would come to be called Einstein’s Equivalence Principle: gravity will pull the same on any two objects regardless of their material or size. That’s even true for a bowling ball and a feather, as long as you go into a vacuum chamber to remove the air resistance that normally holds the feather back. In a vacuum, a bowling ball and a feather truly do hit at the same time, which is hard to even picture. Check out Dr. Brian Cox’s video of that experiment to see for yourself.

But ​do​ they hit at exactly the same time? We physicists have reasons to think that the two objects might hit the ground at very slightly different times. One reason could be that there might be a force in the universe that affects one object more than the other, a force we have yet to observe. By looking closely for differences in how gravity pulls on each object, we perform a test of the Equivalence Principle. The differences we might observe in Equivalence Principle tests are really small, though. For example, if you were to drop two objects from your outstretched hands, you would need to be able to tell which one hit first within a size smaller than a single atom! And an atom is a super tiny particle. They are so small that there are about 100,000 of them across the width of one of the hairs on your head. (To imagine how big that number is, if you lined up 100,000 people shoulder-to-shoulder, they would stretch all the way from Sydney into Blue Mountains National Park. That’s how many atoms are across a single hair!) If we want to measure which object hits first ​that​ well, we need to be more precise than just dropping two rocks and watching them by eye. We need some tricky methods.

 

Soccer and superpositions

We have just the trick in atomic physics: something called a “superposition”. An atom is such a tiny particle that it follows the unusual rules of quantum physics. One interesting property of quantum particles is that they don’t have to be in just one position at a time. That is very different from the rules that you and I must follow. We are always in one position at any one time. You can either be in a state of sitting in the kitchen or in a state of playing soccer outside, but of course you can’t be in ​both​ positions at once. A quantum particle, on the other hand, can be in an unfamiliar state called a ​super​position. In a superposition, it’s like one atom is in two different places at one time; it feels effects from both of those places at the same time.

Let’s try to imagine what it would be like for you to be in a superposition. Let’s say that you are a messy eater and if you were sitting in the kitchen at 9:00 in the morning eating breakfast, you’d get milk on your lip and a brown chocolate smear on your cheek. If you were instead playing soccer at 9:00, let’s say you would get a green grass stain on both knees. In either case, you agree to be on the doorstep at 9:01. If you can be in only one position at once, you would arrive at the doorstep at 9:01 either with two stained knees or with a milk on your lip and a chocolate smear on your cheek. If you could be in a superposition, on the other hand, you might arrive at the doorstep at 9:01 with just milk on your lip and a grass stain on only one of your knees. It’s as if you were partially in both positions. It’s as if you were in the kitchen and playing soccer at the very same time. It’s as if you were in a ​super​position. Unfortunately, you can’t really be in two places at once. You and I can’t be in a superposition because we are made of such a huge number of quantum particles, like a billion billion billion atoms! All of those billion billion billion atoms would have to be in the very same superposition at the same time, which is very, very unlikely.

Note:  There are some details of superpositions this analogy does not fully capture. In my opinion, this example still gets to the heart of what is unfamiliar and interesting about superpositions. I admit, however, that many physicists have very strong and very different opinions on fair ways to talk about superpositions.

 

Probing gravity with atoms

In my lab and labs like mine, we actually do put an atom into a superposition [1,2]. We do this by hitting the atom with a carefully tuned laser. The laser puts the atom into a superposition of two states: (1) ignoring the laser and (2) feeling the laser and getting kicked upward by it. At this point in time, the atom is traveling two paths, one kicked by the laser and one that was not kicked. The atom’s paths are bent by gravity, just like a soccer ball you kick upward. If there was no gravity, the soccer ball would continue on straight forever. Since gravity bends the ball back down to Earth, we could measure how strong gravity is by measuring with a ruler how high the ball is after some time. The laser we use to kick the atom also acts like a ruler. We use the laser as a ruler to find out how gravity pulled on the atom while it flew.

So why don’t we just use soccer balls and rulers to measure gravity? That’s because an atom actually behaves like a wave and we can use waves for really precise measurements, using how waves interfere. Just like the overlapping waves sent out by the two rocks on the surface of the pond, the atom’s waves from its two separate paths interfere when they overlap. This interference pattern only develops if the atom flew on two paths at once. The laser, which acts like a ruler, prints a pattern onto the atom’s waves. That pattern printed onto the atom’s waves is the signature of gravity. The device that measures this is called an atom interferometer. In the end, an atom interferometer measures that interference pattern by taking a picture of the atom to see where it ended up.

 

 

Atom interferometers can perform a precision version of Galileo’s drop test. They can precisely measure how gravity acts on two separate objects. In this case, the two objects are two different kinds of atoms. Both a team in Wuhan, China, and another in Stanford, California, use different isotopes of a rare element called rubidium. Isotopes are different kinds of the same atomic element that weigh different amounts. While China currently holds the world record for precision, they have yet to see signs of a new force. The Stanford team has plans to do 10,000 times better.

With the tricks of quantum physics and the precision of an atom interferometer, we may begin to see the effects of new forces in the universe by carefully studying how gravity pulls on different objects. We are testing Einstein’s Equivalence Principle. So next time you drop two objects from the same height, I invite you to look very closely to see if one hits the ground first. If one does hit first, consider if it’s because you had a measurement error or if it’s because you observed a new force in the universe. Next time you drop two objects, maybe while on a camping trip with boring parents like I was, stop to appreciate the deep mysteries of the universe that might be hiding behind such a simple experiment.

 

 

Reviewers summary

The paper we were given was ‘Superpositions test if gravity pulls the same on two objects’. The paper talked about Einstein's Relativity Principle, which explains that two objects when dropped at the exact same time, even when their weight is different, will land at exactly the same time (if there is no air resistance). They are presenting the idea that this may be incorrect; they are saying that there may be a force that we are yet to observe that affects objects differently. To test this theory they are using the concept of a superposition to try and prove that Einstein’s Relativity Principle is incorrect. They do this by using a laser that is concentrated on an atom. This atom is in a superposition, and it is like the atom is in two places at the same time. It it both feeling the effects of the laser and ignoring it, which means that the atom is travelling two paths, one affected by the laser, and one not. The concept that the paper was examining is relevant to society and the scientific community because if they are able to prove their theory, it will completely change the way that the world looks at this scientific concept.

The paper was interesting to read as it explained a complex concept to do with quantum physics in terms that we were more or less understandable for people of out age group. We also appreciated the use of simple pictures, however we would like if it was a bit more clear and easy to understand depending on how knowledgeable the audience is, particularly the concept of a superposition. Otherwise, we very much enjoyed the paper. We rate the paper 4​.5​/​5 .

Reviewed by

Vihaan Jain, Dana Preston, Emma Schafer, Kendra Ead, and Alexander Mills
Grade 8 (ages –14)
Cherrybrook Technology High School, Sydney, Australia
The reviewers consented to publication of their names as stated

Author commentary

I am so appreciative to have gotten the chance to get feedback directly from an underserved age group. Centering the peer review process on the intended audience is an ingenious and symbiotic strategy. In the first round of reviews, I was delighted to see how well the students correctly picked out the important points of my article, and glad to have gotten tips on where the article's clarity lapsed. I think an important way to improve science communication is to reach an agreement between experts on what's accurate and the audience on what's understandable. In an ideal world, I'd add a step to the review process where an expert can critique the accuracy of content in the article.

Eric Copenhaver started college in Akron, Ohio, majoring in jazz guitar and aspiring to be a rockstar. After switching to Philosophy and taking Physics for a general science requirement, he dove headlong into Physics research. Now a Physics PhD Candidate at UC Berkeley, Eric works with Prof. Holger Müller on the first precision cold atom interferometer to use a lithium, a species with low mass and "simple" electronic structure.

► BibTeX data

► References

[1] L. Zhou, S. Long, B. Tang, X. Chen, F. Gao, W. Peng, W. Duan, J. Zhong, Z. Xiong, J. Wang, Y. Zhang, and M. Zhan, Test of Equivlance Principle at 10-8 Level by a Dual-Species Double-Diffraction Raman Atom Interferometer, Phys. Rev. Lett. 115, 013004 (2015).
https://doi.org/10.1103/PhysRevLett.115.013004

[2] Overstreet, P. Asenbaum, T. Kovachy, R. Notermans, J. Hogan, and M. Kasevich, Effective Inertial Frame in an Atom Interferometric Test of the Equivalence Principle, Phys. Rev. Lett. 120, 183604 (2018).
https://doi.org/10.1103/PhysRevLett.120.183604

This View is published in Quantum Views under the Creative Commons Attribution 4.0 International (CC BY 4.0) license. Copyright remains with the original copyright holders such as the authors or their institutions.

Share

Physics Today, Volume 71, Issue 7, Page 54-54, July 2018.

Abstract

Observables have a dual nature in both classical and quantum kinematics: they are at the same time quantities, allowing to separate states by means of their numerical values, and generators of transformations, establishing relations between different states. In this work, we show how this twofold role of observables constitutes a key feature in the conceptual analysis of classical and quantum kinematics, shedding a new light on the distinguishing feature of the quantum at the kinematical level. We first take a look at the algebraic description of both classical and quantum observables in terms of Jordan–Lie algebras and show how the two algebraic structures are the precise mathematical manifestation of the twofold role of observables. Then, we turn to the geometric reformulation of quantum kinematics in terms of Kähler manifolds. A key achievement of this reformulation is to show that the twofold role of observables is the constitutive ingredient defining what an observable is. Moreover, it points to the fact that, from the restricted point of view of the transformational role of observables, classical and quantum kinematics behave in exactly the same way. Finally, we present Landsman’s general framework of Poisson spaces with transition probability, which highlights with unmatched clarity that the crucial difference between the two kinematics lies in the way the two roles of observables are related to each other.

Abstract

Russellian monism—an influential doctrine proposed by Russell (The analysis of matter, Routledge, London, 1927/1992)—is roughly the view that physics can only ever tell us about the causal, dispositional, and structural properties of physical entities and not their categorical (or intrinsic) properties, whereas our qualia are constituted by those categorical properties. In this paper, I will discuss the relation between Russellian monism and a seminal paradox facing epiphenomenalism, the paradox of phenomenal judgment: if epiphenomenalism is true—qualia are causally inefficacious—then any judgment concerning qualia, including epiphenomenalism itself, cannot be caused by qualia. For many writers, including Hawthorne (Philos Perspect 15:361–378, 2001), Smart (J Conscious Stud 11(2):41–50, 2004), and Braddon-Mitchell and Jackson (The philosophy of mind and cognition, Blackwell, Malden, 2007), Russellian monism faces the same paradox as epiphenomenalism does. I will assess Chalmers’s (The conscious mind: in search of a fundamental theory. Oxford University Press, New York, 1996) and Seager’s (in: Beckermann A, McLaughlin BP (eds) The Oxford handbook of philosophy of mind. Oxford University Press, New York, 2009) defences of Russellian monism against the paradox, and will put forward a novel argument against those defences.

Publication date: Available online 23 June 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Alexander Franklin, Eleanor Knox
Recent discussions of emergence in physics have focussed on the use of limiting relations, and often particularly on singular or asymptotic limits. We discuss a putative example of emergence that does not fit into this narrative: the case of phonons. These quasi-particles have some claim to be emergent, not least because the way in which they relate to the underlying crystal is almost precisely analogous to the way in which quantum particles relate to the underlying quantum field theory. We offer an account of emergence which encompasses phonons, and argue both that emergence may thus be found in cases where the use of limits is not required, and that it provides a way of understanding cases that do involve limits.

Publication date: Available online 22 June 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Laura Felline
In his recent book Bananaworld. Quantum mechanics for primates, Jeff Bub revives and provides a mature version of his influential information-theoretic interpretation of Quantum Theory (QT). In this paper, I test Bub's conjecture that QT should be interpreted as a theory about information, by examining whether his information-theoretic interpretation has the resources to explain (or explain away) quantum conundrums. The discussion of Bub's theses will also serve to investigate, more in general, whether other approaches succeed in defending the claim that QT is about quantum information. First of all, I argue that Bub's interpretation of QT as a principle theory fails to fully explain quantum non-locality. Secondly, I argue that a constructive interpretation, where the quantum state is interpreted ontically as information, also fails at providing a full explanation of quantum correlations. Finally, while epistemic interpretations might succeed in this respect, I argue that such a success comes at the price of rejecting some in between the most basic scientific standards of physical theories.

Publication date: Available online 15 June 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): Vincent Lam, Christian Wüthrich
Theories of quantum gravity generically presuppose or predict that the reality underlying relativistic spacetimes they are describing is significantly non-spatiotemporal. On pain of empirical incoherence, approaches to quantum gravity must establish how relativistic spacetime emerges from their non-spatiotemporal structures. We argue that in order to secure this emergence, it is sufficient to establish that only those features of relativistic spacetimes functionally relevant in producing empirical evidence must be recovered. In order to complete this task, an account must be given of how the more fundamental structures instantiate these functional roles. We illustrate the general idea in the context of causal set theory and loop quantum gravity, two prominent approaches to quantum gravity.

Publication date: Available online 18 June 2018
Source:Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Author(s): David Wallace
I give a fairly systematic and thorough presentation of the case for regarding black holes as thermodynamic systems in the fullest sense, aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but not presuming advanced knowledge of quantum gravity. I pay particular attention to (i) the availability in classical black hole thermodynamics of a well-defined notion of adiabatic intervention; (ii) the power of the membrane paradigm to make black hole thermodynamics precise and to extend it to local-equilibrium contexts; (iii) the central role of Hawking radiation in permitting black holes to be in thermal contact with one another; (iv) the wide range of routes by which Hawking radiation can be derived and its back-reaction on the black hole calculated; (v) the interpretation of Hawking radiation close to the black hole as a gravitationally bound thermal atmosphere. In an appendix I discuss recent criticisms of black hole thermodynamics by Dougherty and Callender. This paper confines its attention to the thermodynamics of black holes; a sequel will consider their statistical mechanics.

Author(s): Emmanuel Lassalle, Caroline Champenois, Brian Stout, Vincent Debierre, and Thomas Durt

Frequent measurements can modify the decay of an unstable quantum state with respect to the free dynamics given by Fermi's golden rule. In a landmark article [A. G. Kofman and G. Kurizki, Nature (London) 405, 546 (2000)], Kofman and Kurizki concluded that in quantum decay processes, acceleration of ...


[Phys. Rev. A 97, 062122] Published Fri Jun 22, 2018

Abstract

In this article, we demonstrate a sense in which the one-particle quantum mechanics (OPQM) and the classical electromagnetic four-potential arise from the quantum field theory (QFT). In addition, the classical Maxwell equations are derived from the QFT scattering process, while both classical electromagnetic fields and potentials serve as mathematical tools to approximate the interactions among elementary particles described by QFT physics. Furthermore, a plausible interpretation of the Aharonov–Bohm (AB) effect is raised within the QFT framework. We provide a quantum treatment of the source of electromagnetic potentials and argue that the underlying mechanism in the AB effect can be understood via interactions among electrons described by QFT theory where the interactions are mediated by virtual photons.

American Journal of Physics, Volume 86, Issue 7, Page 510-517, July 2018.
It is currently believed that there is no experimental evidence on possibly quantum features of gravity or gravity-motivated modifications of quantum mechanics. Here we show that single-atom interference experiments achieving large spatial superpositions can rule out a framework where the Newtonian gravitational interaction is fundamentally classical in the information-theoretic sense: it cannot convey entanglement. Specifically, in this framework gravity acts pairwise between massive particles via classical channels, which effectively induce approximately Newtonian forces between the masses. The experiments indicate that if gravity does reduce to the pairwise Newtonian interaction between atoms at low energies, this interaction cannot arise from the exchange of just classical information, and in principle has the capacity to create entanglement. We clarify that, contrary to current belief, the classical-channel description of gravity differs from the model of Diosi and Penrose,...

Author(s): Eugenio Roldán, Johannes Kofler, and Carlos Navarrete-Benlloch

According to the world view of macrorealism, the properties of a given system exist prior to and independent of measurement, which is incompatible with quantum mechanics. Leggett and Garg put forward a practical criterion capable of identifying violations of macrorealism, and so far experiments perf...


[Phys. Rev. A 97, 062117] Published Mon Jun 18, 2018

Author(s): Yuan Yuan, Zhibo Hou, Kang-Da Wu, Guo-Yong Xiang, Chuan-Feng Li, and Guang-Can Guo

When a photon passes through an interferometer, quantum mechanics does not provide a clear answer as to its past. Quantum retrodiction is a quantitative theory, which endeavors to make statements about the past of a system based on present knowledge. Quantum retrodiction may be used to analyze the p...


[Phys. Rev. A 97, 062115] Published Fri Jun 15, 2018

Author(s): Dong-Ling Deng

Machine learning, the core of artificial intelligence and big data science, is one of today’s most rapidly growing interdisciplinary fields. Recently, machine learning tools and techniques have been adopted to tackle intricate quantum many-body problems. In this Letter, we introduce machine learning...


[Phys. Rev. Lett. 120, 240402] Published Thu Jun 14, 2018

Author(s): Raam Uzdin and Saar Rahav

The second law of thermodynamics can be described using the Clausius inequality, the main link between classical and quantum thermodynamics. A new thermodynamic framework addresses long-standing limitations of this inequality and reveals new bounds relevant to quantum technology experiments.


[Phys. Rev. X 8, 021064] Published Tue Jun 12, 2018

Abstract

Recently it was shown that certain fluid-mechanical ‘pilot-wave’ systems can strikingly mimic a range of quantum properties, including single particle diffraction and interference, quantization of angular momentum etc. How far does this analogy go? The ultimate test of (apparent) quantumness of such systems is a Bell-test. Here the premises of the Bell inequality are re-investigated for particles accompanied by a pilot-wave, or more generally by a resonant ‘background’ field. We find that two of these premises, namely outcome independence and measurement independence, may not be generally valid when such a background is present. Under this assumption the Bell inequality is possibly (but not necessarily) violated. A class of hydrodynamic Bell experiments is proposed that could test this claim. Such a Bell test on fluid systems could provide a wealth of new insights on the different loopholes for Bell’s theorem. Finally, it is shown that certain properties of background-based theories can be illustrated in Ising spin-lattices.

Author(s): Jun Gao, Lu-Feng Qiao, Zhi-Qiang Jiao, Yue-Chi Ma, Cheng-Qiu Hu, Ruo-Jing Ren, Ai-Lin Yang, Hao Tang, Man-Hong Yung, and Xian-Min Jin

Quantum information technologies provide promising applications in communication and computation, while machine learning has become a powerful technique for extracting meaningful structures in “big data.” A crossover between quantum information and machine learning represents a new interdisciplinary...


[Phys. Rev. Lett. 120, 240501] Published Mon Jun 11, 2018

Author(s): Yu Guo, Xiao-Min Hu, Bi-Heng Liu, Yun-Feng Huang, Chuan-Feng Li, and Guang-Can Guo

Growing interest has been invested in exploring high-dimensional quantum systems, for their promising perspectives in certain quantum tasks. How to characterize a high-dimensional entanglement structure is one of the basic questions to take full advantage of it. However, it is not easy for us to cat...


[Phys. Rev. A 97, 062309] Published Thu Jun 07, 2018

Author(s): Luca Mancino, Marco Sbroscia, Emanuele Roccia, Ilaria Gianani, Valeria Cimini, Mauro Paternostro, and Marco Barbieri

The emergence of realistic properties is a key problem in understanding the quantum-to-classical transition. In this respect, measurements represent a way to interface quantum systems with the macroscopic world: these can be driven in the weak regime, where a reduced back-action can be imparted by c...


[Phys. Rev. A 97, 062108] Published Thu Jun 07, 2018

Abstract

In the following paper, the author will try to test the meaning of the transcendental approach in respect of the inner changes implied by the idea of quantum gravity. He will firstly describe the basic methodological Kant’s aim, viz. the grounding of a meta-science of physics as the a priori corpus of physical knowledge. After that, he will take into account the problematic physical and philosophical relationship between the theory of relativity and the quantum mechanics; in showing how the elementary ontological and epistemological assumptions of experience result to be changed within them, he will also show the further modifications occurred in the development of the loop quantum gravity. He will particularly focus on the tough problem of the relationship space-matter, in order to settle the decisive question about the possibility of keeping a transcendental approach in the light of quantum gravity. He will positively answer by recalling Cassirer’s theory of the invariants of experience, although he will also add some problematic issues arising from the new physical context.