Latest Papers on Quantum Foundations - Updated Daily by IJQF

Authors: Aida Ahmadzadegan, Fatemeh Lalegani, Robert B. Mann

We present a new method by which, in principle, it is possible to "see in absolute darkness", i.e., without exchanging any real quanta through quantum fields. This is possible because objects modify the mode structure of the vacuum in their vicinity. The new method probes the mode structure of the vacuum through the Unruh effect, i.e., by recording the excitation rates of quantum systems that are accelerated.

Authors: Adrian Kent (Centre for Quantum Information and Foundations, DAMTP, University of Cambridge and Perimeter Institute for Theoretical Physics)

Several versions of quantum theory assume some form of localized collapse. If measurement outcomes are indeed defined by localized collapses, then a loophole-free demonstration of Bell non-locality needs to ensure space-like separated collapses associated with the measurements of the entangled systems. This collapse locality loophole remains largely untested, with one significant exception probing Diosi's and Penrose's gravitationally induced collapse hypotheses. I describe here techniques that allow much stronger experimental tests. These apply to all the well known types of collapse postulate, including gravitationally induced collapse, spontaneous localization models and Wigner's consciousness-induced collapse.

Authors: Marijn Waaijer, Jan van Neerven

We present an analysis of the Frauchiger--Renner Gedankenexperiment from the point of view of the relational interpretation of quantum mechanics. Our analysis indicates that the paradox obtained by Frauchiger and Renner arises from a combination of allowing self-measurement and reasoning about other agent's knowledge in the past without validation by surviving records. A by-product of our analysis is an interaction-free detection scheme for the existence of records from the past.

Authors: Enrico Celeghini

The Paradigms introduced in philosophy of science one century ago are shown to be quite more satisfactory of that introduced by Galileo. This is particularly evident in the physics based on Hilbert Spaces and related mathematical structures that we apply in this paper to Quantum Mechanics and to Theory of Images. An exhaustive discussion, that include the algebraic analysis of the operators acting on them, exhibits that the Hilbert Spaces -- that have fixed dimension -- must be generalized to the Rigged Hilbert Spaces that contains right inside spaces with continuous and discrete dimensions. This is the property of Rigged Hilbert Spaces that allows a consistent formal description of the physics we are considering. Theory of Quantum Mechanics and of Images are similar and the fundamental difference between them come from the definition of measure that is outside the theory of the spaces: while in Quantum Mechanics the measure is a probabilistic action, in Images it is a classical functional.

KEYWORDS: Optics, Quantum Mechanics, Rigged Hilbert Spaces, Lie Algebras

Authors: Andrei Khrennikov

By filtering out the philosophic component we can be said that the EPR-paper was directed against the straightforward interpretation of the Heisenberg's uncertainty principle or more generally the Bohr's complementarity principle. The latter expresses contextuality of quantum measurements: dependence of measurement's output on the complete experimental arrangement. However, Bell restructured the EPR-argument against complementarity to justify nonlocal theories with hidden variables of the Bohmian mechanics' type. Then this Bell's kind of nonlocality - {\it subquantum nonlocality} - was lifted to the level of quantum theory - up to the terminology {\it "quantum nonlocality"}. The aim of this short note is to explain that Bell's test is simply a special {\it test of local incompatibility of quantum observables}, similar to interference experiments, e.g., the two-slit experiment.

Publication date: Available online 14 February 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): David Wallace

Abstract

I present in detail the case for regarding black hole thermodynamics as having a statistical-mechanical explanation in exact parallel with the statistical-mechanical explanation believed to underlie the thermodynamics of other systems. (Here I presume that black holes are indeed thermodynamic systems in the fullest sense; I review the evidence for that conclusion in the prequel to this paper.) I focus on three lines of argument: (i) zero-loop and one-loop calculations in quantum general relativity understood as a quantum field theory, using the path-integral formalism; (ii) calculations in string theory of the leading-order terms, higher-derivative corrections, and quantum corrections, in the black hole entropy formula for extremal and near-extremal black holes; (iii) recovery of the qualitative and (in some cases) quantitative structure of black hole statistical mechanics via the AdS/CFT correspondence. In each case I briefly review the content of, and arguments for, the form of quantum gravity being used (effective field theory; string theory; AdS/CFT) at a (relatively) introductory level: the paper is aimed at readers with some familiarity with thermodynamics, quantum mechanics and general relativity but does not presume advanced knowledge of quantum gravity. My conclusion is that the evidence for black hole statistical mechanics is as solid as we could reasonably expect it to be in the absence of a directly-empirically-verified theory of quantum gravity.

Authors: Chandramouli Chowdhury, Susmita Das, Surojit Dalui, Bibhas Ranjan Majhi

We re-advocated the conjecture of indistinguishability between the quantum fluctuation observed from a Rindler frame and a real thermal bath, for the case of a free massless scalar field. To clarify the robustness and how far such is admissible, in this paper, we investigate the issue from two different non-inertial observers' perspective. A detailed analysis is being done to find the observable quantities as measured by two non-inertial observers (one is Rindler and another is uniformly rotating) on the real thermal bath and Rindler frame in Minkowski spacetime. More precisely, we compare Thermal-Rindler with Rindler-Rindler and Thermal-rotating with Rindler-rotating situations. In the first model it is observed that although some of the observables are equivalent, all the components of renormalised stress-tensor are not the same. In the later model, we again find that this equivalence is not totally guaranteed. Therefore we argue that the indistinguishability between the real thermal bath and the Rindler frame may not be totally true.

Oldofredi, Andrea (2019) Some remarks on the mentalistic reformulation of the measurement problem. A reply to S. Gao. [Preprint]
Berkovitz, Joseph (2019) On de Finetti's instrumental philosophy of probability. [Preprint]
Boge, Florian J. (2018) The Best of Many Worlds, or, is Quantum Decoherence the Manifestation of a Disposition?∗. [Preprint]

Everyone knows that quantum mechanics is an odd theory, but they don’t necessarily know why. The usual story is that it’s the quantum world itself that’s odd, with its superpositions, uncertainty and entanglement (the mysterious interdependence of observed particle states). All the theory does is reflect that innate peculiarity, right?

Not really. Quantum mechanics became a strange kind of theory not with Werner Heisenberg’s famous uncertainty principle in 1927, nor when Albert Einstein and two colleagues identified (and Erwin Schrödinger named) entanglement in 1935. It happened in 1926, thanks to a proposal from the German physicist Max Born. Born suggested that the right way to interpret the wavy nature of quantum particles was as waves of probability. The wave equation presented by Schrödinger the previous year, Born said, was basically a piece of mathematical machinery for calculating the chances of observing a particular outcome in an experiment.

In other words, Born’s rule connects quantum theory to experiment. It is what makes quantum mechanics a scientific theory at all, able to make predictions that can be tested. “The Born rule is the crucial link between the abstract mathematical objects of quantum theory and the world of experience,” said Lluís Masanes of University College London.

The problem is that Born’s rule was not really more than a smart guess — there was no fundamental reason that led Born to propose it. “It was an intuition without a precise justification,” said Adán Cabello, a quantum theorist at the University of Seville in Spain. “But it worked.” And yet for the past 90 years and more, no one has been able to explain why.

Without that knowledge, it remains hard to figure out what quantum mechanics is telling us about the nature of reality. “Understanding the Born rule is important as a way to understand the picture of the world implicit in quantum theory,” said Giulio Chiribella of the University of Hong Kong, an expert on quantum foundations.

Several researchers have attempted to derive the Born rule from more fundamental principles, but none of those derivations have been widely accepted. Now Masanes and his collaborators Thomas Galley of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, and Markus Müller of the Institute for Quantum Optics and Quantum Information in Vienna have proposed a new way to pull it out of deeper axioms about quantum theory, an approach that might explain how, more generally, quantum mechanics connects to experiment through the process of measurement.

“We derive all the properties of measurements in quantum theory: what the questions are, what the answers are, and what the probability of answers occurring are,” Masanes said.

It’s a bold claim. And given that the question of what measurement means in quantum mechanics has plagued the theory since the days of Einstein and Schrödinger, it seems unlikely that this will be the last word. But the approach of Masanes and colleagues is already winning praise. “I like it a lot,” Chiribella said.

The work “is a sort of ‘cleaning’ exercise,” Cabello said — a way of ridding quantum mechanics of redundant ingredients. “And that is absolutely an important task. These redundancies are a symptom that we don’t fully understand quantum theory.”

Where the Puzzle Is

Schrödinger wrote down his equation in 1925 as a formal description of the proposal by the French physicist Louis de Broglie the previous year that quantum particles such as electrons could behave like waves. The Schrödinger equation ascribes to a particle a wave function (denoted ψ) from which the particle’s future behavior can be predicted. The wave function is a purely mathematical expression, not directly related to anything observable.

The question, then, was how to connect it to properties that are observable. Schrödinger’s first inclination was to suppose that the amplitude of his wave function at some point in space — equivalent to the height of a water wave, say — corresponds to the density of the smeared-out quantum particle at that point.

But Born argued instead that the amplitude of the wave function is related to a probability — specifically, the probability that you will find the particle at that position if you detect it experimentally. In the lecture given for his 1954 Nobel Prize for this work, Born claimed that he had simply generalized from photons, the quantum “packets of light” that Einstein proposed in 1905. Einstein, Born said, had interpreted “the square of the optical wave amplitudes as probability density for the occurrence of photons. This concept could at once be carried over to the ψ-function.”

But this may have been a retrospective justification of a messier train of thought. For at first Born thought that it was simply the amplitude of ψ that gave this probability. He quickly decided that it was the square of the wave function, ψ2 (or, strictly speaking, the square of its modulus, or absolute value). But it was not immediately obvious which of these was right.

“Born got quantum theory to work using wire and bubble gum,” said Mateus Araújo, a quantum theorist at the University of Cologne in Germany. “It’s ugly, we don’t really know why it works, but we know that if we take it out, the theory falls apart.”

Yet the arbitrariness of the Born rule is perhaps the least odd thing about it. In most physics equations, the variables refer to objective properties of the system they are describing: the mass or velocity of bodies in Newton’s laws of motion, for instance. But according to Born, the wave function is not like this. It’s not obvious whether it says anything about the quantum entity itself — such as where it is at any moment in time. Rather, it tells us what we might see if we choose to look. It points in the wrong direction: not down toward the system being studied, but up toward the observer’s experience of it.

“What makes quantum theory puzzling is not so much the Born rule as a way of computing probabilities,” Chiribella said, “but the fact that we cannot interpret the measurements as revealing some pre-existing properties of the system.”

What’s more, the mathematical machinery for unfolding these probabilities can only be written down if you stipulate how you’re looking. If you do different measurements, you might calculate different probabilities, even though you seem to be examining the same system in both cases.

That’s why Born’s prescription for turning wave functions into measurement outcomes contains all of the reputed paradoxical nature of quantum theory: the fact that observable properties of quantum objects emerge, in a probabilistic way, from the act of measurement itself. “Born’s probability postulate is where the puzzle really is,” Cabello said.

So if we could understand where the Born rule comes from, we might finally understand what the vexed concept of measurement really means in quantum theory.

The Argument

That’s what has largely motivated efforts to explain the Born rule — rather than simply to learn and accept it. One of the most celebrated attempts, presented by the American mathematician Andrew Gleason in 1957, shows that the rule follows from some of the other components of the standard mathematical structure of quantum mechanics: In other words, it’s a tighter package than it originally seemed. All the same, Gleason’s approach assumes some key aspects of the mathematical formalism needed to connect quantum states to specific measurement outcomes.

One very different approach to deriving the Born rule draws on the controversial many-worlds interpretation of quantum mechanics. Many-worlds is an attempt to solve the puzzle of quantum measurements by assuming that, instead of selecting just one of the multiple possible outcomes, an observation realizes all of them — in different universes that split off from our own. In the late 1990s, many-worlds advocate David Deutsch asserted that apparent quantum probabilities are precisely what a rational observer would need to use to make predictions in such a scenario — an argument that can be used to derive the Born rule. Meanwhile, Lev Vaidman of Tel Aviv University in Israel, and independently Sean Carroll and Charles Sebens of the California Institute of Technology, suggested that the Born rule is the only one that assigns correct probabilities in a many-worlds multiverse during the instant after a split has occurred but before any observers have registered the outcome of the measurement. In that instant the observers do not yet know which branch of the universe they are on — but Carroll and Sebens argued that “there is a uniquely rational way to apportion credence in such cases, which leads directly to the Born Rule.”

The many-worlds picture leads to its own problems, however — not least the issue of what “probability” can mean at all if every possible outcome is definitely realized. The many-worlds interpretation “requires a radical overhaul of many fundamental concepts and intuitions,” Galley said. What’s more, some say that there is no coherent way to connect an observer before a split to the same individual afterward, and so it is logically unclear what it means for an observer to apply the Born rule to make a prediction “before the event.” For such reasons, many-worlds derivations of the Born rule are not widely accepted.

Masanes and colleagues have now set out an argument that does not require Gleason’s assumptions, let alone many universes, to derive the Born rule. While the rule is typically presented as an add-on to the basic postulates of quantum mechanics, they show that the Born rule follows from those postulates themselves once you admit that measurements generate unique outcomes. That is, if you grant the existence of quantum states, along with the “classical” experience that just one of them is actually observed, you’ve no choice but to square the wave function to connect the two. “Our result shows that not only is the Born rule a good guess, but it is the only logically consistent guess,” Masanes said.

To reach that conclusion, we just need a few basic assumptions. The first is that quantum states are formulated in the usual way: as vectors, possessing both a size and a direction. It’s not that different from saying that each place on Earth can be represented as a point assigned a longitude, latitude and altitude.

The next assumption is also a completely standard one in quantum mechanics: So long as no measurement is made on a particle, it changes in time in a way that is said to be “unitary.” Crudely speaking, this means that the changes are smooth and wavelike, and they preserve information about the particle. This is exactly the behavior that the Schrödinger equation prescribes, and it is in fact unitarity that makes measurement such a headache — because measurement is a non-unitary process, often dubbed the “collapse” of the wave function. In a measurement, only one of several potential states is observed: Information is lost.

The researchers also assume that, for a system of several parts, how you group those parts should make no difference to a measurement outcome. “This assumption is so basic that it is in some sense a precondition of any reasoning about the world,” Galley said. Suppose you have three apples. “If I say, ‘There are two apples on the right and one on the left,’ and you say, ‘There are two apples on the left and one on the right,’ then these are both valid ways of describing the apples. The fact of where we place the dividing line of left and right is a subjective choice, and these two descriptions are equally correct.”

The final assumption embraces measurement itself — but in the most minimal sense conceivable. Simply, a given measurement on a quantum system must produce a unique outcome. There’s no assumption about how that happens: how the quantum formalism must be used to predict the probabilities of the outcomes. Yet the researchers show that this process has to follow the Born rule if the postulate about uniqueness of measurement is to be satisfied. Any alternatives to the Born rule for deriving probabilities of observed outcomes from the wave function won’t satisfy the initial postulates.

The result goes further than this: It could also clear up what the measurement machinery of quantum mechanics is all about. In short, there’s a whole technical paraphernalia of requirements in that mechanism: mathematical functions called Hermitian operators that “operate on” the wave function to produce things called eigenvalues that correspond to measurement probabilities, and so on. But none of that is assumed from the outset by Masanes and colleagues. Rather, they find that, like the Born rule, all of these requirements are implicit in the basic assumptions and aren’t needed as extras.

“We just assume that there are questions, and when asked these return a single answer with some probability,” Galley said. “We then take the formalism of quantum theory and show that the only questions, answers and probabilities are the quantum ones.”

The work can’t answer the troublesome question of why measurement outcomes are unique; rather, it makes that uniqueness axiomatic, turning it into part of the very definition of a measurement. After all, Galley said, uniqueness “is required for us to be able to even begin to do science.”

However, what qualifies as a “minimal” assumption in quantum theory is rarely if ever straightforward. Araújo thinks that there may be more lurking in these assumptions than meets the eye. “They go far beyond assuming that a measurement exists and has a unique outcome,” he said. “Their most important assumption is that there is a fixed set of measurements whose probabilities are enough to completely determine a quantum state.” In other words, it’s not just a matter of saying measurements exist, but of saying that measurements — with corresponding probabilities of outcomes — are able to tell you everything you can know. That might sound reasonable, but it is not self-evidently true. In quantum theory, few things are.

So while Araújo calls the paper “great work,” he adds, “I don’t think it really explains the Born rule, though, any more than noticing that without water we die explains what water is.” And it leaves hanging another question: Why does the Born rule only specify probabilities, and not definite outcomes?

Law Without Law

The project pursued here is one that has become popular with several researchers exploring the foundations of quantum mechanics: to see whether this seemingly exotic but rather ad hoc theory can be derived from some simple assumptions that are easier to intuit. It’s a program called quantum reconstruction.

Cabello has pursued that aim too, and has suggested an explanation of the Born rule that is similar in spirit but different in detail. “I am obsessed with finding the simplest picture of the world that enforces quantum theory,” he said.

His approach starts with the challenging idea that there is in fact no underlying physical law that dictates measurement outcomes: Every outcome may take place so long as it does not violate a set of logical-consistency requirements that connect the outcome probabilities of different experiments. For example, let’s say that one experiment produces three possible outcomes (with particular probabilities), and a second independent experiment produces four possible outcomes. The combined number of possible outcomes for the two experiments is three times four, or 12 possible outcomes, which form a particular, mathematically defined set of combined possibilities.

Such a lawless reality sounds like an unlikely recipe for producing a quantitatively predictive theory like quantum mechanics. But in 1983 the American physicist John Wheeler proposed that statistical regularities in the physical world might emerge from such a situation, as they sometimes do from unplanned crowd behavior. “Everything is built higgledy-piggledy on the unpredictable outcomes of billions upon billions of elementary quantum phenomena,” Wheeler wrote. But there might be no fundamental law governing those phenomena — indeed, he argued, that was the only scenario in which we could hope to find a self-contained physical explanation, because otherwise we’re left with an infinite regression in which any fundamental equation governing behavior needs to be accounted for by some even more fundamental principle. “In contrast to the view that the universe is a machine governed by some magic equation, … the world is a self-synthesizing system,” Wheeler argued. He called this emergence of the lawlike behavior of physics “law without law.”

Cabello finds that, if measurement outcomes are constrained to obey the behaviors seen in quantum systems — where for example certain measurements can be correlated in ways that make them interdependent (entangled) — they must also be prescribed by the Born rule, even in the absence of any deeper law that dictates them.

“The Born rule turns out to be a logical constraint that should be satisfied by any reasonable theory we humans can construct for assigning probabilities when there is no law in the physical reality governing the outcomes,” Cabello said. The Born rule is then dictated merely by logic, not by any underlying physical law. “It has to be satisfied the same way as the rule that the probabilities must be between 0 and 1,” Cabello said. The Born rule itself, he said, is thus an example of Wheeler’s “law without law.”

But is it really that? Araújo thinks that Cabello’s approach doesn’t sufficiently explain the Born rule. Rather, it offers a rationale for which quantum correlations (such as those seen in entanglement) are allowed. And it doesn’t eliminate all possible laws governing them, but only those that are forbidden by the consistency principles. “Once you’ve determined which are the forbidden ones, everything that remains is allowed,” Araújo said. So it could be lawless down there in the quantum world — or there could be some other self-consistent but still law-bound principle behind what we see.

Any Possible Universe

Although the two studies pull out the Born rule from different origins, the results are not necessarily inconsistent, Cabello said: “We simply have different obsessions.” Masanes and colleagues are looking for the simplest set of axioms for constructing the operational procedures of quantum mechanics — and they find that, if measurement as we know it is possible at all, then the Born rule doesn’t need to be added in separately. There’s no specification of what kind of underlying physical reality gives rise to these axioms. But that underlying reality is exactly where Cabello starts from. “In my opinion, the really important task is figuring out which are the physical ingredients common to any universe in which quantum theory holds,” he said. And if he’s right, those ingredients lack any deep laws.

Evidently that remains to be seen: Neither of these papers will settle the matter. But what both studies have in common is that they aim to show how at least some of the recondite, highly mathematical and apparently rather arbitrary quantum formalism can be replaced with simple postulates about what the world is like. Instead of saying that “probabilities of measurement outcomes are equal to the modulus squared of the wave function,” or that “observables correspond to eigenvalues of Hermitian operators,” it’s enough to say that “measurements are unique” or that “no fundamental law governs outcomes.” It might not make quantum mechanics seem any less strange to us, but it could give us a better chance of understanding it.



show enclosure

(image/jpg)

Author(s): Yanwu Gu, Weijun Li, Michael Evans, and Berthold-Georg Englert

The data of four recent experiments—conducted in Delft, Vienna, Boulder, and Munich with the aim of refuting nonquantum hidden-variables alternatives to the quantum-mechanical description—are evaluated from a Bayesian perspective of what constitutes evidence in statistical data. We find that each of...


[Phys. Rev. A 99, 022112] Published Wed Feb 13, 2019

Quantum 'spookiness' explained

Quantum 'spookiness' explained, Published online: 13 February 2019; doi:10.1038/d41586-019-00312-9

Quantum 'spookiness' explained
Chen, Eddy Keming (2019) Time's Arrow in a Quantum Universe: On the Status of Statistical Mechanical Probabilities. [Preprint]

Author(s): Zhen-Peng Xu and Adán Cabello

Measurement incompatibility is the most basic resource that distinguishes quantum from classical physics. Contextuality is the critical resource behind the power of some models of quantum computation and is also a necessary ingredient for many applications in quantum information. A fundamental probl...


[Phys. Rev. A 99, 020103(R)] Published Tue Feb 12, 2019

Lazarovici, Dustin (2019) On the measurement process in Bohmian mechanics (with a reply to Gao). [Preprint]

Authors: Shahabeddin Mostafanazhad Aslmarand, Warner A. Miller, Paul M. Alsing, Verinder S. Rana

Quantum correlation is a key resource for quantum computation. It is recognized that a scalable measure for quantum correlation is important for the field of quantum information processing. We propose a new measure of correlation for a quantum network. Our measure is based on Wheeler's It-from-Bit framework instead of It-from-Qubit. Our measure is an emergent geometric representation of quantum entanglement that arises from a sequence of local measurements, i.e. in Wheeler's words from "irreversible acts of amplification." We can form a joint probability distribution from the binary outcomes of these repeated measurements over an ensemble of identical quantum states without internal inconsistencies. In this sense we have an emergent information geometry based within the space of measurements. We use a well-known information geometry-based distance that applies for a string of measurement outcomes, and a novel generalization of these lengths to areas, volumes and higher-dimensional volumes. Using these areas and volumes we define a curvature measure. Our curvature measure is a monotonic function for correlation. and is an indicator of the degree of correlation in the quantum network. This novel approach enables us to capture a degree of correlation of quantum system, and with well chosen measurements, could provide a scalable measure for identifying entanglement resources for higher dimensional qudit networks.

Authors: Justine Tarrant, Geoff Beck, Sergio Colafrancesco

Planck stars form when a collapsing shell of matter within a black hole reaches the Planck density, roughly equivalent to the mass being compressed into a volumetric size near that of the proton, and rebounds outwards. These planck stars have been considered as accounting for both fast radio bursts and short gamma ray bursts, whilst offering a comparatively low energy perspective onto quantum gravity. The observation of such an event would require black hole masses much smaller than a solar mass, which could be provided by primordial black hole dark matter models. We discuss the low energy isotropic background emissions produced by decaying primordial black holes at all epochs and derive constraints from the spectrum of the extragalactic background light. We find that, in order to avoid exceeding known extragalactic background light emissions, we must restrict the total energy emitted at low frequencies by a planck star exploding in the present epoch to be less than $10^{13}$ erg or restrict the primordial black hole population far below any existing limits. This casts doubt on whether exploding planck stars could actually account for fast radio bursts, as they are speculated to in the literature.

ROVELLI, Carlo (2019) Natural discrete differential calculus in physics. [Preprint]
Butterfield, Jeremy (2018) Lost in Math? [Preprint]
Butterfield, Jeremy and Marsh, Brendan (2019) Non-locality and quasiclassical reality in Kent's formulation of relativistic quantum theory. [Preprint]
Wuthrich, Christian (2017) Quantum gravity from general relativity. [Preprint]
Brungardt, John G. (2019) World Enough and Form: Why Cosmology Needs Hylomorphism. Synthese. pp. 1-33. ISSN 1573-0964
de Ronde, Christian and Massri, Cesar (2019) Against the Tyranny of 'Pure States' in Quantum Theory. [Preprint]
Gao, Shan (2019) A contradiction in Bohm's theory. [Preprint]
Byrd, Nick (2019) What We Can (And Can’t) Infer About Implicit Bias From Debiasing Experiments. [Preprint]

Author(s): Małgorzata Bartkiewicz, Andrzej Grudka, Ryszard Horodecki, Justyna Łodyga, and Jacek Wychowaniec

One out of many emerging implications from solutions of Einstein's general relativity equations are closed timelike curves (CTCs), which are trajectories through space-time that allow for time travel to the past without exceeding the speed of light. Two main quantum models of computation with the us...


[Phys. Rev. A 99, 022304] Published Tue Feb 05, 2019

Priyamvada Natarajan is a leader in the effort to map the universe’s invisible contents, which is to say, almost everything. Ninety-five percent of all stuff takes mysterious, nonluminous forms dubbed dark matter and dark energy, which betray their presence in the cosmos by attracting and repulsing, respectively, the 5 percent of stuff that’s visible. Even that 5 percent is increasingly slipping out of sight, as stars and gas tumble into gargantuan black holes at the centers of galaxies.

Natarajan, a theoretical astrophysicist and professor at Yale University, creates maps of where dark matter is clumped and how dark energy stretches space. She also models the growth of those supermassive black holes and helped develop the leading theory of their formation in the early universe — a theory that will be tested by telescope observations in the near future.

In a recent interview in her office in New Haven, Natarajan said she’s drawn to maps because they encode what’s known, and what’s unknown, at a point in time. “These abysses you know nothing about — that’s also written out in a map; they tell you, ‘terra incognita,’” she said, pulling up j-pegs of old maps she likes to visit across campus at the Beinecke Rare Book & Manuscript Library.

She has been mapping and modeling modern cosmology’s terra incognita, especially dark matter and supermassive black holes, since her student days at Cambridge University in the 1990s. After playing a pioneering role in those pursuits, she landed a professorship at Yale in her late 20s and has been based there ever since. She is also a professor at the University of Copenhagen and an honorary one at the University of Delhi. A member of the Royal Astronomical Society, American Physical Society and the Explorer’s Club, and a published poet, Natarajan has received many fellowships and awards, including several from her home country, India.

We talked for three hours in her Yale office, surrounded by reproductions of abstract art — Rothko, Matisse, Louise Bourgeois — which she calls her “major, major love.” An edited and condensed version of our conversation follows.

Your 2017 book, Mapping the Heavens, narrates the quest to map the cosmos from ancient to modern times. I gather that your own path began with a map.

Yes. I grew up in India. My parents are academics, so I grew up around books. I loved science and math. But I was also interested in history, writing, poetry, art. My parents provided me with advantages that made a big difference. My dad bought me a Commodore 64 before people in India knew what a personal computer was. My parents had also given me a telescope and microscope; I picked the telescope. I was part of an amateur astronomy club, and when I was 15, the director of the Nehru Planetarium, the astrophysicist Nirupama Raghavan, came to speak to us. I told her I had a computer and asked if I could help with her research. So she said, “What are you interested in?” The thing that I have always been crazy about is maps. Celestial, terrestrial, any kind of map.

So she said, “Why don’t you write a program to plot out the sky map that you see in the newspaper every month, produced by the astronomical society of India?” So I went home. It was a really hard problem; she told me later she didn’t think I would come back. I had to teach myself spherical geometry. I worked like crazy. Six weeks later, I figured it out; my program worked. So I went to see her. I showed her the sky maps I had made, and basically her jaw dropped. Then she said, “OK, this is very impressive, but what if you go to Boston to study, and you want to look at the stars and planets?” I said, “Oh, I sorted that out. The way I’ve written the program is you can put the latitude and longitude of anywhere on Earth.” At that point she had a conversion moment; she became incredibly supportive of me. I would do computations for her. So I got my first taste of research.

You received a scholarship to MIT. Did you know what you wanted to study?

I wanted to be a physicist, no question. But I wasn’t sure which area. I finished all the graduate courses in physics by the time I was an undergraduate, and I did a math major as well. So I was ready to do research at that point. I started working with Alan Guth on thermodynamics of the early universe. I was thinking maybe particle physics. I didn’t want to do string theory because I wanted to make a connection with the real world. So I applied to graduate school in physics, but I was undecided, so I deferred.

You then started a Ph.D. in the philosophy of science, which you wanted to pursue in conjunction with physics. Why?

I wanted to be the kind of person who does cutting-edge science and also thinks deeply about the process of science. I was going to be both the insider and outsider. This is a theme that has haunted my life, always, in every which way — personal, emotional, psychological, scientific, intellectual. This feeling of being an outsider and occasionally feeling like an insider. The conflict.

Why do you think you feel that way?

Being female; being brown; being interested in physics; being highly intellectual in this particular way. And because of the cultural transitions that I had made. One of my professors, Evelyn Fox Keller, put it very nicely — that it took me a very long time to find my tribe. I didn’t feel lost, but I felt alone. I still feel it in a lot of ways. There are situations in which I really feel that I don’t belong. And re-entering the same setting at another time, I have felt completely at ease, and I feel like I belong. It’s very weird. I think it’s a deeply psychological thing. I left home very young. And there was this real need to want to be connected.

How would you describe your tribe?

People who have many serious interests that they intellectually engage in. People who are not solely careerist. My game — cosmology, dark matter, black holes — has a very particular competitive culture that I don’t fit into. But thankfully this is the thing that time does. If you stick with it, you do good work, then you don’t have to conform; you can eventually just be who you are. I always felt that I had a very special clarity of mind because I knew what I wanted out of what I was learning. I want a certain depth of understanding that comes with people who think mathematically.

What do you want to understand?

I am attracted to certain very particular kinds of abstraction. We all have our pet things that somehow we gravitate toward, and for me it’s always been these invisible entities: black holes, dark matter, these things that are almost at the limits of our knowledge. All physics breaks down when you reach the edge of a black hole. So it sort of seduces me. These are the things that really push us as scientists: How can we model them? How do we think about them? And as we know more: How can we refine our model? When you improve it, does it mean the thing you had before was wrong? How is a model related to reality? That was going to be the theme of my philosophy Ph.D.: How do you build knowledge?

You decided not to continue with philosophy once you began your Ph.D. in astrophysics with Martin Rees at Cambridge. What drew you to him?

What he did was model. He made models for black holes, gamma-ray bursts, galaxies. Anywhere there was an exciting observation that needed an explanation, you could just build a model. So that was the perfect intellectual match for me.

At that time the observational data of supermassive black holes was just starting to trickle in. So we were building storylines for how to grow a black hole and so on, and there weren’t too many fences. There was a little bit of observational guidance, but you could run a little wilder with the models.

Black holes maybe weren’t as conceptualized as they are now.

They were conceptualized and thought about, but their role was not seen as so central. Now we know that they seem to really be shaping galaxies. So they’ve moved to center stage.

How do you think about black holes? 

They’re crazy objects, no question; they’re bizarre. There are three ways to think about them, and you can choose. One way is that stars, when they exhaust their fuel, have a violent end, and they leave behind — like a dead nuclear reactor — these black holes. So black holes are compact inner parts of the stars that have gravitationally collapsed and have become unbelievably dense. There’s no analogue. It’s not lead; it’s nothing we can think of. Then these stellar remnants build up. Gas falls in. They become bigger.

Another way is to think about the fact that not even light can escape from a black hole. If you want to launch a rocket that has to escape the gravitational grip of the Earth, we have to shoot it out at 11.6 kilometers per second. That’s 33 times the speed of sound, so it’s pretty fast. Now imagine a rocket going out at the speed of light, 300,000 kilometers per second, and it still can’t escape, because the gravitational grip is so strong. That’s a black hole.

The third way is if you picture space-time as a sheet, then a black hole is a pinch in that sheet. An anomaly in the shape of space.

And we’ve gradually realized that supermassive black holes are vital to galaxies?

Yes. My thesis started the work of trying to work out that relationship. What had just been observed was that there’s a correlation between the mass of the stars in the inner part of a galaxy and the central black hole that it hosts. That suggests that somehow the formation of galaxies and black holes is intertwined. We were developing the early framework for how to tie together their growth. It was very exciting, because you could build simple models that could make really powerful predictions. Then, over time, these models got more and more sophisticated because you got more data that constrained you.

I worked on that, and also on gravitational lensing — the bending of light by dark matter. The Hubble Telescope gave us these amazing lensing images that you could analyze. You could indirectly map the presence of dark matter from the amount of stretching that you see in light from different galaxies. So the limits of our knowledge today, the terra incognita, is in that Hubble map. My original contribution was figuring out a mathematical mapping that would allow you to pick out the little clumps of dark matter. What we found is that the dark matter clumpiness in the universe matches extremely well with expectations for “cold” dark matter — the noninteracting kind.

At Yale, you started studying how supermassive black holes might have formed in galaxies’ centers in the first place. First explain what the puzzle is.

As we got better and better telescopes, we started finding these bright quasars powered by billion-solar-mass black holes when the universe was 10 percent of its current age. That was a huge mystery. The first stars were forming at around the same time, so you don’t have enough time for a stellar remnant, which is at most 100 times the mass of the sun, to grow to a billion times the mass of the sun.

In 2005, 2006, I wrote the first two papers with a postdoc of mine on “massive seeds.” Our paper said you can bypass the formation of a star and form a very massive seed black hole — about 10,000 to one million times the mass of the sun — to explain these quasars. The simple idea is you’re in a bathtub and you pull the plug; you see the water going into a vortex. Something very similar happens in the early universe with gas disks. An instability occurs — the equivalent of pulling up the plug — and gas siphons very rapidly into the center. What we did was to integrate these “direct-collapse black hole seeds” into the larger cosmic picture, following a population of black holes — as they form, evolve, become quasars, turn off, shine — until today.

It’s paradoxical that we have a theory, the “standard model of cosmology,” that perfectly captures the evolution of the universe, even though 95 percent of its contents are unknown. How do you explain that to people?

Suppose you are on a beach, looking at the sand dunes. They dissipate, re-form, get swallowed by the ocean. The dynamics of how the sand dunes evolve is something that you understand very well because you understand the wind and the water. But you don’t actually know what a grain of sand is made of. That’s where we are at the moment. Astrophysics can tell you the granularity of dark matter, how dark matter clumps. That gives you some understanding. It rules out candidates. It’s like saying that the grains of sand are not rocks; they’re not pebbles; they’re very fine. But what the grain is made of is going to come from particle physics experiments.

You prefer the level of sand dunes rather than the fundamental level.

Yes. That said, a mismatch between theory and observation at the astrophysical level might point the way to some fundamental ingredient that is missing in your theory. The hierarchy of what’s fundamental, secondary, what is deeper — the layers get mixed up. There are fundamental questions, but the observational data has structure on many different levels, and you have to connect phenomena that occur on all those scales. Then it’s not about a hierarchical organizing of what’s the deepest problem, what’s less so, but rather the interconnections.

Are you continuing your insider-outsider philosophical project?

That training has made me a better scientist because I’m very aware of what I’m actually doing when I’m building a model. When I’m taking a complex physical phenomenon and paring it down, I’m aware of what I have labeled as extraneous and what I have labeled as essential, and why — whether it’s intuition, training, something about the mathematics. It’s a different level of engagement with the process. There is an extra enhancement in life for people who are extremely self-aware.

I’m also very interested in watching, as an insider, my own experiences, and as an outsider, how the group of scientists operates. How we’re supposed to be open-minded, but we’re not. And what totally gives me the kicks is understanding this process of how a radically new idea gets accepted.

The direct-collapse idea is now regarded as the leading theory of supermassive black hole formation in the early universe. What’s that growing consensus based on?

There are growing little bits of evidence in support for such a picture. For instance, in a computer simulation, we start with direct-collapse black holes, populate the early universe with them, and then propagate the growth of those galaxies until today, and match them with the quasars that are seen.

Hubble’s successor, the James Webb Space Telescope, now set to launch in 2021, will peer deep enough into space and back in time to glimpse galaxies forming in the early universe. Those observations will test the direct-collapse idea, right?

Yes. We realized something totally cool: Because the direct-collapse seed started out as an outsize object, the starlight from the surrounding stars will be dim compared to the light generated by gas falling into it. And lucky for us, James Webb is perfectly pitched to see this. Our claim is that if the telescope sees any quasar during the earliest epochs of the universe, it has got to be one of these direct-collapse black holes.

It must have been heartbreaking to see its launch pushed back yet again.

Delayed gratification! In a way, I’m enjoying this period of not knowing because there’s a possibility that these massive seeds don’t exist, and that would be heartbreaking too. I’ve told myself, “OK, Priya, if they don’t find them, then you’ve still got to count yourself lucky to be born at a time when, within one lifetime, you could have learned all about black holes, come up with an original idea, made a prediction, and someone took it seriously enough and tested it. OK, it was not right. But all of this unfolding within your lifetime — think how cool that is!” So I’m trying to tell myself that even that would be a supercool outcome. But finding them would be just so awesome.



show enclosure

(image/jpg)

Noise put to use

Noise put to use, Published online: 04 February 2019; doi:10.1038/s41567-019-0442-7

Through stochastic resonance, noise-driven fluctuations make an otherwise weak periodic signal accessible. Experiments have now reported quantum stochastic resonance, which arises from intrinsic quantum fluctuations rather than external noise.
Maxwell, Nicholas (2019) Aim-Oriented Empiricism and the Metaphysics of Science. [Preprint]

Authors: Bhavya Bhatt, Manish Ram Chander, Raj Patil, Ruchira Mishra, Shlok Nahar, Tejinder P. Singh

We recall that in order to obtain the classical limit of quantum mechanics one needs to take the $\hbar\rightarrow 0$ limit. In addition, one also needs an explanation for the absence of macroscopic quantum superposition of position states. One possible explanation for the latter is the Ghirardi-Rimini-Weber (GRW) model of spontaneous localisation. Here we describe how spontaneous localisation modifies the path integral formulation of density matrix evolution in quantum mechanics. (Such a formulation has been derived earlier by Pearle and Soucek; we provide two new derivations of their result). We then show how the von Neumann equation and the Liouville equation for the density matrix arise in the quantum and classical limit, respectively, from the GRW path integral. Thus we provide a rigorous demonstration of the quantum to classical transition.

Authors: Christian Borghesi

In this paper we suggest a macroscopic toy system in which a potential-like energy is generated by a non-uniform pulsation of the medium (i.e. pulsation of transverse standing oscillations that the elastic medium of the system tends to support at each point). This system is inspired by walking droplets experiments with submerged barriers. We first show that a Poincar\'e-Lorentz covariant formalization of the system causes inconsistency and contradiction. The contradiction is solved by using a general covariant formulation and by assuming a relation between the metric associated with the elastic medium and the pulsation of the medium. (Calculations are performed in a Newtonian-like metric, constant in time). We find ($i$) an effective Schr\"odinger equation with external potential, ($ii$) an effective de Broglie-Bohm guidance formula and ($iii$) an energy of the `particle' which has a direct counterpart in general relativity as well as in quantum mechanics. We analyze the wave and the `particle' in an effective free fall and with a harmonic potential. This potential-like energy is an effective gravitational potential, rooted in the pulsation of the medium at each point. The latter, also conceivable as a natural clock, makes easy to understand why proper time varies from place to place.

Authors: Veronika Baumann, Časlav Brukner

In a joint paper Jeff Bub and Itamar Pitowski argued that the quantum state represents `the credence function of a rational agent [...] who is updating probabilities on the basis of events that occur'. In the famous thought experiment designed by Wigner, Wigner's friend performs a measurement in an isolated laboratory which in turn is measured by Wigner. Here we consider Wigner's friend as a rational agent and ask what her `credence function' is. We find experimental situations in which the friend can convince herself that updating the probabilities on the basis of events that happen solely inside her laboratory is not rational and that conditioning needs to be extended to the information that is available outside of her laboratory. Since the latter can be transmitted into her laboratory, we conclude that the friend is entitled to employ Wigner's perspective on quantum theory when making predictions about the measurements performed on the entire laboratory, in addition to her own perspective, when making predictions about the measurements performed inside the laboratory.

Author(s): Saronath Halder, Manik Banik, Sristy Agrawal, and Somshubhro Bandyopadhyay

Quantum nonlocality is usually associated with entangled states by their violations of Bell-type inequalities. However, even unentangled systems, whose parts may have been prepared separately, can show nonlocal properties. In particular, a set of product states is said to exhibit “quantum nonlocalit...


[Phys. Rev. Lett. 122, 040403] Published Fri Feb 01, 2019

Physics Today, Volume 72, Issue 2, Page 50-51, February 2019.
More than a century after the birth of quantum mechanics, physicists and philosophers are still debating what a “measurement” really means.

Abstract

In this paper, we describe four broad ‘meta-methods’ (as we shall call them) employed in scientific and philosophical research of qualia. These are the theory-centred metamethod, the property-centred meta-method, the argument-centred meta-method, and the event-centred meta-method. Broadly speaking, the theory-centred meta-method is interested in the role of qualia as some theoretical entities picked out by our folk psychological theories; the property-centred meta-method is interested in some metaphysical properties of qualia that we immediately observe through introspection (e.g., intrinsic, non-causal, ineffable); the argument-centred meta-method is interested in the role of qualia in some arguments for non-physicalism; the event-centred metamethod is interested in the role of qualia as some natural events whose nature is hidden and must be uncovered empirically. We will argue that the event-centred metamethod is the most promising route to a comprehensive scientific conception of qualia because of the flexibility of ontological and methodological assumptions it can provide. We also reveal the hidden influences of the different meta-methods and in doing so show why consideration of meta-methods has value for the study of consciousness.

Can quantum ideas explain chemistry’s greatest icon?

Can quantum ideas explain chemistry’s greatest icon?, Published online: 30 January 2019; doi:10.1038/d41586-019-00286-8

Simplistic assumptions about the periodic table lead us astray, warns Eric Scerri.

Abstract

The term ‘locality’ is used in different contexts with different meanings. There have been claims that relational quantum mechanics is local, but it is not clear then how it accounts for the effects that go under the usual name of quantum non-locality. The present article shows that the failure of ‘locality’ in the sense of Bell, once interpreted in the relational framework, reduces to the existence of a common cause in an indeterministic context. In particular, there is no need to appeal to a mysterious space-like influence to understand it.

Abstract

Gao (Synthese, 2017. https://doi.org/10.1007/s11229-017-1476-y) presents a new mentalistic reformulation of the well-known measurement problem affecting the standard formulation of quantum mechanics. According to this author, it is essentially a determinate-experience problem, namely a problem about the compatibility between the linearity of the Schrödinger’s equation, the fundamental law of quantum theory, and definite experiences perceived by conscious observers. In this essay I aim to clarify (i) that the well-known measurement problem is a mathematical consequence of quantum theory’s formalism, and that (ii) its mentalistic variant does not grasp the relevant causes which are responsible for this puzzling issue. The first part of this paper will be concluded claiming that the “physical” formulation of the measurement problem cannot be reduced to its mentalistic version. In the second part of this work it will be shown that, contrary to the case of quantum mechanics, Bohmian mechanics and GRW theories provide clear explanations of the physical processes responsible for the definite localization of macroscopic objects and, consequently, for well-defined perceptions of measurement outcomes by conscious observers. More precisely, the macro-objectification of states of experimental devices is obtained exclusively in virtue of their clear ontologies and dynamical laws without any intervention of human observers. Hence, it will be argued that in these theoretical frameworks the measurement problem and the determinate-experience problem are logically distinct issues.

Quantum Space

-

Physics

on 2019-1-22 12:00am GMT
Author: Jim Baggott
ISBN: 9780198809111
Binding: Hardcover
Publication Date: 22 January 2019
Price: $24.95

Author(s): Yakir Aharonov and Lev Vaidman

The possibility to communicate between spatially separated regions, without even a single photon passing between the two parties, is an amazing quantum phenomenon. The possibility of transmitting one value of a bit in such a way, the interaction-free measurement, has been known for quarter of a cent...


[Phys. Rev. A 99, 010103(R)] Published Fri Jan 18, 2019

American Journal of Physics, Volume 87, Issue 2, Page 95-101, February 2019.
We introduce a game to illustrate the principles of quantum mechanics using a qubit (or spin-first) approach, where students can experience and discover its puzzling features first-hand. Students play the role of particles and scientists. Scientists unravel underlying rules and properties by collecting and analysing data that are generated by observing particles that act according to the given rules. We show how this allows one to illustrate quantum states and their stochastic behavior under measurements as well as quantum entanglement. In addition, we use this approach to illustrate and discuss decoherence and a modern application of quantum features, namely, quantum cryptography. We have tested the game in the class and report on the results that we obtained.
Ambitious new theories dreamed up to explain reality have led us nowhere. Meet the hardcore physicists trying to think their way out of this black hole

Author(s): Davide Girolami

A key computational resource to prepare a quantum system in a target state, an important subroutine of quantum information protocols, is identified and quantified.


[Phys. Rev. Lett. 122, 010505] Published Fri Jan 11, 2019

Publication date: Available online 28 December 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Matthias Egg

Abstract

This paper critically assesses the proposal that scientific realists do not need to search for a solution of the measurement problem in quantum mechanics, but should instead dismiss the problem as ill-posed. James Ladyman and Don Ross have sought to support this proposal with arguments drawn from their naturalized metaphysics and from a Bohr-inspired approach to quantum mechanics. I show that the first class of arguments is unsuccessful, because formulating the measurement problem does not depend on the metaphysical commitments which are undermined by ontic structural realism, rainforest realism, or naturalism in general. The second class of arguments is problematic due to its refusal to provide an analysis of the term “measurement”. It turns out that the proposed dissolution of the measurement problem is in conflict not only with traditional forms of scientific realism but even with the rather minimal realism that Ladyman and Ross themselves defend. The paper concludes with a brief discussion of two related proposals: Healey's pragmatist approach and Bub's information-theoretic interpretation.

Volume 5, Issue 1, pages 11-15

Claus Kiefer [Show Biography]

Claus Kiefer studied physics and astronomy at the Universities of Heidelberg and Vienna. He earned his PhD from Heidelberg University in 1988 under the supervision of H.-Dieter Zeh. He has held positions at the Universities of Heidelberg, Zurich, and Freiburg, and is a professor at the University of Cologne since 2001. His main interests are quantum gravity, cosmology, black holes, and the foundations of quantum theory. He has published several books including the monograph “Quantum Gravity” (third edition: Oxford 2012). He is a member of The Foundational Questions Institute, USA, since 2006.

Full Text Download (81k)

Volume 5, Issue 1, pages 1-10

Valia Allori[Show Biography]

Valia Allori has studied physics and philosophy first in Italy, her home country, and then in the United States. She is currently Associate Professor in the Philosophy Department at Northern Illinois University where she works in the foundations of physics, with special focus on quantum mechanics. Her main concern has always been to understand what the world is really like, and how we can use our best physical theory to answer such general metaphysical questions. In her physics doctoral dissertation from University of Genova (Italy), she discussed the classical limit of quantum mechanics, to analyze the connections between the quantum and the classical theories. What does it mean that a theory, in a certain approximation, reduces to another? Is the classical explanation of macroscopic phenomena essentially different from the one provided by quantum mechanics? In her philosophy doctoral dissertation from Rutgers she turned to more general questions that involve the structure of fundamental physical theories, the metaphysical status and the epistemological role of the theoretical entities used in these theories. Do all fundamental physical theories have the very same structure, contrarily to what one might think? If so, what is this telling us about the nature of explanation?

Full Text Download (539k)

Publication date: Available online 10 December 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jeremy Steeger

Abstract

I defend an analog of probabilism that characterizes rationally coherent estimates for chances. Specifically, I demonstrate the following accuracy-dominance result for stochastic theories in the C*-algebraic framework: supposing an assignment of chance values is possible if and only if it is given by a pure state on a given algebra, your estimates for chances avoid accuracy-dominance if and only if they are given by a state on that algebra. When your estimates avoid accuracy-dominance (roughly: when you cannot guarantee that other estimates would be more accurate), I say that they are sufficiently coherent. In formal epistemology and quantum foundations, the notion of rational coherence that gets more attention requires that you never allow for a sure loss (or ‘Dutch book’) in a given sort of betting game; I call this notion full coherence. I characterize when these two notions of rational coherence align, and I show that there is a quantum state giving estimates that are sufficiently coherent, but not fully coherent.

Publication date: Available online 11 December 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Geoff Beck

Abstract

This work outlines the novel application of the empirical analysis of causation, presented by Kutach, to the study of information theory and its role in physics. The central thesis of this paper is that causation and information are identical functional tools for distinguishing controllable correlations, and that this leads to a consistent view, not only of information theory, but also of statistical physics and quantum information. This approach comes without the metaphysical baggage of declaring information a fundamental ingredient in physical reality and exorcises many of the otherwise puzzling problems that arise from this view-point, particularly obviating the problem of ‘excess baggage’ in quantum mechanics. This solution is achieved via a separation between information carrying causal correlations of a single qubit and the bulk of its state space.

Publication date: Available online 14 December 2018

Source: Physics Letters A

Author(s): Gregory S. Duane

Abstract

A classical origin for the Bohmian quantum potential, as that potential term arises in the quantum mechanical treatment of black holes and Einstein–Rosen (ER) bridges, can be based on 4th-order extensions of Einstein's equations. The required 4th-order extension of general relativity is given by adding quadratic curvature terms with coefficients that maintain a fixed ratio, as their magnitudes approach zero, with classical general relativity as a singular limit. If entangled particles are connected by a Planck-width ER bridge, as conjectured by Maldacena and Susskind, then a connection by a traversable Planck-scale wormhole, allowed in 4th-order gravity, describes such entanglement in the ontological interpretation. It is hypothesized that higher-derivative gravity can account for the nonlocal part of the quantum potential generally.

Graphical abstract

Graphical abstract for this article

Author(s): Markus P. Müller

Recent research has hinted at the need for a family of thermodynamic second laws at the quantum scale, but a new analysis shows this isn’t always the case.


[Phys. Rev. X 8, 041051] Published Wed Dec 19, 2018

Abstract

We construct a local \(\psi \) -epistemic hidden-variable model of Bell correlations by a retrocausal adaptation of the originally superdeterministic model given by Brans. In our model, for a pair of particles the joint quantum state \(|\psi _e(t)\rangle \) as determined by preparation is epistemic. The model also assigns to the pair of particles a factorisable joint quantum state \(|\psi _o(t)\rangle \) which is different from the prepared quantum state \(|\psi _e(t)\rangle \) and has an ontic status. The ontic state of a single particle consists of two parts. First, a single particle ontic quantum state \(\chi (\vec {x},t)|i\rangle \) , where \(\chi (\vec {x},t)\) is a 3-space wavepacket and \(|i\rangle \) is a spin eigenstate of the future measurement setting. Second, a particle position in 3-space \(\vec {x}(t)\) , which evolves via a de Broglie–Bohm type guidance equation with the 3-space wavepacket \(\chi (\vec {x},t)\) acting as a local pilot wave. The joint ontic quantum state \(|\psi _o(t)\rangle \) fixes the measurement outcomes deterministically whereas the prepared quantum state \(|\psi _e(t)\rangle \) determines the distribution of the \(|\psi _o(t)\rangle \) ’s over an ensemble. Both \(|\psi _o(t)\rangle \) and \(|\psi _e(t)\rangle \) evolve via the Schrodinger equation. Our model exactly reproduces the Bell correlations for any pair of measurement settings. We also consider ‘non-equilibrium’ extensions of the model with an arbitrary distribution of hidden variables. We show that, in non-equilibrium, the model generally violates no-signalling constraints while remaining local with respect to both ontology and interaction between particles. We argue that our model shares some structural similarities with the modal class of interpretations of quantum mechanics.

Publication date: Available online 7 December 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jonathan Bain

Abstract

Intrinsic topologically ordered (ITO) condensed matter systems are claimed to exhibit two types of non-locality. The first is associated with topological properties and the second is associated with a particular type of quantum entanglement. These characteristics are supposed to allow ITO systems to encode information in the form of quantum entangled states in a topologically non-local way that protects it against local errors. This essay first clarifies the sense in which these two notions of non-locality are distinct, and then considers the extent to which they are exhibited by ITO systems. I will argue that while the claim that ITO systems exhibit topological non-locality is unproblematic, the claim that they also exhibit quantum entanglement non-locality is less clear, and this is due in part to ambiguities associated with the notion of quantum entanglement. Moreover, any argument that claims some form of "long-range" entanglement is necessary to explain topological properties is incomplete if it fails to provide a convincing reason why mechanistic explanations should be favored over structural explanations of topological phenomena.

Publication date: Available online 30 November 2018

Source: Physics Letters A

Author(s): Atul Singh Arora, Kishor Bharti, Arvind

Abstract

We construct a non-contextual hidden variable model consistent with all the kinematic predictions of quantum mechanics (QM). The famous Bell–KS theorem shows that non-contextual models which satisfy a further reasonable restriction are inconsistent with QM. In our construction, we define a weaker variant of this restriction which captures its essence while still allowing a non-contextual description of QM. This is in contrast to the contextual hidden variable toy models, such as the one by Bell, and brings out an interesting alternate way of looking at QM. The results also relate to the Bohmian model, where it is harder to pin down such features.

Abstract

The PBR theorem gives insight into how quantum mechanics describes a physical system. This paper explores PBRs’ general result and shows that it does not disallow the ensemble interpretation of quantum mechanics and maintains, as it must, the fundamentally statistical character of quantum mechanics. This is illustrated by drawing an analogy with an ideal gas. An ensemble interpretation of the Schrödinger cat experiment that does not violate the PBR conclusion is also given. The ramifications, limits, and weaknesses of the PBR assumptions, especially in light of lessons learned from Bell’s theorem, are elucidated. It is shown that, if valid, PBRs’ conclusion specifies what type of ensemble interpretations are possible. The PBR conclusion would require a more direct correspondence between the quantum state (e.g., \( \left| {\psi \rangle } \right. \) ) and the reality it describes than might otherwise be expected. A simple terminology is introduced to clarify this greater correspondence.

Author(s): Igor Marinković, Andreas Wallucks, Ralf Riedinger, Sungkun Hong, Markus Aspelmeyer, and Simon Gröblacher

Researchers have experimentally demonstrated two cornerstones of quantum physics—entanglement and Bell inequality violations—with two macroscopic mechanical resonators.


[Phys. Rev. Lett. 121, 220404] Published Thu Nov 29, 2018

Quantum 2, 108 (2018).

https://doi.org/10.22331/q-2018-11-27-108

Thermodynamics is traditionally constrained to the study of macroscopic systems whose energy fluctuations are negligible compared to their average energy. Here, we push beyond this thermodynamic limit by developing a mathematical framework to rigorously address the problem of thermodynamic transformations of finite-size systems. More formally, we analyse state interconversion under thermal operations and between arbitrary energy-incoherent states. We find precise relations between the optimal rate at which interconversion can take place and the desired infidelity of the final state when the system size is sufficiently large. These so-called second-order asymptotics provide a bridge between the extreme cases of single-shot thermodynamics and the asymptotic limit of infinitely large systems. We illustrate the utility of our results with several examples. We first show how thermodynamic cycles are affected by irreversibility due to finite-size effects. We then provide a precise expression for the gap between the distillable work and work of formation that opens away from the thermodynamic limit. Finally, we explain how the performance of a heat engine gets affected when one of the heat baths it operates between is finite. We find that while perfect work cannot generally be extracted at Carnot efficiency, there are conditions under which these finite-size effects vanish. In deriving our results we also clarify relations between different notions of approximate majorisation.

Publication date: Available online 23 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Kelvin J. McQueen, Lev Vaidman

Abstract

We defend the many-worlds interpretation of quantum mechanics (MWI) against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer's self-location probabilities. We formulate a probability postulate for the MWI: the probability of self-location in a world with a given set of outcomes is the absolute square of that world's amplitude. We provide a proof of this postulate, which assumes the quantum formalism and two principles concerning symmetry and locality. We also show how a structurally similar proof of the Born rule is available for collapse theories. We conclude by comparing our account to the recent account offered by Sebens and Carroll.

Quantum 2, 107 (2018).

https://doi.org/10.22331/q-2018-11-19-107

We show that spin systems with infinite-range interactions can violate at thermal equilibrium a multipartite Bell inequality, up to a finite critical temperature $T_c$. Our framework can be applied to a wide class of spin systems and Bell inequalities, to study whether nonlocality occurs naturally in quantum many-body systems close to the ground state. Moreover, we also show that the low-energy spectrum of the Bell operator associated to such systems can be well approximated by the one of a quantum harmonic oscillator, and that spin-squeezed states are optimal in displaying Bell correlations for such Bell inequalities.

Publication date: Available online 15 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Benjamin Feintzeig, James Owen Weatherall

Abstract

We provide a novel perspective on “regularity” as a property of representations of the Weyl algebra. In Part I, we critiqued a proposal by Halvorson [2004, “Complementarity of representations in quantum mechanics”, Studies in History and Philosophy of Modern Physics35 (1), pp. 45–56], who advocates for the use of the non-regular “position” and “momentum” representations of the Weyl algebra. Halvorson argues that the existence of these non-regular representations demonstrates that a quantum mechanical particle can have definite values for position or momentum, contrary to a widespread view. In this sequel, we propose a justification for focusing on regular representations, pace Halvorson, by drawing on algebraic methods.

Publication date: Available online 16 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Benjamin Feintzeig, J.B. Le Manchak, Sarita Rosenstock, James Owen Weatherall

Abstract

We provide a novel perspective on “regularity” as a property of representations of the Weyl algebra. We first critique a proposal by Halvorson [2004, “Complementarity of representations in quantum mechanics”, Studies in History and Philosophy of Modern Physics35 (1), pp. 45–56], who argues that the non-regular “position” and “momentum” representations of the Weyl algebra demonstrate that a quantum mechanical particle can have definite values for position or momentum, contrary to a widespread view. We show that there are obstacles to such an intepretation of non-regular representations. In Part II, we propose a justification for focusing on regular representations, pace Halvorson, by drawing on algebraic methods.

Publication date: Available online 13 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Michael E. Cuffaro

Abstract

The principle of ‘information causality’ can be used to derive an upper bound—known as the ‘Tsirelson bound’—on the strength of quantum mechanical correlations, and has been conjectured to be a foundational principle of nature. To date, however, it has not been sufficiently motivated to play such a foundational role. The motivations that have so far been given are, as I argue, either unsatisfactorily vague or appeal to little if anything more than intuition. Thus in this paper I consider whether some way might be found to successfully motivate the principle. And I propose that a compelling way of so doing is to understand it as a generalisation of Einstein's principle of the mutually independent existence—the ‘being-thus’—of spatially distant things. In particular I first describe an argument, due to Demopoulos, to the effect that the so-called ‘no-signalling’ condition can be viewed as a generalisation of Einstein's principle that is appropriate for an irreducibly statistical theory such as quantum mechanics. I then argue that a compelling way to motivate information causality is to in turn consider it as a further generalisation of the Einsteinian principle that is appropriate for a theory of communication. I describe, however, some important conceptual obstacles that must yet be overcome if the project of establishing information causality as a foundational principle of nature is to succeed.

Publication date: Available online 13 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Baptiste Le Bihan, Niels Linnemann

Abstract

Important features of space and time are taken to be missing in quantum gravity, allegedly requiring an explanation of the emergence of spacetime from non-spatio-temporal theories. In this paper, we argue that the explanatory gap between general relativity and non-spatio-temporal quantum gravity theories might significantly be reduced with two moves. First, we point out that spacetime is already partially missing in the context of general relativity when understood from a dynamical perspective. Second, we argue that most approaches to quantum gravity already start with an in-built distinction between structures to which the asymmetry between space and time can be traced back.

Author(s): Zhikuan Zhao, Robert Pisarczyk, Jayne Thompson, Mile Gu, Vlatko Vedral, and Joseph F. Fitzsimons

The traditional formalism of nonrelativistic quantum theory allows the state of a quantum system to extend across space, but only restricts it to a single instant in time, leading to distinction between theoretical treatments of spatial and temporal quantum correlations. Here we unify the geometrica...


[Phys. Rev. A 98, 052312] Published Mon Nov 12, 2018

Author(s): Axel Schild

The local conservation of a physical quantity whose distribution changes with time is mathematically described by the continuity equation. The corresponding time parameter, however, is defined with respect to an idealized classical clock. We consider what happens when this classical time is replaced...


[Phys. Rev. A 98, 052113] Published Mon Nov 12, 2018

Quantum 2, 104 (2018).

https://doi.org/10.22331/q-2018-11-06-104

Using the existing classification of all alternatives to the measurement postulates of quantum theory we study the properties of bi-partite systems in these alternative theories. We prove that in all these theories the purification principle is violated, meaning that some mixed states are not the reduction of a pure state in a larger system. This allows us to derive the measurement postulates of quantum theory from the structure of pure states and reversible dynamics, and the requirement that the purification principle holds. The violation of the purification principle implies that there is some irreducible classicality in these theories, which appears like an important clue for the problem of deriving the Born rule within the many-worlds interpretation. We also prove that in all such modifications the task of state tomography with local measurements is impossible, and present a simple toy theory displaying all these exotic non-quantum phenomena. This toy model shows that, contrarily to previous claims, it is possible to modify the Born rule without violating the no-signalling principle. Finally, we argue that the quantum measurement postulates are the most non-classical amongst all alternatives.

Author(s): Pavel Sekatski, Jean-Daniel Bancal, Sebastian Wagner, and Nicolas Sangouard

Bell’s theorem has been proposed to certify, in a device-independent and robust way, blocks either producing or measuring quantum states. In this Letter, we provide a method based on Bell’s theorem to certify coherent operations for the storage, processing, and transfer of quantum information. This ...


[Phys. Rev. Lett. 121, 180505] Published Fri Nov 02, 2018

Abstract

Understanding the emergence of a tangible 4-dimensional space-time from a quantum theory of gravity promises to be a tremendously difficult task. This article makes the case that this task may not have to be carried. Space-time as we know it may be fundamental to begin with. I recall the common arguments against this possibility and review a class of recently discovered models bypassing the most serious objection. The generic solution of the measurement problem that is tied to semiclassical gravity as well as the difficulty of the alternative make it a reasonable default option in the absence of decisive experimental evidence.

Abstract

Complexified Liénard–Wiechert potentials simplify the mathematics of Kerr–Newman particles. Here we constrain them by fiat to move along Bohmian trajectories to see if anything interesting occurs, as their equations of motion are not known. A covariant theory due to Stueckelberg is used. This paper deviates from the traditional Bohmian interpretation of quantum mechanics since the electromagnetic interactions of Kerr–Newman particles are dictated by general relativity. A Gaussian wave function is used to produce the Bohmian trajectories, which are found to be multi-valued. A generalized analytic continuation is introduced which leads to an infinite number of trajectories. These include the entire set of Bohmian trajectories. This leads to multiple retarded times which come into play in complex space-time. If one weights these trajectories by their natural Bohmian weighting factors, then it is found that the particles do not radiate, that they are extended, and that they can have a finite electrostatic self energy, thus avoiding the usual divergence of the charged point particle. This effort does not in any way criticize or downplay the traditional Bohmian interpretation which does not assume the standard electromagnetic coupling to charged particles, but it suggests that a hybridization of Kerr–Newman particle theory with Bohmian mechanics might lead to interesting new physics, and maybe even the possibility of emergent quantum mechanics.

Abstract

The significance of the de Broglie/Bohm hidden-particle position in the relativistic regime is addressed, seeking connection to the (orthodox) single-particle Newton–Wigner position. The effect of non-positive excursions of the ensemble density for extreme cases of positive-energy waves is easily computed using an integral of the equations of motion developed here for free spin-0 particles in 1 + 1 dimensions and is interpreted in terms of virtual-like pair creation and annihilation beneath the Compton wavelength. A Bohm-theoretic description of the acausal explosion of a specific Newton–Wigner-localized state is presented in detail. The presence of virtual pairs found is interpreted as the Bohm picture of the spatial extension beyond single point particles proposed in the 1960s as to why space-like hyperplane dependence of the Newton–Wigner wavefunctions may be needed to achieve Lorentz covariance. For spin-1/2 particles the convective current is speculatively utilized for achieving parity with the spin-0 theory. The spin-0 improper quantum potential is generalized to an improper stress tensor for spin-1/2 particles.

Author(s): Dmitry V. Zhdanov, Denys I. Bondar, and Tamar Seideman

A quantum analog of friction (understood as a completely positive, Markovian, translation-invariant, phenomenological model of dissipation) is known to be at odds with detailed balance in the thermodynamic limit. We show that this is not the case for quantum systems with internal (e.g., spin) states...


[Phys. Rev. A 98, 042133] Published Mon Oct 29, 2018

Publication date: Available online 19 October 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Trevor Teitel

Abstract

Background independence begins life as an informal property that a physical theory might have, often glossed as ‘doesn't posit a fixed spacetime background’. Interest in trying to offer a precise account of background independence has been sparked by the pronouncements of several theorists working on quantum gravity that background independence embodies in some sense an essential discovery of the General Theory of Relativity, and a feature we should strive to carry forward to future physical theories. This paper has two goals. The first is to investigate what a world must be like in order to be truly described by a background independent theory given extant accounts of background independence. The second is to argue that there are no non-empirical reasons to be more confident in theories that satisfy extant accounts of background independence than in theories that don't. The paper concludes by drawing a general moral about a way in which focussing primarily on mathematical formulations of our physical theories can adversely affect debates in the metaphysics of physics.

Abstract

Parallel lives (PL) is an ontological model of nature in which quantum mechanics and special relativity are unified in a single universe with a single space-time. Point-like objects called lives are the only fundamental objects in this space-time, and they propagate at or below c, and interact with one another only locally at point-like events in space-time, very much like classical point particles. Lives are not alive in any sense, nor do they possess consciousness or any agency to make decisions—they are simply point objects which encode memory at events in space-time. The only causes and effects in the universe occur when lives meet locally, and thus the causal structure of interaction events in space-time is Lorentz invariant. Each life traces a continuous world-line through space-time, and experiences its own relative world, fully defined by the outcomes of past events along its world-line (never superpositions), which are encoded in its external memory. A quantum field comprises a continuum of lives throughout space-time, and familiar physical systems like particles each comprise a sub-continuum of the lives of the field. Each life carries a hidden internal memory containing a local relative wavefunction, which is a local piece of a pure universal wavefunction, but it is the relative wavefunctions in the local memories throughout space-time which are physically real in PL, and not the universal wavefunction in configuration space. Furthermore, while the universal wavefunction tracks the average behavior of the lives of a system, it fails to track their individual dynamics and trajectories. There is always a preferred separable basis, and for an irreducible physical system, each orthogonal term in this basis is a different relative world—each containing some fraction of the lives of the system. The relative wavefunctions in the lives’ internal memories govern which lives of different systems can meet during future local interactions, and thereby enforce entanglement correlations—including Bell inequality violations. These, and many other details, are explored here, but several aspects of this framework are not yet fleshed out, and work is ongoing.

Author(s): Luca Mancino, Vasco Cavina, Antonella De Pasquale, Marco Sbroscia, Robert I. Booth, Emanuele Roccia, Ilaria Gianani, Vittorio Giovannetti, and Marco Barbieri

Theoretical bounds on irreversible entropy production in a thermalizing quantum system are supported by experiments simulating the thermalization of a qubit using a quantum photonic architecture.


[Phys. Rev. Lett. 121, 160602] Published Wed Oct 17, 2018

Abstract

In-principle restrictions on the amount of information that can be gathered about a system have been proposed as a foundational principle in several recent reconstructions of the formalism of quantum mechanics. However, it seems unclear precisely why one should be thus restricted. We investigate the notion of paradoxical self-reference as a possible origin of such epistemic horizons by means of a fixed-point theorem in Cartesian closed categories due to Lawvere that illuminates and unifies the different perspectives on self-reference.

On Formalisms and Interpretations

-

Quantum

on 2018-10-15 2:47pm GMT

Quantum 2, 99 (2018).

https://doi.org/10.22331/q-2018-10-15-99

One of the reasons for the heated debates around the interpretations of quantum theory is a simple confusion between the notions of formalism $\textit{versus}$ interpretation. In this note, we make a clear distinction between them and show that there are actually two $\textit{inequivalent}$ quantum formalisms, namely the relative-state formalism and the standard formalism with the Born and measurement-update rules. We further propose a different probability rule for the relative-state formalism and discuss how Wigner's-friend-type experiments could show the inequivalence with the standard formalism. The feasibility in principle of such experiments, however, remains an open question.

Author(s): Astrid Eichhorn and Aaron Held

The hypothesized asymptotic safe behavior of gravity may be used to retrodict top and bottom quark masses by tracking the effect of quantum gravity fluctuations on matter fields.


[Phys. Rev. Lett. 121, 151302] Published Fri Oct 12, 2018

Author(s): Eliahu Cohen and Eli Pollak

Weak values have been shown to be helpful especially when considering them as the outcomes of weak measurements. In this paper we show that, in principle, the real and imaginary parts of the weak value of any operator may be elucidated from expectation values of suitably defined density, flux, and H...


[Phys. Rev. A 98, 042112] Published Tue Oct 09, 2018

Author(s): J. Nobakht, M. Carlesso, S. Donadi, M. Paternostro, and A. Bassi

The continuous spontaneous localization (CSL) model strives to describe the quantum-to-classical transition from the viewpoint of collapse models. However, its original formulation suffers from a fundamental inconsistency in that it is explicitly energy nonconserving. Fortunately, a dissipative exte...


[Phys. Rev. A 98, 042109] Published Mon Oct 08, 2018

Author(s): Jakub Rembieliński and Jacek Ciborowski

We introduce a variant of quantum and classical electrodynamics formulated on the grounds of a hypothesis of existence of a preferred frame of reference—a formalism complementary to that regarding the structure of the space of photonic states, presented by us recently [Phys. Rev. A 97, 062106 (2018)...


[Phys. Rev. A 98, 042107] Published Thu Oct 04, 2018

Abstract

In physics, one is often misled in thinking that the mathematical model of a system is part of or is that system itself. Think of expressions commonly used in physics like “point” particle, motion “on the line”, “smooth” observables, wave function, and even “going to infinity”, without forgetting perplexing phrases like “classical world” versus “quantum world”.... On the other hand, when a mathematical model becomes really inoperative in regard with correct predictions, one is forced to replace it with a new one. It is precisely what happened with the emergence of quantum physics. Classical models were (progressively) superseded by quantum ones through quantization prescriptions. These procedures appear often as ad hoc recipes. In the present paper, well defined quantizations, based on integral calculus and Weyl–Heisenberg symmetry, are described in simple terms through one of the most basic examples of mechanics. Starting from (quasi-) probability distribution(s) on the Euclidean plane viewed as the phase space for the motion of a point particle on the line, i.e., its classical model, we will show how to build corresponding quantum model(s) and associated probabilities (e.g. Husimi) or quasi-probabilities (e.g. Wigner) distributions. We highlight the regularizing rôle of such procedures with the familiar example of the motion of a particle with a variable mass and submitted to a step potential.

Abstract

The Horizon Quantum Mechanics is an approach that allows one to analyse the gravitational radius of spherically symmetric systems and compute the probability that a given quantum state is a black hole. We first review the (global) formalism and show how it reproduces a gravitationally inspired GUP relation. This results leads to unacceptably large fluctuations in the horizon size of astrophysical black holes if one insists in describing them as (smeared) central singularities. On the other hand, if they are extended systems, like in the corpuscular models, no such issue arises and one can in fact extend the formalism to include asymptotic mass and angular momentum with the harmonic model of rotating corpuscular black holes. The Horizon Quantum Mechanics then shows that, in simple configurations, the appearance of the inner horizon is suppressed and extremal (macroscopic) geometries seem disfavoured.

Abstract

It is shown that the nonlocal anomalous effective actions corresponding to the quantum breaking of the conformal symmetry can lead to observable modifications of Einstein’s equations. The fact that Einstein’s general relativity is in perfect agreement with all observations including cosmological or recently observed gravitational waves imposes strong restrictions on the field content of possible extensions of Einstein’s theory: all viable theories should have vanishing conformal anomalies. It is shown that a complete cancellation of conformal anomalies in \(D=4\) for both the \(C^2\) invariant and the Euler (Gauss–Bonnet) invariant can only be achieved for N-extended supergravity multiplets with \(N \ge 5\) .

Volume 4, Issue 4, pages 235-246

A. I. Arbab [Show Biography]

Arbab Ibrahim studied physics at Khartoum University and high energy physics at the International Cenetr for Theoretical Physics (ICTP), Italy. He has taught physics at Khartoum University and Qassim University, and he is currently a Professor of Physics. He has been a visiting scholar at University of Illinois, Urbana-Champaign, Towson University, and Sultan Qaboos University. His work concentrates on the formulation of quantum mechanics and electromagnetism using Quaternions. He has publications in wide range of theoretical physics. He is an active reviewer for many international journals.

By expressing the Schrödinger wavefunction in the form ψ=Re^iS, where R and S are real functions, we have shown that the expectation value of S is conserved. The amplitude of the wave (R) is found to satisfy the Schrödinger equation while the phase (S) is related to the energy conservation. Besides the quantum potential that depends on R,  we have obtained a phase potential that depends on the phase S derivative. The phase force is a dissipative force. The quantum potential may be attributed to the interaction between the two subfields S and R comprising the quantum particle. This results in splitting (creation/annihilation) of these subfields, each having a mass mc² with an internal frequency of 2mc²/h, satisfying the original wave equation and endowing the particle its quantum nature. The mass of one subfield reflects the interaction with the other subfield. If in Bohmian ansatz R satisfies the Klein-Gordon equation, then S must satisfies the wave equation. Conversely, if R satisfies the wave equation, then S yields the Einstein relativistic energy momentum equation.

Full Text Download (210k)

Volume 4, Issue 4, pages 247-267

Sebastian Fortin [Show Biography] and Olimpia Lombardi [Show Biography]

Oimpia Lombardi obtained her degree in Electronic Engineering and in Philosophy at the University of Buenos Aires, and her PhD in Philosophy at the same university. She is Principal Researcher at the National Scientific and Technical Research Council of Argentina. She is member of the Academie Internationale de Philosophie des Sciences and of the Foundational Questions Institute. She is the director of the Group of the Philosohy of Science at the University of Buenos Aires. Areas of interest: foundations of statistical mechanics, the problem of the arrow of time, interpretation of quantum mechanics, the nature of information, philosophy of chemistry.

Sebastian Fortin has a degree and a PhD in Physics at the University of Buenos Aires and a PhD in Epistemology and History of Science at the National University of Tres de Febrero, Argentina. He is Researcher at the National Scientific and Technical Research Council of Argentina and assistant professor at the Physics Department of the Faculty of Exact and Natural Sciences at the University of Buenos Aires. His field of interest is philosophy of physics, particularly foundations of quantum mechanics.

If decoherence is an irreversible process, its physical meaning might be clarified by comparing quantum and classical irreversibility. In this work we carry out this comparison, from which a unified view of the emergence of irreversibility arises, applicable both to the classical and to the quantum case. According to this unified view, in the two cases the irreversible macro-level arises from the reversible micro-level as a coarse description that can be understood in terms of the concept of projection. This position supplies an understanding of the phenomenon of decoherence different from that implicit in most presentations: the reduced state is not the quantum state of the open system, but a coarse state of the closed composite system; as a consequence, decoherence should be understood not as a phenomenon resulting from the interaction between an open system and its environment, but rather as a coarse evolution that emerges from disregarding certain degrees of freedom of the whole closed system.

Full Text Download (923k)

Volume 4, Issue 4, pages 223-234

Mohammed Sanduk [Show Biography]

Mohammed Sanduk is an Iraqi born British physicist. He was educated at University of Baghdad and University of Manchester. Before attending his undergraduate study, he pub-lished a book in particle physics entitled “Mesons”. Sanduk has worked in industry and academia, and his last post in Iraq was head of the Laser and Opto-electronics Engineering department at Nahrain University in Baghdad. Owing to his interest in the philosophy of science, and he was a member of the academic staff of Pontifical Babel College for Philosophy. Sanduk is working with the department of chemical and process engineering at the University of Surrey. Sanduk is interested in transport of charged particles, Magnetohydro-dynamics, and the renewable energy technology. In addition to that, Sanduk is interested in the foundation of Quantum mechanics, and the philosophy of science & technology.

In the last article, an approach was developed to form an analogy of the wave function and derive analogies for both the mathematical forms of the Dirac and Klein-Gordon equations. The analogies obtained were the transformations from the classical real model forms to the forms in complex space. The analogous of the Klein-Gordon equation was derived from the analogous Dirac equation as in the case of quantum mechanics. In the present work, the forms of Dirac and Klein-Gordon equations were derived as a direct transformation from the classical model. It was found that the Dirac equation form may be related to a complex velocity equation. The Dirac’s Hamiltonian and coefficients correspond to each other in these analogies. The Klein-Gordon equation form may be related to the complex acceleration equation. The complex acceleration equation can explain the generation of the flat spacetime. Although this approach is classical, it may show a possibility of unifying relativistic quantum mechanics and special relativity in a single model and throw light on the undetectable æther.

Full Text Download (576k)

Author(s): Ezad Shojaee, Christopher S. Jackson, Carlos A. Riofrío, Amir Kalev, and Ivan H. Deutsch

The spin-coherent-state positive-operator-valued-measure (POVM) is a fundamental measurement in quantum science, with applications including tomography, metrology, teleportation, benchmarking, and measurement of Husimi phase space probabilities. We prove that this POVM is achieved by collectively me...


[Phys. Rev. Lett. 121, 130404] Published Wed Sep 26, 2018

Author(s): Ding Jia (贾丁)

There has been a body of work deriving the complex Hilbert-space structure of quantum theory from axioms/principles/postulates to deepen our understanding of quantum theory and to reveal ways to go beyond it to resolve foundational issues. Recent progress in incorporating indefinite causal structure...


[Phys. Rev. A 98, 032112] Published Wed Sep 19, 2018

Publication date: Available online 24 August 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): R. Hermens

Abstract

Three recent arguments seek to show that the universal applicability of unitary quantum theory is inconsistent with the assumption that a well-conducted measurement always has a definite physical outcome. In this paper I restate and analyze these arguments. The import of the first two is diminished by their dependence on assumptions about the outcomes of counterfactual measurements. But the third argument establishes its intended conclusion. Even if every well-conducted quantum measurement we ever make will have a definite physical outcome, this argument should make us reconsider the objectivity of that outcome.

Abstract

We review the argument that latent image formation is a measurement in which the state vector collapses, requiring an enhanced noise parameter in objective reduction models. Tentative observation of a residual noise at this level, plus several experimental bounds, imply that the noise must be colored (i.e., non-white), and hence frame dependent and non-relativistic. Thus a relativistic objective reduction model, even if achievable in principle, would be incompatible with experiment; the best one can do is the non-relativistic CSL model. This negative conclusion has a positive aspect, in that the non-relativistic CSL reduction model evades the argument leading to the Conway–Kochen “Free Will Theorem”.

Author(s): Tatsuma Nishioka

In this review the entanglement and Renyi entropies in quantum field theory are described from different points of view, including the perturbative approach and holographic dualities. The applications of these results to constraining renormalization group flows are presented effectively and illustrated with a variety of examples.


[Rev. Mod. Phys. 90, 035007] Published Mon Sep 17, 2018

Author(s): Ricardo Ximenes, Fernando Parisio, and Eduardo O. Dias

The question of how long a particle takes to pass through a potential barrier is still a controversial topic in quantum mechanics. One of the main theoretical problems in obtaining estimates for measurable times is the fact that several previously defined time operators, which remained within the bo...


[Phys. Rev. A 98, 032105] Published Mon Sep 10, 2018

Abstract

For a simple set of observables we can express, in terms of transition probabilities alone, the Heisenberg uncertainty relations, so that they are proven to be not only necessary, but sufficient too, in order for the given observables to admit a quantum model. Furthermore distinguished characterizations of strictly complex and real quantum models, with some ancillary results, are presented and discussed.

Author(s): Nora Tischler, Farzad Ghafari, Travis J. Baker, Sergei Slussarenko, Raj B. Patel, Morgan M. Weston, Sabine Wollmann, Lynden K. Shalm, Varun B. Verma, Sae Woo Nam, H. Chau Nguyen, Howard M. Wiseman, and Geoff J. Pryde

A new photon source is used to realize one-way Einstein-Podolsky-Rosen steering free from restrictions on the type of allowed measurements and on assumptions about the quantum state.


[Phys. Rev. Lett. 121, 100401] Published Fri Sep 07, 2018

Quantum 2, 92 (2018).

https://doi.org/10.22331/q-2018-09-03-92

Bell-inequality violations establish that two systems share some quantum entanglement. We give a simple test to certify that two systems share an asymptotically large amount of entanglement, $n$ EPR states. The test is efficient: unlike earlier tests that play many games, in sequence or in parallel, our test requires only one or two CHSH games. One system is directed to play a CHSH game on a random specified qubit $i$, and the other is told to play games on qubits $\{i,j\}$, without knowing which index is $i$. The test is robust: a success probability within $\delta$ of optimal guarantees distance $O(n^{5/2} \sqrt{\delta})$ from $n$ EPR states. However, the test does not tolerate constant $\delta$; it breaks down for $\delta = \tilde\Omega (1/\sqrt{n})$. We give an adversarial strategy that succeeds within delta of the optimum probability using only $\tilde O(\delta^{-2})$ EPR states.

Author(s): K. Goswami, C. Giarmatzi, M. Kewming, F. Costa, C. Branciard, J. Romero, and A. G. White

A photonic quantum switch between a pair of operations is constructed such that the causal order of operations cannot be distinguished, even in principle.


[Phys. Rev. Lett. 121, 090503] Published Fri Aug 31, 2018

Quantum 2, 87 (2018).

https://doi.org/10.22331/q-2018-08-27-87

Ernst Specker considered a particular feature of quantum theory to be especially fundamental, namely that pairwise joint measurability of sharp measurements implies their global joint measurability ($\href{https://vimeo.com/52923835}{vimeo.com/52923835}$). To date, Specker's principle seemed incapable of singling out quantum theory from the space of all general probabilistic theories. In particular, its well-known consequence for experimental statistics, the principle of consistent exclusivity, does not rule out the set of correlations known as almost quantum, which is strictly larger than the set of quantum correlations. Here we show that, contrary to the popular belief, Specker's principle cannot be satisfied in any theory that yields almost quantum correlations.