Latest Papers on Quantum Foundations - Updated Daily by IJQF

Publication date: Available online 7 December 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jonathan Bain

Abstract

Intrinsic topologically ordered (ITO) condensed matter systems are claimed to exhibit two types of non-locality. The first is associated with topological properties and the second is associated with a particular type of quantum entanglement. These characteristics are supposed to allow ITO systems to encode information in the form of quantum entangled states in a topologically non-local way that protects it against local errors. This essay first clarifies the sense in which these two notions of non-locality are distinct, and then considers the extent to which they are exhibited by ITO systems. I will argue that while the claim that ITO systems exhibit topological non-locality is unproblematic, the claim that they also exhibit quantum entanglement non-locality is less clear, and this is due in part to ambiguities associated with the notion of quantum entanglement. Moreover, any argument that claims some form of "long-range" entanglement is necessary to explain topological properties is incomplete if it fails to provide a convincing reason why mechanistic explanations should be favored over structural explanations of topological phenomena.

Publication date: Available online 30 November 2018

Source: Physics Letters A

Author(s): Atul Singh Arora, Kishor Bharti, Arvind

Abstract

We construct a non-contextual hidden variable model consistent with all the kinematic predictions of quantum mechanics (QM). The famous Bell–KS theorem shows that non-contextual models which satisfy a further reasonable restriction are inconsistent with QM. In our construction, we define a weaker variant of this restriction which captures its essence while still allowing a non-contextual description of QM. This is in contrast to the contextual hidden variable toy models, such as the one by Bell, and brings out an interesting alternate way of looking at QM. The results also relate to the Bohmian model, where it is harder to pin down such features.

Leonard Susskind, a pioneer of string theory, the holographic principle and other big physics ideas spanning the past half-century, has proposed a solution to an important puzzle about black holes. The problem is that even though these mysterious, invisible spheres appear to stay a constant size as viewed from the outside, their interiors keep growing in volume essentially forever. How is this possible?

In a series of recent papers and talks, the 78-year-old Stanford University professor and his collaborators conjecture that black holes grow in volume because they are steadily increasing in complexity — an idea that, while unproven, is fueling new thinking about the quantum nature of gravity inside black holes.

Black holes are spherical regions of such extreme gravity that not even light can escape. First discovered a century ago as shocking solutions to the equations of Albert Einstein’s general theory of relativity, they’ve since been detected throughout the universe. (They typically form from the inward gravitational collapse of dead stars.) Einstein’s theory equates the force of gravity with curves in space-time, the four-dimensional fabric of the universe, but gravity becomes so strong in black holes that the space-time fabric bends toward its breaking point — the infinitely dense “singularity” at the black hole’s center.

According to general relativity, the inward gravitational collapse never stops. Even though, from the outside, the black hole appears to stay a constant size, expanding slightly only when new things fall into it, its interior volume grows bigger and bigger all the time as space stretches toward the center point. For a simplified picture of this eternal growth, imagine a black hole as a funnel extending downward from a two-dimensional sheet representing the fabric of space-time. The funnel gets deeper and deeper, so that infalling things never quite reach the mysterious singularity at the bottom. In reality, a black hole is a funnel that stretches inward from all three spatial directions. A spherical boundary surrounds it called the “event horizon,” marking the point of no return.

Since at least the 1970s, physicists have recognized that black holes must really be quantum systems of some kind — just like everything else in the universe. What Einstein’s theory describes as warped space-time in the interior is presumably really a collective state of vast numbers of gravity particles called “gravitons,” described by the true quantum theory of gravity. In that case, all the known properties of a black hole should trace to properties of this quantum system.

Indeed, in 1972, the Israeli physicist Jacob Bekenstein figured out that the area of the spherical event horizon of a black hole corresponds to its “entropy.” This is the number of different possible microscopic arrangements of all the particles inside the black hole, or, as modern theorists would describe it, the black hole’s storage capacity for information.

Bekenstein’s insight led Stephen Hawking to realize two years later that black holes have temperatures, and that they therefore radiate heat. This radiation causes black holes to slowly evaporate away, giving rise to the much-discussed “black hole information paradox,” which asks what happens to information that falls into black holes. Quantum mechanics says the universe preserves all information about the past. But how does information about infalling stuff, which seems to slide forever toward the central singularity, also evaporate out?

The relationship between a black hole’s surface area and its information content has kept quantum gravity researchers busy for decades. But one might also ask: What does the growing volume of its interior correspond to, in quantum terms? “For whatever reason, nobody, including myself for a number of years, really thought very much about what that means,” said Susskind. “What is the thing which is growing? That should have been one of the leading puzzles of black hole physics.”

In recent years, with the rise of quantum computing, physicists have been gaining new insights about physical systems like black holes by studying their information-processing abilities — as if they were quantum computers. This angle led Susskind and his collaborators to identify a candidate for the evolving quantum property of black holes that underlies their growing volume. What’s changing, the theorists say, is the “complexity” of the black hole — roughly a measure of the number of computations that would be needed to recover the black hole’s initial quantum state, at the moment it formed. After its formation, as particles inside the black hole interact with one another, the information about their initial state becomes ever more scrambled. Consequently, their complexity continuously grows.

Using toy models that represent black holes as holograms, Susskind and his collaborators have shown that the complexity and volume of black holes both grow at the same rate, supporting the idea that the one might underlie the other. And, whereas Bekenstein calculated that black holes store the maximum possible amount of information given their surface area, Susskind’s findings suggest that they also grow in complexity at the fastest possible rate allowed by physical laws.

John Preskill, a theoretical physicist at the California Institute of Technology who also studies black holes using quantum information theory, finds Susskind’s idea very interesting. “That’s really cool that this notion of computational complexity, which is very much something that a computer scientist might think of and is not part of the usual physicist’s bag of tricks,” Preskill said, “could correspond to something which is very natural for someone who knows general relativity to think about,” namely the growth of black hole interiors.

Researchers are still puzzling over the implications of Susskind’s thesis. Aron Wall, a theorist at Stanford (soon moving to the University of Cambridge), said, “The proposal, while exciting, is still rather speculative and may not be correct.” One challenge is defining complexity in the context of black holes, Wall said, in order to clarify how the complexity of quantum interactions might give rise to spatial volume.

A potential lesson, according to Douglas Stanford, a black hole specialist at the Institute for  Advanced Study in Princeton, New Jersey, “is that black holes have a type of internal clock that keeps time for a very long time. For an ordinary quantum system,” he said, “this is the complexity of the state. For a black hole, it is the size of the region behind the horizon.”

If complexity does underlie spatial volume in black holes, Susskind envisions consequences for our understanding of cosmology in general. “It’s not only black hole interiors that grow with time. The space of cosmology grows with time,” he said. “I think it’s a very, very interesting question whether the cosmological growth of space is connected to the growth of some kind of complexity. And whether the cosmic clock, the evolution of the universe, is connected with the evolution of complexity. There, I don’t know the answer.”

 



show enclosure

(image/jpg)
Weatherall, James Owen (2018) Why Not Categorical Equivalence? [Preprint]

That quantum mechanics is a successful theory is not in dispute. It makes astonishingly accurate predictions about the nature of the world at microscopic scales. What has been in dispute for nearly a century is just what it’s telling us about what exists, what is real. There are myriad interpretations that offer their own take on the question, each requiring us to buy into certain as-yet-unverified claims — hence assumptions — about the nature of reality.

Now, a new thought experiment is confronting these assumptions head-on and shaking the foundations of quantum physics. The experiment is decidedly strange. For example, it requires making measurements that can erase any memory of an event that was just observed. While this isn’t possible with humans, quantum computers could be used to carry out this weird experiment and potentially discriminate between the different interpretations of quantum physics.

“Every now and then you get a paper which gets everybody thinking and discussing, and this is one of those cases,” said Matthew Leifer, a quantum physicist at Chapman University in Orange, California. “ is a thought experiment which is going to be added to the canon of weird things we think about in quantum foundations.”

The experiment, designed by Daniela Frauchiger and Renato Renner, of the Swiss Federal Institute of Technology Zurich, involves a set of assumptions that on the face of it seem entirely reasonable. But the experiment leads to contradictions, suggesting that at least one of the assumptions is wrong. The choice of which assumption to give up has implications for our understanding of the quantum world and points to the possibility that quantum mechanics is not a universal theory, and so cannot be applied to complex systems such as humans.

Quantum physicists are notoriously divided when it comes to the correct interpretation of the equations that are used to describe quantum goings-on. But in the new thought experiment, no view of the quantum world comes through unscathed. Each one falls afoul of one or another assumption. Could something entirely new await us in our search for an uncontroversial description of reality?

Quantum theory works extremely well at the scale of photons, electrons, atoms, molecules, even macromolecules. But is it applicable to systems that are much, much larger than macromolecules? “We have not experimentally established the fact that quantum mechanics applies on larger scales, and larger means even something the size of a virus or a little cell,” Renner said. “In particular, we don’t know whether it extends to objects the size of humans and even lesser, it extends to objects the size of black holes.”

Despite this lack of empirical evidence, physicists think that quantum mechanics can be used to describe systems at all scales — meaning it’s universal. To test this assertion, Frauchiger and Renner came up with their thought experiment, which is an extension of something the physicist Eugene Wigner first dreamed up in the 1960s. The new experiment shows that, in a quantum world, two people can end up disagreeing about a seemingly irrefutable result, such as the outcome of a coin toss, suggesting something is amiss with the assumptions we make about quantum reality.

In standard quantum mechanics, a quantum system such as a subatomic particle is represented by a mathematical abstraction called the wave function. Physicists calculate how the particle’s wave function evolves with time.

But the wave function does not give us the exact value for any of the particle’s properties, such as its position. If we want to know where the particle is, the wave function’s value at any point in space and time only lets us calculate the probability of finding the particle at that point, should we choose to look. Before we look, the wave function is spread out, and it accords different probabilities for the particle being in different places. The particle is said to be in a quantum superposition of being in many places at once.

More generally, a quantum system can be in a superposition of states, where “state” can refer to other properties, such as the spin of a particle. Much of the Frauchiger-Renner thought experiment involves manipulating complex quantum objects — maybe even humans — that end up in superpositions of states.

The experiment has four agents: Alice, Alice’s friend, Bob, and Bob’s friend. Alice’s friend is inside a lab making measurements on a quantum system, and Alice is outside, monitoring both the lab and her friend. Bob’s friend is similarly inside another lab, and Bob is observing his friend and the lab, treating them both as one system.

Inside the first lab, Alice’s friend makes a measurement on what is effectively a coin toss designed to come up heads one-third of the time and tails two-thirds of the time. If the toss comes up heads, Alice’s friend prepares a particle with spin pointing down, but if the toss comes up tails, she prepares the particle in a superposition of equal parts spin UP and spin DOWN.

Alice’s friend sends the particle to Bob’s friend, who measures the spin of the particle. Based on the result, Bob’s friend can now make an assertion about what Alice’s friend saw in her coin toss. If he finds the particle spin to be UP, for example, he knows the coin came up tails.

The experiment continues. Alice measures the state of her friend and her lab, treating all of it as one quantum system, and uses quantum theory to make predictions. Bob does the same with his friend and lab. Here comes the first assumption: An agent can analyze another system, even a complex one including other agents, using quantum mechanics. In other words, quantum theory is universal, and everything in the universe, including entire laboratories (and the scientists inside them), follows the rules of quantum mechanics.

This assumption allows Alice to treat her friend and the lab as one system and make a special type of measurement, which puts the entire lab, including its contents, into a superposition of states. This is not a simple measurement, and herein lies the thought experiment’s weirdness.

The process is best understood by considering a single photon that’s in a superposition of being polarized horizontally and vertically. Say you measure the polarization and find it to be vertically polarized. Now, if you keep checking to see if the photon is vertically polarized, you will always find that it is. But if you measure the vertically polarized photon to see if it is polarized in a different direction, say at a 45-degree angle to the vertical, you’ll find that there’s a 50 percent chance that it is, and a 50 percent chance that it isn’t. Now if you go back to measure what you thought was a vertically polarized photon, you’ll find there’s a chance that it’s no longer vertically polarized at all — rather, it’s become horizontally polarized. The 45-degree measurement has put the photon back into a superposition of being polarized horizontally and vertically.

This is all very fine for a single particle, and such measurements have been amply verified in actual experiments. But in the thought experiment, Frauchiger and Renner want to do something similar with complex systems.

As this stage in the experiment, Alice’s friend has already seen the coin coming up either heads or tails. But Alice’s complex measurement puts the lab, friend included, into a superposition of having seen heads and tails. Given this weird state, it’s just as well that the experiment does not demand anything further of Alice’s friend.

Alice, however, is not done. Based on her complex measurement, which can come out as either YES or NO, she can infer the result of the measurement made by Bob’s friend. Say Alice got YES for an answer. She can deduce using quantum mechanics that Bob’s friend must have found the particle’s spin to be UP, and therefore that Alice’s friend got tails in her coin toss.

This assertion by Alice necessitates another assumption about her use of quantum theory. Not only does she reason about what she knows, but she reasons about how Bob’s friend used quantum theory to arrive at his conclusion about the result of the coin toss. Alice makes that conclusion her own. This assumption of consistency argues that the predictions made by different agents using quantum theory are not contradictory.

Meanwhile, Bob can make a similarly complex measurement on his friend and his lab, placing them in a quantum superposition. The answer can again be YES or NO. If Bob gets YES, the measurement is designed to let him conclude that Alice’s friend must have seen heads in her coin toss.

It’s clear that Alice and Bob can make measurements and compare their assertions about the result of the coin toss. But this involves another assumption: If an agent’s measurement says that the coin toss came up heads, then the opposite fact — that the coin toss came up tails — cannot be simultaneously true.

The setup is now ripe for a contradiction. When Alice gets a YES for her measurement, she infers that the coin toss came up tails, and when Bob gets a YES for his measurement, he infers the coin toss came up heads. Most of the time, Alice and Bob will get opposite answers. But Frauchiger and Renner showed that in 1/12 of the cases both Alice and Bob will get a YES in the same run of the experiment, causing them to disagree about whether Alice’s friend got a heads or a tails. “So, both of them are talking about the past event, and they are both sure what it was, but their statements are exactly opposite,” Renner said. “And that’s the contradiction. That shows something must be wrong.”

This led Frauchiger and Renner to claim that one of the three assumptions that underpin the thought experiment must be incorrect.

“The science stops there. We just know one of the three is wrong, and we cannot really give a good argument which one is violated,” Renner said. “This is now a matter of interpretation and taste.”

Fortunately, there are a wealth of interpretations of quantum mechanics, and almost all of them have to do with what happens to the wave function upon measurement. Take a particle’s position. Before measurement, we can only talk in terms of the probabilities of, say, finding the particle somewhere. Upon measurement, the particle assumes a definite location. In the Copenhagen interpretation, measurement causes the wave function to collapse, and we cannot talk of properties, such as a particle’s position, before collapse. Some physicists view the Copenhagen interpretation as an argument that properties are not real until measured.

This form of “anti-realism” was anathema to Einstein, as it is to some quantum physicists today. And so is the notion of a measurement causing the collapse of the wave function, particularly because the Copenhagen interpretation is unclear about exactly what constitutes a measurement. Alternative interpretations or theories mainly try to either advance a realist view — that quantum systems have properties independent of observers and measurements — or avoid a measurement-induced collapse, or both.

For example, the many-worlds interpretation takes the evolution of the wave function at face value and denies that it ever collapses. If a quantum coin toss can be either heads or tails, then in the many-worlds scenario, both outcomes happen, each in a different world. Given this, the assumption that there is only one outcome for a measurement, and that if the coin toss is heads, it cannot simultaneously be tails, becomes untenable. In many-worlds, the result of the coin toss is both heads and tails, and thus the fact that Alice and Bob can sometimes get opposite answers is not a contradiction.

“I have to admit that if you had asked me two years ago, I’d have said just shows that many-worlds is actually a good interpretation and you should give up” the requirement that measurements have only a single outcome, Renner said.

This is also the view of the theoretical physicist David Deutsch of the University of Oxford, who became aware of the Frauchiger-Renner paper when it first appeared on arxiv.org. In that version of the paper, the authors favored the many-worlds scenario. (The latest version of the paper, which was peer reviewed and published in Nature Communications in September, takes a more agnostic stance.) Deutsch thinks the thought experiment will continue to support many-worlds. “My take is likely to be that it kills wave-function-collapse or single-universe versions of quantum theory, but they were already stone dead,” he said. “I’m not sure what purpose it serves to attack them again with bigger weapons.”

Renner, however, has changed his mind. He thinks the assumption most likely to be invalid is the idea that quantum mechanics is universally applicable.

This assumption is violated, for example, by so-called spontaneous collapse theories that argue — as the name suggests — for a spontaneous and random collapse of the wave function, but one that is independent of measurement. These models ensure that small quantum systems, such as particles, can remain in a superposition of states almost forever, but as systems get more massive, it gets more and more likely that they will spontaneously collapse to a classical state. Measurements merely discover the state of the collapsed system.

In spontaneous collapse theories, quantum mechanics can no longer to be applied to systems larger than some threshold mass. And while these models have yet to be empirically verified,  they haven’t been ruled out either.

Nicolas Gisin of the University of Geneva favors spontaneous collapse theories as a way to resolve the contradiction in the Frauchiger-Renner experiment. “My way out of their conundrum is clearly by saying, ‘No, at some point the superposition principle no longer holds,’” he said.

If you want to hold on to the assumption that quantum theory is universally applicable, and that measurements have only a single outcome, then you’ve got to let go of the remaining assumption, that of consistency: The predictions made by different agents using quantum theory will not be contradictory.

Using a slightly altered version of the Frauchiger-Renner experiment, Leifer has shown that this final assumption, or a variant thereof, must go if Copenhagen-style theories hold true. In Leifer’s analysis, these theories share certain attributes, in that they are universally applicable, anti-realistic (meaning that quantum systems don’t have well-defined properties, such as position, before measurement) and complete (meaning that there is no hidden reality that the theory is failing to capture). Given these attributes, his work implies that there is no single outcome of a given measurement that’s objectively true for all observers. So if a detector clicked for Alice’s friend inside the lab, then it’s an objective fact for her, but not so for Alice, who is outside the lab modeling the entire lab using quantum theory. The results of measurements depend on the perspective of the observer.

“If you want to maintain the Copenhagen type of view, it seems the best move is towards this perspectival version,” Leifer said. He points out that certain interpretations, such as quantum Bayesianism, or QBism, have already adopted the stance that measurement outcomes are subjective to an observer.

Renner thinks that giving up this assumption entirely would destroy a theory’s ability to be effective as a means for agents to know about each other’s state of knowledge; such a theory could be dismissed as solipsistic. So any theory that moves toward facts being subjective has to re-establish some means of communicating knowledge that satisfies two opposing constraints. First, it has to be weak enough that it doesn’t provoke the paradox seen in the Frauchiger-Renner experiment. Yet it must also be strong enough to avoid charges of solipsism. No one has yet formulated such a theory to everyone’s satisfaction.

The Frauchiger-Renner experiment generates contradictions among a set of three seemingly sensible assumptions. The effort to explicate how various interpretations of quantum theory violate the assumptions has been “an extremely useful exercise,” said Rob Spekkens of the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

“This thought experiment is a great lens through which to examine the differences of opinions between different camps on the interpretation of quantum theory,” Spekkens said. “I don’t think it’s really eliminated options that people were endorsing prior to the work, but it has clarified precisely what the different interpretational camps need to believe to avoid this contradiction. It has served to clarify people’s position on some of these issues.”

Given that theoreticians cannot tell the interpretations apart, experimentalists are thinking about how to implement the thought experiment, in the hope of further illuminating the problem. But it will be a formidable task, because the experiment makes some weird demands. For example, when Alice makes a special measurement on her friend and her lab, it puts everything, the friend’s brain included, into a superposition of states.

Mathematically, this complicated measurement is the same as first reversing the time evolution of the system — such that the memory of the agent is erased and the quantum system (such as the particle the agent has measured) is brought back to its original state — and then performing a simpler measurement on just the particle, said Howard Wiseman of Griffith University in Brisbane, Australia. The measurement may be simple, but as Gisin points out rather diplomatically, “Reversing an agent, including the brain and the memory of that agent, is the delicate part.”

Nonetheless, Gisin is not averse to thinking that maybe, one day, the experiment could be done using complex quantum computers as the agents inside the labs (acting as Alice’s friend and Bob’s friend). In principle, the time evolution of a quantum computer can be reversed. One possibility is that such an experiment will replicate the predictions of standard quantum mechanics even as quantum computers get more and more complex. But it may not. “Another alternative is that at some point while we develop these quantum computers, we hit the boundary of the superposition principle and that actually quantum mechanics is not universal,” Gisin said.

Leifer, for his part, is holding out for something new. “I think the correct interpretation of quantum mechanics is none of the above,” he said.

He likens the current situation with quantum mechanics to the time before Einstein came up with his special theory of relativity. Experimentalists had found no sign of the “luminiferous ether” — the medium through which light waves were thought to propagate in a Newtonian universe. Einstein argued that there is no ether. Instead he showed that space and time are malleable. “Pre-Einstein I couldn’t have told you that it was the structure of space and time that was going to change,” Leifer said.

Quantum mechanics is in a similar situation now, he thinks. “It’s likely that we are making some implicit assumption about the way the world has to be that just isn’t true,” he said. “Once we change that, once we modify that assumption, everything would suddenly fall into place. That’s kind of the hope. Anybody who is skeptical of all interpretations of quantum mechanics must be thinking something like this. Can I tell you what’s a plausible candidate for such an assumption? Well, if I could, I would just be working on that theory.”



show enclosure

(image/jpg)
Jaksland, Rasmus (2018) Probing spacetime with a holographic relation between spacetime and entanglement. [Preprint]

Authors: E. Zambrini Cruzeiro, N. Gisin

We give the complete list of 175 facet Bell inequalities for the case where Alice and Bob each choose their measurements from a set of four binary outcome measurements. For each inequality we compute the maximum quantum violation for qubits, the resistance to noise, and the minimal detection efficiency required for closing the detection loophole with maximally entangled qubit states, in the case where both detectors have the same efficiency (symmetric case).

Authors: C. J. F. van de Ven, G. C. Groenenboom, R. Reuvers, N. P. Landsman

Spontaneous symmetry breaking (SSB) is mathematically tied to either the thermodynamic or the classical limit, but physically, some approximate form of SSB must occur before the limit. For a Schroedinger operator with double well potential in the classical limit, this may indeed be accomplished by the "flea" mechanism discovered in the 1980s by Jona-Lasinio et al. We adapt this mechanism to the Curie-Weiss model (as a paradigmatic mean-field quantum spin system), and also establish an unexpected relationship between this model (for finite N) and a discretized Schroedinger operator with double well potential.

Authors: Pegah Imannezhad, Ali Ahanj

Quantum cognition is an emerging field making uses of quantum theory to model cognitive phenomena which cannot be explained by classical theories. Usually, in cognitive tests, subjects are asked to give a response to a question, but in this paper, we just observed the subjects' behaviour and the question and answer method was not applied in order to prevent any mental background on participants' minds. Finally, we examined the experimental data on Hardy's non-locality argument (HNA), and we noticed the violation of HNA in human behaviour.

Authors: Isaac Torres, Júlio César Fabris, Oliver Fabio Piattella

We present here a quantum cosmological model with Bohm-de Broglie interpretation of the theory described by a combination of two terms of the Fab Four cosmological theory. The first term is the John Lagrangian and the second is a potential representing matter content to avoid classical trivial solutions. This model has two free functions that provide an adjustment mechanism known classically as self-tuning. The self-tuning is a way to address the cosmological constant problem by allowing a partial break of symmetry in the scalar field sector. The Fab Four is the most general set of self-tuning scalar-tensor gravitational theories in four dimensions. The minisuperspace Hamiltonian thus obtained from this combination of Fab Four terms has fractional powers in the momenta, leading to a problem in applying canonical quantization. We have solved this problem by generalizing the canonical quantization rule using the so-called conformable fractional derivative. We show that this analysis leads to both singular and bouncing (non-singular) solutions, depending on the initial conditions over the scale factor and the homogeneous scalar field, and also depending on the free functions mentioned. This provides an adjustment mechanism in analogy with the classical self-tuning of the Fab Four, but with another interpretation.

Authors: Lisa Glaser, Sebastian Steinhaus

Computer simulations allow us to explore non-perturbative phenomena in physics. This has the potential to help us understand quantum gravity. Finding a theory of quantum gravity is a hard problem, but in the last decades many promising and intriguing approaches that utilize or might benefit from using numerical methods were developed. These approaches are based on very different ideas and assumptions, yet they face the common challenge to derive predictions and compare them to data. In March 2018 we held a workshop at the Nordic Institute for Theoretical Physics (NORDITA) in Stockholm gathering experts in many different approaches to quantum gravity for a workshop on "Quantum gravity on the computer". In this article we try to encapsulate some of the discussions held and talks given during this workshop and combine them with our own thoughts on why and how numerical approaches will play an important role in pushing quantum gravity forward. The last section of the article is a road map providing an outlook of the field and some intentions and goalposts that were debated in the closing session of the workshop. We hope that it will help to build a strong numerical community reaching beyond single approaches to combine our efforts in the search for quantum gravity.

Authors: Ram Brustein, A.J.M. Medved, K. Yagi

Black hole (BH) thermodynamics was established by Bekenstein and Hawking, who made abstract theoretical arguments about the second law of thermodynamics and quantum theory in curved spacetime respectively. Testing these ideas experimentally has, so far, been impractical because the putative flux of Hawking radiation from astrophysical BHs is too small to be distinguished from the rest of the hot environment. Here, it is proposed that the spectrum of emitted gravitational waves (GWs) after the merger of two BHs, in particular the spectrum of GW150914, can be used to infer a lower limit on the magnitude of the entropy of the post-merger BH. This lower bound is significant as it is the same order as the Bekenstein-Hawking entropy. To infer this limit, we first assume that the result of the merger is an ultracompact object with an external geometry which is Schwarzschild or Kerr, but with an outer surface which is capable of reflecting in-falling GWs rather than fully absorbing them. Because of the absence of deviations from the predictions of general relativity in detected GW signals, we then obtain a bound on the minimal redshift factor of GWs that emerge from the vicinity of the object's surface. The lack of deviations also means that the merger remnant essentially needs to have an absorbing surface, and thus it must effectively be a BH. Finally, a relationship between the minimal redshift factor and the BH entropy, which was first proposed by 't Hooft, is used to set a lower bound on the entropy of the post-merger BH.

Abstract

The PBR theorem gives insight into how quantum mechanics describes a physical system. This paper explores PBRs’ general result and shows that it does not disallow the ensemble interpretation of quantum mechanics and maintains, as it must, the fundamentally statistical character of quantum mechanics. This is illustrated by drawing an analogy with an ideal gas. An ensemble interpretation of the Schrödinger cat experiment that does not violate the PBR conclusion is also given. The ramifications, limits, and weaknesses of the PBR assumptions, especially in light of lessons learned from Bell’s theorem, are elucidated. It is shown that, if valid, PBRs’ conclusion specifies what type of ensemble interpretations are possible. The PBR conclusion would require a more direct correspondence between the quantum state (e.g., \( \left| {\psi \rangle } \right. \) ) and the reality it describes than might otherwise be expected. A simple terminology is introduced to clarify this greater correspondence.

Abstract

Monism is roughly the view that there is only one fundamental entity. One of the most powerful argument in its favor comes from quantum mechanics. Extant discussions of quantum monism are framed independently of any interpretation of the quantum theory. In contrast, this paper argues that matters of interpretation play a crucial role when assessing the viability of monism in the quantum realm. I consider four different interpretations: modal interpretations, Bohmian mechanics, many worlds interpretations, and wavefunction realism. In particular, I extensively argue for the following claim: several interpretations of QM do not support monism at a more serious scrutiny, or do so only with further problematic assumptions, or even support different versions of it.

Author(s): Igor Marinković, Andreas Wallucks, Ralf Riedinger, Sungkun Hong, Markus Aspelmeyer, and Simon Gröblacher

Researchers have experimentally demonstrated two cornerstones of quantum physics—entanglement and Bell inequality violations—with two macroscopic mechanical resonators.


[Phys. Rev. Lett. 121, 220404] Published Thu Nov 29, 2018

Loop quantum gravity redux, ancient automatons, and the weirdness of tropical flora: Books in brief

Loop quantum gravity redux, ancient automatons, and the weirdness of tropical flora: Books in brief, Published online: 28 November 2018; doi:10.1038/d41586-018-07525-4

Barbara Kiser reviews five of the week’s best science picks.

Quantum 2, 108 (2018).

https://doi.org/10.22331/q-2018-11-27-108

Thermodynamics is traditionally constrained to the study of macroscopic systems whose energy fluctuations are negligible compared to their average energy. Here, we push beyond this thermodynamic limit by developing a mathematical framework to rigorously address the problem of thermodynamic transformations of finite-size systems. More formally, we analyse state interconversion under thermal operations and between arbitrary energy-incoherent states. We find precise relations between the optimal rate at which interconversion can take place and the desired infidelity of the final state when the system size is sufficiently large. These so-called second-order asymptotics provide a bridge between the extreme cases of single-shot thermodynamics and the asymptotic limit of infinitely large systems. We illustrate the utility of our results with several examples. We first show how thermodynamic cycles are affected by irreversibility due to finite-size effects. We then provide a precise expression for the gap between the distillable work and work of formation that opens away from the thermodynamic limit. Finally, we explain how the performance of a heat engine gets affected when one of the heat baths it operates between is finite. We find that while perfect work cannot generally be extracted at Carnot efficiency, there are conditions under which these finite-size effects vanish. In deriving our results we also clarify relations between different notions of approximate majorisation.

Publication date: Available online 23 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Kelvin J. McQueen, Lev Vaidman

Abstract

We defend the many-worlds interpretation of quantum mechanics (MWI) against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer's self-location probabilities. We formulate a probability postulate for the MWI: the probability of self-location in a world with a given set of outcomes is the absolute square of that world's amplitude. We provide a proof of this postulate, which assumes the quantum formalism and two principles concerning symmetry and locality. We also show how a structurally similar proof of the Born rule is available for collapse theories. We conclude by comparing our account to the recent account offered by Sebens and Carroll.

Crowther, Karen (2017) Inter-theory Relations in Quantum Gravity: Correspondence, Reduction, and Emergence. [Preprint]
Marsh, Brendan Depictions of Quantum Reality in Kent's Interpretation of Quantum Theory. UNSPECIFIED.

Abstract

Historically, the hypothesis that our world is a computer simulation has struck many as just another improbable-but-possible “skeptical hypothesis” about the nature of reality. Recently, however, the simulation hypothesis has received significant attention from philosophers, physicists, and the popular press. This is due to the discovery of an epistemic dependency: If we believe that our civilization will one day run many simulations concerning its ancestry, then we should believe that we are probably in an ancestor simulation right now. This essay examines a troubling but underexplored feature of the ancestor-simulation hypothesis: the termination risk posed by both ancestor-simulation technology and experimental probes into whether our world is an ancestor simulation. This essay evaluates the termination risk by using extrapolations from current computing practices and simulation technology. The conclusions, while provisional, have great implications for debates concerning the fundamental nature of reality and the safety of contemporary physics.

Gao, Shan (2018) Unitary quantum theories are incompatible with special relativity. [Preprint]
Cuffaro, Michael E. (2018) Universality, Invariance, and the Foundations of Computational Complexity in the light of the Quantum Computer. [Preprint]
Cuffaro, Michael E. (2018) Information Causality, the Tsirelson Bound, and the 'Being-Thus' of Things. [Preprint]
American Journal of Physics, Volume 86, Issue 12, Page 957-959, December 2018.
American Journal of Physics, Volume 86, Issue 12, Page 953-955, December 2018.

Quantum 2, 107 (2018).

https://doi.org/10.22331/q-2018-11-19-107

We show that spin systems with infinite-range interactions can violate at thermal equilibrium a multipartite Bell inequality, up to a finite critical temperature $T_c$. Our framework can be applied to a wide class of spin systems and Bell inequalities, to study whether nonlocality occurs naturally in quantum many-body systems close to the ground state. Moreover, we also show that the low-energy spectrum of the Bell operator associated to such systems can be well approximated by the one of a quantum harmonic oscillator, and that spin-squeezed states are optimal in displaying Bell correlations for such Bell inequalities.

In an influential paper, Eppley and Hannah argued that gravity must necessarily be quantized, by proposing a thought experiment involving classical gravitational waves interacting with quantum matter. They argue the interaction must either violate the uncertainty principle or allow superluminal signalling. The feasibility of implementing their experiment in our universe has been challenged by Mattingly, and other limitations of the argument have been noted by Huggett and Callender and by Albers et al . However, these critiques do not directly refute the claim that coupling quantum theories with a Copenhagen collapse postulate to unentanglable classical gravitational degrees of freedom leads to contradiction. I note here that if the gravitational field interacts with matter via the local quantum state, the Eppley–Hannah argument evidently fails. This seems a plausibly natural feature of a hybrid theory, whereas the alternative considered by Eppley–Hannah is evidently incons...
We present a stochastic framework for emergent quantum gravity coupled to matter. The Hamiltonian constraint in diffeomorphism-invariant theories demands the identification of a clock relative to which dynamics may be defined, and other degrees of freedom can play the role of rulers. However, a global system of clock and rulers is generally not available. We provide evidence that stochasticity associated with critical points of clock and ruler fields can be related to the emergence of both a probabilistic description consistent with ordinary quantum theory, and gravitation described by general relativity at long distances. We propose a procedure for embedding any Lorentz-invariant field theory, including the Standard Model and its Lorentz-invariant extensions, in this framework.
Williams, Porter (2018) Renormalization Group Methods. [Preprint]

Publication date: Available online 16 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Benjamin Feintzeig, J.B. Le Manchak, Sarita Rosenstock, James Owen Weatherall

Abstract

We provide a novel perspective on “regularity” as a property of representations of the Weyl algebra. We first critique a proposal by Halvorson [2004, “Complementarity of representations in quantum mechanics”, Studies in History and Philosophy of Modern Physics35 (1), pp. 45–56], who argues that the non-regular “position” and “momentum” representations of the Weyl algebra demonstrate that a quantum mechanical particle can have definite values for position or momentum, contrary to a widespread view. We show that there are obstacles to such an intepretation of non-regular representations. In Part II, we propose a justification for focusing on regular representations, pace Halvorson, by drawing on algebraic methods.

Publication date: Available online 15 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Benjamin Feintzeig, James Owen Weatherall

Abstract

We provide a novel perspective on “regularity” as a property of representations of the Weyl algebra. In Part I, we critiqued a proposal by Halvorson [2004, “Complementarity of representations in quantum mechanics”, Studies in History and Philosophy of Modern Physics35 (1), pp. 45–56], who advocates for the use of the non-regular “position” and “momentum” representations of the Weyl algebra. Halvorson argues that the existence of these non-regular representations demonstrates that a quantum mechanical particle can have definite values for position or momentum, contrary to a widespread view. In this sequel, we propose a justification for focusing on regular representations, pace Halvorson, by drawing on algebraic methods.

Authors: J.D. Franson, R.A. Brewster

The Schrodinger and Heisenberg pictures are equivalent formulations of quantum mechanics in the sense that they give the same expectation value for any operator. We consider a sequence of two or more unitary transformations and show that the Heisenberg operator produced after the first transformation cannot be viewed as the input to the second transformation. The experimental consequences of this are illustrated by several examples in quantum optics.

Authors: David J. Fernandez C

Along the years, supersymmetric quantum mechanics (SUSY QM) has been used for studying solvable quantum potentials. It is the simplest method to build Hamiltonians with prescribed spectra in the spectral design. The key is to pair two Hamiltonians through a finite order differential operator. Some related subjects can be simply analyzed, as the algebras ruling both Hamiltonians and the associated coherent states. The technique has been applied also to periodic potentials, where the spectra consist of allowed and forbidden energy bands. In addition, a link with non-linear second order differential equations, and the possibility of generating some solutions, can be explored. Recent applications concern the study of Dirac electrons in graphene placed either in electric or magnetic fields, and the analysis of optical systems whose relevant equations are the same as those of SUSY QM. These issues will be reviewed briefly in this paper, trying to identify the most important subjects explored currently in the literature.

Authors: D Delphenich

Although the wish to unify theories into something more fundamental is omnipresent and compelling, nonetheless, in a sense, theories must first be unifiable. The reasons for the success of the unification of electricity and magnetism into a theory of electromagnetism are contrasted with the reasons for the failure of the Einstein-Maxwell unification of gravitation and electromagnetism and the attempts of quantum gravity to unify Einstein's theory of gravity with quantum field theory. The difference between a unification of two theories, a concatenation of them, and the existence of a formal analogy between them is also discussed.

Publication date: Available online 13 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Michael E. Cuffaro

Abstract

The principle of ‘information causality’ can be used to derive an upper bound—known as the ‘Tsirelson bound’—on the strength of quantum mechanical correlations, and has been conjectured to be a foundational principle of nature. To date, however, it has not been sufficiently motivated to play such a foundational role. The motivations that have so far been given are, as I argue, either unsatisfactorily vague or appeal to little if anything more than intuition. Thus in this paper I consider whether some way might be found to successfully motivate the principle. And I propose that a compelling way of so doing is to understand it as a generalisation of Einstein's principle of the mutually independent existence—the ‘being-thus’—of spatially distant things. In particular I first describe an argument, due to Demopoulos, to the effect that the so-called ‘no-signalling’ condition can be viewed as a generalisation of Einstein's principle that is appropriate for an irreducibly statistical theory such as quantum mechanics. I then argue that a compelling way to motivate information causality is to in turn consider it as a further generalisation of the Einsteinian principle that is appropriate for a theory of communication. I describe, however, some important conceptual obstacles that must yet be overcome if the project of establishing information causality as a foundational principle of nature is to succeed.

Publication date: Available online 13 November 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Baptiste Le Bihan, Niels Linnemann

Abstract

Important features of space and time are taken to be missing in quantum gravity, allegedly requiring an explanation of the emergence of spacetime from non-spatio-temporal theories. In this paper, we argue that the explanatory gap between general relativity and non-spatio-temporal quantum gravity theories might significantly be reduced with two moves. First, we point out that spacetime is already partially missing in the context of general relativity when understood from a dynamical perspective. Second, we argue that most approaches to quantum gravity already start with an in-built distinction between structures to which the asymmetry between space and time can be traced back.

Barren plateaus in quantum neural network training landscapes

Barren plateaus in quantum neural network training landscapes, Published online: 16 November 2018; doi:10.1038/s41467-018-07090-4

Gradient-based hybrid quantum-classical algorithms are often initialised with random, unstructured guesses. Here, the authors show that this approach will fail in the long run, due to the exponentially-small probability of finding a large enough gradient along any direction.

How does gravity work at the particle level? The question has stumped physicists since the two bedrock theories of general relativity (Albert Einstein’s equations envisioning gravity as curves in the geometry of space-time) and quantum mechanics (equations that describe particle interactions) revolutionized the discipline about a century ago.

One challenge to solving the problem lies in the relative weakness of gravity compared with the strong, weak and electromagnetic forces that govern the subatomic realm. Though gravity exerts an unmistakable influence on macroscopic objects like orbiting planets, leaping sharks and everything else we physically experience, it produces a negligible effect at the particle level, so physicists can’t test or study how it works at that scale.

Confounding matters, the two sets of equations don’t play well together. General relativity paints a continuous picture of space-time while in quantum mechanics everything is quantized in discrete chunks. Their incompatibility leads physicists to suspect that a more fundamental theory is needed to unify all four forces of nature and describe them at all scales.

One relatively recent approach to understanding quantum gravity makes use of a “holographic duality” from string theory called the AdS-CFT correspondence. Our latest In Theory video explains how this correspondence connects a lower dimensional particle theory to a higher dimensional space that includes gravity:

This holographic duality has become a powerful theoretical tool in the quest to understand quantum gravity and the inner workings of black holes and the Big Bang, where extreme gravity operates at tiny scales.

We hope you enjoyed this second episode from season two of Quanta’s In Theory video series. Season two opened in August with an animated explainer about a mysterious mathematical pattern that has been discovered in disparate settings — in the energy spectra of heavy atomic nuclei, a function related to the distribution of prime numbers, an independent bus system in Mexico, spectral measurements of the internet, Arctic ponds, human bones and the color-sensitive cone cells in chicken eyes. To learn more, watch episode one below:



show enclosure

(image/jpg)
The most ambitious experiments yet show that the quantum weirdness Einstein famously hated rules the roost – not just here, but across the entire universe

Author(s): Axel Schild

The local conservation of a physical quantity whose distribution changes with time is mathematically described by the continuity equation. The corresponding time parameter, however, is defined with respect to an idealized classical clock. We consider what happens when this classical time is replaced...


[Phys. Rev. A 98, 052113] Published Mon Nov 12, 2018

Author(s): Zhikuan Zhao, Robert Pisarczyk, Jayne Thompson, Mile Gu, Vlatko Vedral, and Joseph F. Fitzsimons

The traditional formalism of nonrelativistic quantum theory allows the state of a quantum system to extend across space, but only restricts it to a single instant in time, leading to distinction between theoretical treatments of spatial and temporal quantum correlations. Here we unify the geometrica...


[Phys. Rev. A 98, 052312] Published Mon Nov 12, 2018

Quantum fractals

Quantum fractals, Published online: 12 November 2018; doi:10.1038/s41567-018-0327-1

Electrons with fractional dimension have been observed in an artificial Sierpiński triangle, demonstrating their quantum fractal nature.

Design and characterization of electrons in a fractal geometry

Design and characterization of electrons in a fractal geometry, Published online: 12 November 2018; doi:10.1038/s41567-018-0328-0

Electrons are confined to an artificial Sierpiński triangle. Microscopy measurements show that their wavefunctions become self-similar and their quantum properties inherit a non-integer dimension between 1 and 2.

Authors: Jonathan Oppenheim

We present a consistent theory of classical gravity coupled to quantum field theory. The dynamics is linear in the density matrix, completely positive and trace-preserving, and reduces to Einstein's equations in the classical limit. The constraints of general relativity are imposed as a symmetry on the equations of motion. The assumption that gravity is classical necessarily modifies the dynamical laws of quantum mechanics -- the theory must be fundamentally stochastic involving finite sized and probabilistic jumps in space-time and in the quantum field. Nonetheless the quantum state of the system can remain pure conditioned on the classical degrees of freedom. The measurement postulate of quantum mechanics is not needed since the interaction of the quantum degrees of freedom with classical space-time necessarily causes collapse of the wave-function. More generally, we derive a form of classical-quantum dynamics using a non-commuting divergence which has as its limit deterministic classical Hamiltonian evolution, and which doesn't suffer from the pathologies of the semi-classical theory.

Authors: Rafael Sánchez, Janine Splettstoesser, Robert S. Whitney

Maxwell demons are creatures that are imagined to be able to reduce the entropy of a system without performing any work on it. Conventionally, such a Maxwell demon's intricate action consists in measuring individual particles and subsequently performing feedback. Here we show that much simpler setups can still act as demons: we demonstrate that it is sufficient to exploit a non-equilibrium distribution to seemingly break the second law of thermodynamics. We propose both an electronic and an optical implementation of this phenomenon, realizable with current technology.

Authors: Sreenath K. Manikandan, Andrew N. Jordan

We propose an analogy between the quantum physics of a black hole in its late stages of the evaporation process and a superfluid Bose Einstein Condensate (BEC), based on the Horowitz and Maldacena quantum final state projection model [JHEP 2004(02), 008]. The superfluid region is considered to be analogous to the interior of a black hole, and the normal fluid/superfluid interface is compared to the event horizon of a black hole. We theoretically investigate the possibility of recovering the wavefunction of particles incident on a superfluid BEC from the normal fluid, facilitated by the mode conversion processes occurring at the normal fluid/superfluid BEC interface. We also study how the correlations of an infalling mode with an external memory system can be preserved in the process, similar to Hayden and Preskill's "information mirror" model for a black hole [JHEP 2007(09), 120]. Based on these analogies, we conjecture that the quantum state of bosons entering a black hole in its final state is the superfluid quantum ground state of interacting bosons. Our analogy suggests that the wavefunction of bosons falling into a black hole can be recovered from the outgoing Hawking modes. In the particular case when a hole-like quasiparticle (a density dip) is incident on the superfluid BEC causing the superfluid to shrink in size, our model indicates that the evaporation is unitary.

Klevgard, Paul (2018) Wave-Particle Duality: A New Look from First Principles. [Preprint]
Crowther, Karen and Linnemann, Niels and Wuthrich, Christian (2018) What we cannot learn from analogue experiments. [Preprint]
Castellani, Elena and Dardashti, Radin (2018) Symmetry breaking. [Preprint]
Lazarovici, Dustin (2018) On Boltzmann vs. Gibbs and the Equilibrium in Statistical Mechanics. [Preprint]
Leegwater, Gijs (2018) When Greenberger, Horne and Zeilinger meet Wigner's Friend. [Preprint]
Meincke, Anne Sophie (2018) The Disappearance of Change: Towards a Process Account of Persistence. [Preprint]

Quenching our thirst for universality

Quenching our thirst for universality, Published online: 07 November 2018; doi:10.1038/d41586-018-07272-6

Understanding the dynamics of quantum systems far from equilibrium is one of the most pressing issues in physics. Three experiments based on ultracold atomic systems provide a major step forward.
Le Bihan, Baptiste (2018) Space Emergence in Contemporary Physics: Why We Do Not Need Fundamentality, Layers of Reality and Emergence. [Preprint]
Gao, Shan (2018) Unitary quantum theory is incompatible with special relativity. [Preprint]

Quantum 2, 104 (2018).

https://doi.org/10.22331/q-2018-11-06-104

Using the existing classification of all alternatives to the measurement postulates of quantum theory we study the properties of bi-partite systems in these alternative theories. We prove that in all these theories the purification principle is violated, meaning that some mixed states are not the reduction of a pure state in a larger system. This allows us to derive the measurement postulates of quantum theory from the structure of pure states and reversible dynamics, and the requirement that the purification principle holds. The violation of the purification principle implies that there is some irreducible classicality in these theories, which appears like an important clue for the problem of deriving the Born rule within the many-worlds interpretation. We also prove that in all such modifications the task of state tomography with local measurements is impossible, and present a simple toy theory displaying all these exotic non-quantum phenomena. This toy model shows that, contrarily to previous claims, it is possible to modify the Born rule without violating the no-signalling principle. Finally, we argue that the quantum measurement postulates are the most non-classical amongst all alternatives.

Crowther, Karen (2018) What is the point of reduction in science? [Preprint]

Author(s): Pavel Sekatski, Jean-Daniel Bancal, Sebastian Wagner, and Nicolas Sangouard

Bell’s theorem has been proposed to certify, in a device-independent and robust way, blocks either producing or measuring quantum states. In this Letter, we provide a method based on Bell’s theorem to certify coherent operations for the storage, processing, and transfer of quantum information. This ...


[Phys. Rev. Lett. 121, 180505] Published Fri Nov 02, 2018

Abstract

Understanding the emergence of a tangible 4-dimensional space-time from a quantum theory of gravity promises to be a tremendously difficult task. This article makes the case that this task may not have to be carried. Space-time as we know it may be fundamental to begin with. I recall the common arguments against this possibility and review a class of recently discovered models bypassing the most serious objection. The generic solution of the measurement problem that is tied to semiclassical gravity as well as the difficulty of the alternative make it a reasonable default option in the absence of decisive experimental evidence.

Abstract

Complexified Liénard–Wiechert potentials simplify the mathematics of Kerr–Newman particles. Here we constrain them by fiat to move along Bohmian trajectories to see if anything interesting occurs, as their equations of motion are not known. A covariant theory due to Stueckelberg is used. This paper deviates from the traditional Bohmian interpretation of quantum mechanics since the electromagnetic interactions of Kerr–Newman particles are dictated by general relativity. A Gaussian wave function is used to produce the Bohmian trajectories, which are found to be multi-valued. A generalized analytic continuation is introduced which leads to an infinite number of trajectories. These include the entire set of Bohmian trajectories. This leads to multiple retarded times which come into play in complex space-time. If one weights these trajectories by their natural Bohmian weighting factors, then it is found that the particles do not radiate, that they are extended, and that they can have a finite electrostatic self energy, thus avoiding the usual divergence of the charged point particle. This effort does not in any way criticize or downplay the traditional Bohmian interpretation which does not assume the standard electromagnetic coupling to charged particles, but it suggests that a hybridization of Kerr–Newman particle theory with Bohmian mechanics might lead to interesting new physics, and maybe even the possibility of emergent quantum mechanics.

Abstract

The significance of the de Broglie/Bohm hidden-particle position in the relativistic regime is addressed, seeking connection to the (orthodox) single-particle Newton–Wigner position. The effect of non-positive excursions of the ensemble density for extreme cases of positive-energy waves is easily computed using an integral of the equations of motion developed here for free spin-0 particles in 1 + 1 dimensions and is interpreted in terms of virtual-like pair creation and annihilation beneath the Compton wavelength. A Bohm-theoretic description of the acausal explosion of a specific Newton–Wigner-localized state is presented in detail. The presence of virtual pairs found is interpreted as the Bohm picture of the spatial extension beyond single point particles proposed in the 1960s as to why space-like hyperplane dependence of the Newton–Wigner wavefunctions may be needed to achieve Lorentz covariance. For spin-1/2 particles the convective current is speculatively utilized for achieving parity with the spin-0 theory. The spin-0 improper quantum potential is generalized to an improper stress tensor for spin-1/2 particles.

Author(s): Dmitry V. Zhdanov, Denys I. Bondar, and Tamar Seideman

A quantum analog of friction (understood as a completely positive, Markovian, translation-invariant, phenomenological model of dissipation) is known to be at odds with detailed balance in the thermodynamic limit. We show that this is not the case for quantum systems with internal (e.g., spin) states...


[Phys. Rev. A 98, 042133] Published Mon Oct 29, 2018

Publication date: Available online 19 October 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Trevor Teitel

Abstract

Background independence begins life as an informal property that a physical theory might have, often glossed as ‘doesn't posit a fixed spacetime background’. Interest in trying to offer a precise account of background independence has been sparked by the pronouncements of several theorists working on quantum gravity that background independence embodies in some sense an essential discovery of the General Theory of Relativity, and a feature we should strive to carry forward to future physical theories. This paper has two goals. The first is to investigate what a world must be like in order to be truly described by a background independent theory given extant accounts of background independence. The second is to argue that there are no non-empirical reasons to be more confident in theories that satisfy extant accounts of background independence than in theories that don't. The paper concludes by drawing a general moral about a way in which focussing primarily on mathematical formulations of our physical theories can adversely affect debates in the metaphysics of physics.

Abstract

According to recent arguments for panpsychism, all (or most) physical properties are dispositional, dispositions require categorical grounds, and the only categorical properties we know are phenomenal properties. Therefore, phenomenal properties can be posited as the categorical grounds of all (or most) physical properties—in order to solve the mind–body problem and/or in order avoid noumenalism about the grounds of the physical world. One challenge to this case comes from dispositionalism, which agrees that all physical properties are dispositional, but denies that dispositions require categorical grounds. In this paper, I propose that this challenge can be met by the claim that the only (fundamentally) dispositional properties we know are phenomenal properties, in particular, phenomenal properties associated with agency, intention and/or motivation. Versions of this claim have been common in the history of philosophy, and have also been supported by a number of contemporary dispositionalists (and other realists about causal powers). I will defend a new and updated version of it. Combined with other premises from the original case for panpsychism—which are not affected by the challenge from dispositionalism—it forms an argument that dispositionalism entails panpsychism.

Abstract

Parallel lives (PL) is an ontological model of nature in which quantum mechanics and special relativity are unified in a single universe with a single space-time. Point-like objects called lives are the only fundamental objects in this space-time, and they propagate at or below c, and interact with one another only locally at point-like events in space-time, very much like classical point particles. Lives are not alive in any sense, nor do they possess consciousness or any agency to make decisions—they are simply point objects which encode memory at events in space-time. The only causes and effects in the universe occur when lives meet locally, and thus the causal structure of interaction events in space-time is Lorentz invariant. Each life traces a continuous world-line through space-time, and experiences its own relative world, fully defined by the outcomes of past events along its world-line (never superpositions), which are encoded in its external memory. A quantum field comprises a continuum of lives throughout space-time, and familiar physical systems like particles each comprise a sub-continuum of the lives of the field. Each life carries a hidden internal memory containing a local relative wavefunction, which is a local piece of a pure universal wavefunction, but it is the relative wavefunctions in the local memories throughout space-time which are physically real in PL, and not the universal wavefunction in configuration space. Furthermore, while the universal wavefunction tracks the average behavior of the lives of a system, it fails to track their individual dynamics and trajectories. There is always a preferred separable basis, and for an irreducible physical system, each orthogonal term in this basis is a different relative world—each containing some fraction of the lives of the system. The relative wavefunctions in the lives’ internal memories govern which lives of different systems can meet during future local interactions, and thereby enforce entanglement correlations—including Bell inequality violations. These, and many other details, are explored here, but several aspects of this framework are not yet fleshed out, and work is ongoing.

Quantum tunneling of a black hole into a white hole provides a model for the full life cycle of a black hole. The white hole acts as a long-lived remnant, providing a possible resolution to the information paradox. The remnant solution of the paradox has long been viewed with suspicion, mostly because remnants seemed to be such exotic objects. We point out that (i) established physics includes objects with precisely the required properties for remnants: white holes with small masses but large finite interiors; (ii) non-perturbative quantum gravity indicates that a black hole tunnels precisely into such a white hole, at the end of its evaporation. We address the objections to the existence of white-hole remnants, discuss their stability, and show how the notions of entropy relevant in this context allow them to evade several no-go arguments. A black hole’s formation, evaporation, tunneling to a white hole, and final slow decay, form a unitary process that does not violate any kno...

Author(s): Luca Mancino, Vasco Cavina, Antonella De Pasquale, Marco Sbroscia, Robert I. Booth, Emanuele Roccia, Ilaria Gianani, Vittorio Giovannetti, and Marco Barbieri

Theoretical bounds on irreversible entropy production in a thermalizing quantum system are supported by experiments simulating the thermalization of a qubit using a quantum photonic architecture.


[Phys. Rev. Lett. 121, 160602] Published Wed Oct 17, 2018

Abstract

In-principle restrictions on the amount of information that can be gathered about a system have been proposed as a foundational principle in several recent reconstructions of the formalism of quantum mechanics. However, it seems unclear precisely why one should be thus restricted. We investigate the notion of paradoxical self-reference as a possible origin of such epistemic horizons by means of a fixed-point theorem in Cartesian closed categories due to Lawvere that illuminates and unifies the different perspectives on self-reference.

We study an extension of spacetime across Schwarzschild’s central singularity and the behavior of the geodesics crossing it. Locality implies that this extension is independent from the future fate of black holes. We argue that this extension could be the ##IMG## [http://ej.iop.org/images/0264-9381/35/21/215010/cqgaae499ieqn001.gif] limit of the effective quantum geometry inside a black hole, and show that the central region contains causal diamonds with area satisfying Bousso’s bound for an entropy that can be as large as Hawking’s radiation entropy. This result sheds light on the possibility that Hawking radiation is purified by information crossing the internal singularity and supports the black hole to white hole transition scenario.

On Formalisms and Interpretations

-

Quantum

on 2018-10-15 2:47pm GMT

Quantum 2, 99 (2018).

https://doi.org/10.22331/q-2018-10-15-99

One of the reasons for the heated debates around the interpretations of quantum theory is a simple confusion between the notions of formalism $\textit{versus}$ interpretation. In this note, we make a clear distinction between them and show that there are actually two $\textit{inequivalent}$ quantum formalisms, namely the relative-state formalism and the standard formalism with the Born and measurement-update rules. We further propose a different probability rule for the relative-state formalism and discuss how Wigner's-friend-type experiments could show the inequivalence with the standard formalism. The feasibility in principle of such experiments, however, remains an open question.

Abstract
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition (the past hypothesis) to account for the temporal asymmetry. In this paper, I show that the past hypothesis also contains enough information to simplify the quantum ontology and define a natural initial condition. First, I introduce density matrix realism, the thesis that the quantum state of the universe is objective and impure. This stands in sharp contrast to wave function realism, the thesis that the quantum state of the universe is objective and pure. Second, I suggest that the past hypothesis is sufficient to determine a natural density matrix, which is simple and unique. This is achieved by what I call the initial projection hypothesis: the initial density matrix of the universe is the (normalized) projection onto the past hypothesis subspace (in the Hilbert space). Third, because the initial quantum state is unique and simple, we have a strong case for the nomological thesis: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, theoretical unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement.

Author(s): Astrid Eichhorn and Aaron Held

The hypothesized asymptotic safe behavior of gravity may be used to retrodict top and bottom quark masses by tracking the effect of quantum gravity fluctuations on matter fields.


[Phys. Rev. Lett. 121, 151302] Published Fri Oct 12, 2018

Author(s): Eliahu Cohen and Eli Pollak

Weak values have been shown to be helpful especially when considering them as the outcomes of weak measurements. In this paper we show that, in principle, the real and imaginary parts of the weak value of any operator may be elucidated from expectation values of suitably defined density, flux, and H...


[Phys. Rev. A 98, 042112] Published Tue Oct 09, 2018

Abstract

Fragmentalism was first introduced by Kit Fine in his ‘Tense and Reality’ (Modality and tense: philosophical papers, Oxford University Press, Oxford, pp 261–320, 2005). According to fragmentalism, reality is an inherently perspectival place that exhibits a fragmented structure. The current paper defends the fragmentalist interpretation of the special theory of relativity, which Fine briefly considers in his paper. The fragmentalist interpretation makes room for genuine facts regarding absolute simultaneity, duration and length. One might worry that positing such variant properties is a turn for the worse in terms of theoretical virtues because such properties are not involved in physical explanations and hence theoretically redundant. It will be argued that this is not right: if variant properties are indeed instantiated, they will also be involved in straightforward physical explanations and hence not explanatorily redundant. Hofweber and Lange, in their ‘Fine’s Fragmentalist Interpretation of Special Relativity’ (Noûs 51:871–883, 2017), object that the fragmentalist interpretation is in tension with the right explanation of the Lorentz transformations. It will be argued that their objection targets an inessential aspect of the fragmentalist framework and fails to raise any serious problem for the fragmentalist interpretation of special relativity.

Author(s): J. Nobakht, M. Carlesso, S. Donadi, M. Paternostro, and A. Bassi

The continuous spontaneous localization (CSL) model strives to describe the quantum-to-classical transition from the viewpoint of collapse models. However, its original formulation suffers from a fundamental inconsistency in that it is explicitly energy nonconserving. Fortunately, a dissipative exte...


[Phys. Rev. A 98, 042109] Published Mon Oct 08, 2018

Author(s): Jakub Rembieliński and Jacek Ciborowski

We introduce a variant of quantum and classical electrodynamics formulated on the grounds of a hypothesis of existence of a preferred frame of reference—a formalism complementary to that regarding the structure of the space of photonic states, presented by us recently [Phys. Rev. A 97, 062106 (2018)...


[Phys. Rev. A 98, 042107] Published Thu Oct 04, 2018

Abstract

In physics, one is often misled in thinking that the mathematical model of a system is part of or is that system itself. Think of expressions commonly used in physics like “point” particle, motion “on the line”, “smooth” observables, wave function, and even “going to infinity”, without forgetting perplexing phrases like “classical world” versus “quantum world”.... On the other hand, when a mathematical model becomes really inoperative in regard with correct predictions, one is forced to replace it with a new one. It is precisely what happened with the emergence of quantum physics. Classical models were (progressively) superseded by quantum ones through quantization prescriptions. These procedures appear often as ad hoc recipes. In the present paper, well defined quantizations, based on integral calculus and Weyl–Heisenberg symmetry, are described in simple terms through one of the most basic examples of mechanics. Starting from (quasi-) probability distribution(s) on the Euclidean plane viewed as the phase space for the motion of a point particle on the line, i.e., its classical model, we will show how to build corresponding quantum model(s) and associated probabilities (e.g. Husimi) or quasi-probabilities (e.g. Wigner) distributions. We highlight the regularizing rôle of such procedures with the familiar example of the motion of a particle with a variable mass and submitted to a step potential.

Abstract

The Horizon Quantum Mechanics is an approach that allows one to analyse the gravitational radius of spherically symmetric systems and compute the probability that a given quantum state is a black hole. We first review the (global) formalism and show how it reproduces a gravitationally inspired GUP relation. This results leads to unacceptably large fluctuations in the horizon size of astrophysical black holes if one insists in describing them as (smeared) central singularities. On the other hand, if they are extended systems, like in the corpuscular models, no such issue arises and one can in fact extend the formalism to include asymptotic mass and angular momentum with the harmonic model of rotating corpuscular black holes. The Horizon Quantum Mechanics then shows that, in simple configurations, the appearance of the inner horizon is suppressed and extremal (macroscopic) geometries seem disfavoured.

Abstract

It is shown that the nonlocal anomalous effective actions corresponding to the quantum breaking of the conformal symmetry can lead to observable modifications of Einstein’s equations. The fact that Einstein’s general relativity is in perfect agreement with all observations including cosmological or recently observed gravitational waves imposes strong restrictions on the field content of possible extensions of Einstein’s theory: all viable theories should have vanishing conformal anomalies. It is shown that a complete cancellation of conformal anomalies in \(D=4\) for both the \(C^2\) invariant and the Euler (Gauss–Bonnet) invariant can only be achieved for N-extended supergravity multiplets with \(N \ge 5\) .

Volume 4, Issue 4, pages 235-246

A. I. Arbab [Show Biography]

Arbab Ibrahim studied physics at Khartoum University and high energy physics at the International Cenetr for Theoretical Physics (ICTP), Italy. He has taught physics at Khartoum University and Qassim University, and he is currently a Professor of Physics. He has been a visiting scholar at University of Illinois, Urbana-Champaign, Towson University, and Sultan Qaboos University. His work concentrates on the formulation of quantum mechanics and electromagnetism using Quaternions. He has publications in wide range of theoretical physics. He is an active reviewer for many international journals.

By expressing the Schrödinger wavefunction in the form ψ=Re^iS, where R and S are real functions, we have shown that the expectation value of S is conserved. The amplitude of the wave (R) is found to satisfy the Schrödinger equation while the phase (S) is related to the energy conservation. Besides the quantum potential that depends on R,  we have obtained a phase potential that depends on the phase S derivative. The phase force is a dissipative force. The quantum potential may be attributed to the interaction between the two subfields S and R comprising the quantum particle. This results in splitting (creation/annihilation) of these subfields, each having a mass mc² with an internal frequency of 2mc²/h, satisfying the original wave equation and endowing the particle its quantum nature. The mass of one subfield reflects the interaction with the other subfield. If in Bohmian ansatz R satisfies the Klein-Gordon equation, then S must satisfies the wave equation. Conversely, if R satisfies the wave equation, then S yields the Einstein relativistic energy momentum equation.

Full Text Download (210k)

Volume 4, Issue 4, pages 247-267

Sebastian Fortin [Show Biography] and Olimpia Lombardi [Show Biography]

Oimpia Lombardi obtained her degree in Electronic Engineering and in Philosophy at the University of Buenos Aires, and her PhD in Philosophy at the same university. She is Principal Researcher at the National Scientific and Technical Research Council of Argentina. She is member of the Academie Internationale de Philosophie des Sciences and of the Foundational Questions Institute. She is the director of the Group of the Philosohy of Science at the University of Buenos Aires. Areas of interest: foundations of statistical mechanics, the problem of the arrow of time, interpretation of quantum mechanics, the nature of information, philosophy of chemistry.

Sebastian Fortin has a degree and a PhD in Physics at the University of Buenos Aires and a PhD in Epistemology and History of Science at the National University of Tres de Febrero, Argentina. He is Researcher at the National Scientific and Technical Research Council of Argentina and assistant professor at the Physics Department of the Faculty of Exact and Natural Sciences at the University of Buenos Aires. His field of interest is philosophy of physics, particularly foundations of quantum mechanics.

If decoherence is an irreversible process, its physical meaning might be clarified by comparing quantum and classical irreversibility. In this work we carry out this comparison, from which a unified view of the emergence of irreversibility arises, applicable both to the classical and to the quantum case. According to this unified view, in the two cases the irreversible macro-level arises from the reversible micro-level as a coarse description that can be understood in terms of the concept of projection. This position supplies an understanding of the phenomenon of decoherence different from that implicit in most presentations: the reduced state is not the quantum state of the open system, but a coarse state of the closed composite system; as a consequence, decoherence should be understood not as a phenomenon resulting from the interaction between an open system and its environment, but rather as a coarse evolution that emerges from disregarding certain degrees of freedom of the whole closed system.

Full Text Download (923k)

Volume 4, Issue 4, pages 223-234

Mohammed Sanduk [Show Biography]

Mohammed Sanduk is an Iraqi born British physicist. He was educated at University of Baghdad and University of Manchester. Before attending his undergraduate study, he pub-lished a book in particle physics entitled “Mesons”. Sanduk has worked in industry and academia, and his last post in Iraq was head of the Laser and Opto-electronics Engineering department at Nahrain University in Baghdad. Owing to his interest in the philosophy of science, and he was a member of the academic staff of Pontifical Babel College for Philosophy. Sanduk is working with the department of chemical and process engineering at the University of Surrey. Sanduk is interested in transport of charged particles, Magnetohydro-dynamics, and the renewable energy technology. In addition to that, Sanduk is interested in the foundation of Quantum mechanics, and the philosophy of science & technology.

In the last article, an approach was developed to form an analogy of the wave function and derive analogies for both the mathematical forms of the Dirac and Klein-Gordon equations. The analogies obtained were the transformations from the classical real model forms to the forms in complex space. The analogous of the Klein-Gordon equation was derived from the analogous Dirac equation as in the case of quantum mechanics. In the present work, the forms of Dirac and Klein-Gordon equations were derived as a direct transformation from the classical model. It was found that the Dirac equation form may be related to a complex velocity equation. The Dirac’s Hamiltonian and coefficients correspond to each other in these analogies. The Klein-Gordon equation form may be related to the complex acceleration equation. The complex acceleration equation can explain the generation of the flat spacetime. Although this approach is classical, it may show a possibility of unifying relativistic quantum mechanics and special relativity in a single model and throw light on the undetectable æther.

Full Text Download (576k)

Author(s): Ezad Shojaee, Christopher S. Jackson, Carlos A. Riofrío, Amir Kalev, and Ivan H. Deutsch

The spin-coherent-state positive-operator-valued-measure (POVM) is a fundamental measurement in quantum science, with applications including tomography, metrology, teleportation, benchmarking, and measurement of Husimi phase space probabilities. We prove that this POVM is achieved by collectively me...


[Phys. Rev. Lett. 121, 130404] Published Wed Sep 26, 2018

Author(s): Ding Jia (贾丁)

There has been a body of work deriving the complex Hilbert-space structure of quantum theory from axioms/principles/postulates to deepen our understanding of quantum theory and to reveal ways to go beyond it to resolve foundational issues. Recent progress in incorporating indefinite causal structure...


[Phys. Rev. A 98, 032112] Published Wed Sep 19, 2018

Publication date: Available online 24 August 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): R. Hermens

Abstract

Three recent arguments seek to show that the universal applicability of unitary quantum theory is inconsistent with the assumption that a well-conducted measurement always has a definite physical outcome. In this paper I restate and analyze these arguments. The import of the first two is diminished by their dependence on assumptions about the outcomes of counterfactual measurements. But the third argument establishes its intended conclusion. Even if every well-conducted quantum measurement we ever make will have a definite physical outcome, this argument should make us reconsider the objectivity of that outcome.

Abstract

We review the argument that latent image formation is a measurement in which the state vector collapses, requiring an enhanced noise parameter in objective reduction models. Tentative observation of a residual noise at this level, plus several experimental bounds, imply that the noise must be colored (i.e., non-white), and hence frame dependent and non-relativistic. Thus a relativistic objective reduction model, even if achievable in principle, would be incompatible with experiment; the best one can do is the non-relativistic CSL model. This negative conclusion has a positive aspect, in that the non-relativistic CSL reduction model evades the argument leading to the Conway–Kochen “Free Will Theorem”.

Author(s): Tatsuma Nishioka

In this review the entanglement and Renyi entropies in quantum field theory are described from different points of view, including the perturbative approach and holographic dualities. The applications of these results to constraining renormalization group flows are presented effectively and illustrated with a variety of examples.


[Rev. Mod. Phys. 90, 035007] Published Mon Sep 17, 2018

Author(s): Ricardo Ximenes, Fernando Parisio, and Eduardo O. Dias

The question of how long a particle takes to pass through a potential barrier is still a controversial topic in quantum mechanics. One of the main theoretical problems in obtaining estimates for measurable times is the fact that several previously defined time operators, which remained within the bo...


[Phys. Rev. A 98, 032105] Published Mon Sep 10, 2018

Author(s): Marcin Nowakowski, Eliahu Cohen, and Pawel Horodecki

The two-state-vector formalism and the entangled histories formalism are attempts to better understand quantum correlations in time. Both formalisms share some similarities, but they are not identical, having subtle differences in their interpretation and manipulation of quantum temporal structures....


[Phys. Rev. A 98, 032312] Published Mon Sep 10, 2018

Abstract

For a simple set of observables we can express, in terms of transition probabilities alone, the Heisenberg uncertainty relations, so that they are proven to be not only necessary, but sufficient too, in order for the given observables to admit a quantum model. Furthermore distinguished characterizations of strictly complex and real quantum models, with some ancillary results, are presented and discussed.

Author(s): Nora Tischler, Farzad Ghafari, Travis J. Baker, Sergei Slussarenko, Raj B. Patel, Morgan M. Weston, Sabine Wollmann, Lynden K. Shalm, Varun B. Verma, Sae Woo Nam, H. Chau Nguyen, Howard M. Wiseman, and Geoff J. Pryde

A new photon source is used to realize one-way Einstein-Podolsky-Rosen steering free from restrictions on the type of allowed measurements and on assumptions about the quantum state.


[Phys. Rev. Lett. 121, 100401] Published Fri Sep 07, 2018

Quantum 2, 92 (2018).

https://doi.org/10.22331/q-2018-09-03-92

Bell-inequality violations establish that two systems share some quantum entanglement. We give a simple test to certify that two systems share an asymptotically large amount of entanglement, $n$ EPR states. The test is efficient: unlike earlier tests that play many games, in sequence or in parallel, our test requires only one or two CHSH games. One system is directed to play a CHSH game on a random specified qubit $i$, and the other is told to play games on qubits $\{i,j\}$, without knowing which index is $i$. The test is robust: a success probability within $\delta$ of optimal guarantees distance $O(n^{5/2} \sqrt{\delta})$ from $n$ EPR states. However, the test does not tolerate constant $\delta$; it breaks down for $\delta = \tilde\Omega (1/\sqrt{n})$. We give an adversarial strategy that succeeds within delta of the optimum probability using only $\tilde O(\delta^{-2})$ EPR states.

Author(s): K. Goswami, C. Giarmatzi, M. Kewming, F. Costa, C. Branciard, J. Romero, and A. G. White

A photonic quantum switch between a pair of operations is constructed such that the causal order of operations cannot be distinguished, even in principle.


[Phys. Rev. Lett. 121, 090503] Published Fri Aug 31, 2018

Quantum 2, 87 (2018).

https://doi.org/10.22331/q-2018-08-27-87

Ernst Specker considered a particular feature of quantum theory to be especially fundamental, namely that pairwise joint measurability of sharp measurements implies their global joint measurability ($\href{https://vimeo.com/52923835}{vimeo.com/52923835}$). To date, Specker's principle seemed incapable of singling out quantum theory from the space of all general probabilistic theories. In particular, its well-known consequence for experimental statistics, the principle of consistent exclusivity, does not rule out the set of correlations known as almost quantum, which is strictly larger than the set of quantum correlations. Here we show that, contrary to the popular belief, Specker's principle cannot be satisfied in any theory that yields almost quantum correlations.

Author(s): Matteo Carlesso, Andrea Vinante, and Angelo Bassi

Recently, nonthermal excess noise, compatible with the theoretical prediction provided by collapse models, was measured in a millikelvin nanomechanical cantilever experiment [A. Vinante et al., Phys. Rev. Lett. 119, 110401 (2017)]. We propose a feasible implementation of the cantilever experiment a...


[Phys. Rev. A 98, 022122] Published Fri Aug 17, 2018

Quantum 2, 81 (2018).

https://doi.org/10.22331/q-2018-08-13-81

We provide a fine-grained definition for monogamous measure of entanglement that does not invoke any particular monogamy relation. Our definition is given in terms an equality, as oppose to inequality, that we call the "disentangling condition". We relate our definition to the more traditional one, by showing that it generates standard monogamy relations. We then show that all quantum Markov states satisfy the disentangling condition for any entanglement monotone. In addition, we demonstrate that entanglement monotones that are given in terms of a convex roof extension are monogamous if they are monogamous on pure states, and show that for any quantum state that satisfies the disentangling condition, its entanglement of formation equals the entanglement of assistance. We characterize all bipartite mixed states with this property, and use it to show that the G-concurrence is monogamous. In the case of two qubits, we show that the equality between entanglement of formation and assistance holds if and only if the state is a rank 2 bipartite state that can be expressed as the marginal of a pure 3-qubit state in the W class.

Share

Author(s): Ulf Leonhardt, Itay Griniasty, Sander Wildeman, Emmanuel Fort, and Mathias Fink

In the Unruh effect an observer with constant acceleration perceives the quantum vacuum as thermal radiation. The Unruh effect has been believed to be a pure quantum phenomenon, but here we show theoretically how the effect arises from the correlation of noise, regardless of whether this noise is qu...


[Phys. Rev. A 98, 022118] Published Mon Aug 13, 2018