Latest Papers on Quantum Foundations - Updated Daily by IJQF

Authors: Nathaniel Craig, Isabel Garcia Garcia

Stringent Swampland conjectures aimed at effective theories containing massive abelian vectors have recently been proposed (arXiv:1808.09966), with striking phenomenological implications. In this article, we show how effective theories that parametrically violate the proposed conjectures can be UV-completed into theories that satisfy them. The UV-completion is accessible through both the St\"uckelberg and Higgs mechanisms, with all dimensionless parameters taking $\mathcal{O}(1)$ values from the UV perspective. These constructions feature an IR limit containing a light vector that is parametrically separated from any other massive states, and from any cut-off scale mandated by quantum gravity consistency requirements. Moreover, the cut-off--to--vector--mass ratio remains parametrically large even in the decoupling limit in which all other massive states (including any scalar excitations) become arbitrarily heavy. We discuss how apparently strong constraints imposed by the proposed conjectures on phenomenologically interesting models, including specific production mechanisms of dark photon dark matter, are thereby circumvented.

Authors: Martin Bojowald, Suddhasattwa Brahma, Umut Buyukcam, Jonathan Guglielmon, Martijn van Kuppeveld

Weak magnetic monopoles with a continuum of charges less than the minimum implied by Dirac's quantization condition may be possible in non-associative quantum mechanics. If a weakly magnetically charged proton in a hydrogen atom perturbs the standard energy spectrum only slightly, magnetic charges could have escaped detection. Testing this hypothesis requires entirely new methods to compute energy spectra in non-associative quantum mechanics. Such methods are presented here, and evaluated for upper bounds on the magnetic charge of elementary particles.

Authors: S. Carlip, Ricardo A. Mosna, J. P. M. Pitelli

Quantum fluctuations of the vacuum stress-energy tensor are highly non-Gaussian, and can have unexpectedly large effects on spacetime geometry. In this paper, we study a two-dimensional dilaton gravity model coupled to a conformal field theory, in which the distribution of vacuum fluctuations is well understood. By analyzing geodesic deviation, we show that a pencil of massive particles propagating on this fuzzy spacetime eventually converges and collapses. The collapse time depends on the velocity of the congruence of particles, but for ultra-relativistic particles the collapse probability as a function of time converges to an exponential distribution, consistent with our earlier analysis of null geodesics [Phys. Rev. Lett. 107, 021303 (2011)]. We thus find further evidence for the influence of vacuum fluctuations on the small scale causal structure of spacetime.

Authors: Christopher J. Fewster, Rainer Verch

The measurement process is considered for quantum field theory on curved spacetimes. Measurements are carried out on one QFT, the "system", using another, the "probe" via a dynamical coupling of "system" and "probe" in a bounded spacetime region. The resulting "coupled theory" determines a scattering map on the uncoupled combination of the "system" and "probe" by reference to natural "in" and "out" spacetime regions. No specific interaction is assumed and all constructions are local and covariant.

Given any initial probe state in the "in" region, the scattering map determines a completely positive map from "probe" observables in the "out" region to "induced system observables", thus providing a measurement scheme for the latter. It is shown that the induced system observables may be localized in the causal hull of the interaction coupling region and are typically less sharp than the probe observable, but more sharp than the actual measurement on the coupled theory. Post-selected states conditioned on measurement outcomes are obtained using Davies-Lewis instruments. Composite measurements involving causally ordered coupling regions are also considered. Provided that the scattering map obeys a causal factorization property, the causally ordered composition of the individual instruments coincides with the composite instrument; in particular, the instruments may be combined in either order if the coupling regions are causally disjoint. This is the central consistency property of the proposed framework.

The general concepts and results are illustrated by an example in which both "system" and "probe" are quantized linear scalar fields, coupled by a quadratic interaction term with compact spacetime support. System observables induced by simple probe observables are calculated exactly, for sufficiently weak coupling, and compared with first order perturbation theory.

Authors: Min Zhuang, Jiahao Huang, Yongguan Ke, Chaohong Lee

Quantum adiabatic evolution, an important fundamental concept in physics, describes the dynamical evolution arbitrarily close to the instantaneous eigenstate of a slowly driven Hamiltonian. In most systems undergoing spontaneous symmetry-breaking transitions, their two lowest eigenstates change from non-degenerate to degenerate. Therefore, due to the corresponding energy gap vanishes, the conventional gap condition for quantum adiabatic evolution becomes invalid. Here we explore the existence of quantum adiabatic evolutions in spontaneous symmetry-breaking transitions and derive a symmetry-dependent adiabatic condition. Because the driven Hamiltonian conserves the symmetry in the whole process, the transition between different eigenstates with different symmetries is forbidden. Therefore, even if the gap vanishes, symmetry-protected quantum adiabatic evolution may appear when the driven system varies according to the symmetry-dependent adiabatic condition. This study not only advances our understandings of quantum adiabatic evolution and spontaneous symmetry-breaking transitions, but also provides extensive applications ranging from quantum state engineering, topological Thouless pumping to quantum computing.

On Formalisms and Interpretations

-

Quantum

on 2018-10-15 2:47pm GMT

Quantum 2, 99 (2018).

https://doi.org/10.22331/q-2018-10-15-99

One of the reasons for the heated debates around the interpretations of quantum theory is a simple confusion between the notions of formalism $\textit{versus}$ interpretation. In this note, we make a clear distinction between them and show that there are actually two $\textit{inequivalent}$ quantum formalisms, namely the relative-state formalism and the standard formalism with the Born and measurement-update rules. We further propose a different probability rule for the relative-state formalism and discuss how Wigner's-friend-type experiments could show the inequivalence with the standard formalism. The feasibility in principle of such experiments, however, remains an open question.

Chen, Eddy Keming (2018) Realism about the Wave Function. [Preprint]
Leifer, Matthew (2018) Against Fundamentalism. [Preprint]
Abstract
In a quantum universe with a strong arrow of time, we postulate a low-entropy boundary condition (the past hypothesis) to account for the temporal asymmetry. In this paper, I show that the past hypothesis also contains enough information to simplify the quantum ontology and define a natural initial condition. First, I introduce density matrix realism, the thesis that the quantum state of the universe is objective and impure. This stands in sharp contrast to wave function realism, the thesis that the quantum state of the universe is objective and pure. Second, I suggest that the past hypothesis is sufficient to determine a natural density matrix, which is simple and unique. This is achieved by what I call the initial projection hypothesis: the initial density matrix of the universe is the (normalized) projection onto the past hypothesis subspace (in the Hilbert space). Third, because the initial quantum state is unique and simple, we have a strong case for the nomological thesis: the initial quantum state of the universe is on a par with laws of nature. This new package of ideas has several interesting implications, including on the harmony between statistical mechanics and quantum mechanics, theoretical unity of the universe and the subsystems, and the alleged conflict between Humean supervenience and quantum entanglement.
Chen, Eddy Keming (2018) The Intrinsic Structure of Quantum Mechanics. [Preprint]

Author(s): Astrid Eichhorn and Aaron Held

The hypothesized asymptotic safe behavior of gravity may be used to retrodict top and bottom quark masses by tracking the effect of quantum gravity fluctuations on matter fields.


[Phys. Rev. Lett. 121, 151302] Published Fri Oct 12, 2018

Publisher Correction: Quantum mechanics: An inconsistent friend

Publisher Correction: Quantum mechanics: An inconsistent friend, Published online: 12 October 2018; doi:10.1038/s41567-018-0338-y

Publisher Correction: Quantum mechanics: An inconsistent friend

Author(s): Eliahu Cohen and Eli Pollak

Weak values have been shown to be helpful especially when considering them as the outcomes of weak measurements. In this paper we show that, in principle, the real and imaginary parts of the weak value of any operator may be elucidated from expectation values of suitably defined density, flux, and H...


[Phys. Rev. A 98, 042112] Published Tue Oct 09, 2018

Abstract

Fragmentalism was first introduced by Kit Fine in his ‘Tense and Reality’ (Modality and tense: philosophical papers, Oxford University Press, Oxford, pp 261–320, 2005). According to fragmentalism, reality is an inherently perspectival place that exhibits a fragmented structure. The current paper defends the fragmentalist interpretation of the special theory of relativity, which Fine briefly considers in his paper. The fragmentalist interpretation makes room for genuine facts regarding absolute simultaneity, duration and length. One might worry that positing such variant properties is a turn for the worse in terms of theoretical virtues because such properties are not involved in physical explanations and hence theoretically redundant. It will be argued that this is not right: if variant properties are indeed instantiated, they will also be involved in straightforward physical explanations and hence not explanatorily redundant. Hofweber and Lange, in their ‘Fine’s Fragmentalist Interpretation of Special Relativity’ (Noûs 51:871–883, 2017), object that the fragmentalist interpretation is in tension with the right explanation of the Lorentz transformations. It will be argued that their objection targets an inessential aspect of the fragmentalist framework and fails to raise any serious problem for the fragmentalist interpretation of special relativity.

In the spring of 2017, Urmila Mahadev found herself in what most graduate students would consider a pretty sweet position. She had just solved a major problem in quantum computation, the study of computers that derive their power from the strange laws of quantum physics. Combined with her earlier papers, Mahadev’s new result, on what is called blind computation, made it “clear she was a rising star,” said Scott Aaronson, a computer scientist at the University of Texas, Austin.

Mahadev, who was 28 at the time, was already in her seventh year of graduate school at the University of California, Berkeley — long past the stage when most students become impatient to graduate. Now, finally, she had the makings of a “very beautiful Ph.D. dissertation,” said Umesh Vazirani, her doctoral adviser at Berkeley.

But Mahadev did not graduate that year. She didn’t even consider graduating. She wasn’t finished.

For more than five years, she’d had a different research problem in her sights, one that Aaronson called “one of the most basic questions you can ask in quantum computation.” Namely: If you ask a quantum computer to perform a computation for you, how can you know whether it has really followed your instructions, or even done anything quantum at all?

This question may soon be far from academic. Before too many years have elapsed, researchers hope, quantum computers may be able to offer exponential speedups on a host of problems, from modeling the behavior around a black hole to simulating how a large protein folds up.

But once a quantum computer can perform computations a classical computer can’t, how will we know if it has done them correctly? If you distrust an ordinary computer, you can, in theory, scrutinize every step of its computations for yourself. But quantum systems are fundamentally resistant to this kind of checking. For one thing, their inner workings are incredibly complex: Writing down a description of the internal state of a computer with just a few hundred quantum bits (or “qubits”) would require a hard drive larger than the entire visible universe.

And even if you somehow had enough space to write down this description, there would be no way to get at it. The inner state of a quantum computer is generally a superposition of many different non-quantum, “classical” states (like Schrödinger’s cat, which is simultaneously dead and alive). But as soon as you measure a quantum state, it collapses into just one of these classical states. Peer inside a 300-qubit quantum computer, and essentially all you will see is 300 classical bits — zeros and ones — smiling blandly up at you.

“A quantum computer is very powerful, but it’s also very secretive,” Vazirani said.

Given these constraints, computer scientists have long wondered whether it is possible for a quantum computer to provide any ironclad guarantee that it really has done what it claimed. “Is the interaction between the quantum and the classical worlds strong enough so that a dialogue is possible?” asked Dorit Aharonov, a computer scientist at the Hebrew University of Jerusalem.

During her second year of graduate school, Mahadev became captivated by this problem, for reasons even she doesn’t fully understand. In the years that followed, she tried one approach after another. “I’ve had a lot of moments where I think I’m doing things right, and then they break, either very quickly or after a year,” she said.

But she refused to give up. Mahadev displayed a level of sustained determination that Vazirani has never seen matched. “Urmila is just absolutely extraordinary in this sense,” he said.

Now, after eight years of graduate school, Mahadev has succeeded. She has come up with an interactive protocol by which users with no quantum powers of their own can nevertheless employ cryptography to put a harness on a quantum computer and drive it wherever they want, with the certainty that the quantum computer is following their orders. Mahadev’s approach, Vazirani said, gives the user “leverage that the computer just can’t shake off.”

For a graduate student to achieve such a result as a solo effort is “pretty astounding,” Aaronson said.

Mahadev, who is now a postdoctoral researcher at Berkeley, presented her protocol yesterday at the annual Symposium on Foundations of Computer Science, one of theoretical computer science’s biggest conferences, held this year in Paris. Her work has been awarded the meeting’s “best paper” and “best student paper” prizes, a rare honor for a theoretical computer scientist.

In a blog post, Thomas Vidick, a computer scientist at the California Institute of Technology who has collaborated with Mahadev in the past, called her result “one of the most outstanding ideas to have emerged at the interface of quantum computing and theoretical computer science in recent years.”

Quantum computation researchers are excited not just about what Mahadev’s protocol achieves, but also about the radically new approach she has brought to bear on the problem. Using classical cryptography in the quantum realm is a “truly novel idea,” Vidick wrote. “I expect many more results to continue building on these ideas.”

A Long Road

Raised in Los Angeles in a family of doctors, Mahadev attended the University of Southern California, where she wandered from one area of study to another, at first convinced only that she did not want to become a doctor herself. Then a class taught by the computer scientist Leonard Adleman, one of the creators of the famous RSA encryption algorithm, got her excited about theoretical computer science. She applied to graduate school at Berkeley, explaining in her application that she was interested in all aspects of theoretical computer science — except for quantum computation.

“It sounded like the most foreign thing, the thing I knew least about,” she said.

But once she was at Berkeley, Vazirani’s accessible explanations soon changed her mind. He introduced her to the question of finding a protocol for verifying a quantum computation, and the problem “really fired up her imagination,” Vazirani said.

“Protocols are like puzzles,” Mahadev explained. “To me, they seem easier to get into than other questions, because you can immediately start thinking of protocols yourself and then breaking them, and that lets you see how they work.” She chose the problem for her doctoral research, launching herself on what Vazirani called “a very long road.”

If a quantum computer can solve a problem that a classical computer cannot, that doesn’t automatically mean the solution will be hard to check. Take, for example, the problem of factoring large numbers, a task that a big quantum computer could solve efficiently, but which is thought to be beyond the reach of any classical computer. Even if a classical computer can’t factor a number, it can easily check whether a quantum computer’s factorization is correct — it just needs to multiply the factors together and see if they produce the right answer.

Yet computer scientists believe (and have recently taken a step toward proving) that many of the problems a quantum computer could solve do not have this feature. In other words, a classical computer not only cannot solve them, but cannot even recognize whether a proposed solution is correct. In light of this, around 2004, Daniel Gottesman — a physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Ontario — posed the question of whether it is possible to come up with any protocol by which a quantum computer can prove to a non-quantum observer that it really has done what it claimed.

Within four years, quantum computation researchers had achieved a partial answer. It is possible, two different teams showed, for a quantum computer to prove its computations, not to a purely classical verifier, but to a verifier who has access to a very small quantum computer of her own. Researchers later refined this approach to show that all the verifier needs is the capacity to measure a single qubit at a time.

And in 2012, a team of researchers including Vazirani showed that a completely classical verifier could check quantum computations if they were carried out by a pair of quantum computers that can’t communicate with each other. But that paper’s approach was tailored to this specific scenario, and the problem seemed to hit a dead end there, Gottesman said. “I think there were probably people who thought you couldn’t go further.”

It was around this time that Mahadev encountered the verification problem. At first, she tried to come up with an “unconditional” result, one that makes no assumptions about what a quantum computer can or cannot do. But after she had worked on the problem for a while with no progress, Vazirani proposed instead the possibility of using “post-quantum” cryptography — that is, cryptography that researchers believe is beyond the capability of even a quantum computer to break, although they don’t know for sure. (Methods such as the RSA algorithm that are used to encrypt things like online transactions are not post-quantum — a large quantum computer could break them, because their security depends on the hardness of factoring large numbers.)

In 2016, while working on a different problem, Mahadev and Vazirani made an advance that would later prove crucial. In collaboration with Paul Christiano, a computer scientist now at OpenAI, a company in San Francisco, they developed a way to use cryptography to get a quantum computer to build what we’ll call a “secret state” — one whose description is known to the classical verifier, but not to the quantum computer itself.

Their procedure relies on what’s called a “trapdoor” function — one that is easy to carry out, but hard to reverse unless you possess a secret cryptographic key. (The researchers didn’t know how to actually build a suitable trapdoor function yet — that would come later.) The function is also required to be “two-to-one,” meaning that every output corresponds to two different inputs. Think, for example of the function that squares numbers — apart from the number 0, each output (such as 9) has two corresponding inputs (3 and −3).

Armed with such a function,   you can get a quantum computer to create a secret state as follows: First, you ask the computer to build a superposition of all the possible inputs to the function (this might sound complicated for the computer to carry out, but it’s actually easy). Then, you tell the computer to apply the function to this giant superposition, creating a new state that is a superposition of all the possible outputs of the function. The input and output superpositions will be entangled, which means that a measurement on one of them will instantly affect the other.

Next, you ask the computer to measure the output state and tell you the result. This measurement collapses the output state down to just one of the possible outputs, and the input state instantly collapses to match it, since they are entangled — for instance, if you use the squaring function, then if the output is the 9 state, the input will collapse down to a superposition of the 3 and −3 states.

But remember that you’re using a trapdoor function. You have the trapdoor’s secret key, so you can easily figure out the two states that make up the input superposition. But the quantum computer cannot. And it can’t simply measure the input superposition to figure out what it is made of, because that measurement would collapse it further, leaving the computer with one of the two inputs but no way to figure out the other.

In 2017, Mahadev figured out how to build the trapdoor functions at the core of the secret-state method by using a type of cryptography called Learning With Errors (LWE). Using these trapdoor functions, she was able to create a quantum version of “blind” computation, by which cloud-computing users can mask their data so the cloud computer can’t read it, even while it is computing on it. And shortly after that, Mahadev, Vazirani and Christiano teamed up with Vidick and Zvika Brakerski (of the Weizmann Institute of Science in Israel) to refine these trapdoor functions still further, using the secret-state method to develop a foolproof way for a quantum computer to generate provably random numbers.

Mahadev could have graduated on the strength of these results, but she was determined to keep working until she had solved the verification problem. “I was never thinking of graduation, because my goal was never graduation,” she said.

Not knowing whether she would be able to solve it was stressful at times. But, she said, “I was spending time learning about things that I was interested in, so it couldn’t really be a waste of time.”

Set in Stone

Mahadev tried various ways of getting from the secret-state method to a verification protocol, but for a while she got nowhere. Then she had a thought: Researchers had already shown that a verifier can check a quantum computer if the verifier is capable of measuring quantum bits. A classical verifier lacks this capability, by definition. But what if the classical verifier could somehow force the quantum computer to perform the measurements itself and report them honestly?

The tricky part, Mahadev realized, would be to get the quantum computer to commit to which state it was going to measure before it knew which kind of measurement the verifier would ask for — otherwise, it would be easy for the computer to fool the verifier. That’s where the secret-state method comes into play: Mahadev’s protocol requires the quantum computer to first create a secret state and then entangle it with the state it is supposed to measure. Only then does the computer find out what kind of measurement to perform.

Since the computer doesn’t know the makeup of the secret state but the verifier does, Mahadev showed that it’s impossible for the quantum computer to cheat significantly without leaving unmistakable traces of its duplicity. Essentially, Vidick wrote, the qubits the computer is to measure have been “set in cryptographic stone.” Because of this, if the measurement results look like a correct proof, the verifier can feel confident that they really are.

“It is such a wonderful idea!” Vidick wrote. “It stuns me every time Urmila explains it.”

Mahadev’s verification protocol — along with the random-number generator and the blind encryption method — depends on the assumption that quantum computers cannot crack LWE. At present, LWE is widely regarded as a leading candidate for post-quantum cryptography, and it may soon be adopted by the National Institute of Standards and Technology as its new cryptographic standard, to replace the ones a quantum computer could break. That doesn’t guarantee that it really is secure against quantum computers, Gottesman cautioned. “But so far it’s solid,” he said. “No one has found evidence that it’s likely to be breakable.”

In any case, the protocol’s reliance on LWE gives Mahadev’s work a win-win flavor, Vidick wrote. The only way that a quantum computer could fool the protocol is if someone in the quantum computing world figured out how to break LWE, which would itself be a remarkable achievement.

Mahadev’s protocol is unlikely to be implemented in a real quantum computer in the immediate future. For the time being, the protocol requires too much computing power to be practical. But that could change in the coming years, as quantum computers get larger and researchers streamline the protocol.

Mahadev’s protocol probably won’t be feasible within, say, the next five years, but “it is not completely off in fantasyland either,” Aaronson said. “It is something you could start thinking about, if all goes well, at the next stage of the evolution of quantum computers.”

And given how quickly the field is now moving, that stage could arrive sooner rather than later. After all, just five years ago, Vidick said, researchers thought that it would be many years before a quantum computer could solve any problem that a classical computer cannot. “Now,” he said, “people think it’s going to happen in a year or two.”

As for Mahadev, solving her favorite problem has left her feeling a bit at sea. She wishes she could understand just what it was about that problem that made it right for her, she said. “I have to find a new question now, so it would be nice to know.”

But theoretical computer scientists see Mahadev’s unification of quantum computation and cryptography not so much as the end of a story, but as the initial exploration of what will hopefully prove a rich vein of ideas.

“My feeling is that there are going to be lots of follow-ups,” Aharonov said. “I’m looking forward to more results from Urmila.”



show enclosure

(image/jpg)

Author(s): J. Nobakht, M. Carlesso, S. Donadi, M. Paternostro, and A. Bassi

The continuous spontaneous localization (CSL) model strives to describe the quantum-to-classical transition from the viewpoint of collapse models. However, its original formulation suffers from a fundamental inconsistency in that it is explicitly energy nonconserving. Fortunately, a dissipative exte...


[Phys. Rev. A 98, 042109] Published Mon Oct 08, 2018

Abstract

In times of crisis, when current theories are revealed as inadequate to task, and new physics is thought to be required—physics turns to re-evaluate its principles, and to seek new ones. This paper explores the various types, and roles of principles that feature in the problem of quantum gravity as a current crisis in physics. I illustrate the diversity of the principles being appealed to, and show that principles serve in a variety of roles in all stages of the crisis, including in motivating the need for a new theory, and defining what this theory should be like. In particular, I consider: the generalised correspondence principle, UV-completion, background independence, and the holographic principle. I also explore how the current crisis fits with Friedman’s view on the roles of principles in revolutionary theory-change, finding that while many key aspects of this view are not represented in quantum gravity (at the current stage), the view could potentially offer a useful diagnostic, and prescriptive strategy. This paper is intended to be relatively non-technical, and to bring some of the philosophical issues from the search for quantum gravity to a more general philosophical audience interested in the roles of principles in scientific theory-change.

Authors: A. S. Sanz

Since its inception Bohmian mechanics has been generally regarded as a hidden-variable theory aimed at providing an objective description of quantum phenomena. To date, this rather narrow conception of Bohm's proposal has caused it more rejection than acceptance. Now, after 65 years of Bohmian mechanics, should still be such an interpretational aspect the prevailing appraisal? Why not favoring a more pragmatic view, as a legitimate picture of quantum mechanics, on equal footing in all respects with any other more conventional quantum picture? These questions are used here to introduce a discussion on an alternative way to deal with Bohmian mechanics at present, enhancing its aspect as an efficient and useful picture or formulation to tackle, explore, describe and explain quantum phenomena where phase and correlation (entanglement) are key elements. This discussion is presented through two complementary blocks. The first block is aimed at briefly revisiting the historical context that gave rise to the appearance of Bohmian mechanics, and how this approach or analogous ones have been used in different physical contexts. This discussion is used to emphasize a more pragmatic view to the detriment of the more conventional hidden-variable (ontological) approach that has been a leitmotif within the quantum foundations. The second block focuses on some particular formal aspects of Bohmian mechanics supporting the view presented here, with special emphasis on the physical meaning of the local phase field and the associated velocity field encoded within the wave function. As an illustration, a simple model of Young's two-slit experiment is considered. The simplicity of this model allows to understand in an easy manner how the information conveyed by the Bohmian formulation relates to other more conventional concepts in quantum mechanics. This sort of pedagogical application is also aimed at ...

Authors: Raphael Bousso

I share some memories and offer a personal perspective on Jacob Bekenstein's legacy, focussing on black hole entropy and the Bekenstein bound. I summarize a number of fascinating recent developments that grew out of Bekenstein's pioneering contributions, from the Ryu-Takayanagi proposal to the Quantum Null Energy Condition.

Authors: Gia Dvali

An explicit microscopic realization of the phenomenon of holography is provided by a class of simple quantum theories of a bosonic field inhabiting a d-dimensional space and experiencing a momentum dependent attractive interaction. An exact mode counting reveals a family of holographic states. In each a set of gapless modes emerges with their number equal to the area of a (d-1)-dimensional sphere. These modes store an exponentially large number of patterns within a microscopic energy gap. The resulting micro-state entropy obeys the area-law reminiscent of a black hole entropy. We study the time-evolution of the stored patterns and observe the following phenomenon: Among the degenerate micro-states the ones with heavier loaded memories survive longer than those that store emptier patterns. Thus, a state gets stabilized by the burden of its own memory. From time to time the information pattern gets off-loaded from one holographic state into another but cannot escape the system. During this process the pattern becomes highly entangled and scrambled. We suggest that this phenomenon is universal in systems with enhanced memory storage capacity, such as black holes or critical neural networks. This universality sheds an interesting light on the puzzle of why, despite the evaporation, is a black hole forced to maintain information internally for a very long time.

Authors: Stephon Alexander, Raúl Carballo-Rubio

A central aspect of the cosmological constant problem is to understand why vacuum energy does not gravitate. In order to account for this observation, while allowing for nontrivial dynamics of the quantum vacuum, we motivate a novel background independent theory of gravity. The theory is an extension of unimodular gravity that is described in geometric terms by means of a conformal (light-cone) structure and differential forms of degree one and two. We show that the subset of the classical field equations describing the dynamics of matter degrees of freedom and the conformal structure of spacetime are equivalent to that of unimodular gravity. The sector with vanishing matter fields and flat conformal structure is governed by the field equations of BF theory and contains topological invariants that are influenced by quantum vacuum fluctuations. Perturbative deviations from this sector lead to classical solutions that necessarily display relatively small values of the cosmological constant with respect to the would-be contribution of quantum vacuum fluctuations. This feature that goes beyond general relativity (and unimodular gravity) offers an interpretation of the smallness of the currently observed cosmological constant.

Authors: Julian Schmidt, Roderich Tumulka

While fundamental physically realistic Hamiltonians should be invariant under time reversal, time asymmetric Hamiltonians can occur as mathematical possibilities or effective Hamiltonians. Here, we study conditions under which non-relativistic Hamiltonians involving particle creation and annihilation, as come up in quantum field theory (QFT), are time asymmetric. It turns out that the time reversal operator T can be more complicated than just complex conjugation, which leads to the question which criteria determine the correct action of time reversal. We use Bohmian trajectories for this purpose and show that time reversal symmetry can be broken when charges are permitted to be complex numbers, where `charge' means the coupling constant in a QFT that governs the strength with which a fermion emits and absorbs bosons. We pay particular attention to the technique for defining Hamiltonians with particle creation based on interior-boundary conditions, and we find them to generically be time asymmetric. Specifically, we show that time asymmetry for complex charges occurs whenever not all charges have equal or opposite phase. We further show that, in this case, the corresponding ground states can have non-zero probability currents, and we determine the effective potential between fermions of complex charge.

Fankhauser, Johannes (2017) Taming the Delayed Choice Quantum Eraser. [Preprint]
Fankhauser, Johannes (2017) Gravitational redshift, inertia, and the role of charge. [Preprint]

Author(s): Jakub Rembieliński and Jacek Ciborowski

We introduce a variant of quantum and classical electrodynamics formulated on the grounds of a hypothesis of existence of a preferred frame of reference—a formalism complementary to that regarding the structure of the space of photonic states, presented by us recently [Phys. Rev. A 97, 062106 (2018)...


[Phys. Rev. A 98, 042107] Published Thu Oct 04, 2018

Teitel, Trevor (2018) Background Independence: Lessons for Further Decades of Dispute. [Preprint]
Dawid, Richard (2018) Chronical Incompleteness, Final Theory Claims, and the Lack of Free Parameters in String Theory. [Preprint]
Lia, den Daas (2018) Spontaneous Symmetry Breaking and Quantum Measurement. UNSPECIFIED.

Unscrambling the physics of out-of-time-order correlators

Unscrambling the physics of out-of-time-order correlators, Published online: 03 October 2018; doi:10.1038/s41567-018-0295-5

Quantitative tools for measuring the propagation of information through quantum many-body systems, originally developed to study quantum chaos, have recently found many new applications from black holes to disordered spin systems.

Many-body localization and quantum thermalization

Many-body localization and quantum thermalization, Published online: 03 October 2018; doi:10.1038/s41567-018-0305-7

It is the common wisdom that time evolution of a many-body system leads to thermalization and washes away quantum correlations. But one class of system — referred to as many-body localized — defy this expectation.

New horizons towards thermalization

New horizons towards thermalization, Published online: 03 October 2018; doi:10.1038/s41567-018-0326-2

Ideas from theorists in fields as disparate as quantum gravity, quantum information and many-body localization are finding common ground, as we explore in this month’s Focus issue on quantum thermalization.

Does gravity come from quantum information?

Does gravity come from quantum information?, Published online: 03 October 2018; doi:10.1038/s41567-018-0297-3

Recent developments have seen concepts originally developed in quantum information theory, such as entanglement and quantum error correction, come to play a fundamental role in understanding quantum gravity.

Abstract

In physics, one is often misled in thinking that the mathematical model of a system is part of or is that system itself. Think of expressions commonly used in physics like “point” particle, motion “on the line”, “smooth” observables, wave function, and even “going to infinity”, without forgetting perplexing phrases like “classical world” versus “quantum world”.... On the other hand, when a mathematical model becomes really inoperative in regard with correct predictions, one is forced to replace it with a new one. It is precisely what happened with the emergence of quantum physics. Classical models were (progressively) superseded by quantum ones through quantization prescriptions. These procedures appear often as ad hoc recipes. In the present paper, well defined quantizations, based on integral calculus and Weyl–Heisenberg symmetry, are described in simple terms through one of the most basic examples of mechanics. Starting from (quasi-) probability distribution(s) on the Euclidean plane viewed as the phase space for the motion of a point particle on the line, i.e., its classical model, we will show how to build corresponding quantum model(s) and associated probabilities (e.g. Husimi) or quasi-probabilities (e.g. Wigner) distributions. We highlight the regularizing rôle of such procedures with the familiar example of the motion of a particle with a variable mass and submitted to a step potential.

Swanson, Noel (2018) Can Quantum Thermodynamics Save Time? In: UNSPECIFIED.

Authors: Dmitry V. Zhdanov, Denys I. Bondar, Tamar Seideman

A quantum analog of friction (understood as a completely positive, Markovian, translation-invariant and phenomenological model of dissipation) is known to be in odds with the detailed balance in the thermodynamic limit. We show that this is not the case for quantum systems with internal (e.g. spin) states non-adiabatically coupled to translational dynamics. For such systems, a quantum master equation is derived which phenomenologically accounts for the frictional effect of a uniform zero temperature environment. A simple analytical example is provided. Conjectures regarding the finite temperature case are also formulated. The results are important for efficient simulations of complex molecular dynamics and quantum reservoir engineering applications.

Authors: Tejinder P. Singh

A brief non-technical account of the current status of collapse models.

Abstract

Dualism holds (roughly) that some mental events are fundamental and non-physical. I develop a prima facie plausible causal argument for dualism. The argument has several significant implications. First, it constitutes a new way of arguing for dualism. Second, it provides dualists with a parity response to causal arguments for physicalism. Third, it transforms the dialectical role of epiphenomenalism. Fourth, it refutes the view that causal considerations prima facie support physicalism but not dualism. After developing the causal argument for dualism and drawing out these implications, I subject the argument to a battery of objections. Some prompt revisions to the argument. Others reveal limitations in scope. It falls out of the discussion that the causal argument for dualism is best used against physicalism as a keystone in a divide and conquer strategy.

Abstract

The Horizon Quantum Mechanics is an approach that allows one to analyse the gravitational radius of spherically symmetric systems and compute the probability that a given quantum state is a black hole. We first review the (global) formalism and show how it reproduces a gravitationally inspired GUP relation. This results leads to unacceptably large fluctuations in the horizon size of astrophysical black holes if one insists in describing them as (smeared) central singularities. On the other hand, if they are extended systems, like in the corpuscular models, no such issue arises and one can in fact extend the formalism to include asymptotic mass and angular momentum with the harmonic model of rotating corpuscular black holes. The Horizon Quantum Mechanics then shows that, in simple configurations, the appearance of the inner horizon is suppressed and extremal (macroscopic) geometries seem disfavoured.

Abstract

It is shown that the nonlocal anomalous effective actions corresponding to the quantum breaking of the conformal symmetry can lead to observable modifications of Einstein’s equations. The fact that Einstein’s general relativity is in perfect agreement with all observations including cosmological or recently observed gravitational waves imposes strong restrictions on the field content of possible extensions of Einstein’s theory: all viable theories should have vanishing conformal anomalies. It is shown that a complete cancellation of conformal anomalies in \(D=4\) for both the \(C^2\) invariant and the Euler (Gauss–Bonnet) invariant can only be achieved for N-extended supergravity multiplets with \(N \ge 5\) .

Chen, Eddy Keming (2018) Quantum Mechanics in a Time-Asymmetric Universe: On the Nature of the Initial Quantum State. [Preprint]

Volume 4, Issue 4, pages 235-246

A. I. Arbab [Show Biography]

Arbab Ibrahim studied physics at Khartoum University and high energy physics at the International Cenetr for Theoretical Physics (ICTP), Italy. He has taught physics at Khartoum University and Qassim University, and he is currently a Professor of Physics. He has been a visiting scholar at University of Illinois, Urbana-Champaign, Towson University, and Sultan Qaboos University. His work concentrates on the formulation of quantum mechanics and electromagnetism using Quaternions. He has publications in wide range of theoretical physics. He is an active reviewer for many international journals.

By expressing the Schrödinger wavefunction in the form ψ=Re^iS, where R and S are real functions, we have shown that the expectation value of S is conserved. The amplitude of the wave (R) is found to satisfy the Schrödinger equation while the phase (S) is related to the energy conservation. Besides the quantum potential that depends on R,  we have obtained a phase potential that depends on the phase S derivative. The phase force is a dissipative force. The quantum potential may be attributed to the interaction between the two subfields S and R comprising the quantum particle. This results in splitting (creation/annihilation) of these subfields, each having a mass mc² with an internal frequency of 2mc²/h, satisfying the original wave equation and endowing the particle its quantum nature. The mass of one subfield reflects the interaction with the other subfield. If in Bohmian ansatz R satisfies the Klein-Gordon equation, then S must satisfies the wave equation. Conversely, if R satisfies the wave equation, then S yields the Einstein relativistic energy momentum equation.

Full Text Download (210k)

Volume 4, Issue 4, pages 247-267

Sebastian Fortin [Show Biography] and Olimpia Lombardi [Show Biography]

Oimpia Lombardi obtained her degree in Electronic Engineering and in Philosophy at the University of Buenos Aires, and her PhD in Philosophy at the same university. She is Principal Researcher at the National Scientific and Technical Research Council of Argentina. She is member of the Academie Internationale de Philosophie des Sciences and of the Foundational Questions Institute. She is the director of the Group of the Philosohy of Science at the University of Buenos Aires. Areas of interest: foundations of statistical mechanics, the problem of the arrow of time, interpretation of quantum mechanics, the nature of information, philosophy of chemistry.

Sebastian Fortin has a degree and a PhD in Physics at the University of Buenos Aires and a PhD in Epistemology and History of Science at the National University of Tres de Febrero, Argentina. He is Researcher at the National Scientific and Technical Research Council of Argentina and assistant professor at the Physics Department of the Faculty of Exact and Natural Sciences at the University of Buenos Aires. His field of interest is philosophy of physics, particularly foundations of quantum mechanics.

If decoherence is an irreversible process, its physical meaning might be clarified by comparing quantum and classical irreversibility. In this work we carry out this comparison, from which a unified view of the emergence of irreversibility arises, applicable both to the classical and to the quantum case. According to this unified view, in the two cases the irreversible macro-level arises from the reversible micro-level as a coarse description that can be understood in terms of the concept of projection. This position supplies an understanding of the phenomenon of decoherence different from that implicit in most presentations: the reduced state is not the quantum state of the open system, but a coarse state of the closed composite system; as a consequence, decoherence should be understood not as a phenomenon resulting from the interaction between an open system and its environment, but rather as a coarse evolution that emerges from disregarding certain degrees of freedom of the whole closed system.

Full Text Download (923k)

Petkov, Vesselin On Relativistic Mass. UNSPECIFIED.

Volume 4, Issue 4, pages 223-234

Mohammed Sanduk [Show Biography]

Mohammed Sanduk is an Iraqi born British physicist. He was educated at University of Baghdad and University of Manchester. Before attending his undergraduate study, he pub-lished a book in particle physics entitled “Mesons”. Sanduk has worked in industry and academia, and his last post in Iraq was head of the Laser and Opto-electronics Engineering department at Nahrain University in Baghdad. Owing to his interest in the philosophy of science, and he was a member of the academic staff of Pontifical Babel College for Philosophy. Sanduk is working with the department of chemical and process engineering at the University of Surrey. Sanduk is interested in transport of charged particles, Magnetohydro-dynamics, and the renewable energy technology. In addition to that, Sanduk is interested in the foundation of Quantum mechanics, and the philosophy of science & technology.

In the last article, an approach was developed to form an analogy of the wave function and derive analogies for both the mathematical forms of the Dirac and Klein-Gordon equations. The analogies obtained were the transformations from the classical real model forms to the forms in complex space. The analogous of the Klein-Gordon equation was derived from the analogous Dirac equation as in the case of quantum mechanics. In the present work, the forms of Dirac and Klein-Gordon equations were derived as a direct transformation from the classical model. It was found that the Dirac equation form may be related to a complex velocity equation. The Dirac’s Hamiltonian and coefficients correspond to each other in these analogies. The Klein-Gordon equation form may be related to the complex acceleration equation. The complex acceleration equation can explain the generation of the flat spacetime. Although this approach is classical, it may show a possibility of unifying relativistic quantum mechanics and special relativity in a single model and throw light on the undetectable æther.

Full Text Download (576k)

Dardashti, Radin and Dawid, Richard and Gryb, Sean and Thebault, Karim P Y (2018) On the Empirical Consequences of the AdS/CFT Duality. [Preprint]

Authors: Suvrat Raju

A sharp version of the information paradox involves a seeming violation of the monogamy of entanglement during black hole evaporation. We construct an analogous paradox in empty anti-de Sitter space. In a local quantum field theory, Bell correlations between operators localized in mutually spacelike regions are monogamous. We show, through a controlled calculation, that this property can be violated by an order-1 factor in a theory of gravity. This example demonstrates that what appears to be a violation of the monogamy of entanglement may just be a subtle violation of locality in quantum gravity.

Authors: D. F. Ramírez Jiménez, N. G. Kelkar

Methods based on the use of Green's functions or the Jost functions and the Fock-Krylov method are apparently very different approaches to understand the time evolution of unstable states. We show that the two former methods are equivalent up to some constants and as an outcome find an analytic expression for the energy density of states in the Fock-Krylov amplitude in terms of the coefficients introduced in the Green's functions and the Jost functions methods. This model-independent density is further used to obtain an analytical expression for the survival amplitude and study its behaviour at large times. Using these expressions, we investigate the origin of the oscillatory behaviour of the decay law in the region of the transition from the exponential to the non-exponential at large times. With the objective to understand the failure of nuclear and particle physics experiments in observing the non-exponential decay law predicted by quantum mechanics for large times, we derive analytical formulae for the critical transition time, $t_c$, from the exponential to the inverse power law behaviour at large times. Evaluating $\tau_c = \Gamma t_c$ for some particle resonances and narrow nuclear states which have been tested experimentally to verify the exponential decay law, we conclude that the large time power law in particle and nuclear decay is hard to find experimentally.

Authors: Henry Wilkes

We introduce and explore Rafael Sorkin's \textit{evolving co-event scheme}: a theoretical framework for determining completely which events do and do not happen in evolving quantum, or indeed classical, systems. The theory is observer-independent and constructed from discrete histories, making the framework a potential setting for discrete quantum cosmology and quantum gravity, as well as ordinary discrete quantum systems. The foundation of this theory is Quantum Measure Theory, which generalises (classical) measure theory to allow for quantum interference between alternative histories; and its co-event interpretation, which describes whether events can or can not occur, and in what combination, given a system and a quantum measure. In contrast to previous co-event schemes, the evolving co-event scheme is applied in stages, in the stochastic sense, without any dependence on later stages, making it manifestly compatible with an evolving block view. It is shown that the co-event realities produced by the basic evolving scheme do not depend on the inclusion or exclusion of zero measure histories in the history space, which follows non-trivially from the basic rules of the scheme. It is also shown that this evolving co-event scheme will reduce to producing classical realities when it is applied to classical systems.

Authors: Thomas Unden, Daniel Louzon, Michael Zwolak, Wojciech Zurek, Fedor Jelezko

The origin of classical reality in our quantum world is a long-standing mystery. Here, we examine a nitrogen vacancy center evolving naturally in the presence of its environment to study quantum Darwinism - the proliferation of information about preferred quantum states throughout the world via the environment. This redundantly imprinted information accounts for the perception of objective reality, as it is independently accessible by many without perturbing the system of interest. To observe the emergence of redundant information, we implement a novel dynamical decoupling scheme that enables the measurement/control of several nuclear spins (the environment E) interacting with a nitrogen vacancy (the system S). In addition to showing how to create entangled SE states relevant to quantum metrology, we demonstrate that under the decoherence of S, redundant information is imprinted onto E, giving rise to classical objectivity - a consensus of the nuclear spins about the state of S. This provides the first laboratory verification of the objective classical world emerging from the underlying quantum substrate.

Authors: Detlef Dürr, Sheldon Goldstein, Stefan Teufel, Roderich Tumulka, Nino Zanghì

Recently, there has been progress in developing interior-boundary conditions (IBCs) as a technique of avoiding the problem of ultraviolet divergence in non-relativistic quantum field theories while treating space as a continuum and electrons as point particles. An IBC can be expressed in the particle-position representation of a Fock vector $\psi$ as a condition on the values of $\psi$ on the set of collision configurations, and the corresponding Hamiltonian is defined on a domain of vectors satisfying this condition. We describe here how Bohmian mechanics can be extended to this type of Hamiltonian. In fact, part of the development of IBCs was inspired by the Bohmian picture. Particle creation and annihilation correspond to jumps in configuration space; the annihilation is deterministic and occurs when two particles (of the appropriate species) meet, whereas the creation is stochastic and occurs at a rate dictated by the demand for the equivariance of the $|\psi|^2$ distribution, time reversal symmetry, and the Markov property. The process is closely related to processes known as Bell-type quantum field theories.

Jaramillo, José Luis and Lam, Vincent (2018) Counterfactuals in the initial value formulation of general relativity. [Preprint]
Gillies, Donald (2018) Indeterministic Causality and Simpson's Paradox. In: UNSPECIFIED.
Morganti, Matteo (2018) From Ontic Structural Realism to Metaphysical Coherentism. [Preprint]

Author(s): Ezad Shojaee, Christopher S. Jackson, Carlos A. Riofrío, Amir Kalev, and Ivan H. Deutsch

The spin-coherent-state positive-operator-valued-measure (POVM) is a fundamental measurement in quantum science, with applications including tomography, metrology, teleportation, benchmarking, and measurement of Husimi phase space probabilities. We prove that this POVM is achieved by collectively me...


[Phys. Rev. Lett. 121, 130404] Published Wed Sep 26, 2018

Authors: T. P. Shestakova

It is generally accepted that the Copenhagen interpretation is inapplicable to quantum cosmology, by contrast with the many worlds interpretation. I shall demonstrate that the two basic principles of the Copenhagen interpretation, the principle of integrity and the principle of complementarity, do make sense in quantum gravity, since we can judge about quantum gravitational processes in the very early Universe by their vestiges in our macroscopic Universe. I shall present the extended phase space approach to quantum gravity and show that it can be interpreted in the spirit of the Everett's `relative states' formulation, while there is no contradiction between the `relative states' formulation and the mentioned basic principles of the Copenhagen interpretation.

Authors: Mariam Bouhmadi-López, Manuel Kraemer, João Morais, Salvador Robles-Pérez

We study a toy model of a multiverse consisting of canonically quantized universes that interact with each other on a quantum level based on a field-theoretical formulation of the Wheeler-DeWitt equation. This interaction leads to the appearance of a pre-inflationary phase in the evolution of the individual universes. We analyze scalar perturbations within the model and calculate the influence of the pre-inflationary phase onto the power spectrum of these perturbations. The result is that there is a suppression of power on large scales, which can describe well the Planck 2018 data for the cosmic microwave background anisotropies and could thus indicate a possible solution to the observed quadrupole discrepancy.

Authors: C.P. Panos, Ch.C. Moustakidis

It is shown that the entropic force formula $F_e=-\lambda\partial S/\partial A$ leads to a Newtonian $r^{-2}$ dependence. Here we employ the universal property of the information entropy $S=a+b\ln N$ ($N$ is the number of particles of a quantum system and $A$ is the area containing the system). This property was previously obtained for fermionic systems (atoms, atomic clusters, nuclei and infinite Fermi systems) and bosonic ones (correlated boson-atoms in a trap). A similar dependence of the entropic force has been derived very recently by Plastino et al with a Bose gas entropy, inspired by Verlinde's conjecture~\cite{Verlide-11} that gravity is an emergent entropic force.

Authors: George Savvidy, Konstantin Savvidy

We demonstrate that the Riemann zeta function zeros define the position and the widths of the resonances of the quantised Artin dynamical system. The Artin dynamical system is defined on the fundamental region of the modular group on the Lobachevsky plane. It has a finite volume and an infinite extension in the vertical direction that correspond to a cusp. In classical regime the geodesic flow in the fundamental region represents one of the most chaotic dynamical systems, has mixing of all orders, Lebesgue spectrum and non-zero Kolmogorov entropy. In quantum-mechanical regime the system can be associated with the narrow infinitely long waveguide stretched out to infinity along the vertical axis and a cavity resonator attached to it at the bottom. That suggests a physical interpretation of the Maass automorphic wave function in the form of an incoming plane wave of a given energy entering the resonator, bouncing inside the resonator and scattering to infinity. As the energy of the incoming wave comes close to the eigenmodes of the cavity a pronounced resonance behaviour shows up in the scattering amplitude.

Authors: Carlo Maria Scandolo, Roberto Salazar, Jarosław K. Korbicz, Paweł Horodecki

We investigate the emergence of classicality and objectivity in arbitrary physical theories. First we provide an explicit example of a theory where there are no objective states. Then we characterize classical states of generic theories, and show how classical physics emerges through a decoherence process, which always exists in causal theories as long as there are classical states. We apply these results to the study of the emergence of objectivity, here recast as a multiplayer game. In particular, we prove that the so-called Spectrum Broadcast Structure characterizes all objective states in every causal theory, in the very same way as it does in quantum mechanics. This shows that the structure of objective states is valid across an extremely broad range of physical theories. Finally we show that, unlike objectivity, the emergence of local observers is not generic among physical theories, but it is only possible if a theory satisfies two axioms that rule out holistic behavior in composite systems.

Authors: Joshua Rosaler, Robert Harlander

In light of the increasingly strong violations of the Higgs naturalness principle revealed at the LHC, we examine the assumptions underlying one influential argument for naturalness in the sense that prohibits fine-tuning of bare Standard Model (SM) parameters. We highlight the dependence of this argument on the interpretation of these bare parameters as "fundamental parameters," by direct physical analogy with the interpretation of microscopic lattice parameters in condensed matter physics. We emphasize that while the notion of fundamental parameters is appropriate to some applications in condensed matter physics, it plays no essential role in the effective field theories (EFT's) of high-energy physics. We distinguish two ways of understanding high-energy EFT's within the Wilsonian framework, the first of which takes an EFT to be uniquely defined by a single set of physical, fundamental bare parameters, and the second of which dispenses entirely with the notion of fundamental parameters. In this latter view, an EFT is instead defined by a one-parameter class of physically equivalent parametrizations related by Wilsonian renormalization group flows. From this perspective, the delicate cancellation between bare Higgs mass and quantum corrections appears as an eliminable artifact of mathematical convention, rather than as a physical coincidence calling out for explanation by a deeper theory. Thus, we aim to clarify the distinction between two physical interpretations of Wilsonian EFT's and bare parameters in high energy physics, and to show in light of this distinction how one formulation of the naturalness requirement, based on the notion that bare parameters at an EFT's physical cutoff constitute "fundamental parameters," may be rooted in an excessively literal reading of the high-energy/condensed-matter analogy.

Authors: Jorge Pullin, Parampreet Singh

We summarize the talks presented at the QG3 session (loop quantum gravity: cosmology and black holes) of the 15th Marcel Grossmann Meeting held in Rome, Italy on July 1-7 2018.

Crowther, Karen (2018) Defining a crisis: The roles of principles in the search for a theory of quantum gravity. [Preprint]
Damiano, Anselmi (2018) The correspondence principle in quantum field theory and quantum gravity. [Preprint]

Authors: Mohamed Hatifi, Ralph Willox, Samuel Colin, Thomas Durt

Recently, the properties of bouncing oil droplets, also known as "walkers", have attracted much attention because they are thought to offer a gateway to a better understanding of quantum behaviour. They constitute indeed a macroscopic realization of wave-particle duality, in the sense that their trajectories are guided by a self-generated surrounding wave. The aim of this paper is to develop a phenomenological theory for the behavior of walkers in terms of de Broglie-Bohm and Nelson dynamics. We study in particular how modifications of the de Broglie pilot-wave theory, \`a la Nelson, affect the process of relaxation to quantum equilibrium, and prove an H-theorem for the relaxation to quantum equilibrium under Nelson dynamics. We compare the onset of equilibrium in the Nelson and de Broglie-Bohm approaches and we also propose some simple experiments by which one can test the applicability of our theory to the context of bouncing oil droplets.

Authors: John S. Briggs

An assessment is given as to the extent to which pure unitary evolution, as distinct from environmental decohering interaction, can provide the transition necessary for an observer to interpret perceived quantum dynamics as classical. This has implications for the interpretation of quantum wavefunctions as a characteristic of ensembles or of single particles and the related question of wavefunction collapse.

Crowther, Karen (2018) When do we stop digging? Conditions on a fundamental theory of physics. [Preprint]

Author(s): Ding Jia (贾丁)

There has been a body of work deriving the complex Hilbert-space structure of quantum theory from axioms/principles/postulates to deepen our understanding of quantum theory and to reveal ways to go beyond it to resolve foundational issues. Recent progress in incorporating indefinite causal structure...


[Phys. Rev. A 98, 032112] Published Wed Sep 19, 2018

Publication date: Available online 24 August 2018

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): R. Hermens

Authors: Yuri Bonder, Cristóbal Corral

It is well known that a theory with explicit Lorentz violation is not invariant under diffeomorphisms. On the other hand, for geometrical theories of gravity, there are alternative transformations, which can be best defined within the first-order formalism, and that can be regarded as a set of improved diffeomorphisms. These symmetries are known as local translations and, among other features, they are Lorentz covariant off shell. It is thus interesting to study if theories with explicit Lorentz violation are invariant under local translations. In this work, an example of such a theory, known as the minimal gravity sector of the Standard Model Extension, is analyzed. Using a robust algorithm, it is shown that local translations are not a symmetry of the theory. It remains to be seen if local translations are spontaneously broken under spontaneous Lorentz violation, which are regarded as a more natural alternative when spacetime is dynamic.

Authors: Marcel Reginatto, Michael J. W. Hall

We consider the coupling of quantum fields to classical gravity in the formalism of ensembles on configuration space, a model that allows a consistent formulation of interacting classical and quantum systems. Explicit calculations show that there are solutions for which two quantum fields are in an entangled state, even though their interaction occurs solely via a common classical gravitational field, and that such entangled solutions can evolve from initially unentangled ones. These results support the observation of a previous paper that an observed generation of entanglement would not provide a definitive test of the nonclassical nature of gravity.

Authors: Alexander M. Dalzell, Aram W. Harrow, Dax Enshan Koh, Rolando L. La Placa

Quantum computational supremacy arguments, which describe a way for a quantum computer to perform a task that cannot also be done by a classical computer, typically require some sort of computational assumption related to the limitations of classical computation. One common assumption is that the polynomial hierarchy (PH) does not collapse, a stronger version of the statement that P $\neq$ NP, which leads to the conclusion that any classical simulation of certain families of quantum circuits requires time scaling worse than any polynomial in the size of the circuits. However, the asymptotic nature of this conclusion prevents us from calculating exactly how many qubits these quantum circuits must have for their classical simulation to be intractable on modern classical supercomputers. We refine these quantum computational supremacy arguments and perform such a calculation by imposing fine-grained versions of the non-collapse assumption. Each version is parameterized by a constant $a$ and asserts that certain specific computational problems with input size $n$ require $2^{an}$ time steps to be solved by a non-deterministic algorithm. Then, we choose a specific value of $a$ for each version that we argue makes the assumption plausible, and based on these conjectures we conclude that Instantaneous Quantum Polynomial-Time (IQP) circuits with 180 qubits, Quantum Approximate Optimization Algorithm (QAOA) circuits with 360 qubits and boson sampling circuits (i.e. linear optical networks) with 90 photons are large enough for the task of producing samples from their output distributions up to constant multiplicative error to be intractable on current technology.

Authors: David L. Bartley

The significance of the Bohm/de Broglie hidden-particle position in the relativistic regime is addressed, seeking connection to the (orthodox) single-particle Newton-Wigner position. The effect of non-positive excursions of the ensemble density for extreme cases of positive-energy waves is easily computed using an integral of the equations of motion developed here for free spin-0 particles in 1+1 dimensions and is interpreted in terms of virtual-like pair creation and annihilation beneath the Compton wavelength. A Bohm-theoretic description of the acausal explosion of a specific Newton-Wigner-localized state is presented in detail. The presence of virtual pairs found is interpreted as the Bohm picture of the spatial extension beyond single point particles proposed in the 1960s as to why space-like hyperplane dependence of the Newton-Wigner wavefunctions may be needed to achieve Lorentz covariance. For spin-1/2 particles the convective current is speculatively utilized for achieving parity with the spin-0 theory. The spin-0 improper quantum potential is generalized to an improper stress tensor for spin-1/2 particles.

Authors: Koji Yasuda

The measurement apparatus proposed in the titled paper, "Energy-Time Uncertainty Relations in Quantum Measurements" [Found. Phys. 46, 1522-1550 (2016)] was examined. A simple proof was presented for the non-existence of the apparatus.

Authors: Sina Salek, Daniel Ebler, Giulio Chiribella

Quantum mechanics allows for situations where the relative order between two processes is entangled with a quantum degree of freedom. Here we show that such entanglement can enhance the ability to transmit quantum information over noisy communication channels. We consider two completely dephasing channels, which in normal conditions are unable to transmit any quantum information. We show that, when the two channels are traversed in an indefinite order, a quantum bit sent through them has a 25\% probability to reach the receiver without any error. For partially dephasing channels, a similar advantage takes place deterministically: the amount of quantum information that can travel through two channels in a superposition of orders can be larger than the amount of quantum information that can travel through each channel individually.

Abstract

Three recent arguments seek to show that the universal applicability of unitary quantum theory is inconsistent with the assumption that a well-conducted measurement always has a definite physical outcome. In this paper I restate and analyze these arguments. The import of the first two is diminished by their dependence on assumptions about the outcomes of counterfactual measurements. But the third argument establishes its intended conclusion. Even if every well-conducted quantum measurement we ever make will have a definite physical outcome, this argument should make us reconsider the objectivity of that outcome.

Abstract

We review the argument that latent image formation is a measurement in which the state vector collapses, requiring an enhanced noise parameter in objective reduction models. Tentative observation of a residual noise at this level, plus several experimental bounds, imply that the noise must be colored (i.e., non-white), and hence frame dependent and non-relativistic. Thus a relativistic objective reduction model, even if achievable in principle, would be incompatible with experiment; the best one can do is the non-relativistic CSL model. This negative conclusion has a positive aspect, in that the non-relativistic CSL reduction model evades the argument leading to the Conway–Kochen “Free Will Theorem”.

Standard quantum theory explains the behaviour of microscopic things like electrons and atoms. It should also, in principle apply to larger objects – but it might not
Azhar, Feraz and Loeb, Abraham (2018) Gauging Fine-Tuning. [Preprint]
de Ronde, Christian and Massri, Cesar (2018) A New Objective Definition of Quantum Entanglement as Potential Coding of Intensive and Effective Relations. [Preprint]

Quantum theory cannot consistently describe the use of itself

Quantum theory cannot consistently describe the use of itself, Published online: 18 September 2018; doi:10.1038/s41467-018-05739-8

Quantum mechanics is expected to provide a consistent description of reality, even when recursively describing systems contained in each other. Here, the authors develop a variant of Wigner’s friend Gedankenexperiment where each of the current interpretations of QM fails in giving a consistent description.

Reimagining of Schrödinger’s cat breaks quantum mechanics — and stumps physicists

Reimagining of Schrödinger’s cat breaks quantum mechanics — and stumps physicists, Published online: 18 September 2018; doi:10.1038/d41586-018-06749-8

In a multi-‘cat’ experiment, the textbook interpretation of quantum theory seems to lead to contradictory pictures of reality, physicists claim.

An inconsistent friend

An inconsistent friend, Published online: 18 September 2018; doi:10.1038/s41567-018-0293-7

Are there limits to the applicability of textbook quantum theory? Experiments haven’t found any yet, but a new theoretical analysis shows that treating your colleagues as quantum systems might be a step too far.
Roberts, Bryan W. (2018) Time Reversal. [Preprint]

Author(s): Tatsuma Nishioka

In this review the entanglement and Renyi entropies in quantum field theory are described from different points of view, including the perturbative approach and holographic dualities. The applications of these results to constraining renormalization group flows are presented effectively and illustrated with a variety of examples.


[Rev. Mod. Phys. 90, 035007] Published Mon Sep 17, 2018

Calosi, Claudio and Morganti, Matteo (2018) Interpreting Quantum Entanglement: Steps Towards Coherentist Quantum Mechanics. The British Journal for the Philosophy of Science.
French, Steven (2018) Between Factualism and Substantialism: Structuralism as a Third Way. [Preprint]
Abstract
We put forward a new, coherentist account of quantum entanglement, according to which entangled systems are characterized by symmetric relations of ontological dependence among the component particles. We compare this coherentist viewpoint with the two most popular alternatives currently on offerstructuralism and holismand argue that it is essentially different from, and preferable to, both. In the course of this article, we point out how coherentism might be extended beyond the case of entanglement and further articulated.
Steeger, Jeremy and Teh, Nicholas (2018) Two Forms of Inconsistency in Quantum Foundations. [Preprint]

The leading hypothesis about the universe’s birth — that a quantum speck of space became energized and inflated in a split second, creating a baby cosmos — solves many puzzles and fits all observations to date. Yet this “cosmic inflation” hypothesis lacks definitive proof. Telltale ripples that should have formed in the inflating spatial fabric, known as primordial gravitational waves, haven’t been detected in the geometry of the universe by the world’s most sensitive telescopes. Their absence has fueled underdog theories of cosmogenesis in recent years. And yet cosmic inflation is wriggly. In many variants of the idea, the sought-after ripples would simply be too weak to observe.

“The question is whether one can test the entire scenario, not just specific models,” said Avi Loeb, an astrophysicist and cosmologist at Harvard University. “If there is no guillotine that can kill off some theories, then what’s the point?”

In a new paper that appeared on the physics preprint site, arxiv.org, on Sunday, Loeb and two Harvard colleagues, Xingang Chen and Zhong-Zhi Xianyu, suggested such a guillotine. The researchers predicted an oscillatory pattern in the distribution of matter throughout the cosmos that, if detected, could distinguish between inflation and alternative scenarios — particularly the hypothesis that the Big Bang was actually a bounce preceded by a long period of contraction.

The paper has yet to be peer-reviewed, but Will Kinney, an inflationary cosmologist at the University at Buffalo and a visiting professor at Stockholm University, said “the analysis seems correct to me.” He called the proposal “a very elegant idea.”

“If the signal is real and observable, it would be very interesting,” Sean Carroll of the California Institute of Technology said in an email.

Any potential hints about the Big Bang are worth looking for, but the main question, according to experts, is whether the putative oscillatory pattern will be strong enough to detect. It might not be a clear-cut guillotine as advertised.

If it does exist, the signal would appear in density variations across the universe. Imagine taking a giant ice cream scoop to the sky and counting how many galaxies wind up inside. Do this many times all over the cosmos, and you’ll find that the number of scooped-up galaxies will vary above or below some average. Now increase the size of your scoop. When scooping larger volumes of universe, you might find that the number of captured galaxies now varies more extremely than before. As you use progressively larger scoops, according to Chen, Loeb and Xianyu’s calculations, the amplitude of matter density variations should oscillate between more and less extreme as you move up the scales. “What we showed,” Loeb explained, is that from the form of these oscillations, “you can tell if the universe was expanding or contracting when the density perturbations were produced” — reflecting an inflationary or bounce cosmology, respectively.

Regardless of which theory of cosmogenesis is correct, cosmologists believe that the density variations observed throughout the cosmos today were almost certainly seeded by random ripples in quantum fields that existed long ago.

Because of quantum uncertainty, any quantum field that filled the primordial universe would have fluctuated with ripples of all different wavelengths. Periodically, waves of a certain wavelength would have constructively interfered, forming peaks — or equivalently, concentrations of particles. These concentrations later grew into the matter density variations seen on different scales in the cosmos today.

But what caused the peaks at a particular wavelength to get frozen into the universe when they did? According to the new paper, the timing depended on whether the peaks formed while the universe was exponentially expanding, as in inflation models, or while it was slowly contracting, as in bounce models.

If the universe contracted in the lead-up to a bounce, ripples in the quantum fields would have been squeezed. At some point the observable universe would have contracted to a size smaller than ripples of a certain wavelength, like a violin whose resonant cavity is too small to produce the sounds of a cello. When the too-large ripples disappeared, whatever peaks, or concentrations of particles, existed at that scale at that moment would have been “frozen” into the universe. As the observable universe shrank further, ripples at progressively smaller and smaller scales would have vanished, freezing in as density variations. Ripples of some sizes might have been constructively interfering at the critical moment, producing peak density variations on that scale, whereas slightly shorter ripples that disappeared a moment later might have frozen out of phase. These are the oscillations between high and low density variations that Chen, Loeb and Xianyu argue should theoretically show up as you change the size of your galaxy ice cream scoop.

These oscillations would also arise if instead the universe experienced a period of rapid inflation. In that case, as it grew bigger and bigger, it would have been able to fit quantum ripples with ever larger wavelengths. Density variations would have been imprinted on the universe at each scale at the moment that ripples of that size were able to form.

The authors argue that a qualitative difference between the forms of oscillations in the two scenarios will reveal which one occurred. In both cases, it was as if the quantum field put tick marks on a piece of tape as it rushed past — representing the expanding or contracting universe. If space were expanding exponentially, as in inflation, the tick marks imprinted on the universe by the field would have grown farther and farther apart. If the universe contracted, the tick marks should have become closer and closer together as a function of scale. Thus Chen, Loeb and Xianyu argue that the changing separation between the peaks in density variations as a function of scale should reveal the universe’s evolutionary history. “We can finally see whether the primordial universe was actually expanding or contracting, and whether it did it inflationarily fast or extremely slowly,” Chen said.

Exactly what the oscillatory signal might look like, and how strong it might be, depend on the unknown nature of the quantum fields that might have created it. Discovering such a signal would tell us about those primordial cosmic ingredients. As for whether the putative signal will show up at all in future galaxy surveys, “the good news,” according to Kinney, is that the signal is probably “much, much easier to detect” than other searched-for signals called “non-gaussianities”: triangles and other geometric arrangements of matter in the sky that would also verify and reveal details of inflation. The bad news, though, “is that the strength and the form of the signal depend on a lot of things you don’t know,” Kinney said, such as constants whose values might be zero, and it’s entirely possible that “there will be no detectable signal.”



show enclosure

(image/jpg)
The double-slit experiment is a classic demonstration that all particles of light and matter are also waves - and now it’s been done with antimatter particles

Author(s): Ricardo Ximenes, Fernando Parisio, and Eduardo O. Dias

The question of how long a particle takes to pass through a potential barrier is still a controversial topic in quantum mechanics. One of the main theoretical problems in obtaining estimates for measurable times is the fact that several previously defined time operators, which remained within the bo...


[Phys. Rev. A 98, 032105] Published Mon Sep 10, 2018

Author(s): Marcin Nowakowski, Eliahu Cohen, and Pawel Horodecki

The two-state-vector formalism and the entangled histories formalism are attempts to better understand quantum correlations in time. Both formalisms share some similarities, but they are not identical, having subtle differences in their interpretation and manipulation of quantum temporal structures....


[Phys. Rev. A 98, 032312] Published Mon Sep 10, 2018

Afriat, Alexander (2018) Is the world made of loops? [Preprint]
Ellerman, David (2018) Logical Entropy: Introduction to Classical and Quantum Logical Information Theory. Entropy, 20 (9). p. 679.

Abstract

For a simple set of observables we can express, in terms of transition probabilities alone, the Heisenberg uncertainty relations, so that they are proven to be not only necessary, but sufficient too, in order for the given observables to admit a quantum model. Furthermore distinguished characterizations of strictly complex and real quantum models, with some ancillary results, are presented and discussed.

Author(s): Nora Tischler, Farzad Ghafari, Travis J. Baker, Sergei Slussarenko, Raj B. Patel, Morgan M. Weston, Sabine Wollmann, Lynden K. Shalm, Varun B. Verma, Sae Woo Nam, H. Chau Nguyen, Howard M. Wiseman, and Geoff J. Pryde

A new photon source is used to realize one-way Einstein-Podolsky-Rosen steering free from restrictions on the type of allowed measurements and on assumptions about the quantum state.


[Phys. Rev. Lett. 121, 100401] Published Fri Sep 07, 2018

Quantum 2, 92 (2018).

https://doi.org/10.22331/q-2018-09-03-92

Bell-inequality violations establish that two systems share some quantum entanglement. We give a simple test to certify that two systems share an asymptotically large amount of entanglement, $n$ EPR states. The test is efficient: unlike earlier tests that play many games, in sequence or in parallel, our test requires only one or two CHSH games. One system is directed to play a CHSH game on a random specified qubit $i$, and the other is told to play games on qubits $\{i,j\}$, without knowing which index is $i$. The test is robust: a success probability within $\delta$ of optimal guarantees distance $O(n^{5/2} \sqrt{\delta})$ from $n$ EPR states. However, the test does not tolerate constant $\delta$; it breaks down for $\delta = \tilde\Omega (1/\sqrt{n})$. We give an adversarial strategy that succeeds within delta of the optimum probability using only $\tilde O(\delta^{-2})$ EPR states.

Abstract

This paper considers the extent to which the notion of truthmaking can play a substantive role in defining physicalism. While a truthmaking-based approach to physicalism is prima facie attractive, there is some reason to doubt that truthmaking can do much work when it comes to understanding physicalism, and perhaps austere metaphysical frameworks in general. First, despite promising to dispense with higher-level properties and states, truthmaking appears to make little progress on issues concerning higher-level items and how they are related to how things are physically. Second, it seems that truthmaking-based approaches to physicalism will have a difficult time addressing the status of truthmaking itself without, in effect, appealing to the resources of alternative ways of conceptualizing physicalism.

Author(s): K. Goswami, C. Giarmatzi, M. Kewming, F. Costa, C. Branciard, J. Romero, and A. G. White

A photonic quantum switch between a pair of operations is constructed such that the causal order of operations cannot be distinguished, even in principle.


[Phys. Rev. Lett. 121, 090503] Published Fri Aug 31, 2018

Quantum 2, 87 (2018).

https://doi.org/10.22331/q-2018-08-27-87

Ernst Specker considered a particular feature of quantum theory to be especially fundamental, namely that pairwise joint measurability of sharp measurements implies their global joint measurability ($\href{https://vimeo.com/52923835}{vimeo.com/52923835}$). To date, Specker's principle seemed incapable of singling out quantum theory from the space of all general probabilistic theories. In particular, its well-known consequence for experimental statistics, the principle of consistent exclusivity, does not rule out the set of correlations known as almost quantum, which is strictly larger than the set of quantum correlations. Here we show that, contrary to the popular belief, Specker's principle cannot be satisfied in any theory that yields almost quantum correlations.