Latest Papers on Quantum Foundations - Updated Daily by IJQF

Gao, Shan (2019) Quantum theory is incompatible with relativity: A new proof beyond Bell's theorem and a test of unitary quantum theories. [Preprint]

Authors: Antoine Suarez

Physicians define death as the "irreversible" breakdown of all brain-functions including brain-stem. By "irreversible" they mean a damage that is beyond the human capacity to restore the patient's healthy state. In the same line I propose to complete the definition of quantum physics in [1] by Principle D (Detection): "Detection outcomes (like death) are ordinarily irreversible and observer-independent". It is then argued that this principle excludes generalization of quantum superposition to visible objects bearing observer-dependent outcomes. However this exclusion is not absolute: It rather means that "Schr\"{o}dinger's cat" and "Wigner's friend" should be considered "miracle" narratives beyond the domain of science.

Authors: Davi Geiger, Zvi M. Kedem

Physical laws for elementary particles can be described by the quantum dynamics equation given a Hamiltonian. The solution are probability amplitudes in Hilbert space that evolve over time. A probability density function over position and time is given as the magnitude square of such probability amplitude. An entropy can be associated with these probability densities characterizing the position information of a particle. Coherent states are localized wave packets and may describe the spatial distribution for some particle states. We show that due to a dispersion property of Hamiltonians in quantum physics, the entropy of coherent states increases over time. We investigate a partition of the Hilbert space into four sets based on whether the entropy is (i) increasing but not constant, (ii) decreasing but not constant, (iii) constant, (iv) oscillating.

We then postulate that quantum theory of elementary particles is equipped with a law that entropy (weakly) increases in time and thus states in set (ii) are disallowed, and the states in set (iii) can not complete an oscillation period. There is a key role of the conjugate process transforming states that are allowed into states that are not, and vice-versa.

Then, according to this law, quantum theory is not time reversible unless the state is in the partition (iii), e.g., stationary states (eigentstates of the Hamiltonian). This law in quantum theory limits physical scenarios beyond conservation laws, providing causality reasoning by defining an arrow of time.

Authors: Liang Liang sun, Xiang Zhou, Sixia Yu

Finding physical principles lying behind quantum mechanics is essential to understand various quantum features, e.g., the quantum correlations, in a theory-independent manner. Here we propose such a principle, namely, no disturbance without uncertainty, stating that the disturbance caused by a measurement to a subsequent incompatible measurement is no larger than the uncertainty of the first measurement, equipped with suitable theory-independent measures for disturbance and uncertainty. When applied to local systems in a multipartite scenario, our principle imposes such a strong constraint on non-signaling correlations that quantum correlations can be recovered in many cases: i. it accounts for the Tsirelsons bound; ii. it provides the so far tightest boundary for a family of the noisy super-nonlocal box with 3 parameters, and iii. it rules out an almost quantum correlation from quantum correlations by which all the previous principles fail, as well as the celebrated quantum criterion due to Navascues, Pironio, and Acin. Our results pave the way to understand nonlocality exhibited in quantum correlations from local principles.

The universe is kind of an impossible object. It has an inside but no outside; it’s a one-sided coin. This Möbius architecture presents a unique challenge for cosmologists, who find themselves in the awkward position of being stuck inside the very system they’re trying to comprehend.

It’s a situation that Lee Smolin has been thinking about for most of his career. A physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada, Smolin works at the knotty intersection of quantum mechanics, relativity and cosmology. Don’t let his soft voice and quiet demeanor fool you — he’s known as a rebellious thinker and has always followed his own path. In the 1960s Smolin dropped out of high school, played in a rock band called Ideoplastos, and published an underground newspaper. Wanting to build geodesic domes like R. Buckminster Fuller, Smolin taught himself advanced mathematics — the same kind of math, it turned out, that you need to play with Einstein’s equations of general relativity. The moment he realized this was the moment he became a physicist. He studied at Harvard University and took a position at the Institute for Advanced Study in Princeton, New Jersey, eventually becoming a founding faculty member at the Perimeter Institute.

“Perimeter,” in fact, is the perfect word to describe Smolin’s place near the boundary of mainstream physics. When most physicists dived headfirst into string theory, Smolin played a key role in working out the competing theory of loop quantum gravity. When most physicists said that the laws of physics are immutable, he said they evolve according to a kind of cosmic Darwinism. When most physicists said that time is an illusion, Smolin insisted that it’s real.

Smolin often finds himself inspired by conversations with biologists, economists, sculptors, playwrights, musicians and political theorists. But he finds his biggest inspiration, perhaps, in philosophy — particularly in the work of the German philosopher Gottfried Leibniz, active in the 17th and 18th centuries, who along with Isaac Newton invented calculus. Leibniz argued (against Newton) that there’s no fixed backdrop to the universe, no “stuff” of space; space is just a handy way of describing relationships. This relational framework captured Smolin’s imagination, as did Leibniz’s enigmatic text The Monadology, in which Leibniz suggests that the world’s fundamental ingredient is the “monad,” a kind of atom of reality, with each monad representing a unique view of the whole universe. It’s a concept that informs Smolin’s latest work as he attempts to build reality out of viewpoints, each one a partial perspective on a dynamically evolving universe. A universe as seen from the inside.

Quanta Magazine spoke with Smolin about his approach to cosmology and quantum mechanics, which he details in his recent book, Einstein’s Unfinished Revolution. The interview has been condensed and edited for clarity.

You have a slogan: “The first principle of cosmology must be: There is nothing outside the universe.”

In different formulations of the laws of physics, like Newtonian mechanics or quantum mechanics, there is background structure — structure which has to be specified and is fixed. It’s not subject to evolution, it’s not influenced by anything that happens. It’s structure outside the system being modeled. It’s the framework on which we hang observables — the observer, a clock and so forth. The statement that there’s nothing outside the universe — there’s no observer outside the universe — implies that we need a formulation of physics without background structure. All the theories of physics we have, in one way or another, apply only to subsystems of the universe. They don’t apply to the universe as a whole, because they require this background structure.

If we want to make a cosmological theory, to understand nature on the cosmological scale, we have to avoid what the philosopher Roberto Unger and I called “the cosmological fallacy,” the mistaken belief that we can take theories that apply to subsystems and scale them up to the universe as a whole. We need a formulation of dynamics that doesn’t refer to an observer or measuring instrument or anything outside the system. That means we need a different kind of theory.

You’ve recently proposed such a theory — one in which, as you put it, “the history of the universe is constituted of different views of itself.” What does that mean?

It’s a theory about processes, about the sequences and causal relations among things that happen, not the inherent properties of things that are. The fundamental ingredient is what we call an “event.” Events are things that happen at a single place and time; at each event there’s some momentum, energy, charge or other various physical quantity that’s measurable. The event has relations with the rest of the universe, and that set of relations constitutes its “view” of the universe. Rather than describing an isolated system in terms of things that are measured from the outside, we’re taking the universe as constituted of relations among events. The idea is to try to reformulate physics in terms of these views from the inside, what it looks like from inside the universe.

How do you do that?

There are many views, and each one has only partial information about the rest of the universe. We propose as a principle of dynamics that each view should be unique. That idea comes from Leibniz’s principle of the identity of indiscernibles. Two events whose views are exactly mappable onto each other are the same event, by definition. So each view is unique, and you can measure how distinct one is from another by defining a quantity called the “variety.” If you think of a node on a graph, you can go one step out, two steps out, three steps out. Each step gives you a neighborhood — the one-step neighborhood, the two-step neighborhood, the three-step neighborhood. So for any two events you can ask: How many steps do you have to go out until their views diverge? In what neighborhood are they different? The fewer steps you have to go, the more distinguishable the views are from one another. The idea in this theory is that the laws of physics — the dynamics of the system — work to maximize variety. That principle — that nature wants to maximize variety — actually leads, within the framework I’ve been describing, to the Schrödinger equation, and hence to a recovery, in an appropriate limit, of quantum mechanics.

I know from your book that you’re a realist at heart — you believe strongly in a reality independent of our knowledge of it — and therefore, like Einstein, you think quantum mechanics is incomplete. Does this theory of views help complete what you think is missing in quantum theory?

Einstein — as well as someone called Leslie Ballentine — advocated an “ensemble interpretation” of the wave function . The idea was that the wave function describes an ensemble of possible states. But one day, I was sitting in a cafe working and suddenly I thought: What if the ensemble is real? What if, when you have a wave function describing a single water molecule, it’s actually describing the ensemble of every water molecule in the universe?

So whereas normally we would think that there’s one water molecule but an uncertainty of states, you’re saying that the uncertainty of states is actually the ensemble of all the water molecules in the universe?

Yes. They form an ensemble because they have very similar views. They all interact with one another, because the probability of interaction is determined by the similarity of views, not necessarily their proximity in space.

Things don’t have to be near each other to interact?

In this theory, the similarity of views is more fundamental than space. Often, two events have similar views because they’re close in space. If two people stand next to each other, they have very similar, overlapping views of the universe. But two atoms have many fewer relational properties than big, complex objects like people. So two atoms far apart in space can still have very similar views. That means that at the smallest scale, there should be highly nonlocal interactions, which is exactly what you get with entanglement in quantum mechanics. That’s where quantum mechanics comes from, according to the real-ensemble formulation.

It reminds me of a lot of work that’s going on now in physics that’s finding surprising connections between entanglement and the geometry of space-time.

I think a lot of that work is really interesting. The hypothesis that’s motivating it is that entanglement is fundamental in quantum mechanics, and the geometry of space or space-time emerges from structures of entanglement. It’s a very positive development.

You’ve said that these ideas were inspired by Leibniz’s Monadology. Did you just happen to pull out your Monadology and reread it?

I first read Leibniz at the instigation of Julian Barbour, when I was just out of graduate school. First I read the correspondence between Leibniz and Samuel Clarke, who was a follower of Newton, in which Leibniz criticized Newton’s notion of absolute space and absolute time and argued that observables in physics should be relational. They should describe the relations of one system with another, resulting from their interaction. Later I read the Monadology. I read it as a sketch for how to make a background-independent theory of physics. I do look at my copy from time to time. There is a beautiful quote in there, where Leibniz says, “Just as the same city viewed from different directions appears entirely different … there are, as it were, just as many different universes, which are, nevertheless, only perspectives on a single one, corresponding to the different points of view of each monad.” That, to me, evokes why these ideas are very suitable, not just in physics but for a whole range of things from social policy and postmodernism to art to what it feels like to be an individual in a diverse society. But that’s another discussion!

Your work has been very influenced by philosophy. Looking back historically, people like Einstein and Bohr and John Wheeler all took philosophy very seriously; it directly influenced their physics. It seems to be a trait of great physicists and yet —

And also of not-great physicists.

OK, fair! It just seems that it’s become almost taboo to talk about philosophy in physics today. Has that been your experience?

Not at all. Many of the leading theorists in foundational physics — where the goal is to deepen our knowledge of the fundamental laws — know philosophy very well. As an undergraduate at Hampshire College, I did a lot of physics and some philosophy courses. Then when I went to Harvard for graduate school, I intended to do a double Ph.D. in physics and philosophy, but I got disenchanted with philosophy pretty quickly. I mean, the physicists were arrogant enough. But the philosophers even more so.

Back when we had the revolutions in physics in the first part of the 20th century in Europe, people like Einstein, Bohr, Heisenberg, Schrödinger and others were very well educated in philosophy, and it informed their work as physicists. Then there was this pragmatic turn, where the dominant mode of physics became anti-foundational, anti-philosophy.

The historian of physics David Kaiser at MIT has studied this in detail. He studied quantum mechanics textbooks and lecture notes and saw how, through the 1940s into the 1950s, references to philosophy and to foundational issues disappeared from quantum mechanics courses. Freeman Dyson once said, normally the young people are rebels and the old people are the conservatives, but in his generation it was the reverse. The young people didn’t want to hear about messy philosophy or foundational issues, they just wanted to get out and apply quantum mechanics.

This was great for the explosion of applications of quantum mechanics from the 1940s into the 1970s, through the establishment of the Standard Model, condensed matter physics and so forth. But then fundamental physics got stuck, and part of the reason we got stuck is we reached a set of problems on which you can’t make progress with this pragmatic, anti-foundational culture. I should make clear that those fields where you can assume we know the relevant laws, like condensed matter and astrophysics, continue to thrive. But if your goal is to discover new, deeper laws, you need to mix with philosophers again. And it has been happening much more.

When I started mixing with philosophers, there were a few who really knew physics well, but most didn’t. Today, the young people working in philosophy of physics, for the most part, know physics well. The interchange with philosophy is coming back, and I think it’s a good thing.



show enclosure

(image/jpg)
De Haro, Sebastian (2019) Theoretical Equivalence and Duality. [Preprint]

Authors: Yusef Maleki, Alireza Maleki

Quantum mechanics imposes a fundamental bound on the minimum time required for the quantum systems to evolve between two states of interest. This bound introduces a limit on the speed of the dynamical evolution of the systems, known as the quantum speed limit. We show that black holes can drastically affect the speed limit of a two-level fermionic quantum system subjected to an open quantum dynamics. As we demonstrate, the quantum speed limit can enhance at the vicinity of a black hole's event horizon in the Schwarzschild spacetime.

Authors: Antony Valentini

We compare and contrast two distinct approaches to understanding the Born rule in de Broglie-Bohm pilot-wave theory, one based on dynamical relaxation over time (advocated by this author and collaborators) and the other based on typicality of initial conditions (advocated by the 'Bohmian mechanics' school). It is argued that the latter approach is inherently circular and physically misguided. The typicality approach has engendered a deep-seated confusion between contingent and law-like features, leading to misleading claims not only about the Born rule but also about the nature of the wave function. By artificially restricting the theory to equilibrium, the typicality approach has led to further misunderstandings concerning the status of the uncertainty principle, the role of quantum measurement theory, and the kinematics of the theory (including the status of Galilean and Lorentz invariance). The restriction to equilibrium has also made an erroneously-constructed stochastic model of particle creation appear more plausible than it actually is. To avoid needless controversy, we advocate a modest 'empirical approach' to the foundations of statistical mechanics. We argue that the existence or otherwise of quantum nonequilibrium in our world is an empirical question to be settled by experiment.

Authors: Matteo Carlesso, Mauro Paternostro

The gap between the predictions of collapse models and those of standard quantum mechanics widens with the complexity of the involved systems. Addressing the way such gap scales with the mass or size of the system being investigated paves the way to testing the validity of the collapse theory and identify the values of the parameters that characterize it. Here, we review the recently proposed non-interferometric approach to the testing of collapse models, focusing on the opto-mechanical platform.

Authors: Jonathan Barrett, Robin Lorenz, Ognyan Oreshkov

It is known that the classical framework of causal models is not general enough to allow for causal reasoning about quantum systems. Efforts have been devoted towards generalization of the classical framework to the quantum case, with the aim of providing a framework in which cause-effect relations between quantum systems, and their connection with empirically observed data, can be rigorously analyzed. Building on the results of Allen et al., Phys. Rev. X 7, 031021 (2017), we present a fully-fledged framework of quantum causal models. The approach situates causal relations in unitary transformations, in analogy with an approach to classical causal models that assumes underlying determinism and situates causal relations in functional dependences between variables. We show that for any quantum causal model, there exists a corresponding unitary circuit, with appropriate causal structure, such that the quantum causal model is returned when marginalising over latent systems, and vice versa. We introduce an intrinsically quantum notion that plays a role analogous to the conditional independence of classical variables, and (generalizing a central theorem of the classical framework) show that d-separation is sound and complete in the quantum case. We present generalizations of the three rules of the classical `do-calculus', in each case relating a property of the causal structure to a formal property of the quantum process, and to an operational statement concerning the outcomes of interventions. In addition to the results concerning quantum causal models, we introduce and derive similar results for `classical split-node causal models', which are more closely analogous to quantum causal models than the classical causal models that are usually studied.

Abstract
A physically consistent semi-classical treatment of black holes requires universality arguments to deal with the `trans-Planckian' problem where quantum spacetime effects appear to be amplified such that they undermine the entire semi-classical modelling framework. We evaluate three families of such arguments in comparison with Wilsonian renormalization group universality arguments found in the context of condensed matter physics. Our analysis is framed by the crucial distinction between robustness and universality. Particular emphasis is placed on the quality whereby the various arguments are underpinned by `integrated' notions of robustness and universality. Whereas the principal strength of Wilsonian universality arguments can be understood in terms of the presence of such integration, the principal weakness of all three universality arguments for Hawking radiation is its absence.

The flashier fruits of Albert Einstein’s century-old insights are by now deeply embedded in the popular imagination: Black holes, time warps and wormholes show up regularly as plot points in movies, books, TV shows. At the same time, they fuel cutting-edge research, helping physicists pose questions about the nature of space, time, even information itself.

Perhaps ironically, though, what is arguably the most revolutionary part of Einstein’s legacy rarely gets attention. It has none of the splash of gravitational waves, the pull of black holes or even the charm of quarks. But lurking just behind the curtain of all these exotic phenomena is a deceptively simple idea that pulls the levers, shows how the pieces fit together, and lights the path ahead.

The idea is this: Some changes don’t change anything. The most fundamental aspects of nature stay the same even as they seemingly shape-shift in unexpected ways. Einstein’s 1905 papers on relativity led to the unmistakable conclusion, for example, that the relationship between energy and mass is invariant, even though energy and mass themselves can take vastly different forms. Solar energy arrives on Earth and becomes mass in the form of green leaves, creating food we can eat and use as fuel for thought. (“What is this mind of ours: what are these atoms with consciousness?” asked the late Richard Feynman. “Last week’s potatoes!”) That’s the meaning of E = mc2. The “c” stands for the speed of light, a very large number, so it doesn’t take much matter to produce an enormous amount of energy; in fact, the sun turns millions of tons of mass into energy each second.

This endless morphing of matter into energy (and vice versa) powers the cosmos, matter, life. Yet through it all, the energy-matter content of the universe never changes. It’s strange but true: Matter and energy themselves are less fundamental than the underlying relationships between them.

We tend to think of things, not relationships, as the heart of reality. But most often, the opposite is true. “It’s not the stuff,” said the Brown University physicist Stephon Alexander.

The same is true, Einstein showed, for “stuff” like space and time, seemingly stable, unchangeable aspects of nature; in truth, it’s the relationship between space and time that always stays the same, even as space contracts and time dilates. Like energy and matter, space and time are mutable manifestations of deeper, unshakable foundations: the things that never vary no matter what.

“Einstein’s deep view was that space and time are basically built up by relationships between things happening,” said the physicist Robbert Dijkgraaf, director of the Institute for Advanced Study in Princeton, New Jersey, where Einstein spent his final decades.

The relationship that eventually mattered most to Einstein’s legacy was symmetry. Scientists often describe symmetries as changes that don’t really change anything, differences that don’t make a difference, variations that leave deep relationships invariant. Examples are easy to find in everyday life. You can rotate a snowflake by 60 degrees and it will look the same. You can switch places on a teeter-totter and not upset the balance. More complicated symmetries have led physicists to the discovery of everything from neutrinos to quarks — they even led to Einstein’s own discovery that gravitation is the curvature of space-time, which, we now know, can curl in on itself, pinching off into black holes.

Over the past several decades, some physicists have begun to question whether focusing on symmetry is still as productive as it used to be. New particles predicted by theories based on symmetries haven’t appeared in experiments as hoped, and the Higgs boson that was detected was far too light to fit into any known symmetrical scheme. Symmetry hasn’t yet helped to explain why gravity is so weak, why the vacuum energy is so small, or why dark matter remains transparent.

“There has been, in particle physics, this prejudice that symmetry is at the root of our description of nature,” said the physicist Justin Khoury of the University of Pennsylvania. “That idea has been extremely powerful. But who knows? Maybe we really have to give up on these beautiful and cherished principles that have worked so well. So it’s a very interesting time right now.”

Light

Einstein wasn’t thinking about invariance or symmetry when he wrote his first relativity papers in 1905, but historians speculate that his isolation from the physics community during his employment in the Swiss patent office might have helped him see past the unnecessary trappings people took for granted.

Like other physicists of his time, Einstein was pondering several seemingly unrelated puzzles. James Clerk Maxwell’s equations revealing the intimate connection between electric and magnetic fields looked very different in different frames of reference — whether an observer is moving or at rest. Moreover, the speed at which electromagnetic fields propagated through space almost precisely matched the speed of light repeatedly measured by experiments — a speed that didn’t change no matter what. An observer could be running toward the light or rushing away from it, and the speed didn’t vary.

Einstein connected the dots: The speed of light was a measurable manifestation of the symmetrical relationship between electric and magnetic fields — a more fundamental concept than space itself. Light didn’t need anything to travel through because it was itself electromagnetic fields in motion. The concept of “at rest” — the static “empty space” invented by Isaac Newton — was unnecessary and nonsensical. There was no universal “here” or “now”: Events could appear simultaneous to one observer but not another, and both perspectives would be correct.

Chasing after a light beam produced another curious effect, the subject of Einstein’s second relativity paper, “Does the Inertia of a Body Depend Upon Its Energy Content?” The answer was yes. The faster you chase, the harder it is to go faster. Resistance to change becomes infinite at the speed of light. Since that resistance is inertia, and inertia is a measure of mass, the energy of motion is transformed into mass. “There is no essential distinction between mass and energy,” Einstein wrote.

It took several years for Einstein to accept that space and time are inextricably interwoven threads of a single space-time fabric, impossible to disentangle. “He still wasn’t thinking in a fully unified space-time sort of way,” said David Kaiser, a physicist and historian of science at the Massachusetts Institute of Technology.

Unified space-time is a difficult concept to wrap our minds around. But it begins to make sense if we think about the true meaning of “speed.” The speed of light, like any speed, is a relationship — distance traveled over time. But the speed of light is special because it can’t change; your laser beam won’t advance any faster just because it is shot from a speeding satellite. Measurements of distance and time must therefore change instead, depending on one’s state of motion, leading to effects known as “space contraction” and “time dilation.” The invariant is this: No matter how fast two people are traveling with respect to each other, they always measure the same “space-time interval.” Sitting at your desk, you hurtle through time, hardly at all through space. A cosmic ray flies over vast distances at nearly the speed of light but traverses almost no time, remaining ever young. The relationships are invariant no matter how you switch things around.

Gravity

Einstein’s special theory of relativity, which came first, is “special” because it applies only to steady, unchanging motion through space-time — not accelerating motion like the movement of an object falling toward Earth. It bothered Einstein that his theory didn’t include gravity, and his struggle to incorporate it made symmetry central to his thinking. “By the time he gets full-on into general relativity, he’s much more invested in this notion of invariants and space-time intervals that should be the same for all observers,” Kaiser said.

Specifically, Einstein was puzzled by a difference that didn’t make a difference, a symmetry that didn’t make sense. It’s still astonishing to drop a wad of crumped paper and a set of heavy keys side by side to see that somehow, almost magically, they hit the ground simultaneously — as Galileo demonstrated (at least apocryphally) by dropping light and heavy balls off the tower in Pisa. If the force of gravity depends on mass, then the more massive an object is, the faster it should sensibly fall. Inexplicably, it does not.

The key insight came to Einstein in one of his famous thought experiments. He imagined a man falling off a building. The man would be floating as happily as an astronaut in space, until the ground got in his way. When Einstein realized that a person falling freely would feel weightless, he described the discovery as the happiest thought of his life. It took a while for him to pin down the mathematical details of general relativity, but the enigma of gravity was solved once he showed that gravity is the curvature of space-time itself, created by massive objects like the Earth. Nearby “falling” objects like Einstein’s imaginary man or Galileo’s balls simply follow the space-time path carved out for them.

When general relativity was first published, 10 years after the special version, a problem arose: It appeared that energy might not be conserved in strongly curved space-time. It was well-known that certain quantities in nature are always conserved: the amount of energy (including energy in the form of mass), the amount of electric charge, the amount of momentum. In a remarkable feat of mathematical alchemy, the German mathematician Emmy Noether proved that each of these conserved quantities is associated with a particular symmetry, a change that doesn’t change anything.

Noether showed that the symmetries of general relativity — its invariance under transformations between different reference frames — ensure that energy is always conserved. Einstein’s theory was saved. Noether and symmetry have both occupied center stage in physics ever since.

Matter

Post Einstein, the pull of symmetry only became more powerful. Paul Dirac, trying to make quantum mechanics compatible with the symmetry requirements of special relativity, found a minus sign in an equation suggesting that “antimatter” must exist to balance the books. It does. Soon after, Wolfgang Pauli, in an attempt to account for the energy that seemed to go missing during the disintegration of radioactive particles, speculated that perhaps the missing energy was carried away by some unknown, elusive particle. It was, and that particle is the neutrino.

Starting in the 1950s, invariances took on a life of their own, becoming ever more abstract, “leaping out,” as Kaiser put it, from the symmetries of space-time. These new symmetries, known as “gauge” invariances, became extremely productive, “furnishing the world,” Kaiser said, by requiring the existence of everything from W and Z bosons to gluons. “Because we think there’s a symmetry that’s so fundamental it has to be protected at all costs, we invent new stuff,” he said. Gauge symmetry “dictates what other ingredients you have to introduce.” It’s roughly the same kind of symmetry as the one that tells us that a triangle that’s invariant under 120-degree rotations must have three equal sides.

Gauge symmetries describe the internal structure of the system of particles that populates our world. They indicate all the ways physicists can shift, rotate, distort and generally mess with their equations without varying anything important. “The symmetry tells you how many ways you can flip things, change the way the forces work, and it doesn’t change anything,” Alexander said. The result is a peek at the hidden scaffolding that supports the basic ingredients of nature.

The abstractness of gauge symmetries causes a certain unease in some quarters. “You don’t see the whole apparatus, you only see the outcome,” Dijkgraaf said. “I think with gauge symmetries there’s still a lot of confusion.”

To compound the problem, gauge symmetries produce a multitude of ways to describe a single physical system — a redundancy, as the physicist Mark Trodden of the University of Pennsylvania put it. This property of gauge theories, Trodden explained, renders calculations “fiendishly complicated.” Pages and pages of calculations lead to very simple answers. “And that makes you wonder: Why? Where does all that complexity in the middle come from? And one possible answer to that is this redundancy of description that gauge symmetries give you.”

Such internal complexity is the opposite of what symmetry normally offers: simplicity. With a tiling pattern that repeats itself, “you only need to look at one little bit and you can predict the rest of it,” Dijkgraaf said. You don’t need one law for the conservation of energy and another for matter where only one will do. The universe is symmetrical in that it’s homogeneous on large scales; it doesn’t have a left or right, up or down. “If that weren’t the case, cosmology would be a big mess,” Khoury said.

Broken Symmetries

The biggest problem is that symmetry as it’s now understood seems to be failing to answer some of the biggest questions in physics. True, symmetry told physicists where to look for both the Higgs boson and gravitational waves — two momentous discoveries of the past decade. At the same time, symmetry-based reasoning predicted a slew of things that haven’t shown up in any experiments, including the “supersymmetric” particles that could have served as the cosmos’s missing dark matter and explained why gravity is so weak compared to electromagnetism and all the other forces.

In some cases, symmetries present in the underlying laws of nature appear to be broken in reality. For instance, when energy congeals into matter via the good old E = mc2, the result is equal amounts of matter and antimatter — a symmetry. But if the energy of the Big Bang created matter and antimatter in equal amounts, they should have annihilated each other, leaving not a trace of matter behind. Yet here we are.

The perfect symmetry that should have existed in the early hot moments of the universe somehow got destroyed as it cooled down, just as a perfectly symmetrical drop of water loses some of its symmetry when it freezes into ice. (A snowflake may look the same in six different orientations, but a melted snowflake looks the same in every direction.)

“Everyone’s interested in spontaneously broken symmetries,” Trodden said. “The law of nature obeys a symmetry, but the solution you’re interested in does not.”

But what broke the symmetry between matter and antimatter?

It would come as a surprise to no one if physics today turned out to be burdened with unnecessary scaffolding, much like the notion of “empty space” that misdirected people before Einstein. Today’s misdirection, some think, may even have to do with the obsession with symmetry itself, at least as it’s currently understood.

Many physicists have been exploring an idea closely related to symmetry called “duality.” Dualities are not new to physics. Wave-particle duality — the fact that the same quantum system is best described as either a wave or a particle, depending on the context — has been around since the beginning of quantum mechanics. But newfound dualities have revealed surprising relationships: For example, a three-dimensional world without gravity can be mathematically equivalent, or dual, to a four-dimensional world with gravity.

If descriptions of worlds with different numbers of spatial dimensions are equivalent, then “one dimension in some sense can be thought of as fungible,” Trodden said.

“These dualities include elements — the number of dimensions — we think about as invariant,” Dijkgraaf said, “but they are not.” The existence of two equivalent descriptions with all the attendant calculations raises “a very deep, almost philosophical point: Is there an invariant way to describe physical reality?”

No one is giving up on symmetry anytime soon, in part because it’s proved so powerful and also because relinquishing it means, to many physicists, giving up on “naturalness” — the idea that the universe has to be exactly the way it is for a reason, the furniture arranged so impeccably that you couldn’t imagine it any other way.

Clearly, some aspects of nature — like the orbits of the planets — are the result of history and accident, not symmetry. Biological evolution is a combination of known mechanisms and chance. Perhaps Max Born was right when he responded to Einstein’s persistent objection that “God does not play dice” by pointing out that “nature, as well as human affairs, seems to be subject to both necessity and accident.”

Certain aspects of physics will have to remain intact — causality for example. “Effects cannot precede causes,” Alexander said. Other things almost certainly will not.

One aspect that will surely not play a key role in the future is the speed of light, which grounded Einstein’s work. The smooth fabric of space-time Einstein wove a century ago inevitably gets ripped to shreds inside black holes and at the moment of the Big Bang. “The speed of light can’t remain constant if space-time is crumbling,” Alexander said. “If space-time is crumbling, what is invariant?”

Certain dualities suggest that space-time emerges from something more basic still, the strangest relationship of all: What Einstein called the “spooky” connections between entangled quantum particles. Many researchers believe these long-distance links stitch space-time together. As Kaiser put it, “The hope is that something like a continuum of space-time would emerge as a secondary effect of more fundamental relationships, including entanglement relationships.” In that case, he said, classical, continuous space-time would be an “illusion.”

The high bar for new ideas is that they cannot contradict consistently reliable theories like quantum mechanics and relativity — including the symmetries that support them.

Einstein once compared building a new theory to climbing a mountain. From a higher perspective, you can see the old theory still standing, but it’s altered, and you can see where it fits into the larger, more inclusive landscape. Instead of thinking, as Feynman suggested, with last week’s potatoes, future thinkers might ponder physics using the information encoded in quantum entanglements, which weave the space-time to grow potatoes in the first place.



show enclosure

(image/jpg)

Last month, a team of physicists reported in Nature that a sound-trapping fluid, analogous to a black hole that traps light, radiates a featureless spectrum of energies, just as Stephen Hawking predicted for the invisible spheres he was famous for studying. But opinions differ about what this sonic analogue of a black hole reveals about the real kind — such as the one recently seen in silhouette in a first-ever photograph.

The question is how to interpret the bizarre analogy between a fluid of rubidium atoms in a lab in Israel and the mysterious astrophysical abysses most often created when huge stars exhaust their fuel and collapse inward.

Some philosophers and physicists argue that the new findings have striking implications for the black hole information paradox, a profound 45-year-old puzzle about whether or how quantum information escapes black holes. Others regard the fluid experiment as an amusing demo that says nothing about black holes or their central mystery.

The paradox sprang from Hawking’s 1974 insight that a black hole isn’t truly black. Its black-looking, spherical “event horizon” marks the vicinity within which its gravity is so strong that even light rays cannot climb out. But Hawking reasoned that the fabric of space-time at the event horizon will experience “quantum fluctuations,” where pairs of particles and antiparticles spontaneously pop up out of the vacuum. Normally, these opposites instantly annihilate, returning energy borrowed fleetingly from the universe’s budget. But a particle and an antiparticle that materialize on either side of a black hole’s event horizon get dragged apart.

Energy is stolen from the vacuum, Hawking realized, in the permanent creation of a new particle, which radiates out from the horizon as “Hawking radiation.” Its accomplice takes the fall, carrying negative energy into the black hole. Black holes lose energy, in other words, as they radiate. They slowly evaporate and shrink, ultimately disappearing completely.

The problem is that, according to Hawking’s calculations, black hole radiation will be random, with a featureless, “thermal” spectrum of energies that carries no information about the black hole or whatever formed or fell in it. This implies that an evaporating black hole destroys information — something quantum mechanics doesn’t allow. Quantum math relies on the premise that information is never lost. As particles shuffle and transform, a record of the past always remains encoded in the present and future. We could theoretically re-create a burned book from its ashes by turning back time.

In the decades since Hawking radiation was discovered, the information paradox has motivated the quest for a deeper understanding of nature. Today’s physicists widely believe black hole information is preserved — that the quantum nature of gravity somehow modifies event horizons (and corrects Hawking’s calculation) in a way that encrypts the outgoing Hawking radiation with a record of the past. The question is how black hole information gets out.

Years ago, the theoretical physicist Bill Unruh argued that Hawking’s insights about black hole horizons should also apply to “sonic horizons.” This raised the prospect of testing Hawking’s math by analogy, initiating a race to create black hole analogues in the lab. The most successful practitioner, Jeff Steinhauer of the Technion in Haifa, Israel, generates a sonic horizon by accelerating a fluid of rubidium-87 atoms to a supersonic speed. In 2016, Steinhauer made headlines by detecting the acoustic analogue of Hawking radiation. Quantum units of sound, called phonons, popped up in pairs straddling the sonic horizon; one phonon would get swept along by the moving fluid while the other fought its way upstream and escaped.

Now, three years of improvements to the apparatus have “allowed for the quantitative check of Hawking’s predictions,” Steinhauer said. In his new paper, he and three collaborators reported that their sonic radiation is featureless, just as Hawking calculated for black holes. “The discovery gives us hints regarding the information paradox,” Steinhauer said by email. “The thermal form of the spectrum suggests that Hawking radiation carries no information. Thus, we need to look elsewhere to solve the information paradox.”

Most quantum gravity researchers disagree with this assessment, but a group of philosophers who have become interested in analogue black hole experiments think Steinhauer is right.

The key issue is whether space-time at a black hole’s event horizon can be treated as smooth. Both Hawking and Unruh, in their studies of real and sonic black holes, assumed that quantum fluctuations happen on a smooth background. Hawking, in his calculation, glossed over the (unknown) microscopic properties of the space-time fabric at the event horizon, and Unruh likewise treated the fluid in a sonic black hole as smooth, ignoring its composite atoms. It’s this “smoothness approximation” that most quantum gravity researchers find suspect. They think quantum-scale properties of space-time somehow encode information in Hawking radiation.

Steinhauer’s new measurements confirm that in the fluid case, the smoothness approximation works. Moreover, Unruh’s theoretical studies suggest that fluids with diverse microscopic properties will still be smooth on macro scales and emit featureless, thermal Hawking radiation. The philosophers argue that the “universality” of Hawking radiation — its robustness and insensitivity to the fine-grained details of a medium — suggests that the smoothness approximation should also hold for space-time.

“We argue that if it turns out that the modeling assumptions don’t steer you wrong in the acoustic case, that gives you good reason, on the basis of universality considerations, to believe that they don’t steer you wrong in the Hawking case,” said Eric Winsberg, a philosopher of science at the University of South Florida and a co-author of a recent study of analogue black hole experiments. In other words, the new results increase the likelihood “that information in real black holes must be lost.”

But there’s a major catch, which philosophers discussed in another recent paper: Even if the smoothness approximation holds universally for fluids, it might not hold for space-time, which might be stitched out of microscopic parts according to a much stranger pattern. Perhaps, as Winsberg put it, “there are more ways that space-time could deviate from smoothness than are dreamt of in your philosophy.”

For instance, various thought experiments and toy examples suggest that space-time might be holographic — a geometric projection, similar to how a video game universe emerges from a computer chip. The interior of a black hole might be a hologram that projects from information encoded on the event horizon. Daniel Harlow, a quantum gravity theorist and black hole expert at the Massachusetts Institute of Technology, said such a scenario would be expected to add subtle structure to the spectrum of Hawking radiation. The radiation would look thermal, but meaningful patterns would appear “if you fed the entire radiation cloud into a quantum computer and ran some fancy algorithms on it.”

The philosophers concede that exotic possibilities for the quantum-scale properties of space-time “mute the strength” with which Steinhauer’s experiment makes black hole information loss more likely.

Will any of this change anyone’s mind? Different starting beliefs, evidentiary requirements and other factors “can have a big effect on the kinds of inferences scientists make,” said Sean Gryb, a physicist and philosopher at the University of Bristol. Quantum gravity theorists will almost certainly go on thinking that information escapes black holes, even as the minority who believe in information loss feel more confident. Without measuring actual black hole radiation — which is beyond experimental reach — how will experts ever agree? “This is the kind of question philosophers of science have been looking for a definite answer to for a very long time,” Gryb said.



show enclosure

(image/jpg)
Weatherall, James (2019) Equivalence and Duality in Electromagnetism. [Preprint]
Glick, David (2019) Timelike Entanglement For Delayed-Choice Entanglement Swapping. [Preprint]

Nature Physics, Published online: 24 June 2019; doi:10.1038/s41567-019-0545-1

A type of stochastic neural network called a restricted Boltzmann machine has been widely used in artificial intelligence applications for decades. They are now finding new life in the simulation of complex wavefunctions in quantum many-body physics.
Wuthrich, Christian (2019) Time travelling in emergent spacetime. [Preprint]

Author(s): Alvaro Ortega, Emma McKay, Álvaro M. Alhambra, and Eduardo Martín-Martínez

We study the work cost of processes in quantum fields without the need of projective measurements, which are always ill defined in quantum field theory. Inspired by interferometry schemes, we propose a work distribution that generalizes the two-point measurement scheme employed in quantum thermodyna...


[Phys. Rev. Lett. 122, 240604] Published Fri Jun 21, 2019

Authors: M. Giammarchi

This paper descrives the first experimental evidence of antimatter-wave interference, a process at the heart of Quantum Physics and its interpretation. For the case of ordinary matter particles, interference phenomena have been observed in a variety of cases, ranging to electrons up to complex molecules. Here I present the first demonstration of single-positrons quantum interference.

Authors: Tejinder P. Singh

In the first paper of this series, we have introduced the concept of an atom of space-time-matter [STM], which is described by the spectral action of non-commutative geometry, corresponding to a classical theory of gravity. In the present work, we use the Connes time parameter along with the spectral action, to incorporate gravity into trace dynamics. We then derive the spectral equation of motion for the STM atom, which turns out to be the Dirac equation on a non-commutative space.

Authors: Tejinder P. Singh

In the first paper of this series, we have introduced the concept of an atom of space-time-matter [STM], which is described by the spectral action of non-commutative geometry, corresponding to a classical theory of gravity. In the present work, we use the Connes time parameter along with the spectral action, to incorporate gravity into trace dynamics. We then derive the spectral equation of motion for the STM atom, which turns out to be the Dirac equation on a non-commutative space.

Authors: Pasquale Bosso

The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canon ical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

Authors: Chia-Yi Ju, Adam Miranowicz, Guang-Yin Chen, Franco Nori

Recently, apparent non-physical implications of non-Hermitian quantum mechanics (NHQM) have been discussed in the literature. In particular, the apparent violation of the non-signaling theorem, discrimination of non-orthogonal states, and the increase of quantum entanglement by local operations were reported and, therefore, NHQM was not considered as a fundamental theory. Here we show that these and other no-go principles (including the no-cloning and no-deleting theorems) of conventional quantum mechanics are indeed satisfied in any NHQM if its formalism is properly applied. We have developed a modified formulation of NHQM based on the geometry of Hilbert spaces which is consistent with the conventional quantum mechanics for Hermitian systems. Using this formulation the validity of these principles can be shown in a simple and uniform approach.

Read, James and Menon, Tushar (2019) The limitations of inertial frame spacetime functionalism. [Preprint]
Swanson, Noel (2018) Deciphering the Algebraic CPT Theorem. [Preprint]
Canturk, Bilal (2019) A conceptual frame for giving physical content to the uncertainty principle and the quantum state. [Preprint]
Kastner, Ruth (2019) The "Delayed Choice Quantum Eraser" Neither Erases Nor Delays. Foundations of Physics.
Abstract
This article suggests a fresh look at gauge symmetries, with the aim of drawing a clear line between the
a priori theoretical considerations involved, and some methodological and empirical non-deductive aspects that are often overlooked. The gauge argument is primarily based on a general symmetry principle expressing the idea that a change of mathematical representation should not change the form of the dynamical law. In addition, the ampliative part of the argument is based on the introduction of new degrees of freedom into the theory according to a methodological principle that is formulated here in terms of correspondence between passive and active transformations. To demonstrate how the two kinds of considerations work together in a concrete context, I begin by considering spatial symmetries in mechanics. I suggest understanding Mach's principle as a similar combination of theoretical, methodological and empirical considerations, and demonstrate the claim with a simple toy model. I then examine gauge symmetries as a manifestation of the two principles in a quantum context. I further show that in all of these cases the relational nature of physically significant quantities can explain the relevance of the symmetry principle and the way the methodology is applied. In the quantum context, the relevant relational variables are quantum phases.
While quantum field theory could more aptly be called the ‘quantum field framework’—as it encompasses a vast variety of varying concepts and theories—in comparison, relativity, both special and general, is more commonly portrayed as less of a ‘general framework’. Viewed from this perspective, the paradigm of analogue space-times (also often called analogue gravity) is to promote the specific theory of general relativity (Einstein gravity) to a framework which covers relativistic phenomena at large. Ultimately, this then also gives rise to new proposals for experiments in the laboratory, as it allows one to move general features of the ‘relativistic framework’ from general relativity to entirely new new areas of physics. This allows one to experimentally look into analogies of currently unobservable phenomena of general relativity proper. The only requirement for this to work is the presence of a notion of an upper limit for propagation speeds in this new setting. Systems ...
Castellani, Elena and De Haro, Sebastian (2018) Duality, Fundamentality, and Emergence. [Preprint]

Author(s): H. Chau Nguyen, Huy-Viet Nguyen, and Otfried Gühne

Correlations between distant particles are central to many puzzles and paradoxes of quantum mechanics and, at the same time, underpin various applications such as quantum cryptography and metrology. Originally in 1935, Einstein, Podolsky, and Rosen (EPR) used these correlations to argue against the ...


[Phys. Rev. Lett. 122, 240401] Published Mon Jun 17, 2019

Deng, Natalja (2018) What is temporal ontology? Philosophical Studies, 175 (3). pp. 793-807. ISSN 0031-8116

Authors: Run-Qiu Yang, Yu-Sen An, Chao Niu, Cheng-Yong Zhang, Keun-Young Kim

We make comments on some shortcomings of the non-unitary-invariant and non-bi-invariant complexity in quantum mechanics/field theory and argue that the unitary-invariant and bi-invariant complexity is still a competitive candidate in quantum mechanics/field theory, contrary to quantum circuits in quantum computation. Based on the unitary-invariance of the complexity and intuitions from the holographic complexity, we propose a novel complexity formula between two states. Our proposal shows that i) the complexity between certain states in two dimensional CFTs is given by the Liouville action, which is compatible with the path-integral complexity; ii) it also gives natural interpretation for both the CV and CA holographic conjectures and identify what the reference states are in both cases. Our proposal explicitly produces the conjectured time dependence of the complexity: linear growth in chaotic systems. Last but not least, we present interesting relations between the complexity and the Lyapunov exponent: the Lyapunov exponent is proportional to the complexity growth rate in linear growth region.

Authors: G.W. Gibbons

Two lectures given in Paris in 1985. They were circulated as a preprint Solitons And Black Holes In Four-Dimensions, Five-Dimensions. G.W. Gibbons (Cambridge U.) . PRINT-85-0958 (CAMBRIDGE), (Received Dec 1985). 14pp. and appeared in print in De Vega, H.J. ( Ed.), Sanchez, N. ( Ed.) : Field Theory, Quantum Gravity and Strings*, 46-59 and Preprint - GIBBONS, G.W. (REC.OCT.85) 14p.

I have scanned the original, reformatted and and corrected various typos.

Authors: M.D.C. Torri, V. Antonelli, L. Miramonti

This work explores a Standard Model (S.M.) extension possibility, that violates Lorentz invariance, preserving the space-time isotropy and homogeneity. In this sense HMSR represents an attempt to introduce an isotropic Lorentz Invariance Violation in the elementary particle S.M. The theory is constructed starting from a modified kinematics, that takes into account supposed quantum effects due to interaction with the space-time background. The space-time structure itself is modified, resulting in a pseudo-Finsler manifold. The S.M. extension here provided is inspired by the effective fields theories, but it preserves covariance, with respect to newly introduced modified Lorentz transformations. Geometry perturbations are not considered as universal, but particle species dependent. Non universal character of the amended Lorentz transformations allows to obtain visible physical effects, detectable in experiments by comparing different perturbations related to different interacting particles species.

Authors: Louis Marchildon

All investigators working on the foundations of quantum mechanics agree that the theory has profoundly modified our conception of reality. But there ends the consensus. The unproblematic formalism of the theory gives rise to a number of very different interpretations, each of which has consequences on the notion of reality. This paper analyses how the Copenhagen interpretation, von Neumann's state vector collapse, Bohm and de Broglie's pilot wave and Everett's many worlds modify, each in its own way, the classical conception of reality, whose local character, in particular, requires revision.

Authors: Gabriel R. Bengochea, Gabriel León, Elias Okon, Daniel Sudarsky

Recently it has been argued that a correct reading of the quantum fluctuations of the vacuum could lead to a solution to the cosmological constant problem. In this work we critically examine such a proposal, finding it questionable due to conceptual and self-consistency problems, as well as issues with the actual calculations. We conclude that the proposal is inadequate as a solution to the cosmological constant problem.

Authors: A.P. Balachandran, I.M. Burbano, A.F. Reyes-Lega, S. Tabban

The algebraic approach to quantum physics emphasizes the role played by the structure of the algebra of observables and its relation to the space of states. An important feature of this point of view is that subsystems can be described by subalgebras, with partial trace being replaced by the more general notion of restriction to a subalgebra. This, in turn, has recently led to applications to the study of entanglement in systems of identical particles. In the course of those investigations on entanglement and particle identity, an emergent gauge symmetry has been found by Balachandran, de Queiroz and Vaidya. In this letter we establish a novel connection between that gauge symmetry, entropy production and quantum operations. Thus, let A be a system described by a finite dimensional observable algebra and $\omega$ a mixed faithful state. Using the Gelfand-Naimark-Segal (GNS) representation we construct a canonical purification of $\omega$, allowing us to embed A into a larger system C. Using Tomita-Takasaki theory, we obtain a subsystem decomposition of C into subsystems A and B, without making use of any tensor product structure. We identify a group of transformations that acts as a gauge group on A while at the same time giving rise to entropy increasing quantum operations on C. We provide physical means to simulate this gauge symmetry/quantum operation duality.

Authors: S. Sołtan, D. Dopierała, A. Bednorz

Recent group of experiments tested local realism with random choices prepared by humans. These various tests were subject to additional assumptions, which lead to loopholes in the interpretations of almost all the experiments. Among these assumptions is fair sampling, no signaling and faithful quantum model. We examined the data from 9 of 13 experiments and analyzed occurring anomalies in view of the above assumption. We conclude that further tests of local realism need better setup calibration to avoid apparent signaling or necessity of the complicated underlying quantum model.

Sullivan, Emily (2019) Universality Caused: The case of renormalization group explanation. [Preprint]
Chen, Eddy Keming (2019) Realism about the Wave Function. Philosophy Compass.
Description unavailable
Dardashti, Radin and Hartmann, Stephan and Thebault, Karim P Y and Winsberg, Eric (2015) Hawking Radiation and Analogue Experiments: A Bayesian Analysis. [Preprint]
De Haro, Sebastian and Butterfield, Jeremy (2019) On symmetry and duality. Synthese. pp. 1-41.
McKenzie, Alan (2019) Levels of reality: emergent properties of a mathematical multiverse. [Preprint]
Cushman, Matthew (2019) Anthropic Indexical Sampling and Implications for The Doomsday Argument. [Preprint]
De Haro, Sebastian and Butterfield, Jeremy (2019) A Schema for Duality, Illustrated by Bosonization. Foundations of Mathematics and Physics one Century after Hilbert.

Authors: Krishan Saraswat, Niayesh Afshordi

Two seemingly distinct notions regarding black holes have captured the imagination of theoretical physicists over the past decade: First, black holes are conjectured to be fast scramblers of information, a notion that is further supported through connections to quantum chaos and decay of mutual information via AdS/CFT holography. Second, black hole information paradox has motivated exotic quantum structure near horizons of black holes (e.g., gravastars, fuzzballs, or firewalls) that may manifest themselves through delayed gravitational wave echoes in the aftermath of black hole formation or mergers, and are potentially observable by LIGO/Virgo observatories. By studying various limits of charged AdS/Schwarzschild black holes we show that, if properly defined, the two seemingly distinct phenomena happen on an identical timescale of log(Radius)/$(\pi \times {\rm Temperature})$. We further comment on the physical interpretation of this coincidence and the corresponding holographic interpretation of black hole echoes.

Authors: P. B. Lerner

The Gedankenexperiment advanced by Frauchiger and Renner in their Nature paper is based on an implicit assumption that one can synchronize stochastic measurement intervals between two non-interacting systems. This hypothesis, the author demonstrates, is equivalent to the complete entanglement of these systems. Consequently, Frauchiger and Renner's postulate Q is too broad and, in general, meaningless. Accurate reformulation of the postulate, Q1 does not seem to entail any paradoxes with measurement. This paper is agnostic with respect to particular interpretations of quantum mechanics. Nor does it refer to the collapse of the wavefunction.

Authors: Yuri G Rudoy, Enock O Oladimeji

In this paper the detailed investigation of one of the most interested models in the non relativistic quantum mechanics of one massive particle i.e., introduced by G. Poeschl and E. Teller in 1933 is presented. This model includes as particular cases two most popular and valuable models: the quasi free particle in the box with impenetrable hard walls (i.e., the model with confinement) and Bloch quantum harmonic oscillator, which is unconfined in space; both models are frequently and effectively exploited in modern nanotechnology e.g., in quantum dots and magnetic traps. We give the extensive and elementary exposition of the potentials, wave functions and energetic spectra of all these interconnected models. Moreover, the pressure operator is defined following the lines of G. Helmann and R. Feynman which were the first who introduced this idea in the late 30ies in quantum chemistry. By these means the baroenergetic equation of state is obtained and analyzed for all three models; in particular, it is shown the absence of the pressure for the Bloch oscillator due to the infinite width of the box. The generalization of these results on the case of nonzero temperature will be given later.

Authors: Steven B. Giddings

A succinct summary is given of the problem of reconciling observation of black hole-like objects with quantum mechanics. If quantum black holes behave like subsystems, and also decay, their information must be transferred to their environments. Interactions that accomplish this with `minimal' departure from a standard description are parameterized. Possible sensitivity of gravitational wave or very long baseline interferometric observations to these interactions is briefly outlined.

Authors: A. Hariri, D. Curic, L. Giner, J. S. Lundeen

The weak value, the average result of a weak measurement, has proven useful for probing quantum and classical systems. Examples include the amplification of small signals, investigating quantum paradoxes, and elucidating fundamental quantum phenomena such as geometric phase. A key characteristic of the weak value is that it can be complex, in contrast to a standard expectation value. However, typically only either the real or imaginary component of the weak value is determined in a given experimental setup. Weak measurements can be used to, in a sense, simultaneously measure non-commuting observables. This principle was used in the direct measurement of the quantum wavefunction. However, the wavefunction's real and imaginary components, given by a weak value, are determined in different setups or on separate ensembles of systems, putting the procedure's directness in question. To address these issues, we introduce and experimentally demonstrate a general method to simultaneously read out both components of the weak value in a single experimental apparatus. In particular, we directly measure the polarization state of an ensemble of photons using weak measurement. With our method, each photon contributes to both the real and imaginary parts of the weak-value average. On a fundamental level, this suggests that the full complex weak value is a characteristic of each photon measured.

In 1981, many of the world’s leading cosmologists gathered at the Pontifical Academy of Sciences, a vestige of the coupled lineages of science and theology located in an elegant villa in the gardens of the Vatican. Stephen Hawking chose the august setting to present what he would later regard as his most important idea: a proposal about how the universe could have arisen from nothing.

Before Hawking’s talk, all cosmological origin stories, scientific or theological, had invited the rejoinder, “What happened before that?” The Big Bang theory, for instance — pioneered 50 years before Hawking’s lecture by the Belgian physicist and Catholic priest Georges Lemaître, who later served as president of the Vatican’s academy of sciences — rewinds the expansion of the universe back to a hot, dense bundle of energy. But where did the initial energy come from?

The Big Bang theory had other problems. Physicists understood that an expanding bundle of energy would grow into a crumpled mess rather than the huge, smooth cosmos that modern astronomers observe. In 1980, the year before Hawking’s talk, the cosmologist Alan Guth realized that the Big Bang’s problems could be fixed with an add-on: an initial, exponential growth spurt known as cosmic inflation, which would have rendered the universe huge, smooth and flat before gravity had a chance to wreck it. Inflation quickly became the leading theory of our cosmic origins. Yet the issue of initial conditions remained: What was the source of the minuscule patch that allegedly ballooned into our cosmos, and of the potential energy that inflated it?

Hawking, in his brilliance, saw a way to end the interminable groping backward in time: He proposed that there’s no end, or beginning, at all. According to the record of the Vatican conference, the Cambridge physicist, then 39 and still able to speak with his own voice, told the crowd, “There ought to be something very special about the boundary conditions of the universe, and what can be more special than the condition that there is no boundary?”

The “no-boundary proposal,” which Hawking and his frequent collaborator, James Hartle, fully formulated in a 1983 paper, envisions the cosmos having the shape of a shuttlecock. Just as a shuttlecock has a diameter of zero at its bottommost point and gradually widens on the way up, the universe, according to the no-boundary proposal, smoothly expanded from a point of zero size. Hartle and Hawking derived a formula describing the whole shuttlecock — the so-called “wave function of the universe” that encompasses the entire past, present and future at once — making moot all contemplation of seeds of creation, a creator, or any transition from a time before.

“Asking what came before the Big Bang is meaningless, according to the no-boundary proposal, because there is no notion of time available to refer to,” Hawking said in another lecture at the Pontifical Academy in 2016, a year and a half before his death. “It would be like asking what lies south of the South Pole.”

Hartle and Hawking’s proposal radically reconceptualized time. Each moment in the universe becomes a cross-section of the shuttlecock; while we perceive the universe as expanding and evolving from one moment to the next, time really consists of correlations between the universe’s size in each cross-section and other properties — particularly its entropy, or disorder. Entropy increases from the cork to the feathers, aiming an emergent arrow of time. Near the shuttlecock’s rounded-off bottom, though, the correlations are less reliable; time ceases to exist and is replaced by pure space. As Hartle, now 79 and a professor at the University of California, Santa Barbara, explained it by phone recently, “We didn’t have birds in the very early universe; we have birds later on. … We didn’t have time in the early universe, but we have time later on.”

The no-boundary proposal has fascinated and inspired physicists for nearly four decades. “It’s a stunningly beautiful and provocative idea,” said Neil Turok, a cosmologist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada, and a former collaborator of Hawking’s. The proposal represented a first guess at the quantum description of the cosmos — the wave function of the universe. Soon an entire field, quantum cosmology, sprang up as researchers devised alternative ideas about how the universe could have come from nothing, analyzed the theories’ various predictions and ways to test them, and interpreted their philosophical meaning. The no-boundary wave function, according to Hartle, “was in some ways the simplest possible proposal for that.”

But two years ago, a paper by Turok, Job Feldbrugge of the Perimeter Institute, and Jean-Luc Lehners of the Max Planck Institute for Gravitational Physics in Germany called the Hartle-Hawking proposal into question. The proposal is, of course, only viable if a universe that curves out of a dimensionless point in the way Hartle and Hawking imagined naturally grows into a universe like ours. Hawking and Hartle argued that indeed it would — that universes with no boundaries will tend to be huge, breathtakingly smooth, impressively flat, and expanding, just like the actual cosmos. “The trouble with Stephen and Jim’s approach is it was ambiguous,” Turok said — “deeply ambiguous.”

In their 2017 paper, published in Physical Review Letters, Turok and his co-authors approached Hartle and Hawking’s no-boundary proposal with new mathematical techniques that, in their view, make its predictions much more concrete than before. “We discovered that it just failed miserably,” Turok said. “It was just not possible quantum mechanically for a universe to start in the way they imagined.” The trio checked their math and queried their underlying assumptions before going public, but “unfortunately,” Turok said, “it just seemed to be inescapable that the Hartle-Hawking proposal was a disaster.”

The paper ignited a controversy. Other experts mounted a vigorous defense of the no-boundary idea and a rebuttal of Turok and colleagues’ reasoning. “We disagree with his technical arguments,” said Thomas Hertog, a physicist at the Catholic University of Leuven in Belgium who closely collaborated with Hawking for the last 20 years of the latter’s life. “But more fundamentally, we disagree also with his definition, his framework, his choice of principles. And that’s the more interesting discussion.”

After two years of sparring, the groups have traced their technical disagreement to differing beliefs about how nature works. The heated — yet friendly — debate has helped firm up the idea that most tickled Hawking’s fancy. Even critics of his and Hartle’s specific formula, including Turok and Lehners, are crafting competing quantum-cosmological models that try to avoid the alleged pitfalls of the original while maintaining its boundless allure.

Garden of Cosmic Delights

Hartle and Hawking saw a lot of each other from the 1970s on, typically when they met in Cambridge for long periods of collaboration. The duo’s theoretical investigations of black holes and the mysterious singularities at their centers had turned them on to the question of our cosmic origin.

In 1915, Albert Einstein discovered that concentrations of matter or energy warp the fabric of space-time, causing gravity. In the 1960s, Hawking and the Oxford University physicist Roger Penrose proved that when space-time bends steeply enough, such as inside a black hole or perhaps during the Big Bang, it inevitably collapses, curving infinitely steeply toward a singularity, where Einstein’s equations break down and a new, quantum theory of gravity is needed. The Penrose-Hawking “singularity theorems” meant there was no way for space-time to begin smoothly, undramatically at a point.

Hawking and Hartle were thus led to ponder the possibility that the universe began as pure space, rather than dynamical space-time. And this led them to the shuttlecock geometry. They defined the no-boundary wave function describing such a universe using an approach invented by Hawking’s hero, the physicist Richard Feynman. In the 1940s, Feynman devised a scheme for calculating the most likely outcomes of quantum mechanical events. To predict, say, the likeliest outcomes of a particle collision, Feynman found that you could sum up all possible paths that the colliding particles could take, weighting straightforward paths more than convoluted ones in the sum. Calculating this “path integral” gives you the wave function: a probability distribution indicating the different possible states of the particles after the collision.

Likewise, Hartle and Hawking expressed the wave function of the universe — which describes its likely states — as the sum of all possible ways that it might have smoothly expanded from a point. The hope was that the sum of all possible “expansion histories,” smooth-bottomed universes of all different shapes and sizes, would yield a wave function that gives a high probability to a huge, smooth, flat universe like ours. If the weighted sum of all possible expansion histories yields some other kind of universe as the likeliest outcome, the no-boundary proposal fails.

The problem is that the path integral over all possible expansion histories is far too complicated to calculate exactly. Countless different shapes and sizes of universes are possible, and each can be a messy affair. “Murray Gell-Mann used to ask me,” Hartle said, referring to the late Nobel Prize-winning physicist, “if you know the wave function of the universe, why aren’t you rich?” Of course, to actually solve for the wave function using Feynman’s method, Hartle and Hawking had to drastically simplify the situation, ignoring even the specific particles that populate our world (which meant their formula was nowhere close to being able to predict the stock market). They considered the path integral over all possible toy universes in “minisuperspace,” defined as the set of all universes with a single energy field coursing through them: the energy that powered cosmic inflation. (In Hartle and Hawking’s shuttlecock picture, that initial period of ballooning corresponds to the rapid increase in diameter near the bottom of the cork.)

Even the minisuperspace calculation is hard to solve exactly, but physicists know there are two possible expansion histories that potentially dominate the calculation. These rival universe shapes anchor the two sides of the current debate.

The rival solutions are the two “classical” expansion histories that a universe can have. Following an initial spurt of cosmic inflation from size zero, these universes steadily expand according to Einstein’s theory of gravity and space-time. Weirder expansion histories, like football-shaped universes or caterpillar-like ones, mostly cancel out in the quantum calculation.

One of the two classical solutions resembles our universe. On large scales, it’s smooth and randomly dappled with energy, due to quantum fluctuations during inflation. As in the real universe, density differences between regions form a bell curve around zero. If this possible solution does indeed dominate the wave function for minisuperspace, it becomes plausible to imagine that a far more detailed and exact version of the no-boundary wave function might serve as a viable cosmological model of the real universe.

The other potentially dominant universe shape is nothing like reality. As it widens, the energy infusing it varies more and more extremely, creating enormous density differences from one place to the next that gravity steadily worsens. Density variations form an inverted bell curve, where differences between regions approach not zero, but infinity. If this is the dominant term in the no-boundary wave function for minisuperspace, then the Hartle-Hawking proposal would seem to be wrong.

The two dominant expansion histories present a choice in how the path integral should be done. If the dominant histories are two locations on a map, megacities in the realm of all possible quantum mechanical universes, the question is which path we should take through the terrain. Which dominant expansion history, and there can only be one, should our “contour of integration” pick up? Researchers have forked down different paths.

In their 2017 paper, Turok, Feldbrugge and Lehners took a path through the garden of possible expansion histories that led to the second dominant solution. In their view, the only sensible contour is one that scans through real values (as opposed to imaginary values, which involve the square roots of negative numbers) for a variable called “lapse.” Lapse is essentially the height of each possible shuttlecock universe — the distance it takes to reach a certain diameter. Lacking a causal element, lapse is not quite our usual notion of time. Yet Turok and colleagues argue partly on the grounds of causality that only real values of lapse make physical sense. And summing over universes with real values of lapse leads to the wildly fluctuating, physically nonsensical solution.

“People place huge faith in Stephen’s intuition,” Turok said by phone. “For good reason — I mean, he probably had the best intuition of anyone on these topics. But he wasn’t always right.”

Imaginary Universes

Jonathan Halliwell, a physicist at Imperial College London, has studied the no-boundary proposal since he was Hawking’s student in the 1980s. He and Hartle analyzed the issue of the contour of integration in 1990. In their view, as well as Hertog’s, and apparently Hawking’s, the contour is not fundamental, but rather a mathematical tool that can be placed to greatest advantage. It’s similar to how the trajectory of a planet around the sun can be expressed mathematically as a series of angles, as a series of times, or in terms of any of several other convenient parameters. “You can do that parameterization in many different ways, but none of them are any more physical than another one,” Halliwell said.

He and his colleagues argue that, in the minisuperspace case, only contours that pick up the good expansion history make sense. Quantum mechanics requires probabilities to add to 1, or be “normalizable,” but the wildly fluctuating universe that Turok’s team landed on is not. That solution is nonsensical, plagued by infinities and disallowed by quantum laws — obvious signs, according to no-boundary’s defenders, to walk the other way.

It’s true that contours passing through the good solution sum up possible universes with imaginary values for their lapse variables. But apart from Turok and company, few people think that’s a problem. Imaginary numbers pervade quantum mechanics. To team Hartle-Hawking, the critics are invoking a false notion of causality in demanding that lapse be real. “That’s a principle which is not written in the stars, and which we profoundly disagree with,” Hertog said.

According to Hertog, Hawking seldom mentioned the path integral formulation of the no-boundary wave function in his later years, partly because of the ambiguity around the choice of contour. He regarded the normalizable expansion history, which the path integral had merely helped uncover, as the solution to a more fundamental equation about the universe posed in the 1960s by the physicists John Wheeler and Bryce DeWitt. Wheeler and DeWitt — after mulling over the issue during a layover at Raleigh-Durham International — argued that the wave function of the universe, whatever it is, cannot depend on time, since there is no external clock by which to measure it. And thus the amount of energy in the universe, when you add up the positive and negative contributions of matter and gravity, must stay at zero forever. The no-boundary wave function satisfies the Wheeler-DeWitt equation for minisuperspace.  

In the final years of his life, to better understand the wave function more generally, Hawking and his collaborators started applying holography — a blockbuster new approach that treats space-time as a hologram. Hawking sought a holographic description of a shuttlecock-shaped universe, in which the geometry of the entire past would project off of the present.

That effort is continuing in Hawking’s absence. But Turok sees this shift in emphasis as changing the rules. In backing away from the path integral formulation, he says, proponents of the no-boundary idea have made it ill-defined. What they’re studying is no longer Hartle-Hawking, in his opinion — though Hartle himself disagrees.

For the past year, Turok and his Perimeter Institute colleagues Latham Boyle and Kieran Finn have been developing a new cosmological model that has much in common with the no-boundary proposal. But instead of one shuttlecock, it envisions two, arranged cork to cork in a sort of hourglass figure with time flowing in both directions. While the model is not yet developed enough to make predictions, its charm lies in the way its lobes realize CPT symmetry, a seemingly fundamental mirror in nature that simultaneously reflects matter and antimatter, left and right, and forward and backward in time. One disadvantage is that the universe’s mirror-image lobes meet at a singularity, a pinch in space-time that requires the unknown quantum theory of gravity to understand. Boyle, Finn and Turok take a stab at the singularity, but such an attempt is inherently speculative.

There has also been a revival of interest in the “tunneling proposal,” an alternative way that the universe might have arisen from nothing, conceived in the ’80s independently by the Russian-American cosmologists Alexander Vilenkin and Andrei Linde. The proposal, which differs from the no-boundary wave function primarily by way of a minus sign, casts the birth of the universe as a quantum mechanical “tunneling” event, similar to when a particle pops up beyond a barrier in a quantum mechanical experiment.

Questions abound about how the various proposals intersect with anthropic reasoning and the infamous multiverse idea. The no-boundary wave function, for instance, favors empty universes, whereas significant matter and energy are needed to power hugeness and complexity. Hawking argued that the vast spread of possible universes permitted by the wave function must all be realized in some larger multiverse, within which only complex universes like ours will have inhabitants capable of making observations. (The recent debate concerns whether these complex, habitable universes will be smooth or wildly fluctuating.) An advantage of the tunneling proposal is that it favors matter- and energy-filled universes like ours without resorting to anthropic reasoning — though universes that tunnel into existence may have other problems.

No matter how things go, perhaps we’ll be left with some essence of the picture Hawking first painted at the Pontifical Academy of Sciences 38 years ago. Or perhaps, instead of a South Pole-like non-beginning, the universe emerged from a singularity after all, demanding a different kind of wave function altogether. Either way, the pursuit will continue. “If we are talking about a quantum mechanical theory, what else is there to find other than the wave function?” asked Juan Maldacena, an eminent theoretical physicist at the Institute for Advanced Study in Princeton, New Jersey, who has mostly stayed out of the recent fray. The question of the wave function of the universe “is the right kind of question to ask,” said Maldacena, who, incidentally, is a member of the Pontifical Academy. “Whether we are finding the right wave function, or how we should think about the wave function — it’s less clear.”

Correction: This article was revised on June 6, 2019, to list Latham Boyle and Kieran Finn as co-developers of the CPT-symmetric universe idea.



show enclosure

(image/jpg)

When quantum mechanics was first developed a century ago as a theory for understanding the atomic-scale world, one of its key concepts was so radical, bold and counter-intuitive that it passed into popular language: the “quantum leap.” Purists might object that the common habit of applying this term to a big change misses the point that jumps between two quantum states are typically tiny, which is precisely why they weren’t noticed sooner. But the real point is that they’re sudden. So sudden, in fact, that many of the pioneers of quantum mechanics assumed they were instantaneous.

A new experiment shows that they aren’t. By making a kind of high-speed movie of a quantum leap, the work reveals that the process is as gradual as the melting of a snowman in the sun. “If we can measure a quantum jump fast and efficiently enough,” said Michel Devoret of Yale University, “it is actually a continuous process.” The study, which was led by Zlatko Minev, a graduate student in Devoret’s lab, was published on Monday in Nature. Already, colleagues are excited. “This is really a fantastic experiment,” said the physicist William Oliver of the Massachusetts Institute of Technology, who wasn’t involved in the work. “Really amazing.”

But there’s more. With their high-speed monitoring system, the researchers could spot when a quantum jump was about to appear, “catch” it halfway through, and reverse it, sending the system back to the state in which it started. In this way, what seemed to the quantum pioneers to be unavoidable randomness in the physical world is now shown to be amenable to control. We can take charge of the quantum.

All Too Random

The abruptness of quantum jumps was a central pillar of the way quantum theory was formulated by Niels Bohr, Werner Heisenberg and their colleagues in the mid-1920s, in a picture now commonly called the Copenhagen interpretation. Bohr had argued earlier that the energy states of electrons in atoms are “quantized”: Only certain energies are available to them, while all those in between are forbidden. He proposed that electrons change their energy by absorbing or emitting quantum particles of light — photons — that have energies matching the gap between permitted electron states. This explained why atoms and molecules absorb and emit very characteristic wavelengths of light — why many copper salts are blue, say, and sodium lamps yellow.

Bohr and Heisenberg began to develop a mathematical theory of these quantum phenomena in the 1920s. Heisenberg’s quantum mechanics enumerated all the allowed quantum states, and implicitly assumed that jumps between them are instant — discontinuous, as mathematicians would say. “The notion of instantaneous quantum jumps … became a foundational notion in the Copenhagen interpretation,” historian of science Mara Beller has written.

Another of the architects of quantum mechanics, the Austrian physicist Erwin Schrödinger, hated that idea. He devised what seemed at first to be an alternative to Heisenberg’s math of discrete quantum states and instant jumps between them. Schrödinger’s theory represented quantum particles in terms of wavelike entities called wave functions, which changed only smoothly and continuously over time, like gentle undulations on the open sea. Things in the real world don’t switch suddenly, in zero time, Schrödinger thought — discontinuous “quantum jumps” were just a figment of the mind. In a 1952 paper called “Are there quantum jumps?,” Schrödinger answered with a firm “no,” his irritation all too evident in the way he called them “quantum jerks.”

The argument wasn’t just about Schrödinger’s discomfort with sudden change. The problem with a quantum jump was also that it was said to just happen at a random moment — with nothing to say why that particular moment. It was thus an effect without a cause, an instance of apparent randomness inserted into the heart of nature. Schrödinger and his close friend Albert Einstein could not accept that chance and unpredictability reigned at the most fundamental level of reality. According to the German physicist Max Born, the whole controversy was therefore “not so much an internal matter of physics, as one of its relation to philosophy and human knowledge in general.” In other words, there’s a lot riding on the reality (or not) of quantum jumps.

Seeing Without Looking

To probe further, we need to see quantum jumps one at a time. In 1986, three teams of researchers reported them happening in individual atoms suspended in space by electromagnetic fields. The atoms flipped between a “bright” state, where they could emit a photon of light, and a “dark” state that did not emit at random moments, remaining in one state or the other for periods of between a few tenths of a second and a few seconds before jumping again. Since then, such jumps have been seen in various systems, ranging from photons switching between quantum states to atoms in solid materials jumping between quantized magnetic states. In 2007 a team in France reported jumps that correspond to what they called “the birth, life and death of individual photons.”

In these experiments the jumps indeed looked abrupt and random — there was no telling, as the quantum system was monitored, when they would happen, nor any detailed picture of what a jump looked like. The Yale team’s setup, by contrast, allowed them to anticipate when a jump was coming, then zoom in close to examine it. The key to the experiment is the ability to collect just about all of the available information about it, so that none leaks away into the environment before it can be measured. Only then can they follow single jumps in such detail.

The quantum systems the researchers used are much larger than atoms, consisting of wires made from a superconducting material — sometimes called “artificial atoms” because they have discrete quantum energy states analogous to the electron states in real atoms. Jumps between the energy states can be induced by absorbing or emitting a photon, just as they are for electrons in atoms.

Devoret and colleagues wanted to watch a single artificial atom jump between its lowest-energy (ground) state and an energetically excited state. But they couldn’t monitor that transition directly, because making a measurement on a quantum system destroys the coherence of the wave function — its smooth wavelike behavior  — on which quantum behavior depends. To watch the quantum jump, the researchers had to retain this coherence. Otherwise they’d “collapse” the wave function, which would place the artificial atom in one state or the other. This is the problem famously exemplified by Schrödinger’s cat, which is allegedly placed in a coherent quantum “superposition” of live and dead states but becomes only one or the other when observed.

To get around this problem, Devoret and colleagues employ a clever trick involving a second excited state. The system can reach this second state from the ground state by absorbing a photon of a different energy. The researchers probe the system in a way that only ever tells them whether the system is in this second “bright” state, so named because it’s the one that can be seen. The state to and from which the researchers are actually looking for quantum jumps is, meanwhile, the “dark” state — because it remains hidden from direct view.

The researchers placed the superconducting circuit in an optical cavity (a chamber in which photons of the right wavelength can bounce around) so that, if the system is in the bright state, the way that light scatters in the cavity changes. Every time the bright state decays by emission of a photon, the detector gives off a signal akin to a Geiger counter’s “click.”

The key here, said Oliver, is that the measurement provides information about the state of the system without interrogating that state directly. In effect, it asks whether the system is in, or is not in, the ground and dark states collectively. That ambiguity is crucial for maintaining quantum coherence during a jump between these two states. In this respect, said Oliver, the scheme that the Yale team has used is closely related to those employed for error correction in quantum computers. There, too, it’s necessary to get information about quantum bits without destroying the coherence on which the quantum computation relies. Again, this is done by not looking directly at the quantum bit in question but probing an auxiliary state coupled to it.

The strategy reveals that quantum measurement is not about the physical perturbation induced by the probe but about what you know (and what you leave unknown) as a result. “Absence of an event can bring as much information as its presence,” said Devoret. He compares it to the Sherlock Holmes story in which the detective infers a vital clue from the “curious incident” in which a dog did not do anything in the night. Borrowing from a different (but often confused) dog-related Holmes story, Devoret calls it “Baskerville’s Hound meets Schrödinger’s Cat.”

To Catch a Jump

The Yale team saw a series of clicks from the detector, each signifying a decay of the bright state, arriving typically every few microseconds. This stream of clicks was interrupted approximately every few hundred microseconds, apparently at random, by a hiatus in which there were no clicks. Then after a period of typically 100 microseconds or so, the clicks resumed. During that silent time, the system had presumably undergone a transition to the dark state, since that’s the only thing that can prevent flipping back and forth between the ground and bright states.

So here in these switches from “click” to “no-click” states are the individual quantum jumps — just like those seen in the earlier experiments on trapped atoms and the like. However, in this case Devoret and colleagues could see something new.

Before each jump to the dark state, there would typically be a short spell where the clicks seemed suspended: a pause that acted as a harbinger of the impending jump. “As soon as the length of a no-click period significantly exceeds the typical time between two clicks, you have a pretty good warning that the jump is about to occur,” said Devoret.

That warning allowed the researchers to study the jump in greater detail. When they saw this brief pause, they switched off the input of photons driving the transitions. Surprisingly, the transition to the dark state still happened even without photons driving it — it is as if, by the time the brief pause sets in, the fate is already fixed. So although the jump itself comes at a random time, there is also something deterministic in its approach.

With the photons turned off, the researchers zoomed in on the jump with fine-grained time resolution to see it unfold. Does it happen instantaneously — the sudden quantum jump of Bohr and Heisenberg? Or does it happen smoothly, as Schrödinger insisted it must? And if so, how?

The team found that jumps are in fact gradual. That’s because, even though a direct observation could reveal the system only as being in one state or another, during a quantum jump the system is in a superposition, or mixture, of these two end states. As the jump progresses, a direct measurement would be increasingly likely to yield the final rather than the initial state. It’s a bit like the way our decisions may evolve over time. You can only either stay at a party or leave it — it’s a binary choice — but as the evening wears on and you get tired, the question “Are you staying or leaving?” becomes increasingly likely to get the answer “I’m leaving.”

The techniques developed by the Yale team reveal the changing mindset of a system during a quantum jump. Using a method called tomographic reconstruction, the researchers could figure out the relative weightings of the dark and ground states in the superposition. They saw these weights change gradually over a period of a few microseconds. That’s pretty fast, but it’s certainly not instantaneous.

What’s more, this electronic system is so fast that the researchers could “catch” the switch between the two states as it is happening, then reverse it by sending a pulse of photons into the cavity to boost the system back to the dark state. They can persuade the system to change its mind and stay at the party after all.

Flash of Insight

The experiment shows that quantum jumps “are indeed not instantaneous if we look closely enough,” said Oliver, “but are coherent processes”: real physical events that unfold over time.

The gradualness of the “jump” is just what is predicted by a form of quantum theory called quantum trajectories theory, which can describe individual events like this. “It is reassuring that the theory matches perfectly with what is seen” said David DiVincenzo, an expert in quantum information at Aachen University in Germany, “but it’s a subtle theory, and we are far from having gotten our heads completely around it.”

The possibility of predicting a quantum jumps just before they occur, said Devoret, makes them somewhat like volcanic eruptions. Each eruption happens unpredictably, but some big ones can be anticipated by watching for the atypically quiet period that precedes them. “To the best of our knowledge, this precursory signal has not been proposed or measured before,” he said.

Devoret said that an ability to spot precursors to quantum jumps might find applications in quantum sensing technologies. For example, “in atomic clock measurements, one wants to synchronize the clock to the transition frequency of an atom, which serves as a reference,” he said. But if you can detect right at the start if the transition is about to happen, rather than having to wait for it to be completed, the synchronization can be faster and therefore more precise in the long run.

DiVincenzo thinks that the work might also find applications in error correction for quantum computing, although he sees that as “quite far down the line.” To achieve the level of control needed for dealing with such errors, though, will require this kind of exhaustive harvesting of measurement data — rather like the data-intensive situation in particle physics, said DiVincenzo.

The real value of the result is not, though, in any practical benefits; it’s a matter of what we learn about the workings of the quantum world. Yes, it is shot through with randomness — but no, it is not punctuated by instantaneous jerks. Schrödinger, aptly enough, was both right and wrong at the same time.



show enclosure

(image/jpg)
Gryb, Sean and Palacios, Patricia and Thebault, Karim P Y (2019) On the Universality of Hawking Radiation. [Preprint]

Nature, Published online: 03 June 2019; doi:10.1038/s41586-019-1287-z

Experiment overturns Bohr’s view of quantum jumps, demonstrating that they possess a degree of predictability and when completed are continuous, coherent and even deterministic.
Making a Difference: Essays on the Philosophy of Causation Edited by BeebeeHelen, HitchcockChristopher and PriceHuwOxford University Press, 2017. xii + 336 pp.

Authors: Steven B. Giddings, Seth Koren, Gabriel Treviño

Two new observational windows have been opened to strong gravitational physics: gravitational waves, and very long baseline interferometry. This suggests observational searches for new phenomena in this regime, and in particular for those necessary to make black hole evolution consistent with quantum mechanics. We describe possible features of "compact quantum objects" that replace classical black holes in a consistent quantum theory, and approaches to observational tests for these using gravitational waves. This is an example of a more general problem of finding consistent descriptions of deviations from general relativity, which can be tested via gravitational wave detection. Simple models for compact modifications to classical black holes are described via an effective stress tensor, possibly with an effective equation of state. A general discussion is given of possible observational signatures, and of their dependence on properties of the colliding objects. The possibility that departures from classical behavior are restricted to the near-horizon regime raises the question of whether these will be obscured in gravitational wave signals, due to their mutual interaction in a binary coalescence being deep in the mutual gravitational well. Numerical simulation with such simple models will be useful to clarify the sensitivity of gravitational wave observation to such highly compact departures from classical black holes.

Authors: Andreas Aste

This paper addresses the question why quantum mechanics is formulated in a unitary Hilbert space, i.e. in a manifestly complex setting. Investigating the linear dynamics of real quantum theory in a finite-dimensional Euclidean Hilbert space hints at the emergence of a complex structure. A widespread misconception concerning the measurement process in quantum mechanics and the hermiticity of observables is briefly discussed.

Authors: Peter Holland

We develop a trajectory construction of solutions to the massless wave equation in n+1 dimensions and hence show that the quantum state of a massive relativistic system in 3+1 dimensions may be represented by a stand-alone four-dimensional congruence comprising a continuum of 3-trajectories coupled to an internal scalar time coordinate. A real Klein-Gordon amplitude is the current density generated by the temporal gradient of the internal time. Complex amplitudes are generated by a two-phase flow. The Lorentz covariance of the trajectory model is established.

Authors: Michael K.-H. Kiessling

This contribution inquires into Clausius' proposal that "the entropy of the world tends to a maximum.'" The question is raised whether the entropy of `the world' actually does have a maximum; and if the answer is "Yes!," what such states of maximum entropy look like, and if the answer is "No!," what this could entail for the fate of the universe. Following R. Penrose, `the world' is modelled by a closed Friedman--Lemaitre type universe, in which a three-dimensional spherical `space' is filled with `matter' consisting of $N$ point particles, their large-scale distribution being influenced by their own gravity. As `entropy of matter' the Boltzmann entropy for a (semi-)classical macrostate, and Boltzmann's ergodic ensemble formulation of it for an isolated thermal equilibrium state, are studied. Since the notion of a Boltzmann entropy is not restricted to classical non-relativistic physics, the inquiry will take into account quantum theory as well as relativity theory; we also consider black hole entropy. Model universes having a maximum entropy state and those which don't will be encountered. It is found that the answer to our maximum entropy question is not at all straightforward at the general-relativistic level. In particular, it is shown that the increase in Bekenstein--Hawking entropy of general-relativistic black holes does not always compensate for the Boltzmann entropy of a piece of matter swallowed by a black hole.

Hetzroni, Guy (2019) Gauge and Ghosts. The British Journal for the Philosophy of Science. ISSN 1464-3537
Rivat, Sébastien (2019) Renormalization Scrutinized. Studies in History and Philosophy of Modern Physics. ISSN 1355-2198
Rédei, Miklós (2019) On the tension between physics and mathematics. [Preprint]
Cambbell, Douglas Ian and Yang, Yi (2019) Does the Solar System Compute the Laws of Motion? [Preprint]

Authors: Steven B. Giddings

The impressive images from the Event Horizon Telescope sharpen the conflict between our observations of gravitational phenomena and the principles of quantum mechanics. Two related scenarios for reconciling quantum mechanics with the existence of black hole-like objects, with "minimal" departure from general relativity and local quantum field theory, have been explored; one of these could produce signatures visible to EHT observations. A specific target is temporal variability of images, with a characteristic time scale determined by the classical black hole radius. The absence of evidence for such variability in the initial observational span of seven days is not expected to strongly constrain such variability. Theoretical and observational next steps towards investigating such scenarios are outlined.

Authors: Samir D. Mathur

The vacuum must contain virtual fluctuations of black hole microstates for each mass $M$. We observe that the expected suppression for $M\gg m_p$ is counteracted by the large number $Exp[S_{bek}]$ of such states. From string theory we learn that these microstates are extended objects that are resistant to compression. We argue that recognizing this `virtual extended compression-resistant' component of the gravitational vacuum is crucial for understanding gravitational physics. Remarkably, such virtual excitations have no significant effect for observable systems like stars, but they resolve two important problems: (a) gravitational collapse is halted outside the horizon radius, removing the information paradox; (b) spacetime acquires a `stiffness' against the curving effects of vacuum energy; this ameliorates the cosmological constant problem posed by the existence of a planck scale $\Lambda$.

Authors: Florio M. Ciaglia, Alberto Ibort, Giuseppe Marmo

A new picture of Quantum Mechanics based on the theory of groupoids is presented. This picture provides the mathematical background for Schwinger's algebra of selective measurements and helps to understand its scope and eventual applications. In this first paper, the kinematical background is described using elementary notions from category theory, in particular the notion of 2-groupoids as well as their representations. Some basic results are presented, and the relation with the standard Dirac-Schr\"odinger and Born-Jordan-Heisenberg pictures are succinctly discussed.

Authors: F. Laloë

We discuss a model where a spontaneous quantum collapse is induced by the gravitational interaction, treated classically. Its dynamics couples the standard wave function of a system with the Bohmian positions of its particles, which are considered as the only source of the gravitational attraction. The collapse is obtained by adding a small imaginary component to the gravitational coupling. It predicts extremely small perturbations of microscopic systems, but very fast collapse of QSMDS (quantum superpositions of macroscopically distinct quantum states) of a solid object, varying as the fifth power of its size. The model does not require adding any dimensional constant to those of standard physics.

Authors: Klaus Renziehausen, Ingo Barth

Bohm developed the Bohmian mechanics (BM), in which the Schr\"odinger equation is transformed into two differential equations: A continuity equation and an equation of motion similar to the Newtonian equation of motion. This transformation can be executed both for single-particle systems and for many-particle systems. Later, Kuzmenkov and Maksimov used basic quantum mechanics for the derivation of many-particle quantum hydrodynamics (MPQHD) including one differential equation for the mass balance and two differential equations for the momentum balance, and we extended their analysis in a prework [K. Renziehausen, I. Barth, Prog. Theor. Exp. Phys. 2018, 013A05 (2018)] for the case that the particle ensemble consists of different particle sorts. The purpose of this paper is to show how the differential equations of MPQHD can be derived for such a particle ensemble with the differential equations of BM as a starting point. Moreover, our discussion clarifies that the differential equations of MPQHD are more suitable for an analysis of many-particle systems than the differential equations of BM because the differential equations of MPQHD depend on a single position vector only while the differential equations of BM depend on the complete set of all particle coordinates.

Gao, Shan (2019) Are there many worlds? [Preprint]
Halvorson, Hans (2019) There is no invariant, four-dimensional stuff. [Preprint]
Esfeld, Michael (2019) From the measurement problem to the primitive ontology programme. [Preprint]

Author(s): Lev Vaidman

Counterfactual communication, i.e., a communication without particles traveling in the transmission channel, is a bizarre quantum effect. Starting from interaction-free measurements many protocols achieving various tasks from counterfactual cryptography to counterfactual transfer of quantum states w...


[Phys. Rev. A 99, 052127] Published Wed May 29, 2019

Author(s): Andrew Lucas

There may be a universal bound on the dissipative timescale in a many-body quantum system for the decay of a small operator into a combination of large operators.


[Phys. Rev. Lett. 122, 216601] Published Wed May 29, 2019

Nature, Published online: 29 May 2019; doi:10.1038/d41586-019-01592-x

It is extremely difficult to observe the radiation that is thought to be emitted by black holes. The properties of this radiation have now been analysed using an analogue black hole comprising a system of ultracold atoms.

Author(s): Siddhant Das, Markus Nöth, and Detlef Dürr

It is well known that orthodox quantum mechanics does not make unambiguous predictions for the statistics in arrival time (or time-of-flight) experiments. Bohmian mechanics (or de Broglie–Bohm theory) offers a distinct conceptual advantage in this regard, owing to the well-defined concepts of point ...


[Phys. Rev. A 99, 052124] Published Tue May 28, 2019

Author(s): Paul Boes, Jens Eisert, Rodrigo Gallego, Markus P. Müller, and Henrik Wilming

The von Neumann entropy is a key quantity in quantum information theory and, roughly speaking, quantifies the amount of quantum information contained in a state when many identical and independent (i.i.d.) copies of the state are available, in a regime that is often referred to as being asymptotic. ...


[Phys. Rev. Lett. 122, 210402] Published Tue May 28, 2019

Author(s): Philippe Faist, Mario Berta, and Fernando Brandão

Thermodynamics imposes restrictions on what state transformations are possible. In the macroscopic limit of asymptotically many independent copies of a state—as for instance in the case of an ideal gas—the possible transformations become reversible and are fully characterized by the free energy. In ...


[Phys. Rev. Lett. 122, 200601] Published Fri May 24, 2019

Author(s): Marie Ioannou, Jonatan Bohr Brask, and Nicolas Brunner

Quantum theory allows for randomness generation in a device-independent setting, where no detailed description of the experimental device is required. Here we derive a general upper bound on the amount of randomness that can be certified in such a setting. Our bound applies to any black-box scenario...


[Phys. Rev. A 99, 052338] Published Thu May 23, 2019

Author(s): Dmitry A. Abanin, Ehud Altman, Immanuel Bloch, and Maksym Serbyn

The route of a physical system toward equilibrium and thermalization has been the subject of discussion and controversy since the time of Boltzmann. This Colloquium reviews the recent progress in understanding many-body localization, a phase of matter in which quantum mechanics and disorder conspire to prohibit thermalization altogether. Many new phenomena emerge in lieu of conventional statistical mechanics and may be observed in systems of ultracold atoms, superconducting qubits, and certain quantum materials.


[Rev. Mod. Phys. 91, 021001] Published Wed May 22, 2019

Author(s): Askery Canabarro, Samuraí Brito, and Rafael Chaves

The ability to witness nonlocal correlations lies at the core of foundational aspects of quantum mechanics and its application in the processing of information. Commonly, this is achieved via the violation of Bell inequalities. Unfortunately, however, their systematic derivation quickly becomes unfe...


[Phys. Rev. Lett. 122, 200401] Published Wed May 22, 2019

Publication date: Available online 17 May 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Adam Koberinski

Abstract

In this paper I will focus on the case of the discovery of parity nonconservation in weak interactions from the period spanning 1947–1957, and the lessons this episode provides for successful theory construction in HEP. I aim to (a) summarize the history into a coherent story for philosophers of science, and (b) use the history as a case study for the epistemological evolution of the understanding of weak interactions in HEP. Iconclude with some philosophical lessons regarding theory construction in physics.

Author(s): Daniel Harlow and Hirosi Ooguri

Insights from the AdS/CFT correspondence provide a glimpse of what global kinematical properties of viable quantum theories of gravity might be.


[Phys. Rev. Lett. 122, 191601] Published Fri May 17, 2019

Publication date: Available online 14 May 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Jonathan F. Schonfeld

Abstract

I argue that the marquis characteristics of the quantum-mechanical double-slit experiment (point detection, random distribution, Born rule) can be explained using Schroedinger's equation alone, if one takes into account that, for any atom in a detector, there is a small but nonzero gap between its excitation energy and the excitation energies of all other relevant atoms in the detector (isolated-levels assumption). To illustrate the point I introduce a toy model of a detector. The form of the model follows common practice in quantum optics and cavity QED. Each detector atom can be resonantly excited by the incoming particle, and then emit a detection signature (e.g., bright flash of light) or dissipate its energy thermally. Different atoms have slightly different resonant energies per the isolated-levels assumption, and the projectile preferentially excites the atom with the closest energy match. The toy model permits one easily to estimate the probability that any atom is resonantly excited, and also that a detection signature is produced before being overtaken by thermal dissipation. The end-to-end detection probability is the product of these two probabilities, and is proportional to the absolute-square of the incoming wavefunction at the atom in question, i.e. the Born rule. I consider how closely a published neutron interference experiment conforms to the picture developed here; I show how this paper's analysis steers clear of creating a scenario with local hidden variables; I show how the analysis steers clear of the irreversibility implicit in the projection postulate; and I discuss possible experimental tests of this paper's ideas. Hopefully, this is a significant step toward realizing the program of solving the measurement problem within unitary quantum mechanics envisioned by Landsman, among others.

Publication date: Available online 11 May 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Katsuaki Higashi

Abstract

According to a conventional view, there exists no common cause model of quantum correlations satisfying locality requirements. Indeed, Bell's inequality is derived from some locality requirements and the assumption that the common cause exists, and the violation of the inequality has been experimentally verified. On the other hand, some researchers argued that in the derivation of the inequality, the existence of a common common-cause for multiple correlations is implicitly assumed and that the assumption is unreasonably strong. According to their idea, what is necessary for explaining the quantum correlation is a common cause for each correlation. However, Graβhoff et al. showed that when there are three pairs of perfectly correlated events and a common cause of each correlation exist, we cannot construct a common cause model that is consistent with quantum mechanical prediction and also meets several locality requirements. In this paper, first, as a consequence of the fact shown by Graβhoff et al., we will confirm that there exists no local common cause model when a two-particle system is in any maximally entangled state. After that, based on Hardy's famous argument, we will prove that there exists no local common cause model when a two-particle system is in any non-maximally entangled state. Therefore, it will be concluded that for any entangled state, there exists no local common cause model. It will be revealed that the non-existence of a common cause model satisfying locality is not limited to a particular state like the singlet state.

Author(s): Eyuri Wakakuwa, Akihito Soeda, and Mio Murao

We prove a trade-off relation between the entanglement cost and classical communication round complexity of a protocol in implementing a class of two-qubit unitary gates by two distant parties, a key subroutine in distributed quantum information processing. The task is analyzed in an information the...


[Phys. Rev. Lett. 122, 190502] Published Thu May 16, 2019

Author(s): F. Laloë

We discuss a model of spontaneous collapse of the quantum state that does not require adding any stochastic processes to the standard dynamics. The additional ingredient with respect to the wave function is a position in the configuration space which drives the collapse in a completely deterministic...


[Phys. Rev. A 99, 052111] Published Tue May 14, 2019

Author(s): Alexander R. H. Smith

We generalize a quantum communication protocol introduced by Bartlett et al. [New J. Phys. 11, 063013 (2009)], in which two parties communicating do not share a classical reference frame, to the case where changes of their reference frames form a one-dimensional noncompact Lie group. Alice sends to ...


[Phys. Rev. A 99, 052315] Published Fri May 10, 2019

Author(s): P.-P. Crépin, C. Christen, R. Guérout, V. V. Nesvizhevsky, A.Yu. Voronin, and S. Reynaud

We propose to use quantum interferences to improve the accuracy of the measurement of the free-fall acceleration g¯ of antihydrogen in the gravitational behavior of antihydrogen at rest (GBAR) experiment. This method uses most antiatoms prepared in the experiment and it is simple in its principle, a...


[Phys. Rev. A 99, 042119] Published Mon Apr 22, 2019

Author(s): Florian Fröwis, Matteo Fadel, Philipp Treutlein, Nicolas Gisin, and Nicolas Brunner

The quantum Fisher information (QFI) of certain multipartite entangled quantum states is larger than what is reachable by separable states, providing a metrological advantage. Are these nonclassical correlations strong enough to potentially violate a Bell inequality? Here, we present evidence from t...


[Phys. Rev. A 99, 040101(R)] Published Wed Apr 17, 2019

Author(s): Krzysztof Ptaszyński and Massimiliano Esposito

We report two results complementing the second law of thermodynamics for Markovian open quantum systems coupled to multiple reservoirs with different temperatures and chemical potentials. First, we derive a nonequilibrium free energy inequality providing an upper bound for a maximum power output, wh...


[Phys. Rev. Lett. 122, 150603] Published Tue Apr 16, 2019

Author(s): Andrea Crespi, Francesco V. Pepe, Paolo Facchi, Fabio Sciarrino, Paolo Mataloni, Hiromichi Nakazato, Saverio Pascazio, and Roberto Osellame

The decay of an unstable system is usually described by an exponential law. Quantum mechanics predicts strong deviations of the survival probability from the exponential: Indeed, the decay is initially quadratic, while at very large times it follows a power law, with superimposed oscillations. The l...


[Phys. Rev. Lett. 122, 130401] Published Wed Apr 03, 2019

Volume 5, Issue 2, pages 80-97

Aurélien Drezet [Show Biography]

Dr. Aurélien Drezet was born in Metz, France, in 1975. He received his Ph.D Degree in experimental physics from the University Joseph Fourier at Grenoble in 2002 where he was working on nano optics and near-field scanning optical microscopy. After 6 years as a post-doc in Austria and France, he is currently head of the Nano-Optics and Forces team at Institut Néel in Grenoble (which belongs to the national scientific research center-CNRS-France which is also associated with the University Joseph Fourier in Grenoble). His activities range from experimental and theoretical plasmonics to near-field scanning optical microscopy in both the classical and quantum regime. He is also strongly involved in scientific works and discussions concerning quantum foundations in general and Bohmian mechanics in particular.

This is an analysis of the recently published article “Quantum theory cannot consistently describe the use of itself” by D. Frauchiger and R. Renner [1]. Here I decipher the paradox and analyze it from the point of view of de Broglie-Bohm hidden variable theory (i.e., Bohmian mechanics). I also analyze the problem from the perspective obtained by the Copenhagen interpretation (i.e., the Bohrian interpretation) and show that both views are self consistent and do not lead to any contradiction with a `single-world’ description of quantum theory.

Full Text Download (655k)

Volume 5, Issue 2, pages 69-79

Mohammed Sanduk [Show Biography]

Mohammed Sanduk is an Iraqi born British physicist. He was educated at University of Baghdad and University of Manchester. Before attending his undergraduate study, he pub-lished a book in particle physics entitled “Mesons”. Sanduk has worked in industry and academia, and his last post in Iraq was head of the Laser and Opto-electronics Engineering department at Nahrain University in Baghdad. Owing to his interest in the philosophy of science, and he was a member of the academic staff of Pontifical Babel College for Philosophy. Sanduk is working with the department of chemical and process engineering at the University of Surrey. Sanduk is interested in transport of charged particles, Magnetohydro-dynamics, and the renewable energy technology. In addition to that, Sanduk is interested in the foundation of Quantum mechanics, and the philosophy of science & technology.

The imaginary i in the formulation of the quantum mechanics is accepted within the axioms of the quantum mechanics theory, and, thus, there is no need for an explanation of its origin. Since 2012, in a non-quantum mechanics project, there has been an attempt to complexify a real function and build an analogy for relativistic quantum mechanics. In that theoretical attempt, a partial observation technique is proposed as one of the reasons behind the appearance of the imaginary i. The present article throws light on that attempt of complexification and tries to explain the logic of physics behind the complex phase factor. This physical process of partial observation acts as a process of physicalization of a virtual model. According to the positive results of analogy, the appeared imaginary i in quantum mechanics formulation may be related to a partial observation case as well.

Full Text Download (621k)

Volume 5, Issue 2, pages 51-68

Daniel Shanahan [Show Biography]

Dan Shanahan is an independent researcher with a passion for foundational issues in quantum theory and relativity. Born in Perth, Western Australia, he studied physics at the Universities of NSW and Sydney.

Effects associated in quantum mechanics with a divisible probability wave are
explained as physically real consequences of the equal but opposite reaction
of the apparatus as a particle is measured. Taking as illustration a
Mach-Zehnder interferometer operating by refraction, it is shown that this
reaction must comprise a fluctuation in the reradiation field of complementary
effect to the changes occurring in the photon as it is projected into one or
other path. The evolution of this fluctuation through the experiment will
explain the alternative states of the particle discerned in self interference,
while the maintenance of equilibrium in the face of such fluctuations becomes
the source of the Born probabilities. In this scheme, the probability wave
is a mathematical artifact, epistemic rather than ontic, and akin in this
respect to the simplifying constructions of geometrical optics.

Full Text Download (361k)

Volume 5, Issue 2, pages 16-50

Edward J. Gillis [Show Biography]

Ed Gillis received his B. A. in Philosophy from the University of Michigan, and his Ph.D in Physics from the University of Colorado for research on the relationship between quantum nonlocality and relativity. He has authored several papers on quantum foundations, dealing, in particular, with connections between wave function collapse and elementary processes, how these connections might lead to an explanation of the no-superluminal-signaling principle in fundamental physical terms, and possible tests for collapse. He has also worked as an engineer on the development of sensor systems and control algorithms based on the information provided by those systems.

The assumption that wave function collapse is a real occurrence has very interesting consequences – both experimental and theoretical. Besides predicting observable deviations from linear evolution, it implies that these deviations must originate in nondeterministic effects at the elementary level in order to prevent superluminal signaling, as demonstrated by Gisin. This lack of determinism implies that information cannot be instantiated in a reproducible form in isolated microsystems (as illustrated by the No-cloning theorem). By stipulating that information is a reproducible and referential property of physical systems, one can formulate the no-signaling principle in strictly physical terms as a prohibition of the acquisition of information about spacelike-separated occurrences. This formulation provides a new perspective on the relationship between relativity and spacetime structure, and it imposes tight constraints on the way in which collapse effects are induced. These constraints indicate that wave function collapse results from (presumably small) nondeterministic deviations from linear evolution associated with nonlocally entangling interactions. This hypothesis can be formalized in a stochastic collapse equation and used to assess the feasibility of testing for collapse effects.
Full Text Download (336k)

Author(s): James Klatzow, Jonas N. Becker, Patrick M. Ledingham, Christian Weinzetl, Krzysztof T. Kaczmarek, Dylan J. Saunders, Joshua Nunn, Ian A. Walmsley, Raam Uzdin, and Eilon Poem

Experiments demonstrate a quantum-coherence-induced power increase for quantum heat engines over their classical counterparts.


[Phys. Rev. Lett. 122, 110601] Published Wed Mar 20, 2019

Publication date: Available online 9 March 2019

Source: Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics

Author(s): Tom McLeish, Mark Pexton, Tom Lancaster

Abstract

There has been growing interest in systems in condensed matter physics as a potential source of examples of both epistemic and ontological emergence. One of these case studies is the fractional quantum Hall state (FQHS). In the FQHS a system of electrons displays a type of holism due to a pattern of long-range quantum entanglement that some argue is emergent. Indeed, in general, quantum entanglement is sometimes cited as the best candidate for one form of ontological emergence. In this paper we argue that there are significant formal and physical parallels between the quantum FQHS and classical polymer systems. Both types of system cannot be explained simply by considering an aggregation of local microphysical properties alone, since important features of each are globally determined by topological features. As such, we argue that if the FQHS is a case of ontological emergence then it is not due to the quantum nature of the system and classical polymer systems are ontologically emergent as well.