Weekly Papers on Quantum Foundations (25)

Gao, Shan (2019) Quantum theory is incompatible with relativity: A new proof beyond Bell’s theorem and a test of unitary quantum theories. [Preprint]

Authors: Antoine Suarez

Physicians define death as the “irreversible” breakdown of all brain-functions including brain-stem. By “irreversible” they mean a damage that is beyond the human capacity to restore the patient’s healthy state. In the same line I propose to complete the definition of quantum physics in [1] by Principle D (Detection): “Detection outcomes (like death) are ordinarily irreversible and observer-independent”. It is then argued that this principle excludes generalization of quantum superposition to visible objects bearing observer-dependent outcomes. However this exclusion is not absolute: It rather means that “Schr\”{o}dinger’s cat” and “Wigner’s friend” should be considered “miracle” narratives beyond the domain of science.

Authors: Davi GeigerZvi M. Kedem

Physical laws for elementary particles can be described by the quantum dynamics equation given a Hamiltonian. The solution are probability amplitudes in Hilbert space that evolve over time. A probability density function over position and time is given as the magnitude square of such probability amplitude. An entropy can be associated with these probability densities characterizing the position information of a particle. Coherent states are localized wave packets and may describe the spatial distribution for some particle states. We show that due to a dispersion property of Hamiltonians in quantum physics, the entropy of coherent states increases over time. We investigate a partition of the Hilbert space into four sets based on whether the entropy is (i) increasing but not constant, (ii) decreasing but not constant, (iii) constant, (iv) oscillating.

We then postulate that quantum theory of elementary particles is equipped with a law that entropy (weakly) increases in time and thus states in set (ii) are disallowed, and the states in set (iii) can not complete an oscillation period. There is a key role of the conjugate process transforming states that are allowed into states that are not, and vice-versa.

Then, according to this law, quantum theory is not time reversible unless the state is in the partition (iii), e.g., stationary states (eigentstates of the Hamiltonian). This law in quantum theory limits physical scenarios beyond conservation laws, providing causality reasoning by defining an arrow of time.

Authors: Liang Liang sunXiang ZhouSixia Yu

Finding physical principles lying behind quantum mechanics is essential to understand various quantum features, e.g., the quantum correlations, in a theory-independent manner. Here we propose such a principle, namely, no disturbance without uncertainty, stating that the disturbance caused by a measurement to a subsequent incompatible measurement is no larger than the uncertainty of the first measurement, equipped with suitable theory-independent measures for disturbance and uncertainty. When applied to local systems in a multipartite scenario, our principle imposes such a strong constraint on non-signaling correlations that quantum correlations can be recovered in many cases: i. it accounts for the Tsirelsons bound; ii. it provides the so far tightest boundary for a family of the noisy super-nonlocal box with 3 parameters, and iii. it rules out an almost quantum correlation from quantum correlations by which all the previous principles fail, as well as the celebrated quantum criterion due to Navascues, Pironio, and Acin. Our results pave the way to understand nonlocality exhibited in quantum correlations from local principles.

The universe is kind of an impossible object. It has an inside but no outside; it’s a one-sided coin. This Möbius architecture presents a unique challenge for cosmologists, who find themselves in the awkward position of being stuck inside the very system they’re trying to comprehend.

It’s a situation that Lee Smolin has been thinking about for most of his career. A physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada, Smolin works at the knotty intersection of quantum mechanics, relativity and cosmology. Don’t let his soft voice and quiet demeanor fool you — he’s known as a rebellious thinker and has always followed his own path. In the 1960s Smolin dropped out of high school, played in a rock band called Ideoplastos, and published an underground newspaper. Wanting to build geodesic domes like R. Buckminster Fuller, Smolin taught himself advanced mathematics — the same kind of math, it turned out, that you need to play with Einstein’s equations of general relativity. The moment he realized this was the moment he became a physicist. He studied at Harvard University and took a position at the Institute for Advanced Study in Princeton, New Jersey, eventually becoming a founding faculty member at the Perimeter Institute.

“Perimeter,” in fact, is the perfect word to describe Smolin’s place near the boundary of mainstream physics. When most physicists dived headfirst into string theory, Smolin played a key role in working out the competing theory of loop quantum gravity. When most physicists said that the laws of physics are immutable, he said they evolve according to a kind of cosmic Darwinism. When most physicists said that time is an illusion, Smolin insisted that it’s real.

Smolin often finds himself inspired by conversations with biologists, economists, sculptors, playwrights, musicians and political theorists. But he finds his biggest inspiration, perhaps, in philosophy — particularly in the work of the German philosopher Gottfried Leibniz, active in the 17th and 18th centuries, who along with Isaac Newton invented calculus. Leibniz argued (against Newton) that there’s no fixed backdrop to the universe, no “stuff” of space; space is just a handy way of describing relationships. This relational framework captured Smolin’s imagination, as did Leibniz’s enigmatic text The Monadology, in which Leibniz suggests that the world’s fundamental ingredient is the “monad,” a kind of atom of reality, with each monad representing a unique view of the whole universe. It’s a concept that informs Smolin’s latest work as he attempts to build reality out of viewpoints, each one a partial perspective on a dynamically evolving universe. A universe as seen from the inside.

Quanta Magazine spoke with Smolin about his approach to cosmology and quantum mechanics, which he details in his recent book, Einstein’s Unfinished Revolution. The interview has been condensed and edited for clarity.

You have a slogan: “The first principle of cosmology must be: There is nothing outside the universe.”

In different formulations of the laws of physics, like Newtonian mechanics or quantum mechanics, there is background structure — structure which has to be specified and is fixed. It’s not subject to evolution, it’s not influenced by anything that happens. It’s structure outside the system being modeled. It’s the framework on which we hang observables — the observer, a clock and so forth. The statement that there’s nothing outside the universe — there’s no observer outside the universe — implies that we need a formulation of physics without background structure. All the theories of physics we have, in one way or another, apply only to subsystems of the universe. They don’t apply to the universe as a whole, because they require this background structure.

If we want to make a cosmological theory, to understand nature on the cosmological scale, we have to avoid what the philosopher Roberto Unger and I called “the cosmological fallacy,” the mistaken belief that we can take theories that apply to subsystems and scale them up to the universe as a whole. We need a formulation of dynamics that doesn’t refer to an observer or measuring instrument or anything outside the system. That means we need a different kind of theory.

You’ve recently proposed such a theory — one in which, as you put it, “the history of the universe is constituted of different views of itself.” What does that mean?

It’s a theory about processes, about the sequences and causal relations among things that happen, not the inherent properties of things that are. The fundamental ingredient is what we call an “event.” Events are things that happen at a single place and time; at each event there’s some momentum, energy, charge or other various physical quantity that’s measurable. The event has relations with the rest of the universe, and that set of relations constitutes its “view” of the universe. Rather than describing an isolated system in terms of things that are measured from the outside, we’re taking the universe as constituted of relations among events. The idea is to try to reformulate physics in terms of these views from the inside, what it looks like from inside the universe.

How do you do that?

There are many views, and each one has only partial information about the rest of the universe. We propose as a principle of dynamics that each view should be unique. That idea comes from Leibniz’s principle of the identity of indiscernibles. Two events whose views are exactly mappable onto each other are the same event, by definition. So each view is unique, and you can measure how distinct one is from another by defining a quantity called the “variety.” If you think of a node on a graph, you can go one step out, two steps out, three steps out. Each step gives you a neighborhood — the one-step neighborhood, the two-step neighborhood, the three-step neighborhood. So for any two events you can ask: How many steps do you have to go out until their views diverge? In what neighborhood are they different? The fewer steps you have to go, the more distinguishable the views are from one another. The idea in this theory is that the laws of physics — the dynamics of the system — work to maximize variety. That principle — that nature wants to maximize variety — actually leads, within the framework I’ve been describing, to the Schrödinger equation, and hence to a recovery, in an appropriate limit, of quantum mechanics.

I know from your book that you’re a realist at heart — you believe strongly in a reality independent of our knowledge of it — and therefore, like Einstein, you think quantum mechanics is incomplete. Does this theory of views help complete what you think is missing in quantum theory?

Einstein — as well as someone called Leslie Ballentine — advocated an “ensemble interpretation” of the wave function . The idea was that the wave function describes an ensemble of possible states. But one day, I was sitting in a cafe working and suddenly I thought: What if the ensemble is real? What if, when you have a wave function describing a single water molecule, it’s actually describing the ensemble of every water molecule in the universe?

So whereas normally we would think that there’s one water molecule but an uncertainty of states, you’re saying that the uncertainty of states is actually the ensemble of all the water molecules in the universe?

Yes. They form an ensemble because they have very similar views. They all interact with one another, because the probability of interaction is determined by the similarity of views, not necessarily their proximity in space.

Things don’t have to be near each other to interact?

In this theory, the similarity of views is more fundamental than space. Often, two events have similar views because they’re close in space. If two people stand next to each other, they have very similar, overlapping views of the universe. But two atoms have many fewer relational properties than big, complex objects like people. So two atoms far apart in space can still have very similar views. That means that at the smallest scale, there should be highly nonlocal interactions, which is exactly what you get with entanglement in quantum mechanics. That’s where quantum mechanics comes from, according to the real-ensemble formulation.

It reminds me of a lot of work that’s going on now in physics that’s finding surprising connections between entanglement and the geometry of space-time.

I think a lot of that work is really interesting. The hypothesis that’s motivating it is that entanglement is fundamental in quantum mechanics, and the geometry of space or space-time emerges from structures of entanglement. It’s a very positive development.

You’ve said that these ideas were inspired by Leibniz’s Monadology. Did you just happen to pull out your Monadology and reread it?

I first read Leibniz at the instigation of Julian Barbour, when I was just out of graduate school. First I read the correspondence between Leibniz and Samuel Clarke, who was a follower of Newton, in which Leibniz criticized Newton’s notion of absolute space and absolute time and argued that observables in physics should be relational. They should describe the relations of one system with another, resulting from their interaction. Later I read the Monadology. I read it as a sketch for how to make a background-independent theory of physics. I do look at my copy from time to time. There is a beautiful quote in there, where Leibniz says, “Just as the same city viewed from different directions appears entirely different … there are, as it were, just as many different universes, which are, nevertheless, only perspectives on a single one, corresponding to the different points of view of each monad.” That, to me, evokes why these ideas are very suitable, not just in physics but for a whole range of things from social policy and postmodernism to art to what it feels like to be an individual in a diverse society. But that’s another discussion!

Your work has been very influenced by philosophy. Looking back historically, people like Einstein and Bohr and John Wheeler all took philosophy very seriously; it directly influenced their physics. It seems to be a trait of great physicists and yet —

And also of not-great physicists.

OK, fair! It just seems that it’s become almost taboo to talk about philosophy in physics today. Has that been your experience?

Not at all. Many of the leading theorists in foundational physics — where the goal is to deepen our knowledge of the fundamental laws — know philosophy very well. As an undergraduate at Hampshire College, I did a lot of physics and some philosophy courses. Then when I went to Harvard for graduate school, I intended to do a double Ph.D. in physics and philosophy, but I got disenchanted with philosophy pretty quickly. I mean, the physicists were arrogant enough. But the philosophers even more so.

Back when we had the revolutions in physics in the first part of the 20th century in Europe, people like Einstein, Bohr, Heisenberg, Schrödinger and others were very well educated in philosophy, and it informed their work as physicists. Then there was this pragmatic turn, where the dominant mode of physics became anti-foundational, anti-philosophy.

The historian of physics David Kaiser at MIT has studied this in detail. He studied quantum mechanics textbooks and lecture notes and saw how, through the 1940s into the 1950s, references to philosophy and to foundational issues disappeared from quantum mechanics courses. Freeman Dyson once said, normally the young people are rebels and the old people are the conservatives, but in his generation it was the reverse. The young people didn’t want to hear about messy philosophy or foundational issues, they just wanted to get out and apply quantum mechanics.

This was great for the explosion of applications of quantum mechanics from the 1940s into the 1970s, through the establishment of the Standard Model, condensed matter physics and so forth. But then fundamental physics got stuck, and part of the reason we got stuck is we reached a set of problems on which you can’t make progress with this pragmatic, anti-foundational culture. I should make clear that those fields where you can assume we know the relevant laws, like condensed matter and astrophysics, continue to thrive. But if your goal is to discover new, deeper laws, you need to mix with philosophers again. And it has been happening much more.

When I started mixing with philosophers, there were a few who really knew physics well, but most didn’t. Today, the young people working in philosophy of physics, for the most part, know physics well. The interchange with philosophy is coming back, and I think it’s a good thing.

show enclosure

De Haro, Sebastian (2019) Theoretical Equivalence and Duality. [Preprint]

Authors: Yusef MalekiAlireza Maleki

Quantum mechanics imposes a fundamental bound on the minimum time required for the quantum systems to evolve between two states of interest. This bound introduces a limit on the speed of the dynamical evolution of the systems, known as the quantum speed limit. We show that black holes can drastically affect the speed limit of a two-level fermionic quantum system subjected to an open quantum dynamics. As we demonstrate, the quantum speed limit can enhance at the vicinity of a black hole’s event horizon in the Schwarzschild spacetime.

Authors: Matteo CarlessoMauro Paternostro

The gap between the predictions of collapse models and those of standard quantum mechanics widens with the complexity of the involved systems. Addressing the way such gap scales with the mass or size of the system being investigated paves the way to testing the validity of the collapse theory and identify the values of the parameters that characterize it. Here, we review the recently proposed non-interferometric approach to the testing of collapse models, focusing on the opto-mechanical platform.

Authors: Antony Valentini

We compare and contrast two distinct approaches to understanding the Born rule in de Broglie-Bohm pilot-wave theory, one based on dynamical relaxation over time (advocated by this author and collaborators) and the other based on typicality of initial conditions (advocated by the ‘Bohmian mechanics’ school). It is argued that the latter approach is inherently circular and physically misguided. The typicality approach has engendered a deep-seated confusion between contingent and law-like features, leading to misleading claims not only about the Born rule but also about the nature of the wave function. By artificially restricting the theory to equilibrium, the typicality approach has led to further misunderstandings concerning the status of the uncertainty principle, the role of quantum measurement theory, and the kinematics of the theory (including the status of Galilean and Lorentz invariance). The restriction to equilibrium has also made an erroneously-constructed stochastic model of particle creation appear more plausible than it actually is. To avoid needless controversy, we advocate a modest ’empirical approach’ to the foundations of statistical mechanics. We argue that the existence or otherwise of quantum nonequilibrium in our world is an empirical question to be settled by experiment.

Authors: Jonathan BarrettRobin LorenzOgnyan Oreshkov

It is known that the classical framework of causal models is not general enough to allow for causal reasoning about quantum systems. Efforts have been devoted towards generalization of the classical framework to the quantum case, with the aim of providing a framework in which cause-effect relations between quantum systems, and their connection with empirically observed data, can be rigorously analyzed. Building on the results of Allen et al., Phys. Rev. X 7, 031021 (2017), we present a fully-fledged framework of quantum causal models. The approach situates causal relations in unitary transformations, in analogy with an approach to classical causal models that assumes underlying determinism and situates causal relations in functional dependences between variables. We show that for any quantum causal model, there exists a corresponding unitary circuit, with appropriate causal structure, such that the quantum causal model is returned when marginalising over latent systems, and vice versa. We introduce an intrinsically quantum notion that plays a role analogous to the conditional independence of classical variables, and (generalizing a central theorem of the classical framework) show that d-separation is sound and complete in the quantum case. We present generalizations of the three rules of the classical `do-calculus’, in each case relating a property of the causal structure to a formal property of the quantum process, and to an operational statement concerning the outcomes of interventions. In addition to the results concerning quantum causal models, we introduce and derive similar results for `classical split-node causal models’, which are more closely analogous to quantum causal models than the classical causal models that are usually studied.

Abstract

A physically consistent semi-classical treatment of black holes requires universality arguments to deal with the `trans-Planckian’ problem where quantum spacetime effects appear to be amplified such that they undermine the entire semi-classical modelling framework. We evaluate three families of such arguments in comparison with Wilsonian renormalization group universality arguments found in the context of condensed matter physics. Our analysis is framed by the crucial distinction between robustness and universality. Particular emphasis is placed on the quality whereby the various arguments are underpinned by `integrated’ notions of robustness and universality. Whereas the principal strength of Wilsonian universality arguments can be understood in terms of the presence of such integration, the principal weakness of all three universality arguments for Hawking radiation is its absence.

The flashier fruits of Albert Einstein’s century-old insights are by now deeply embedded in the popular imagination: Black holes, time warps and wormholes show up regularly as plot points in movies, books, TV shows. At the same time, they fuel cutting-edge research, helping physicists pose questions about the nature of space, time, even information itself.

Perhaps ironically, though, what is arguably the most revolutionary part of Einstein’s legacy rarely gets attention. It has none of the splash of gravitational waves, the pull of black holes or even the charm of quarks. But lurking just behind the curtain of all these exotic phenomena is a deceptively simple idea that pulls the levers, shows how the pieces fit together, and lights the path ahead.

The idea is this: Some changes don’t change anything. The most fundamental aspects of nature stay the same even as they seemingly shape-shift in unexpected ways. Einstein’s 1905 papers on relativity led to the unmistakable conclusion, for example, that the relationship between energy and mass is invariant, even though energy and mass themselves can take vastly different forms. Solar energy arrives on Earth and becomes mass in the form of green leaves, creating food we can eat and use as fuel for thought. (“What is this mind of ours: what are these atoms with consciousness?” asked the late Richard Feynman. “Last week’s potatoes!”) That’s the meaning of E = mc2. The “c” stands for the speed of light, a very large number, so it doesn’t take much matter to produce an enormous amount of energy; in fact, the sun turns millions of tons of mass into energy each second.

This endless morphing of matter into energy (and vice versa) powers the cosmos, matter, life. Yet through it all, the energy-matter content of the universe never changes. It’s strange but true: Matter and energy themselves are less fundamental than the underlying relationships between them.

We tend to think of things, not relationships, as the heart of reality. But most often, the opposite is true. “It’s not the stuff,” said the Brown University physicist Stephon Alexander.

The same is true, Einstein showed, for “stuff” like space and time, seemingly stable, unchangeable aspects of nature; in truth, it’s the relationship between space and time that always stays the same, even as space contracts and time dilates. Like energy and matter, space and time are mutable manifestations of deeper, unshakable foundations: the things that never vary no matter what.

“Einstein’s deep view was that space and time are basically built up by relationships between things happening,” said the physicist Robbert Dijkgraaf, director of the Institute for Advanced Study in Princeton, New Jersey, where Einstein spent his final decades.

The relationship that eventually mattered most to Einstein’s legacy was symmetry. Scientists often describe symmetries as changes that don’t really change anything, differences that don’t make a difference, variations that leave deep relationships invariant. Examples are easy to find in everyday life. You can rotate a snowflake by 60 degrees and it will look the same. You can switch places on a teeter-totter and not upset the balance. More complicated symmetries have led physicists to the discovery of everything from neutrinos to quarks — they even led to Einstein’s own discovery that gravitation is the curvature of space-time, which, we now know, can curl in on itself, pinching off into black holes.

Over the past several decades, some physicists have begun to question whether focusing on symmetry is still as productive as it used to be. New particles predicted by theories based on symmetries haven’t appeared in experiments as hoped, and the Higgs boson that was detected was far too light to fit into any known symmetrical scheme. Symmetry hasn’t yet helped to explain why gravity is so weak, why the vacuum energy is so small, or why dark matter remains transparent.

“There has been, in particle physics, this prejudice that symmetry is at the root of our description of nature,” said the physicist Justin Khoury of the University of Pennsylvania. “That idea has been extremely powerful. But who knows? Maybe we really have to give up on these beautiful and cherished principles that have worked so well. So it’s a very interesting time right now.”

Light

Einstein wasn’t thinking about invariance or symmetry when he wrote his first relativity papers in 1905, but historians speculate that his isolation from the physics community during his employment in the Swiss patent office might have helped him see past the unnecessary trappings people took for granted.

Like other physicists of his time, Einstein was pondering several seemingly unrelated puzzles. James Clerk Maxwell’s equations revealing the intimate connection between electric and magnetic fields looked very different in different frames of reference — whether an observer is moving or at rest. Moreover, the speed at which electromagnetic fields propagated through space almost precisely matched the speed of light repeatedly measured by experiments — a speed that didn’t change no matter what. An observer could be running toward the light or rushing away from it, and the speed didn’t vary.

Einstein connected the dots: The speed of light was a measurable manifestation of the symmetrical relationship between electric and magnetic fields — a more fundamental concept than space itself. Light didn’t need anything to travel through because it was itself electromagnetic fields in motion. The concept of “at rest” — the static “empty space” invented by Isaac Newton — was unnecessary and nonsensical. There was no universal “here” or “now”: Events could appear simultaneous to one observer but not another, and both perspectives would be correct.

Chasing after a light beam produced another curious effect, the subject of Einstein’s second relativity paper, “Does the Inertia of a Body Depend Upon Its Energy Content?” The answer was yes. The faster you chase, the harder it is to go faster. Resistance to change becomes infinite at the speed of light. Since that resistance is inertia, and inertia is a measure of mass, the energy of motion is transformed into mass. “There is no essential distinction between mass and energy,” Einstein wrote.

It took several years for Einstein to accept that space and time are inextricably interwoven threads of a single space-time fabric, impossible to disentangle. “He still wasn’t thinking in a fully unified space-time sort of way,” said David Kaiser, a physicist and historian of science at the Massachusetts Institute of Technology.

Unified space-time is a difficult concept to wrap our minds around. But it begins to make sense if we think about the true meaning of “speed.” The speed of light, like any speed, is a relationship — distance traveled over time. But the speed of light is special because it can’t change; your laser beam won’t advance any faster just because it is shot from a speeding satellite. Measurements of distance and time must therefore change instead, depending on one’s state of motion, leading to effects known as “space contraction” and “time dilation.” The invariant is this: No matter how fast two people are traveling with respect to each other, they always measure the same “space-time interval.” Sitting at your desk, you hurtle through time, hardly at all through space. A cosmic ray flies over vast distances at nearly the speed of light but traverses almost no time, remaining ever young. The relationships are invariant no matter how you switch things around.

Gravity

Einstein’s special theory of relativity, which came first, is “special” because it applies only to steady, unchanging motion through space-time — not accelerating motion like the movement of an object falling toward Earth. It bothered Einstein that his theory didn’t include gravity, and his struggle to incorporate it made symmetry central to his thinking. “By the time he gets full-on into general relativity, he’s much more invested in this notion of invariants and space-time intervals that should be the same for all observers,” Kaiser said.

Specifically, Einstein was puzzled by a difference that didn’t make a difference, a symmetry that didn’t make sense. It’s still astonishing to drop a wad of crumped paper and a set of heavy keys side by side to see that somehow, almost magically, they hit the ground simultaneously — as Galileo demonstrated (at least apocryphally) by dropping light and heavy balls off the tower in Pisa. If the force of gravity depends on mass, then the more massive an object is, the faster it should sensibly fall. Inexplicably, it does not.

The key insight came to Einstein in one of his famous thought experiments. He imagined a man falling off a building. The man would be floating as happily as an astronaut in space, until the ground got in his way. When Einstein realized that a person falling freely would feel weightless, he described the discovery as the happiest thought of his life. It took a while for him to pin down the mathematical details of general relativity, but the enigma of gravity was solved once he showed that gravity is the curvature of space-time itself, created by massive objects like the Earth. Nearby “falling” objects like Einstein’s imaginary man or Galileo’s balls simply follow the space-time path carved out for them.

When general relativity was first published, 10 years after the special version, a problem arose: It appeared that energy might not be conserved in strongly curved space-time. It was well-known that certain quantities in nature are always conserved: the amount of energy (including energy in the form of mass), the amount of electric charge, the amount of momentum. In a remarkable feat of mathematical alchemy, the German mathematician Emmy Noether proved that each of these conserved quantities is associated with a particular symmetry, a change that doesn’t change anything.

Noether showed that the symmetries of general relativity — its invariance under transformations between different reference frames — ensure that energy is always conserved. Einstein’s theory was saved. Noether and symmetry have both occupied center stage in physics ever since.

Matter

Post Einstein, the pull of symmetry only became more powerful. Paul Dirac, trying to make quantum mechanics compatible with the symmetry requirements of special relativity, found a minus sign in an equation suggesting that “antimatter” must exist to balance the books. It does. Soon after, Wolfgang Pauli, in an attempt to account for the energy that seemed to go missing during the disintegration of radioactive particles, speculated that perhaps the missing energy was carried away by some unknown, elusive particle. It was, and that particle is the neutrino.

Starting in the 1950s, invariances took on a life of their own, becoming ever more abstract, “leaping out,” as Kaiser put it, from the symmetries of space-time. These new symmetries, known as “gauge” invariances, became extremely productive, “furnishing the world,” Kaiser said, by requiring the existence of everything from W and Z bosons to gluons. “Because we think there’s a symmetry that’s so fundamental it has to be protected at all costs, we invent new stuff,” he said. Gauge symmetry “dictates what other ingredients you have to introduce.” It’s roughly the same kind of symmetry as the one that tells us that a triangle that’s invariant under 120-degree rotations must have three equal sides.

Gauge symmetries describe the internal structure of the system of particles that populates our world. They indicate all the ways physicists can shift, rotate, distort and generally mess with their equations without varying anything important. “The symmetry tells you how many ways you can flip things, change the way the forces work, and it doesn’t change anything,” Alexander said. The result is a peek at the hidden scaffolding that supports the basic ingredients of nature.

The abstractness of gauge symmetries causes a certain unease in some quarters. “You don’t see the whole apparatus, you only see the outcome,” Dijkgraaf said. “I think with gauge symmetries there’s still a lot of confusion.”

To compound the problem, gauge symmetries produce a multitude of ways to describe a single physical system — a redundancy, as the physicist Mark Trodden of the University of Pennsylvania put it. This property of gauge theories, Trodden explained, renders calculations “fiendishly complicated.” Pages and pages of calculations lead to very simple answers. “And that makes you wonder: Why? Where does all that complexity in the middle come from? And one possible answer to that is this redundancy of description that gauge symmetries give you.”

Such internal complexity is the opposite of what symmetry normally offers: simplicity. With a tiling pattern that repeats itself, “you only need to look at one little bit and you can predict the rest of it,” Dijkgraaf said. You don’t need one law for the conservation of energy and another for matter where only one will do. The universe is symmetrical in that it’s homogeneous on large scales; it doesn’t have a left or right, up or down. “If that weren’t the case, cosmology would be a big mess,” Khoury said.

Broken Symmetries

The biggest problem is that symmetry as it’s now understood seems to be failing to answer some of the biggest questions in physics. True, symmetry told physicists where to look for both the Higgs boson and gravitational waves — two momentous discoveries of the past decade. At the same time, symmetry-based reasoning predicted a slew of things that haven’t shown up in any experiments, including the “supersymmetric” particles that could have served as the cosmos’s missing dark matter and explained why gravity is so weak compared to electromagnetism and all the other forces.

In some cases, symmetries present in the underlying laws of nature appear to be broken in reality. For instance, when energy congeals into matter via the good old E = mc2, the result is equal amounts of matter and antimatter — a symmetry. But if the energy of the Big Bang created matter and antimatter in equal amounts, they should have annihilated each other, leaving not a trace of matter behind. Yet here we are.

The perfect symmetry that should have existed in the early hot moments of the universe somehow got destroyed as it cooled down, just as a perfectly symmetrical drop of water loses some of its symmetry when it freezes into ice. (A snowflake may look the same in six different orientations, but a melted snowflake looks the same in every direction.)

“Everyone’s interested in spontaneously broken symmetries,” Trodden said. “The law of nature obeys a symmetry, but the solution you’re interested in does not.”

But what broke the symmetry between matter and antimatter?

It would come as a surprise to no one if physics today turned out to be burdened with unnecessary scaffolding, much like the notion of “empty space” that misdirected people before Einstein. Today’s misdirection, some think, may even have to do with the obsession with symmetry itself, at least as it’s currently understood.

Many physicists have been exploring an idea closely related to symmetry called “duality.” Dualities are not new to physics. Wave-particle duality — the fact that the same quantum system is best described as either a wave or a particle, depending on the context — has been around since the beginning of quantum mechanics. But newfound dualities have revealed surprising relationships: For example, a three-dimensional world without gravity can be mathematically equivalent, or dual, to a four-dimensional world with gravity.

If descriptions of worlds with different numbers of spatial dimensions are equivalent, then “one dimension in some sense can be thought of as fungible,” Trodden said.

“These dualities include elements — the number of dimensions — we think about as invariant,” Dijkgraaf said, “but they are not.” The existence of two equivalent descriptions with all the attendant calculations raises “a very deep, almost philosophical point: Is there an invariant way to describe physical reality?”

No one is giving up on symmetry anytime soon, in part because it’s proved so powerful and also because relinquishing it means, to many physicists, giving up on “naturalness” — the idea that the universe has to be exactly the way it is for a reason, the furniture arranged so impeccably that you couldn’t imagine it any other way.

Clearly, some aspects of nature — like the orbits of the planets — are the result of history and accident, not symmetry. Biological evolution is a combination of known mechanisms and chance. Perhaps Max Born was right when he responded to Einstein’s persistent objection that “God does not play dice” by pointing out that “nature, as well as human affairs, seems to be subject to both necessity and accident.”

Certain aspects of physics will have to remain intact — causality for example. “Effects cannot precede causes,” Alexander said. Other things almost certainly will not.

One aspect that will surely not play a key role in the future is the speed of light, which grounded Einstein’s work. The smooth fabric of space-time Einstein wove a century ago inevitably gets ripped to shreds inside black holes and at the moment of the Big Bang. “The speed of light can’t remain constant if space-time is crumbling,” Alexander said. “If space-time is crumbling, what is invariant?”

Certain dualities suggest that space-time emerges from something more basic still, the strangest relationship of all: What Einstein called the “spooky” connections between entangled quantum particles. Many researchers believe these long-distance links stitch space-time together. As Kaiser put it, “The hope is that something like a continuum of space-time would emerge as a secondary effect of more fundamental relationships, including entanglement relationships.” In that case, he said, classical, continuous space-time would be an “illusion.”

The high bar for new ideas is that they cannot contradict consistently reliable theories like quantum mechanics and relativity — including the symmetries that support them.

Einstein once compared building a new theory to climbing a mountain. From a higher perspective, you can see the old theory still standing, but it’s altered, and you can see where it fits into the larger, more inclusive landscape. Instead of thinking, as Feynman suggested, with last week’s potatoes, future thinkers might ponder physics using the information encoded in quantum entanglements, which weave the space-time to grow potatoes in the first place.

show enclosure

Last month, a team of physicists reported in Nature that a sound-trapping fluid, analogous to a black hole that traps light, radiates a featureless spectrum of energies, just as Stephen Hawking predicted for the invisible spheres he was famous for studying. But opinions differ about what this sonic analogue of a black hole reveals about the real kind — such as the one recently seen in silhouette in a first-ever photograph.

The question is how to interpret the bizarre analogy between a fluid of rubidium atoms in a lab in Israel and the mysterious astrophysical abysses most often created when huge stars exhaust their fuel and collapse inward.

Some philosophers and physicists argue that the new findings have striking implications for the black hole information paradox, a profound 45-year-old puzzle about whether or how quantum information escapes black holes. Others regard the fluid experiment as an amusing demo that says nothing about black holes or their central mystery.

The paradox sprang from Hawking’s 1974 insight that a black hole isn’t truly black. Its black-looking, spherical “event horizon” marks the vicinity within which its gravity is so strong that even light rays cannot climb out. But Hawking reasoned that the fabric of space-time at the event horizon will experience “quantum fluctuations,” where pairs of particles and antiparticles spontaneously pop up out of the vacuum. Normally, these opposites instantly annihilate, returning energy borrowed fleetingly from the universe’s budget. But a particle and an antiparticle that materialize on either side of a black hole’s event horizon get dragged apart.

Energy is stolen from the vacuum, Hawking realized, in the permanent creation of a new particle, which radiates out from the horizon as “Hawking radiation.” Its accomplice takes the fall, carrying negative energy into the black hole. Black holes lose energy, in other words, as they radiate. They slowly evaporate and shrink, ultimately disappearing completely.

The problem is that, according to Hawking’s calculations, black hole radiation will be random, with a featureless, “thermal” spectrum of energies that carries no information about the black hole or whatever formed or fell in it. This implies that an evaporating black hole destroys information — something quantum mechanics doesn’t allow. Quantum math relies on the premise that information is never lost. As particles shuffle and transform, a record of the past always remains encoded in the present and future. We could theoretically re-create a burned book from its ashes by turning back time.

In the decades since Hawking radiation was discovered, the information paradox has motivated the quest for a deeper understanding of nature. Today’s physicists widely believe black hole information is preserved — that the quantum nature of gravity somehow modifies event horizons (and corrects Hawking’s calculation) in a way that encrypts the outgoing Hawking radiation with a record of the past. The question is how black hole information gets out.

Years ago, the theoretical physicist Bill Unruh argued that Hawking’s insights about black hole horizons should also apply to “sonic horizons.” This raised the prospect of testing Hawking’s math by analogy, initiating a race to create black hole analogues in the lab. The most successful practitioner, Jeff Steinhauer of the Technion in Haifa, Israel, generates a sonic horizon by accelerating a fluid of rubidium-87 atoms to a supersonic speed. In 2016, Steinhauer made headlines by detecting the acoustic analogue of Hawking radiation. Quantum units of sound, called phonons, popped up in pairs straddling the sonic horizon; one phonon would get swept along by the moving fluid while the other fought its way upstream and escaped.

Now, three years of improvements to the apparatus have “allowed for the quantitative check of Hawking’s predictions,” Steinhauer said. In his new paper, he and three collaborators reported that their sonic radiation is featureless, just as Hawking calculated for black holes. “The discovery gives us hints regarding the information paradox,” Steinhauer said by email. “The thermal form of the spectrum suggests that Hawking radiation carries no information. Thus, we need to look elsewhere to solve the information paradox.”

Most quantum gravity researchers disagree with this assessment, but a group of philosophers who have become interested in analogue black hole experiments think Steinhauer is right.

The key issue is whether space-time at a black hole’s event horizon can be treated as smooth. Both Hawking and Unruh, in their studies of real and sonic black holes, assumed that quantum fluctuations happen on a smooth background. Hawking, in his calculation, glossed over the (unknown) microscopic properties of the space-time fabric at the event horizon, and Unruh likewise treated the fluid in a sonic black hole as smooth, ignoring its composite atoms. It’s this “smoothness approximation” that most quantum gravity researchers find suspect. They think quantum-scale properties of space-time somehow encode information in Hawking radiation.

Steinhauer’s new measurements confirm that in the fluid case, the smoothness approximation works. Moreover, Unruh’s theoretical studies suggest that fluids with diverse microscopic properties will still be smooth on macro scales and emit featureless, thermal Hawking radiation. The philosophers argue that the “universality” of Hawking radiation — its robustness and insensitivity to the fine-grained details of a medium — suggests that the smoothness approximation should also hold for space-time.

“We argue that if it turns out that the modeling assumptions don’t steer you wrong in the acoustic case, that gives you good reason, on the basis of universality considerations, to believe that they don’t steer you wrong in the Hawking case,” said Eric Winsberg, a philosopher of science at the University of South Florida and a co-author of a recent study of analogue black hole experiments. In other words, the new results increase the likelihood “that information in real black holes must be lost.”

But there’s a major catch, which philosophers discussed in another recent paper: Even if the smoothness approximation holds universally for fluids, it might not hold for space-time, which might be stitched out of microscopic parts according to a much stranger pattern. Perhaps, as Winsberg put it, “there are more ways that space-time could deviate from smoothness than are dreamt of in your philosophy.”

For instance, various thought experiments and toy examples suggest that space-time might be holographic — a geometric projection, similar to how a video game universe emerges from a computer chip. The interior of a black hole might be a hologram that projects from information encoded on the event horizon. Daniel Harlow, a quantum gravity theorist and black hole expert at the Massachusetts Institute of Technology, said such a scenario would be expected to add subtle structure to the spectrum of Hawking radiation. The radiation would look thermal, but meaningful patterns would appear “if you fed the entire radiation cloud into a quantum computer and ran some fancy algorithms on it.”

The philosophers concede that exotic possibilities for the quantum-scale properties of space-time “mute the strength” with which Steinhauer’s experiment makes black hole information loss more likely.

Will any of this change anyone’s mind? Different starting beliefs, evidentiary requirements and other factors “can have a big effect on the kinds of inferences scientists make,” said Sean Gryb, a physicist and philosopher at the University of Bristol. Quantum gravity theorists will almost certainly go on thinking that information escapes black holes, even as the minority who believe in information loss feel more confident. Without measuring actual black hole radiation — which is beyond experimental reach — how will experts ever agree? “This is the kind of question philosophers of science have been looking for a definite answer to for a very long time,” Gryb said.

show enclosure

Weatherall, James (2019) Equivalence and Duality in Electromagnetism. [Preprint]
Glick, David (2019) Timelike Entanglement For Delayed-Choice Entanglement Swapping. [Preprint]

Nature Physics, Published online: 24 June 2019; doi:10.1038/s41567-019-0545-1

A type of stochastic neural network called a restricted Boltzmann machine has been widely used in artificial intelligence applications for decades. They are now finding new life in the simulation of complex wavefunctions in quantum many-body physics.

Wuthrich, Christian (2019) Time travelling in emergent spacetime. [Preprint]

Author(s): Alvaro Ortega, Emma McKay, Álvaro M. Alhambra, and Eduardo Martín-Martínez

We study the work cost of processes in quantum fields without the need of projective measurements, which are always ill defined in quantum field theory. Inspired by interferometry schemes, we propose a work distribution that generalizes the two-point measurement scheme employed in quantum thermodyna…

[Phys. Rev. Lett. 122, 240604] Published Fri Jun 21, 2019

Article written by