Scientist Study

Your source for the latest science & space news

Wednesday, 1 December 2021

Mysterious clouds could offer new clues on dark matter


The hunt for gravitational waves, ripples in space and time caused by major cosmic cataclysms, could help solve one of the Universe's other burning mysteries—boson clouds and whether they are a leading contender for dark matter.

Researchers are using powerful instruments, like the advanced Laser Interferometer Gravitational-Wave Observatory (LIGO), advanced Virgo, and KAGRA, that detect gravitational waves up to billions of light years away to locate potential boson clouds.

Boson clouds, made up of ultralight subatomic particles that are almost impossible to detect, have been suggested as a possible source of dark matter—which accounts for about 85 percent of all matter in the Universe.

Now a major new international study carried out in the LIGO-Virgo-KAGRA collaboration and co-led by researchers from The Australian National University (ANU), offers one of the best leads yet to hunt down these subatomic particles by searching for gravitational waves caused by boson clouds circling black holes.

Dr. Lilli Sun, from the ANU Centre for Gravitational Astrophysics, said the study was the first all-sky survey in the world tailored to look for predicted gravitational waves coming from possible boson clouds near rapidly spinning black holes.

"It is almost impossible to detect these ultralight boson particles on Earth," Dr. Sun said.

"The particles, if they exist, have extremely small mass and rarely interact with other matter—which is one of the key properties that dark matter seems to have. Dark matter is material that cannot be seen directly, but we know that dark matter exists because of the effect it has on objects that we can observe.

"But by searching for gravitational waves emitted by these clouds we may be able to track down these elusive boson particles and possibly crack the code of dark matter. Our searches could also allow us to rule out certain ultralight boson particles that our theories say could exist but actually don't."

Dr. Sun, also an Associate Investigator at the ARC Centre of Excellence for Gravitational Wave Discovery (OzGrav), said gravitational wave detectors allowed researchers to examine the energy of rapidly rotating black holes extracted by such clouds if they exist.

"We believe these black holes trap a huge number of boson particles in their powerful gravity field, creating a cloud corotating with them. This delicate dance continues for millions of years and keeps generating gravitational waves that hurtle through space," she said.

While the researchers haven't yet detected gravitational waves from boson clouds, Dr. Sun said gravitational wave science had "opened doors that were previously locked to scientists."

"Gravitational-wave discoveries not only provide information about mysterious compact objects in the Universe, like black holes and neutron stars, they also allow us to look for new particles and dark matter," she said.

"Future gravitational wave detectors will certainly open more possibilities. We will be able to reach deeper into the Universe and discover more insights about these particles.

"For example, the discovery of boson clouds using gravitational wave detectors would bring important insights about dark matter and help advance other searches for dark matter. It would also advance our understanding of particle physics more broadly."

In another significant breakthrough, the study also shed more light on the chance of boson clouds existing in our own galaxy by taking into consideration their ages.

Dr. Sun said the strength of any gravitational wave depends on the age of the cloud, with older ones sending out weaker signals.

"The boson cloud shrinks as it loses energy by sending out gravitational waves," Dr. Sun said.

"We learned that a particular type of boson cloud younger than 1,000 years is not likely to exist anywhere in our galaxy, while clouds that are up to 10 million years old are not likely to exist within about 3,260 light years from Earth."

Reference: 

All-sky search for gravitational wave emission from scalar boson clouds around spinning black holes in LIGO O3 data. arXiv:2111.15507v1 [astro-ph.HE], arxiv.org/abs/2111.15507

Planetary scientists discover brief presence of water in Arabia Terra on Mars


As part of a team of collaborators from Northern Arizona University and Johns Hopkins University, Northern Arizona University (NAU) Ph.D. candidate Ari Koeppel recently discovered that water was once present in a region of Mars called Arabia Terra.

Arabia Terra is in the northern latitudes of Mars. Named in 1879 by Italian astronomer Giovanni Schiaparelli, this ancient land covers an area slightly larger than the European continent. Arabia Terra contains craters, volcanic calderas, canyons and beautiful bands of rock reminiscent of sedimentary rock layers in the Painted Desert or the Badlands.

These layers of rock and how they formed was the research focus for Koeppel along with his advisor, associate professor Christopher Edwards of NAU's Department of Astronomy and Planetary Science along with Andrew Annex, Kevin Lewis and undergraduate student Gabriel Carrillo of Johns Hopkins University. Their study, titled "A fragile record of fleeting water on Mars," was funded by the NASA Mars Data Analysis Program and recently published in the journal Geology.

"We were specifically interested in using rocks on the surface of Mars to get a better understanding of past environments three to four billion years ago and whether there could have been climatic conditions that were suitable for life on the surface," Koeppel said. "We were interested in whether there was stable water, how long there could have been stable water, what the atmosphere might have been like and what the temperature on the surface might have been like." 

In order to get a better understanding of what happened to create the rock layers, the scientists focused on thermal inertia, which defines the ability of a material to change temperature. Sand, with small and loose particles, gains and loses heat quickly, while a solid boulder will remain warm long after dark. By looking at surface temperatures, they were able to determine the physical properties of rocks in their study area. They could tell if a material was loose and eroding away when it otherwise looked like it was solid.

"No one had done an in-depth thermal inertia investigation of these really interesting deposits that cover a large portion of the surface of Mars," Edwards said.

To complete the study, Koeppel used remote sensing instruments on orbiting satellites. "Just like geologists on Earth, we look at rocks to try to tell stories about past environments," Koeppel said. "On Mars, we're a little bit more limited. We can't just go to a rock outcrop and collect samples—we're pretty reliant on satellite data. So, there are a handful of satellites orbiting Mars, and each satellite hosts a collection of instruments.  Each instrument plays its own role in helping us describe the rocks that are on the surface."

Through a series of investigations using this remotely gathered data, they looked at thermal inertia, plus evidence of erosion, the condition of the craters and what minerals were present.

"We figured out these deposits are much less cohesive than everyone previously thought they were, indicating that this setting could only have had water for only a brief period of time," said Koeppel. "For some people, that kind of sucks the air out of the story because we often think that having more water for more time means there's a greater chance of life having been there at one point. But for us, it's actually really interesting because it brings up a whole set of new questions. What are the conditions that could have allowed there to be water there for a brief amount of time? Could there have been glaciers that melted quickly with outbursts of huge floods? Could there have been a groundwater system that percolated up out of the ground for only a brief period of time only to sink back down?"

Koeppel started his college career in engineering and physics but switched to studying the geological sciences while earning his master's degree at The City College of New York. He came to NAU to work with Edwards and immerse himself in the thriving planetary science community of Flagstaff.

"I got into planetary science because of my excitement for exploring worlds beyond Earth. The universe is astoundingly big; even Mars is just the tip of the iceberg," Koeppel said. "But we've been studying Mars for a few decades now, and at this point, we have a huge accumulation of data. We're beginning to study it at levels that are comparable to ways we've been able to study Earth, and it's a really exciting time for Mars science."

Reference:

Ari H.D. Koeppel et al, A fragile record of fleeting water on Mars, Geology (2021). DOI: 10.1130/G49285.1

Shrinking qubits for quantum computing with atom-thin materials


For quantum computers to surpass their classical counterparts in speed and capacity, their qubits—which are superconducting circuits that can exist in an infinite combination of binary states—need to be on the same wavelength. Achieving this, however, has come at the cost of size. Whereas the transistors used in classical computers have been shrunk down to nanometer scales, superconducting qubits these days are still measured in millimeters—one millimeter is one million nanometers.

Combine qubits together into larger and larger circuit chips, and you end up with, relatively speaking, a big physical footprint, which means quantum computers take up a lot of physical space. These are not yet devices we can carry in our backpacks or wear on our wrists.

To shrink qubits down while maintaining their performance, the field needs a new way to build the capacitors that store the energy that "powers" the qubits. In collaboration with Raytheon BBN Technologies, Wang Fong-Jen Professor James Hone's lab at Columbia Engineering recently demonstrated a superconducting qubit capacitor built with 2D materials that's a fraction of previous sizes.

To build qubit chips previously, engineers have had to use planar capacitors, which set the necessary charged plates side by side. Stacking those plates would save space, but the metals used in conventional parallel capacitors interfere with qubit information storage. In the current work, published on November 18 in Nano Letters, Hone's Ph.D. students Abhinandan Antony and Anjaly Rajendra sandwiched an insulating layer of boron nitride between two charged plates of superconducting niobium dieselenide. These layers are each just a single atom thick and held together by van der Waals forces, the weak interaction between electrons. The team then combined their capacitors with aluminum circuits to create a chip containing two qubits with an area of 109 square micrometers and just 35 nanometers thick—that's 1,000 times smaller than chips produced under conventional approaches.

When they cooled their qubit chip down to just above absolute zero, the qubits found the same wavelength. The team also observed key characteristics that showed that the two qubits were becoming entangled and acting as a single unit, a phenomenon known as quantum coherence; that would mean the qubit's quantum state could be manipulated and read out via electrical pulses, said Hone. The coherence time was short—a little over 1 microsecond, compared to about 10 microseconds for a conventionally built coplanar capacitor, but this is only a first step in exploring the use of 2D materials in this area, he said.

Separate work published on arXiv in August from researchers at MIT also took advantage of niobium diselenide and boron nitride to build parallel-plate capacitors for qubits. The devices studied by the MIT team showed even longer coherence times—up to 25 microseconds—indicating that there is still room to further improve performance.
From here, Hone and his team will continue refining their fabrication techniques and test other types of 2D materials to increase coherence times, which reflect how long the qubit is storing information. New device designs should be able to shrink things down even further, said Hone, by combining the elements into a single van der Waals stack or by deploying 2D materials for other parts of the circuit.

"We now know that 2D materials may hold the key to making quantum computers possible," Hone said. "It is still very early days, but findings like these will spur researchers worldwide to consider novel applications of 2D materials. We hope to see a lot more work in this direction going forward."

Reference: 

Abhinandan Antony et al, Miniaturizing Transmon Qubits Using van der Waals Materials, Nano Letters (2021). DOI: 10.1021/acs.nanolett.1c04160

Researchers generate, for the first time, a vortex beam of atoms and molecules


Vortices may conjure a mental image of whirlpools and tornadoes—spinning bodies of water and air—but they can also exist on much smaller scales. In a new study published in Science, researchers from the Weizmann Institute of Science, together with collaborators from the Technion-Israel Institute of Technology and Tel Aviv University, have created, for the first time, vortices made of a single atom. These vortices could help answer fundamental questions about the inner workings of the subatomic world and be used to enhance a variety of technologies—for example, by providing new capabilities for atomic microscopes.

Scientists have long been striving to produce various types of nano-scale vortices in the lab, with recent focus on creating vortex beams—streams of particles having spinning properties—where even their internal quantum structure can be made to spin. Vortices made up of elementary particles, electrons and photons, have been created experimentally in the past, but until now vortex beams of atoms have existed only as a thought experiment. "During a theoretical debate with Prof. Ido Kaminer from the Technion, we came up with an idea for an experiment that would generate vortices of single atoms," says Dr. Yair Segev, who has recently completed his Ph.D. studies in the group of Prof. Edvardas Narevicius of Weizmann's Chemical and Biological Physics Department.

In classical physics, spinning objects are often characterized by a property known as angular momentum. Similar to linear momentum, it describes the effort needed to stop a moving object in its tracks, or rather, to stop it from spinning. Vortices—characterized by the circulation of flux around an axis—embody this property perfectly in their relentless spin.

However, the very basic property of angular momentum, which characterizes naturally occurring vortices both big and small, takes on a different twist on the quantum scale. Unlike their classical physics equivalents, quantum particles cannot take on any value of angular momentum; rather, they can only take on values in discrete portions, or "quanta." Another difference is the way in which a vortex particle carries its angular momentum—not as a rigid, spinning propeller, but as a wave that flows and twists around its own axis of motion.

These waves can be shaped and manipulated similarly to how breakwaters are used to direct the flow of seawater close to shore, but on a much smaller scale. "By placing physical obstacles in an atom's path, we can manipulate the shape of its wave into various forms," says Alon Luski, a Ph.D. student in Narevicius's group. Luski and Segev, who led the research along with Rea David from their group, collaborated with colleagues from Tel Aviv University to develop an innovative approach for directing the movement of atoms. They created patterns of nanometric "breakwaters" called gratings—tiny ceramic discs, several hundreds of nanometers in diameter, with specific slit patterns. When the slits are arranged into a fork-like shape, each atom that passes through them behaves like a wave that flows through a physical obstacle, in this way acquiring angular momentum and emerging as a spinning vortex. These "nano-forks" were produced through a nano-fabrication process that was developed specifically for this experiment by Dr. Ora Bitton and Hila Nadler, both of Weizmann's Chemical Research Support Department.

To generate and observe atomic vortices, the researchers aim a supersonic beam of helium atoms at these forked gratings. Before reaching the gratings, the beam passes through a system of narrow slits that blocks some of the atoms, transmitting only the atoms that behave more like large waves—those that are better suited to being shaped by the gratings. When these "wavy" atoms interact with the "forks," they are shaped into vortices, and their intensity is recorded and photographed by a detector.

This results in a donut-shaped image constructed from millions of vortexed helium atoms that collide with the detector. "When we saw the donut-shaped image, we knew we had succeeded in creating vortices of these helium atoms," says Segev. Much like the "eye" of the storm, the center of these "donuts" represents the space where each atomic vortex is calmest—the intensity of the waves there is zero, so no atoms are found there. "The 'donuts' are the fingerprint of a series of different vortex beams," explains Narevicius.

During the experiments, the researchers made an odd observation. "We saw that next to the perfectly shaped donuts, there were two small spots of 'noise' as well," says Segev. "At first we thought this was a hardware malfunction, but after extensive investigation we realized that what we're looking at are actually unusual molecules, each made of two helium atoms, that were joined together in our beams." In other words, they had generated vortices of not only atoms but also of molecules.

Although the researchers used helium in their experiments, the experimental setup may accommodate studies of other elements and molecules. It could also be used to study hidden subatomic properties, such as the charge distribution of protons or neutrons that may be revealed only when an atom is spinning. Luski gives the example of a mechanical clock: "Mechanical clocks are made of tiny gears and cogs, each moving at a certain frequency, similarly to the internal structure of an atom. Now imagine taking that clock and spinning it—this motion could change the internal frequency of the gears, and the internal structure could be expressed in the properties of the vortex as well."

In addition to offering a new way of studying the very basic properties of matter, atomic vortex beams might find use in several technological applications, such as in atomic microscopy. The interaction between spinning atoms and any investigated material could lead to the discovery of novel properties of that material, adding significant, previously inaccessible data to many future experiments.

Reference: 

Alon Luski et al, Vortex beams of atoms and molecules, Science (2021). DOI: 10.1126/science.abj2451

Tuesday, 30 November 2021

Physicists create time crystals with quantum computers


There is a huge global effort to engineer a computer capable of harnessing the power of quantum physics to carry out computations of unprecedented complexity. While formidable technological obstacles still stand in the way of creating such a quantum computer, today's early prototypes are still capable of remarkable feats.

For example, the creation of a new phase of matter called a "time crystal." Just as a crystal's structure repeats in space, a time crystal repeats in time and, importantly, does so infinitely and without any further input of energy—like a clock that runs forever without any batteries. The quest to realize this phase of matter has been a longstanding challenge in theory and experiment—one that has now finally come to fruition.

In research published Nov. 30 in Nature, a team of scientists from Stanford University, Google Quantum AI, the Max Planck Institute for Physics of Complex Systems and Oxford University detail their creation of a time crystal using Google's Sycamore quantum computing hardware.

"The big picture is that we are taking the devices that are meant to be the quantum computers of the future and thinking of them as complex quantum systems in their own right," said Matteo Ippoliti, a postdoctoral scholar at Stanford and co-lead author of the work. "Instead of computation, we're putting the computer to work as a new experimental platform to realize and detect new phases of matter."

For the team, the excitement of their achievement lies not only in creating a new phase of matter but in opening up opportunities to explore new regimes in their field of condensed matter physics, which studies the novel phenomena and properties brought about by the collective interactions of many objects in a system. (Such interactions can be far richer than the properties of the individual objects.)

"Time-crystals are a striking example of a new type of non-equilibrium quantum phase of matter," said Vedika Khemani, assistant professor of physics at Stanford and a senior author of the paper. "While much of our understanding of condensed matter physics is based on equilibrium systems, these new quantum devices are providing us a fascinating window into new non-equilibrium regimes in many-body physics."

What a time crystal is and isn't

The basic ingredients to make this time crystal are as follows: The physics equivalent of a fruit fly and something to give it a kick. The fruit fly of physics is the Ising model, a longstanding tool for understanding various physical phenomena—including phase transitions and magnetism—which consists of a lattice where each site is occupied by a particle that can be in two states, represented as a spin up or down.

During her graduate school years, Khemani, her doctoral advisor Shivaji Sondhi, then at Princeton University, and Achilleas Lazarides and Roderich Moessner at the Max Planck Institute for Physics of Complex Systems stumbled upon this recipe for making time crystals unintentionally. They were studying non-equilibrium many-body localized systems—systems where the particles get "stuck" in the state in which they started and can never relax to an equilibrium state. They were interested in exploring phases that might develop in such systems when they are periodically "kicked" by a laser. Not only did they manage to find stable non-equilibrium phases, they found one where the spins of the particles flipped between patterns that repeat in time forever, at a period twice that of the driving period of the laser, thus making a time crystal.

The periodic kick of the laser establishes a specific rhythm to the dynamics. Normally the "dance" of the spins should sync up with this rhythm, but in a time crystal it doesn't. Instead, the spins flip between two states, completing a cycle only after being kicked by the laser twice. This means that the system's "time translation symmetry" is broken. Symmetries play a fundamental role in physics, and they are often broken—explaining the origins of regular crystals, magnets and many other phenomena; however, time translation symmetry stands out because unlike other symmetries, it can't be broken in equilibrium. The periodic kick is a loophole that makes time crystals possible.

The doubling of the oscillation period is unusual, but not unprecedented. And long-lived oscillations are also very common in the quantum dynamics of few-particle systems. What makes a time crystal unique is that it's a system of millions of things that are showing this kind of concerted behavior without any energy coming in or leaking out.

"It's a completely robust phase of matter, where you're not fine-tuning parameters or states but your system is still quantum," said Sondhi, professor of physics at Oxford and co-author of the paper. "There's no feed of energy, there's no drain of energy, and it keeps going forever and it involves many strongly interacting particles."

While this may sound suspiciously close to a "perpetual motion machine," a closer look reveals that time crystals don't break any laws of physics. Entropy—a measure of disorder in the system—remains stationary over time, marginally satisfying the second law of thermodynamics by not decreasing.

Between the development of this plan for a time crystal and the quantum computer experiment that brought it to reality, many experiments by many different teams of researchers achieved various almost-time-crystal milestones. However, providing all the ingredients in the recipe for "many-body localization" (the phenomenon that enables an infinitely stable time crystal) had remained an outstanding challenge.

For Khemani and her collaborators, the final step to time crystal success was working with a team at Google Quantum AI. Together, this group used Google's Sycamore quantum computing hardware to program 20 "spins" using the quantum version of a classical computer's bits of information, known as qubits.

Revealing just how intense the interest in time crystals currently is, another time crystal was published in Science this month. That crystal was created using qubits within a diamond by researchers at Delft University of Technology in the Netherlands.

Quantum opportunities

The researchers were able to confirm their claim of a true time crystal thanks to special capabilities of the quantum computer. Although the finite size and coherence time of the (imperfect) quantum device meant that their experiment was limited in size and duration—so that the time crystal oscillations could only be observed for a few hundred cycles rather than indefinitely—the researchers devised various protocols for assessing the stability of their creation. These included running the simulation forward and backward in time and scaling its size.

"We managed to use the versatility of the quantum computer to help us analyze its own limitations," said Moessner, co-author of the paper and director at the Max Planck Institute for Physics of Complex Systems. "It essentially told us how to correct for its own errors, so that the fingerprint of ideal time-crystalline behavior could be ascertained from finite time observations."

A key signature of an ideal time crystal is that it shows indefinite oscillations from all states. Verifying this robustness to choice of states was a key experimental challenge, and the researchers devised a protocol to probe over a million states of their time crystal in just a single run of the machine, requiring mere milliseconds of runtime. This is like viewing a physical crystal from many angles to verify its repetitive structure.

"A unique feature of our quantum processor is its ability to create highly complex quantum states," said Xiao Mi, a researcher at Google and co-lead author of the paper. "These states allow the phase structures of matter to be effectively verified without needing to investigate the entire computational space—an otherwise intractable task."

Creating a new phase of matter is unquestionably exciting on a fundamental level. In addition, the fact that these researchers were able to do so points to the increasing usefulness of quantum computers for applications other than computing. "I am optimistic that with more and better qubits, our approach can become a main method in studying non-equilibrium dynamics," said Pedram Roushan, researcher at Google and senior author of the paper.

"We think that the most exciting use for quantum computers right now is as platforms for fundamental quantum physics," said Ippoliti. "With the unique capabilities of these systems, there's hope that you might discover some new phenomenon that you hadn't predicted."

Reference: 

Mi, X et al,Time-Crystalline Eigenstate Order on a Quantum Processor, Nature (2021). DOI: 10.1038/s41586-021-04257-w

How the act of measuring a quantum particle transforms it into an everyday object


The quantum world and our everyday world are very different places. In a publication that appeared as the "Editor's Suggestion" in Physical Review A this week, UvA physicists Jasper van Wezel and Lotte Mertens and their colleagues investigate how the act of measuring a quantum particle transforms it into an everyday object.

Quantum mechanics is the theory that describes the tiniest objects in the world around us, ranging from the constituents of single atoms to small dust particles. This microscopic realm behaves remarkably differently from our everyday experience—despite the fact that all objects in our human-scale world are made of quantum particles themselves. This leads to intriguing physical questions: why are the quantum world and the macroscopic world so different, where is the dividing line between them, and what exactly happens there?

Measurement problem

One particular area where the distinction between quantum and classical becomes essential is when we use an everyday object to measure a quantum system. The division between the quantum and everyday worlds then amounts to asking how 'big' the measurement device should be to be able to show quantum properties using a display in our everyday world. Finding out the details of measurement, such as how many quantum particles it takes to create a measurement device, is called the quantum measurement problem.

As experiments probing the world of quantum mechanics become ever more advanced and involve ever larger quantum objects, the invisible line where pure quantum behavior crosses over into classical measurement outcomes is rapidly being approached. In an article, UvA physicists Jasper van Wezel and Lotte Mertens and their colleagues take stock of current models that attempt to solve the measurement problem, and particularly those that do so by proposing slight modifications to the one equation that rules all quantum behavior: Schrödinger's equation.

Born's rule

The researchers show that such amendments can in principle lead to consistent proposals for solving the measurement problem. However, it turns out to be difficult to create models that satisfy Born's rule, which tells us how to use Schrödinger's equation for predicting measurement outcomes. The researchers show that only models with sufficient mathematical complexity (in technical terms: models that are non-linear and non-unitary) can give rise to Born's rule and therefore have a chance of solving the measurement problem and teaching us about the elusive crossover between quantum physics and the everyday world.

Reference: 

Lotte Mertens et al, Inconsistency of linear dynamics and Born's rule, Physical Review A (2021). DOI: 10.1103/PhysRevA.104.052224

Strong winds power electric fields in the upper atmosphere, NASA's ICON finds


What happens on Earth doesn't stay on Earth.

Using observations from NASA's ICON mission, scientists presented the first direct measurements of Earth's long-theorized dynamo on the edge of space: a wind-driven electrical generator that spans the globe 60-plus miles above our heads. The dynamo churns in the ionosphere, the electrically charged boundary between Earth and space. It's powered by tidal winds in the upper atmosphere that are faster than most hurricanes and rise from the lower atmosphere, creating an electrical environment that can affect satellites and technology on Earth.

The new work, published today in Nature Geoscience, improves our understanding of the ionosphere, which helps scientists better predict space weather and protect our technology from its effects.

Launched in 2019, ICON, short for Ionospheric Connection Explorer, is a mission to untangle how Earth's weather interacts with the weather in space. Radio and GPS signals zip through the ionosphere, which is home to auroras and the International Space Station. Empty pockets or dense swells of electrically charged particles can disrupt these signals.

Scientists who study the atmosphere and space weather have long included Earth's dynamo in their models because they knew it had important effects. But with little information, they had to make some assumptions about how it works. Data from ICON is the first concrete observation of winds fueling the dynamo, eventually influencing space weather, to feed into those models.

"ICON's first year in space has shown predicting these winds is key to improving our ability to predict what happens in the ionosphere," said Thomas Immel, ICON principal investigator at University of California, Berkeley, and lead author of the new study.

Earth's sky-high generator

The ionosphere is like a sloshing sea of electrically charged particles, created by the Sun and intermixed with the neutral upper atmosphere. Sandwiched between Earth and space, the ionosphere responds to changes from both the Sun above and Earth below. How much influence comes from each side is what researchers are interested in figuring out. Studying a year of ICON data, the researchers found much of the change they observed originated in the lower atmosphere.

Generators work by repeatedly moving an electricity-carrying conductor—like a copper wire—through a magnetic field. Filled with electrically charged gases called plasma, the ionosphere acts like a wire, or rather, a tangled mess of wires: Electricity flows right through. Like the dynamo in Earth's core, the dynamo in the atmosphere produces electromagnetic fields from motion.

Strong winds in the thermosphere, a layer of the upper atmosphere known for its high temperatures, push current-carrying plasma in the ionosphere across invisible magnetic field lines that arc around Earth like an onion. The wind tends to push on chunky, positively charged particles more than small, negatively charged electrons. "You get pluses moving differently than minuses," said co-author Brian Harding, a physicist at University of California, Berkeley. "That's an electric current."

In most generators, these components are bound tightly so they stay put and act predictably. But the ionosphere is free to move however it likes. "The current generates its own magnetic field, which fights Earth's magnetic field as it's passing through," Immel said. "So you end up with a wire trying to get away from you. It's a messy generator."

Following the whims of the ionosphere is key to predicting space weather's potential impacts. Depending on which way the wind blows, plasma in the ionosphere shoots out into space or plummets toward Earth. This behavior results from the tug-of-war between the ionosphere and Earth's electromagnetic fields.

The dynamo, which lies at the lower end of the ionosphere, has remained a mystery for so long because it's difficult to observe. Too high for scientific balloons and too low for satellites, it has eluded many of the tools researchers have to study near-Earth space. ICON is uniquely equipped to investigate this part of the ionosphere from above by taking advantage of the upper atmosphere's natural glow to detect the motion of plasma.

ICON simultaneously observes powerful winds and migrating plasma. "This was the first time we could tell how much the wind contributes to the ionosphere's behavior, without any assumptions," said Astrid Maute, another study co-author and ICON scientist at the National Center for Atmospheric Research in Boulder, Colorado.

Only in the past decade or so, Immel said, have scientists realized just how much those rising winds vary. "The upper atmosphere wasn't expected to change rapidly," he said. "But it does, day to day. We're finding this is all due to changes driven up from the lower atmosphere."

Wind power

Familiar are the winds that skim the surface of Earth, from gentle breezes to bracing gusts that blow one way and then the other.

High-altitude winds are a different beast. From 60 to 95 miles above the ground, in the lower thermosphere, winds can blast in the same direction at the same speed—around 250 mph—for a few hours at a time before suddenly reversing direction. (By comparison, winds in the strongest Category 5 hurricanes tear at 157 mph or more.)

These dramatic shifts are the result of waves of air, called tides, born at Earth's surface when the lower atmosphere heats up during the day then cools down at night. They surge through the sky daily, carrying changes from below.

The farther the atmosphere stretches away from the surface, the thinner it becomes and the less turbulence there is to disrupt these motions. That means small tides generated near the surface can grow much larger when they reach the upper atmosphere. "Changes in the winds up there are mostly controlled by what happens below," Harding said.

ICON's new wind measurements help scientists understand these tidal patterns that span the globe and their effects.

Tides ripple up through the sky, building in strength and growing before gusting through the ionosphere. The electric dynamo whirs in response.

The scientists analyzed the first year of ICON data, and found high-altitude winds strongly influence the ionosphere. "We traced the pattern of how the ionosphere moves, and there was a clear wave-like structure," Harding said. Changes in the wind, he explained, directly corresponded to the dance of plasma 370 miles above Earth's surface.

"Half of the motion of the plasma can be attributed to the winds that we observe right there on that same magnetic field line," Immel said. "That tells you it's an important observation to make if you want to predict what plasma is doing."

ICON's first year of observations coincided with solar minimum, the quiet phase of the Sun's 11-year activity cycle. During this time, the Sun's behavior was a low, constant hum. "We know the Sun's not doing much, but we saw a lot of variability from below, and then remarkable changes in the ionosphere," Immel said. That told the researchers they could rule out the Sun as the main influence.

As the Sun ramps up to its active phase, scientists will be able to study more complex changes and interactions between space and Earth's atmosphere.

Immel said he is excited to have this confirmation of long-held ionosphere theories. "We found half of what causes the ionosphere to behave as it does right there in the data," he said. "This is what we wanted to know."

Still, Maute said, "This leaves room to explore what else is contributing to the ionosphere's behavior."

Reference:  

Thomas J. Immel et al, Regulation of ionospheric plasma velocities by thermospheric winds, Nature Geoscience (2021). DOI: 10.1038/s41561-021-00848-4

Xenobots: Team builds first living robots that can reproduce


To persist, life must reproduce. Over billions of years, organisms have evolved many ways of replicating, from budding plants to sexual animals to invading viruses.

Now scientists have discovered an entirely new form of biological reproduction—and applied their discovery to create the first-ever, self-replicating living robots.

The same team that built the first living robots ("Xenobots," assembled from frog cells—reported in 2020) has discovered that these computer-designed and hand-assembled organisms can swim out into their tiny dish, find single cells, gather hundreds of them together, and assemble "baby" Xenobots inside their Pac-Man-shaped "mouth"—that, a few days later, become new Xenobots that look and move just like themselves.

And then these new Xenobots can go out, find cells, and build copies of themselves. Again and again.

"With the right design—they will spontaneously self-replicate," says Joshua Bongard, a computer scientist and robotics expert at the University of Vermont who co-led the new research.

The results of the new research were published November 29, 2021, in the Proceedings of the National Academy of Sciences.

Into the Unknown

In a Xenopus laevis frog, these embryonic cells would develop into skin. "They would be sitting on the outside of a tadpole, keeping out pathogens and redistributing mucus," says Michael Levin, a professor of biology and director of the Allen Discovery Center at Tufts University and co-leader of the new research. "But we're putting them into a novel context. We're giving them a chance to reimagine their multicellularity."

And what they imagine is something far different than skin. "People have thought for quite a long time that we've worked out all the ways that life can reproduce or replicate. But this is something that's never been observed before," says co-author Douglas Blackiston, the senior scientist at Tufts University who assembled the Xenobot "parents" and developed the biological portion of the new study.

"This is profound," says Levin. "These cells have the genome of a frog, but, freed from becoming tadpoles, they use their collective intelligence, a plasticity, to do something astounding." In earlier experiments, the scientists were amazed that Xenobots could be designed to achieve simple tasks. Now they are stunned that these biological objects—a computer-designed collection of cells—will spontaneously replicate. "We have the full, unaltered frog genome," says Levin, "but it gave no hint that these cells can work together on this new task," of gathering and then compressing separated cells into working self-copies.

"These are frog cells replicating in a way that is very different from how frogs do it. No animal or plant known to science replicates in this way," says Sam Kriegman, the lead author on the new study, who completed his Ph.D. in Bongard's lab at UVM and is now a post-doctoral researcher at Tuft's Allen Center and Harvard University's Wyss Institute for Biologically Inspired Engineering.

On its own, the Xenobot parent, made of some 3,000 cells, forms a sphere. "These can make children but then the system normally dies out after that. It's very hard, actually, to get the system to keep reproducing," says Kriegman. But with an artificial intelligence program working on the Deep Green supercomputer cluster at UVM's Vermont Advanced Computing Core, an evolutionary algorithm was able to test billions of body shapes in simulation—triangles, squares, pyramids, starfish—to find ones that allowed the cells to be more effective at the motion-based "kinematic" replication reported in the new research.

"We asked the supercomputer at UVM to figure out how to adjust the shape of the initial parents, and the AI came up with some strange designs after months of chugging away, including one that resembled Pac-Man," says Kriegman. "It's very non-intuitive. It looks very simple, but it's not something a human engineer would come up with. Why one tiny mouth? Why not five? We sent the results to Doug and he built these Pac-Man-shaped parent Xenobots. Then those parents built children, who built grandchildren, who built great-grandchildren, who built great-great-grandchildren." In other words, the right design greatly extended the number of generations.

Kinematic replication is well-known at the level of molecules—but it has never been observed before at the scale of whole cells or organisms.

"We've discovered that there is this previously unknown space within organisms, or living systems, and it's a vast space," says Bongard, a professor in UVM's College of Engineering and Mathematical Sciences. "How do we then go about exploring that space? We found Xenobots that walk. We found Xenobots that swim. And now, in this study, we've found Xenobots that kinematically replicate. What else is out there?"

Or, as the scientists write in the Proceedings of the National Academy of Sciences study: "life harbors surprising behaviors just below the surface, waiting to be uncovered."

Responding to Risk

Some people may find this exhilarating. Others may react with concern, or even terror, to the notion of a self-replicating biotechnology. For the team of scientists, the goal is deeper understanding.

"We are working to understand this property: replication. The world and technologies are rapidly changing. It's important, for society as a whole, that we study and understand how this works," says Bongard. These millimeter-sized living machines, entirely contained in a laboratory, easily extinguished, and vetted by federal, state and institutional ethics experts, "are not what keep me awake at night. What presents risk is the next pandemic; accelerating ecosystem damage from pollution; intensifying threats from climate change," says UVM's Bongard. "This is an ideal system in which to study self-replicating systems. We have a moral imperative to understand the conditions under which we can control it, direct it, douse it, exaggerate it."

Bongard points to the COVID epidemic and the hunt for a vaccine. "The speed at which we can produce solutions matters deeply. If we can develop technologies, learning from Xenobots, where we can quickly tell the AI,: 'We need a biological tool that does X and Y and suppresses Z,' —that could be very beneficial. Today, that takes an exceedingly long time." The team aims to accelerate how quickly people can go from identifying a problem to generating solutions—"like deploying living machines to pull microplastics out of waterways or build new medicines," Bongard says.

"We need to create technological solutions that grow at the same rate as the challenges we face," Bongard says.

And the team sees promise in the research for advancements toward regenerative medicine. "If we knew how to tell collections of cells to do what we wanted them to do, ultimately, that's regenerative medicine—that's the solution to traumatic injury, birth defects, cancer, and aging," says Levin. "All of these different problems are here because we don't know how to predict and control what groups of cells are going to build. Xenobots are a new platform for teaching us."

Reference: 

Kinematic self-replication in reconfigurable organisms, Proceedings of the National Academy of Sciences (2021). DOI: 10.1073/pnas.2112672118