Your source for the latest science & space press releases

Showing posts with label Physics. Show all posts
Showing posts with label Physics. Show all posts

Thursday, 6 January 2022

Matter and antimatter seem to respond equally to gravity


As part of an experiment to measure—to an extremely precise degree—the charge-to-mass ratios of protons and antiprotons, the RIKEN-led BASE collaboration at CERN, Geneva, Switzerland, has found that, within the uncertainty of the experiment, matter and antimatter respond to gravity in the same way.

Matter and antimatter create some of the most interesting problems in physics today. They are essentially equivalent, except that where a particle has a positive charge its antiparticle has a negative one. In other respects they seem equivalent. However, one of the great mysteries of physics today, known as "baryon asymmetry," is that, despite the fact that they seem equivalent, the universe seems made up entirely of matter, with very little antimatter. Naturally, scientists around the world are trying hard to find something different between the two, which could explain why we exist.

As part of this quest, scientists have explored whether matter and antimatter interact similarly with gravity, or whether antimatter would experience gravity in a different way than matter, which would violate Einstein's weak equivalence principle. Now, the BASE collaboration has shown, within strict boundaries, that antimatter does in fact respond to gravity in the same way as matter.

The finding, published in Nature, actually came from a different experiment, which was examining the charge-to-mass ratios of protons and antiprotons, one of the other important measurements that could determine the key difference between the two.

This work involved 18 months of work at CERN's antimatter factory. To make the measurements, the team confined antiprotons and negatively charged hydrogen ions, which they used as a proxy for protons, in a Penning trap. In this device, a particle follows a cyclical trajectory with a frequency, close to the cyclotron frequency, that scales with the trap's magnetic-field strength and the particle's charge-to-mass ratio. By feeding antiprotons and negatively charged hydrogen ions into the trap, one at a time, they were able to measure, under identical conditions, the cyclotron frequencies of the two particle types, comparing their charge-to-mass ratios. According to Stefan Ulmer, the leader of the project, "By doing this, we were able to obtain a result that they are essentially equivalent, to a degree four times more precise than previous measures. To this level of CPT invariance, causality and locality hold in the relativistic quantum field theories of the Standard Model."

Interestingly, the group used the measurements to test a fundamental physics law known as the weak equivalence principle. According to this principle, different bodies in the same gravitational field should undergo the same acceleration in the absence of frictional forces. Because the BASE experiment was placed on the surface of the Earth, the proton and antiproton cyclotron-frequency measurements were made in the gravitational field on the Earth's surface, and any difference between the gravitational interaction of protons and antiprotons would result in a difference between the cyclotron frequencies.

By sampling the gravitational field of the Earth as the planet orbited the Sun, the scientists found that matter and antimatter responded to gravity in the same way up to a degree of three parts in 100, which means that the gravitational acceleration of matter and antimatter are identical within 97% of the experienced acceleration.

Ulmer adds that these measurements could lead to new physics. He says, "The 3% accuracy of the gravitational interaction obtained in this study is comparable to the accuracy goal of the gravitational interaction between antimatter and matter that other research groups plan to measure using free-falling anti-hydrogen atoms. If the results of our study differ from those of the other groups, it could lead to the dawn of a completely new physics."

Reference: 

Stefan Ulmer, A 16-parts-per-trillion measurement of the antiproton-to-proton charge–mass ratio, Nature (2022). DOI: 10.1038/s41586-021-04203-w

Wednesday, 5 January 2022

Physicists watch as ultracold atoms form a crystal of quantum tornadoes


The world we experience is governed by classical physics. How we move, where we are, and how fast we're going are all determined by the classical assumption that we can only exist in one place at any one moment in time.

But in the quantum world, the behavior of individual atoms is governed by the eerie principle that a particle's location is a probability. An atom, for instance, has a certain chance of being in one location and another chance of being at another location, at the same exact time.

When particles interact, purely as a consequence of these quantum effects, a host of odd phenomena should ensue. But observing such purely quantum mechanical behavior of interacting particles amid the overwhelming noise of the classical world is a tricky undertaking.

Now, MIT physicists have directly observed the interplay of interactions and quantum mechanics in a particular state of matter: a spinning fluid of ultracold atoms. Researchers have predicted that, in a rotating fluid, interactions will dominate and drive the particles to exhibit exotic, never-before-seen behaviors.

In a study published today in Nature, the MIT team has rapidly rotated a quantum fluid of ultracold atoms. They watched as the initially round cloud of atoms first deformed into a thin, needle-like structure. Then, at the point when classical effects should be suppressed, leaving solely interactions and quantum laws to dominate the atoms' behavior, the needle spontaneously broke into a crystalline pattern, resembling a string of miniature, quantum tornadoes.

"This crystallization is driven purely by interactions, and tells us we're going from the classical world to the quantum world," says Richard Fletcher, assistant professor of physics at MIT.

The results are the first direct, in-situ documentation of the evolution of a rapidly-rotating quantum gas. Martin Zwierlein, the Thomas A. Frank Professor of Physics at MIT, says the evolution of the spinning atoms is broadly similar to how Earth's rotation spins up large-scale weather patterns.

"The Coriolis effect that explains Earth's rotational effect is similar to the Lorentz force that explains how charged particles behave in a magnetic field," Zwierlein notes. "Even in classical physics, this gives rise to intriguing pattern formation, like clouds wrapping around the Earth in beautiful spiral motions. And now we can study this in the quantum world."

The study's coauthors include Biswaroop Mukherjee, Airlia Shaffer, Parth B. Patel, Zhenjie Yan, Cedric Wilson, and Valentin Crépel, who are all affiliated with the MIT-Harvard Center for Ultracold Atoms and MIT's Research Laboratory of Electronics.

Spinning stand-ins


In the 1980s, physicists began observing a new family of matter known as quantum Hall fluids, which consists of clouds of electrons floating in magnetic fields. Instead of repelling each other and forming a crystal, as classical physics would predict, the particles adjusted their behavior to what their neighbors were doing, in a correlated, quantum way.

"People discovered all kinds of amazing properties, and the reason was, in a magnetic field, electrons are (classically) frozen in place—all their kinetic energy is switched off, and what's left is purely interactions," Fletcher says. "So, this whole world emerged. But it was extremely hard to observe and understand."

In particular, electrons in a magnetic field move in very small motions that are hard to see. Zwierlein and his colleagues reasoned that, as the motion of atoms under rotation occurs at much larger length scales, they might be able to use utracold atoms as stand-ins for electrons, and be able to watch identical physics.

"We thought, let's get these cold atoms to behave as if they were electrons in a magnetic field, but that we could control precisely," Zwierlein says. "Then we can visualize what individual atoms are doing, and see if they obey the same quantum mechanical physics."

Weather in a carousel


In their new study, the physicists used lasers to trap a cloud of about 1 million sodium atoms, and cooled the atoms to temperatures of about 100 nanokelvins. They then used a system of electromagnets to generate a trap to confine the atoms, and collectively spun the atoms around, like marbles in a bowl, at about 100 rotations per second.

The team imaged the cloud with a camera, capturing a perspective similar to a child's when facing towards the center on a playground carousel. After about 100 milliseconds, the researchers observed that the atoms spun into a long, needle-like structure, which reached a critical, quantum thinness.

"In a classical fluid, like cigarette smoke, it would just keep getting thinner," Zwierlein says. "But in the quantum world, a fluid reaches a limit to how thin it can get."

"When we saw it had reached this limit, we had good reason to think we were knocking on the door of interesting, quantum physics," adds Fletcher, who with Zwierlein, published the results up to this point in a previous Science paper. "Then the question was, what would this needle-thin fluid do under the influence of purely rotation and interactions?"

In their new paper, the team took their experiment a crucial step further, to see how the needle-like fluid would evolve. As the fluid continued to spin, they observed a quantum instability starting to kick in: The needle began to waver, then corkscrew, and finally broke into a string of rotating blobs, or miniature tornadoes—a quantum crystal, arising purely from the interplay of the rotation of the gas, and forces between the atoms.

"This evolution connects to the idea of how a butterfly in China can create a storm here, due to instabilities that set off turbulence," Zwierlein explains. "Here, we have quantum weather: The fluid, just from its quantum instabilities, fragments into this crystalline structure of smaller clouds and vortices. And it's a breakthrough to be able to see these quantum effects directly."

Reference: 

Martin Zwierlein, Crystallization of bosonic quantum Hall states in a rotating quantum gas, Nature (2022). DOI: 10.1038/s41586-021-04170-2. 

Richard J. Fletcher et al, Geometric squeezing into the lowest Landau level, Science (2021). DOI: 10.1126/science.aba7202

Chinese tokamak facility keeps plasma 2.6 times as hot as the Sun for 17 minutes


Good news for fusion energy progress and a new world record for the Chinese Academy of Sciences, as its Experimental Advanced Superconducting Tokamak (EAST), or "artifical sun," maintains 70 million degrees Celsius (126 million °F) for 1,056 seconds.

High-temperature plasma is a critical part of many large-scale fusion energy initiatives, which attempt to replicate some of the conditions that make the Sun a powerful enough fusion reactor to warm our solar system, with the goal of eventually supplying safe, clean energy for humankind.

Heat can be viewed as an energetic vibration of atoms, and this vibration becomes so extreme at ultra-high temperatures that atoms begin to randomly smash into one another with enough speed to jam their nuclei together, fusing them together and creating a new atomic element.

If you're using lightweight atoms from the lower end of the periodic table – like the Sun does, fusing hydrogen into helium – the new atom weighs less than the original two combined, and the difference in mass is ejected as thermal energy. At the core of the Sun, temperatures around 27 million °C (48.6 million °F) fuse about 620 million metric tons of hydrogen into about 616 million metric tons of helium every second, converting some 4 million tons of matter into energy.

A small proportion of this eventually reaches us here on Earth as electromagnetic radiation, supplying us with visible light, ultraviolet light, infra-red, radio waves, X-rays and gamma rays, and without this generous solar gift of energy, life as we know it would never have been possible.

Tokamak-style fusion reactors like the International Thermonuclear Experimental Reactor (ITER) obviously don't have the colossal scale and gravity of the Sun, but they aim to heat up hydrogen atoms – specifically, deuterium and tritium isotopes – to a point where they begin smashing together, fusing and releasing energy that can both be harvested, and sustain the reaction as additional hydrogen atoms are fed in.

ITER's target temperature is 150 million °C (270 million °F). China's EAST facility, which is a key contributor to the ITER project, has hit this mark already, reaching 160 million °C (288 million °F) for 20 seconds, and holding 120 million °C (216 million °F) for 101 seconds in separate experiments announced last May.

The latest experiment tested the Chinese tokamak's capability to endure extreme temperatures over longer periods, sustaining a temperature 2.6 times hotter than the Sun's core for some 1,056 seconds, or 17 minutes and 36 seconds. Nobody's ever sustained a high-temperature plasma for 1,000 seconds before, so this is an important milestone.

It's natural to wonder how these insane temperatures can possibly exist on Earth without causing the entire tokamak facility to melt down or burn to a crisp. Essentially, the donut shape of the tokamak's inner chamber is lined with the most heat-resistant materials available – tungsten and carbon, for example. Since even these would be destroyed if exposed to hundreds of millions of degrees, the superheated plasma is squashed right into the middle of the chamber, as far from the walls as possible, using powerful magnetic fields.

Most importantly, though, these extraordinary temperatures are achieved in a tiny amount of plasma relative to the size of the chamber, so the energy dissipates rapidly before it reaches the walls.

It's important to clarify: EAST has not created a fusion reaction here, just a sustained, superheated plasma similar to the kind that will eventually be used to create fusion. So it's a long way from being energy-positive at this point. Tokamak-style fusion is still many years from that lofty goal at this point, and the globe-spanning ITER project has already been described as the most expensive science experiment of all time and the most complicated engineering project in human history, since even when it does generate heat from fusion reactions, it'll vent that heat rather than attempt to capture and use it.

Indeed, we'll likely have to wait for a "DEMO" class successor to the ITER facility, like the one planned by EUROfusion, before we see a large tokamak generating useful amounts of electricity. Where ITER is shooting for a Q value of 10 – putting in 50 MW of thermal energy and generating 500 MW of gross thermal output, the EU's DEMO reactor aims to put in 80 MW and generate some 2 GW for a Q factor of 25.

That's currently planned to begin operation in 2051. Ah well, 29 years away is better than 30.


Tuesday, 4 January 2022

Black holes and dark matter—are they one and the same?


Primordial black holes created in the first instants after the Big Bang—tiny ones smaller than the head of a pin and supermassive ones covering billions of miles—may account for all of the dark matter in the universe.

That's the implication of a new model of the early universe created by astrophysicists at Yale, the University of Miami, and the European Space Agency (ESA). If proven true with data from the soon-to-launch James Webb Space Telescope, the discovery would transform scientists' understanding of the origins and nature of both dark matter and black holes.

Dark matter—which has never been directly observed—is thought to constitute the majority of matter in the universe and act as the unseen scaffolding upon which galaxies form and develop. Physicists have spent years testing a variety of dark matter candidates, including hypothetical particles such as sterile neutrinos, Weakly Interacting Massive Particles (WIMPS), and axions.

Black holes, on the other hand, have been observed. A black hole is a point in space where matter is so tightly compacted it creates intense gravity. Not even light can resist its pull. Black holes are found at the centers of most galaxies.

The new study, accepted for publication in The Astrophysical Journal, harkens back to a theory first proposed in the 1970s by physicists Stephen Hawking and Bernard Carr. At the time, Hawking and Carr argued that in the first fraction of a second after the Big Bang, tiny fluctuations in the density of the universe may have created an undulating landscape with "lumpy" regions that had extra mass. These lumpy areas would collapse into black holes.

Although the theory did not gain traction within the wider scientific community—the new study suggests that, if modified slightly, it could have explanatory power after all.

If most of the primordial black holes were "born" at a size roughly 1.4 times the mass of Earth's sun, they could potentially account for all dark matter, said Yale professor of astronomy and physics Priyamvada Natarajan, the paper's theorist.

Natarajan and her colleagues say their new model shows that the first stars and galaxies would have formed around black holes in the early universe. Also, she said, primordial black holes would have had the ability to grow into supermassive black holes by feasting on gas and stars in their vicinity, or by merging with other black holes.

"Primordial black holes, if they do exist, could well be the seeds from which all supermassive black holes form, including the one at the center of the Milky Way," Natarajan said.

"What I find personally super exciting about this idea is how it elegantly unifies the two really challenging problems that I work on—that of probing the nature of dark matter and the formation and growth of black holes—and resolves them in one fell swoop," she added.

The James Webb telescope's mission will be to find the first galaxies that formed in the early universe and see stars forming planetary systems.

The new study's first author is Nico Cappelluti, a former Yale Center for Astronomy & Astrophysics Prize postdoctoral fellow who is now an assistant professor of physics at the University of Miami. Günther Hasinger, ESA's director of science, is the study's second author.

"Our study shows that without introducing new particles or new physics, we can solve mysteries of modern cosmology from the nature of dark matter itself to the origin of super-massive black holes," Cappelluti said.

Primordial black holes also may resolve another cosmological puzzle: the excess of infra-red radiation, synced with X-ray radiation, that has been detected from distant, dim sources scattered around the universe. Natarajan and her colleagues said growing, primordial black holes would present "exactly" the same radiation signature.

Best of all, the existence of primordial black holes can be proven—or disproven—in the near future, courtesy of the James Webb Space Telescope and ESA's Laser Interferometer Space Antenna (LISA) mission announced for the 2030s.

If dark matter is comprised of primordial black holes, more stars and galaxies would have formed around them in the early universe—precisely the epoch that the James Webb telescope will be able to see. LISA, meanwhile, will be able to pick up gravitational wave signals from early mergers of primordial black holes.

"If the first stars and galaxies already formed in the so-called 'dark ages,' Webb should be able to see evidence of them," Hasinger said.

Natarajan added, "It was irresistible to explore this idea deeply, knowing it had the potential to be validated fairly soon."

Reference: 

Exploring the high-redshift PBH-ΛCDM Universe: early black hole seeding, the first stars and cosmic radiation backgrounds, arXiv:2109.08701 [astro-ph.CO] arxiv.org/abs/2109.08701

Monday, 3 January 2022

Self‐healing crystal voids in double perovskite nanocrystal


From the Terminator to Spiderman's suit, self-repairing robots and devices abound in sci-fi movies. In reality, though, wear and tear reduce the effectiveness of electronic devices until they need to be replaced. What is the cracked screen of your mobile phone healing itself overnight, or the solar panels providing energy to satellites continually repairing the damage caused by micro-meteorites?

The field of self-repairing materials is rapidly expanding, and what used to be science fiction might soon become reality, thanks to Technion—Israel Institute of Technology scientists who developed eco-friendly nanocrystal semiconductors capable of self-healing. Their findings, recently published in Advanced Functional Materials, describe the process, in which a group of materials called double perovskites display self-healing properties after being damaged by the radiation of an electron beam. The perovskites, first discovered in 1839, have recently garnered scientists' attention due to unique electro-optical characteristics that make them highly efficient in energy conversion, despite inexpensive production. A special effort has been put into the use of lead-based perovskites in highly efficient solar cells.

The Technion research group of Professor Yehonadav Bekenstein from the Faculty of Material Sciences and Engineering and the Solid-State Institute at the Technion is searching for green alternatives to the toxic lead and engineering lead-free perovskites. The team specializes in the synthesis of nano-scale crystals of new materials. By controlling the crystals' composition, shape, and size, they change the material's physical properties.

Nanocrystals are the smallest material particles that remain naturally stable. Their size makes certain properties more pronounced and enables research approaches that would be impossible on larger crystals, such as imaging using electron microscopy to see how atoms in the materials move. This was, in fact, the method that enabled the discovery of self-repair in the lead-free perovskites.

The perovskite nanoparticles were produced in Prof. Bekenstein's lab using a short, simple process that involves heating the material to 100°C for a few minutes. When Ph.D. students Sasha Khalfin and Noam Veber examined the particles using a transmission electron microscope, they discovered the exciting phenomenon. The high voltage electron beam used by this type of microscope caused faults and holes in the nanocrystals. The researchers were then able to explore how these holes interact with the material surrounding them and move and transform within it.

They saw that the holes moved freely within the nanocrystal, but avoided its edges. The researchers developed a code that analyzed dozens of videos made using the electron microscope to understand the movement dynamics within the crystal. They found that holes formed on the surface of the nanoparticles, and then moved to energetically stable areas inside. The reason for the holes' movement inwards was hypothesized to be organic molecules coating the nanocrystals' surface. Once these organic molecules were removed, the group discovered the crystal spontaneously ejected the holes to the surface and out, returning to its original pristine structure—in other words, the crustal repaired itself.

This discovery is an important step towards understanding the processes that enable perovskite nanoparticles to heal themselves, and paves the way to their incorporation in solar panels and other electronic devices.

Reference: 

Sasha Khalfin et al, Self‐Healing of Crystal Voids in Double Perovskite Nanocrystals Is Related to Surface Passivation, Advanced Functional Materials (2021). DOI: 10.1002/adfm.202110421

Quantum imaging: Pushing the boundaries of optics


Quantum mechanically entangled light particles break down the boundaries of conventional optics and allow a glimpse into previously invisible wavelength ranges, thus bringing about new possibilities for imaging techniques, microscopy and spectroscopy. Unearthing these possibilities and creating technological solutions was the goal of the Fraunhofer lighthouse project QUILT, the results of which are now available.

Light can do some amazing things. For example, light particles (photons) be entangled upon creation, which connects them inextricably to one another in terms of their properties, not just across great distances, but also across different wavelength ranges. These entangled photons are the tools used by the Fraunhofer researchers in the project "QUILT—Quantum Methods for Advanced Imaging Solutions." They are using the photons to develop quantum optical solutions for wavelength ranges that have thus far proven to be virtually inaccessible. These wavelengths provide us with valuable information beyond the light of the visible spectrum: Short-wave ultraviolet radiation, for example, can be used to make the tiniest structures in cells visible. Infrared radiation provides information about noxious gases in the air or the composition of plastics, and long-wave terahertz radiation can be used to precisely determine the thickness of coating and paint layers. There is thus great potential in the fields of biomedical diagnostics, material testing or process and environmental analytics. The only issue is that creating and detecting these light waves requires significantly more resources than those used in imaging techniques for visible ranges.

New detection principle for different methods


For four years, teams of researchers from six Fraunhofer Institutes have been working with external organizations, supported by an advisory board with representatives from industry and science, to find ways of using the entangled photon pairs in different measurement methods in imaging, spectroscopy and metrology—making the invisible visible. The underlying principle is that while one photon has a wavelength that can be captured on camera, the other is designed to interact with the object under examination in the invisible range. The entanglement, dubbed by Einstein as "spooky action at a distance," means that the information gathered by the second photon is transferred to the first, making it visible to the camera.

During this project, the partners have done important pioneering work for scientific and technical development in this relatively new field. The first ever use of the new detection principle for terahertz radiation was demonstrated. This could, for example, improve methods for investigating materials in the future. A quantum optical counterpart has been developed to the classical Fourier-transform infrared (FTIR) spectrometer, which is used in areas such as process analytics to examine gas samples. The project also produced the first ever video created using imaging with undetected light and the world's first 2D image captured and reconstructed using "quantum ghost imaging" with asynchronous detection. Above all, ghost imaging is well suited to biological and medical applications, in which light-sensitive cell samples can be observed over a long period because the new processes use less light. This can help to improve diagnosis.

Cornerstone for industrial applications


The project has resulted in the submission and granting of seven patents, high-profile scientific publications and demonstrators for quantum-based imaging, spectroscopy and optical tomography. The researchers intend to use these to continue exploring new unconventional fields of application for quantum-based methods together with industry partners. Innovative branches of industry, such as environmental technology and medical engineering, are of particular interest. For the exchange of ideas in the international scientific community, in 2018 the QUILT consortium initiated an annual series of seminars, "Sensing with Quantum Light," which has become the leading platform in the field.

Source: Link

“Invisibility Cloaks” May Soon Be Real: Creating Invisibility With Superconducting Materials


Invisibility devices may soon no longer be the stuff of science fiction. A new study published in the De Gruyter journal Nanophotonics by lead authors Huanyang Chen at Xiamen University, China, and Qiaoliang Bao, suggests the use of the material Molybdenum Trioxide (α-MoO3) to replace expensive and difficult to produce metamaterials in the emerging technology of novel optical devices.

The idea of an invisibility cloak may sound more like magic than science, but researchers are currently hard at work producing devices that can scatter and bend light in such a way that it creates the effect of invisibility.

Thus far these devices have relied on metamaterials—a material that has been specially engineered to possess novel properties not found in naturally occurring substances or in the individual particles of that material—but the study by Chen and co-authors suggests the use of α-MoO3 to create these invisibility devices.

Possessing some unique properties, this material can provide an excellent platform for controlling energy flow. The team's simulation results showed that when cylindrical or rolled up α-MoO3 materials replace metamaterials, the simplified invisibility concentrator can gain the effects of electromagnetic invisibility and energy concentration that would be demonstrated by a near perfect-invisibility device.

As a result, the study shows that hyperbolic materials such as α-MoO3 and Vanadium pentoxide (V2O5) could serve as a new basis for transformation optics, opening the possibility of photonic devices beyond invisibility concentrators, including improved infrared imaging and detection systems.

Transformation optics has been a hot topic in physics over recent decades thanks to the discovery that the path light takes through a continuous medium can be the same as its propagation through a curved space that has undergone a coordinate transformation.

The consequence of this is that the behavior of light can be manipulated as it passes through a material, something that has led to the creation of a multitude of novel optical devices, such as invisibility cloaks—a camouflage material that could cover an object and bend light around it making it almost disappear—and other optical illusion devices.

"It is the first time that 2D materials have been used for transformation optical devices. Usually, we need metamaterials but this is much simpler," says Chen. The researcher continued by explaining that the first application for the results of this study might be a large size energy concentrator capable of improving such devices. "We are now performing experiments by rolling up the α-MoO3, the results of which we hope will appear very soon."

Reference: 

Tao Hou et al, Invisibility concentrator based on van der Waals semiconductor α-MoO3, Nanophotonics (2021). DOI: 10.1515/nanoph-2021-0557

Sunday, 2 January 2022

What the thermodynamics of clocks tell us about the mysteries of time


Surprising new insights about the strange physics underlying how clocks work could transform our understanding of time’s arrow – and hint at how time works at the quantum scale

A CENTURY ago, two intellectual giants met to debate the nature of time. One was the French philosopher Henri Bergson, a superstar whose fans caused the first traffic jam on Broadway in New York as they flocked to one of his earlier appearances. He believed there was more to time than something that can be measured by clocks, captured by mathematics or explained by psychology. He argued that the way we experience it, with a duration and direction, could only be revealed through philosophy.

Bergson’s opponent, a physicist called Albert Einstein, disagreed. After developing his theories of relativity he believed time was a physical entity, separate from human consciousness, that could speed up or slow down. Einstein thought that time was interwoven in space in a static cosmos called the block universe which lacks a clear past, present or future.

Almost 100 years later, the question of why the time we perceive is so different from the time postulated in physics is still hotly debated. Now, fresh clues are starting to suggest the devices we use to measure time might be crucial to arriving at an answer.

Those clues relate to the fact that in general relativity, clocks are incorporated into the theory as perfectly idealised objects, with smooth readings that are accurate no matter how much you zoom in, when they actually are anything but. “Clocks are physical things which are made up of physical systems, and so we kind of know that idealisation can’t be right,” says Emily Adlam at the Rotman Institute of Philosophy at Western University in Canada. “A more realistic understanding of clocks may ultimately be the key to understanding time.”

We can measure time using anything that goes through a change – sundials use the shifting sun, water clocks tap the flow of water and even the temperature of a cup of tea can help us estimate when it was brewed. Today, we mostly use sophisticated mechanical and atomic clocks, which can measure time much more accurately than a cup of tea, because they tick reliably with a certain frequency.

Since astronomer Christiaan Huygens invented the first pendulum clock in the 17th century, we have been steadily improving the accuracy of scientific clocks, with phenomenal results. Nowadays, the most impressive machines can measure each second so accurately that they wouldn’t miss a beat in 20 billion years, longer than the age of the universe. Impressive. But it turns out there may be a price to pay for such accuracy.

To produce their ticks, clocks need a source of energy. A grandfather clock must be wound up and a wall clock is powered by a battery. The most accurate atomic clocks, with ticks that correspond to electromagnetic signals given off by atoms changing energy levels, are driven by high-powered lasers.

This isn’t surprising. But rather than just requiring energy to run each mechanical part, new research suggests something more might be at play. Clocks could be a type of so-called thermodynamic machine, with fundamental constraints on their performance set by the underlying physics. If this is true, not only will it mean there could be a limit to how accurately we can measure time’s passing, it “will have a huge impact on how philosophers think about time”, says Gerard Milburn, a quantum physicist at the University of Queensland, Australia.

We know of two types of thermodynamic machine. The first comprises heat engines – things like fridges and combustion engines – which have a maximum efficiency set by thermodynamics. The second group encompasses information storage devices, like DNA and hard discs. In these, thermodynamics tells us the cost of erasing information. If clocks are a third, it would mean there are limits on how accurately we can tell the time, due to the involvement of energy’s messy cousin, entropy.

The maximum efficiency of heat engines was determined by engineer Sadi Carnot in 1824, before entropy was defined. But his calculation paved the way for the discovery of the second law of thermodynamics, which says any closed system – something that nothing can enter or leave – will increase in entropy, a measure of disorder or randomness, over time.

Low entropy means high order. If the atoms in a box of gas clustered in one corner rather than being spread out chaotically, entropy would be low. But because there are fewer ways for atoms to be ordered than disordered, making the latter more likely, closed systems – like the universe – tend towards disorder. A cup of hot tea loses heat to its surroundings, raising overall entropy, but never spontaneously heats up. This creates an arrow of time.

The second law is the only law of physics in which rules are irreversible in time. Because of this, thermodynamics is used to explain the arrow of time we perceive. But the second law doesn’t tell the whole picture. There is still a question of why we only ever experience time moving forwards – many physicists today argue that this is simply an illusion.

If the arrow of time in thermodynamics could be linked to the practical reality of measuring time, then thermodynamics could help explain how we perceive time after all, says Adlam. What is needed, she says, is a direct link between thermodynamics and practical timekeeping – something explaining why all clocks run in the same direction as the entropy increase of the universe. Find this link, and we might just answer some of the questions Einstein and Bergson were at odds over. In search of this connection, a handful of researchers are turning to clocks.

A few years ago, Paul Erker at the Institute for Quantum Optics and Quantum Information in Vienna, Austria, teamed up with Marcus Huber at the Vienna University of Technology in an attempt to understand what clocks really are. They started off by modelling quantum clocks, simple systems in which the flow of energy is easy to track. In a 2017 paper, they and their colleagues showed a clock made of just three atoms – one hot, one cold and one “ticking” thanks to the energy flow between the two others – should dissipate more energy the more accurate it is. This was a big step, but still purely theoretical. By 2020, they were ready to test it.

Teaming up with researchers including Natalia Ares at the University of Oxford and Edward Laird at Lancaster University, both in the UK, the researchers built a simple pendulum clock from a suspended membrane of silicon nitride with a thickness of about 50 nanometres. “You could think of it more like a drum than a pendulum,” says Laird. They made their tiny drum vibrate, with each vibration corresponding to one tick of the clock. The strength of the vibrations could be increased by applying an electric field. To determine the clock’s accuracy – how regularly the ticks occurred – they connected it to an electrical circuit including a voltmeter. “It is a beautiful experiment,” says Milburn.

The crux of that experiment was that the clock became more accurate as more energy was supplied to the drum. And the more accurate it was, the more entropy it produced. This was the first result to explain why clocks move forwards in time, because as they measure time, they increase entropy, an irreversible process. “This research gives a very nice explicit link between the thermodynamic arrow of time and perceptual time,” says Adlam.

Carlo Rovelli at Aix-Marseille University in France agrees the work sharpens our understanding of the strict relationship between time and heat. “Simply put, if there is no heat involved, there is no way to distinguish the past from the future,” he says. The research strengthens his thermal time hypothesis, which argues that time emerges from the laws of thermodynamics on the macroscopic scale of humans, regardless of what is going on at a microscopic level.

Crucially, the research also shows that the arrow of time isn’t something only humans can experience. “It doesn’t really matter if it’s a conscious agent who observes the clock or a device, such as a detector,” says Huber. The entropy still increases. “It’s true for anything.” Rather than being a consequence of our consciousness, this suggests the way we perceive time may be physically built into the process of timekeeping. If so, Bergson’s argument falls apart and Einstein looks right to have believed time is a physical entity.

This isn’t the first time a link between energy cost and the accuracy of clocks has been explored. A similar relationship between accuracy and energy cost has been seen in the biochemical clocks that operate inside ocean-dwelling cyanobacteria, helping them generate the chemicals needed for photosynthesis early in the morning before the sun rises. This is partly because they are living organisms, not mechanical clocks. “Evolution probably places additional constraints on what it means for a clock to be good, beyond the energetic constraints of precision,” says Jordan Horowitz at the University of Michigan.

But not all clocks entirely follow the rules, it would seem. The most accurate atomic clocks appear more efficient than the research predicts. These clocks involve complex circuits, detectors and feedback, making their energy flow difficult to model. Both Erker and Huber are confident they will be shown to obey the same constraint. “I’m not able to prove this statement yet,” says Erker. “But my hunch definitely goes in this direction.”

If he’s right, it would have meaning beyond proving an arrow of time exists outside of our consciousness. The link between clocks and thermodynamics may also reflect time on a smaller scale. If there is a limit on how accurately we can resolve time, could this be a sign that time itself isn’t perfectly smooth, but instead lumpy – packed into tiny units in the same way that light comes in photons?

Answering this could be tricky. To probe space-time at this tiniest of scales, below those we can currently reach with our best particle accelerators, would require vast amounts of energy. At a certain level of energy, you would expect to create a black hole that would swallow the entire experiment, suggesting it is impossible to resolve time perfectly. “You end up with a sort of fundamental limit on the sensitivity to which you can measure a time interval,” says Adlam. This might be related to the limit caused by thermodynamics, she says, but the link isn’t clear yet.

Probing time at minuscule scales is exciting, but what Huber is most thrilled about relates to quantum mechanics and a mystery called the measurement problem. “I have a long-standing obsession with it,” he says.

Unlike relativity, in which time is local and relative, quantum mechanics assumes there is a universal background time. Time in quantum mechanics doesn’t have an arrow: equations work equally well forwards as backwards in time. But sometimes this reversibility can be broken. When we measure a quantum system, the act of measuring causes the system to collapse from a superposition, a mix of different possible states, into a specific outcome. This cannot be reversed, creating an arrow of time. How time manages both to have, and not have, an arrow is just one of quantum mechanics’ many puzzles. But if the thermodynamic arrow can explain our perceptual time arrow, maybe it can explain the quantum one too.

This is what Huber wants to tackle next. We know that whenever we measure something, we affect it, but the nitty-gritty of this process is often ignored in quantum mechanics. According to Huber, the act of measuring should create a flow of energy that may be best described by the laws of thermodynamics. “I think the measurement postulate is the second law in disguise,” he says. Perhaps quantum measurements, like clocks, create rising entropy and hence an emergent arrow of time.

Erker, on the other hand, points out the research could also help to test ideas that combine the notoriously clashing theories of quantum mechanics and general relativity into a quantum theory of gravity. Such tests are extremely hard. Because gravity is so weak, you either need to put massive objects in a quantum superposition state to probe gravitational effects,which is tricky and has only been done in molecules of up to 2000 atoms. Or you need to be able to make incredibly precise measurements – and quantum clocks could help with that. “If we could build clocks that are accurate on very short timescales, we could actually build tabletop quantum experiments that test for gravitational effects,” says Erker.

Any theory that explains gravity and quantum mechanics needs to describe how clocks work on the quantum scale. “All this research on understanding what clocks really are and how they kind of interact with quantum mechanics and with relativity is probably an important step to understanding how those theories fit together,” says Adlam.

Bergson and Einstein’s debate cost the physicist the Nobel prize for general relativity. The president of the Nobel committee said that, while it was complex, “it will be no secret that the famous philosopher Bergson in Paris has challenged this theory”. Instead, Einstein won for the less-glamorous photoelectric effect. But a century on, Einstein now seems the real winner of the debate. The next question is whether there will ever be a way to merge his theory of general relativity with quantum mechanics. On that, only time will tell.

Source: Link

Saturday, 1 January 2022

Large Hadron Collider will reach for the edge of physics


THE Large Hadron Collider (LHC) at CERN near Geneva, Switzerland, will start running again after a three-year shutdown and delays due to the covid-19 pandemic. The particle collider – known for its role in the discovery of the Higgs boson, which gives mass to all other fundamental particles – will return in 2022 with upgrades that give it a power boost.

Work has been under way to conduct tests on the collider and calibrate new equipment. Now, it is gearing up for experiments that could give physicists the data needed to expand the standard model, our best description of how particles and forces interact.

Phil Allport at the University of Birmingham in the UK says the upgrades could allow new measurements that give us insight into the way the Higgs boson decays, leading to a better understanding of how it fits into the standard model.

“These measurements shed light on what’s happening at the highest energies that we can reach, which tells us about phenomena in the very early universe,” he says. They will also allow us to test ideas that try to account for things that aren’t fully described by the standard model, he says.

This includes mysteries that have plagued physicists for decades, such as the so-called hierarchy problem, which deals with the vast discrepancy between the mass of the Higgs and those of other fundamental particles, plus dark energy and dark matter, the unexplained phenomena that make up most of the universe.

“All of these things require extensions to the standard model of particle physics to accommodate, and all of those theories make predictions. And the best place to look to test those predictions is usually in the highest energies achievable,” says Allport. He says the LHC upgrades also pave the way to entirely new observations that signal a departure from the standard model.

Part of the upgrade work has been to increase the power of the injectors that supply highly accelerated particle beams to the collider. Prior to the last shutdown in 2018, protons could reach an energy of 6.5 teraelectronvolts, but the upgrades mean this can now be pushed to 6.8 teraelectronvolts.

Rende Steerenberg at CERN says that these more powerful beams will cause collisions at higher energies than ever before, and other upgrades in the future will also allow more particles to be collided at the same time.

There are already plans for further improvements in 2024, which will narrow the LHC’s beams and drastically increase the number of collisions that take place. The 2018 run saw around 40 collisions every time a pulse of protons passed each other, but upgrades will push this to between 120 to 250. At that point, the LHC will take on a new name, the High Luminosity Large Hadron Collider, and it should begin experiments in 2028.

There are still many tests to be run before the power of the new components can be unleashed. Scientists at CERN hope to finish these by late February and then slowly ramp up to a small number of full-power collisions in May. The frequency of these collisions will be increased in June, which is when Steerenberg says “meaningful” physics will begin.

Source: Link

Friday, 31 December 2021

Using Magnets To Toggle Nanolasers Leads to Better Photonics


 Controlling nanolasers with magnets lays the groundwork for more robust optical signaling.

A magnetic field can be used to switch nanolasers on and off, shows new research from Aalto University. The physics underlying this discovery paves the way for the development of optical signals that cannot be disturbed by external disruptions, leading to unprecedented robustness in signal processing.

Lasers concentrate light into extremely bright beams that are useful in a variety of domains, such as broadband communication and medical diagnostics devices. About ten years ago, extremely small and fast lasers known as plasmonic nanolasers were developed. These nanolasers are potentially more power-efficient than traditional lasers, and they have been of great advantage in many fields—for example, nanolasers have increased the sensitivity of biosensors used in medical diagnostics.

So far, switching nanolasers on and off has required manipulating them directly, either mechanically or with the use of heat or light. Now, researchers have found a way to remotely control nanolasers.

“The novelty here is that we are able to control the lasing signal with an external magnetic field. By changing the magnetic field around our magnetic nanostructures, we can turn the lasing on and off,” says Professor Sebastiaan van Dijken of Aalto University.

The team accomplished this by making plasmonic nanolasers from different materials than normal. Instead of the usual noble metals, such as gold or silver, they used magnetic cobalt-platinum nanodots patterned on a continuous layer of gold and insulating silicon dioxide. Their analysis showed that both the material and the arrangement of the nanodots in periodic arrays were required for the effect.

Photonics advances towards extremely robust signal processing


The new control mechanism may prove useful in a range of devices that make use of optical signals, but its implications for the emerging field of topological photonics are even more exciting. Topological photonics aims to produce light signals that are not disturbed by external disruptions. This would have applications in many domains by providing very robust signal processing.

“The idea is that you can create specific optical modes that are topological, that have certain characteristics which allow them to be transported and protected against any disturbance,” explains van Dijken. “That means if there are defects in the device or because the material is rough, the light can just pass them by without being disturbed, because it is topologically protected.”

So far, creating topologically protected optical signals using magnetic materials has required strong magnetic fields. The new research shows that the effect of magnetism in this context can be unexpectedly amplified using a nanoparticle array of a particular symmetry. The researchers believe their findings could point the way to new, nanoscale, topologically protected signals.

“Normally, magnetic materials can cause a very minor change in the absorption and polarization of light. In these experiments, we produced very significant changes in the optical response— up to 20 percent. This has never been seen before,” says van Dijken.

Academy Professor Päivi Törmä adds that “these results hold great potential for the realization of topological photonic structures wherein magnetization effects are amplified by a suitable choice of the nanoparticle array geometry.”

These findings are the result of a long-lasting collaboration between the Nanomagnetism and Spintronics group led by Professor van Dijken and the Quantum Dynamics group led by Professor Törmä, both in the Department of Applied Physics at Aalto University.

Reference:

Freire-Fernández, F., Cuerda, J., Daskalakis, K.S. et al. Magnetic on–off switching of a plasmonic laser. Nat. Photon. 16, 27–32 (2022). DOI: 10.1038/s41566-021-00922-8

Wednesday, 29 December 2021

Birds Have a Mysterious 'Quantum Sense'. Scientists Have Now Seen It in Action


Seeing our world through the eyes of a migratory bird would be a rather spooky experience. Something about their visual system allows them to 'see' our planet's magnetic field, a clever trick of quantum physics and biochemistry that helps them navigate vast distances.

In early 2021, scientists from the University of Tokyo announced they had, for the first time ever, directly observed a key reaction hypothesized to be behind birds' (and many other creatures') talents for sensing the direction of Earth's poles.

Importantly, this is evidence of quantum physics directly affecting a biochemical reaction in a cell – something we've long hypothesized but haven't seen in action before. 

Using a tailor-made microscope sensitive to faint flashes of light, the team watched a culture of human cells containing a special light-sensitive material respond dynamically to changes in a magnetic field.

The change the researchers observed in the lab matched what would be expected if a quirky quantum effect was responsible for the illuminating reaction.

"We've not modified or added anything to these cells," said biophysicist Jonathan Woodward.

"We think we have extremely strong evidence that we've observed a purely quantum mechanical process affecting chemical activity at the cellular level."

So how are cells, particularly human cells, capable of responding to magnetic fields?

While there are several hypotheses out there, many researchers think the ability is due to a unique quantum reaction involving photoreceptors called cryptochromes.

Cryptochromes are found in the cells of many species and are involved in regulating circadian rhythms. In species of migratory birds, dogs, and other creatures, they're linked to the mysterious ability to sense magnetic fields.

In fact, while most of us can't see magnetic fields, human cells definitely contain cryptochromes. And there's evidence that even though it's not conscious, humans are actually still capable of detecting Earth's magnetism.

To see the reaction within cryptochromes in action, the researchers bathed a culture of human cells containing cryptochromes in blue light, causing them to fluoresce weakly. As they glowed, the team swept magnetic fields of various frequencies repeatedly over the cells.

They found that each time the magnetic field passed over the cells, their fluorescence dipped around 3.5 percent – enough to show a direct reaction.

How can a magnetic field affect a photoreceptor? It all comes down to something called spin – an innate property of electrons.

We already know that spin is significantly affected by magnetic fields. Arrange electrons in the right way around an atom, and collect enough of them together in one place, and the resulting mass of material can be made to move using nothing more than a weak magnetic field like the one that surrounds our planet.

This is all well and good if you want to make a needle for a navigational compass. But with no obvious signs of magnetically-sensitive chunks of material inside pigeon skulls, physicists have had to think smaller.

In 1975, a Max Planck Institute researcher named Klaus Schulten developed a theory on how magnetic fields could influence chemical reactions. 

It involved something called a radical pair. A garden-variety radical is an electron in the outer shell of an atom that isn't partnered with a second electron.

Sometimes, these bachelor electrons can adopt a wingman in another atom to form a radical pair. The two stay unpaired, but thanks to a shared history are considered entangled, which in quantum terms means their spins will eerily correspond no matter how far apart they are.

Since this correlation can't be explained by ongoing physical connections, it's purely a quantum activity, something even Albert Einstein considered 'spooky'. 

In the hustle-bustle of a living cell, their entanglement will be fleeting. But even these briefly correlating spins should last just long enough to make a subtle difference in the way their respective parent atoms behave.

In this experiment, as the magnetic field passed over the cells, the corresponding dip in fluorescence suggests that the generation of radical pairs had been affected.

An interesting consequence of the research could be in how even weak magnetic fields could indirectly affect other biological processes. While evidence of magnetism affecting human health is weak, similar experiments as this could prove to be another avenue for investigation.

"The joyous thing about this research is to see that the relationship between the spins of two individual electrons can have a major effect on biology," said Woodward.

Of course, birds aren't the only animal to rely on our magnetosphere for direction. Species of fish, worms, insects, and even some mammals have a knack for it. We humans might even be cognitively affected by Earth's faint magnetic field.

Evolution of this ability could have delivered a number of vastly different actions based on different physics.

Having evidence that at least one of them connects the weirdness of the quantum world with the behavior of a living thing is enough to force us to wonder what other bits of biology arise from the spooky depths of fundamental physics.

This research was published in PNAS.

Source: Link

Tuesday, 28 December 2021

Large helium nanodroplets splash like water upon surface collisions


While working with helium nanodroplets, scientists at the Department of Ion Physics and Applied Physics led by Fabio Zappa and Paul Scheier have come across a surprising phenomenon: When the ultracold droplets hit a hard surface, they behave like drops of water. Ions with which they were previously doped thus remain protected on impact and are not neutralized.

At the Department of Ion Physics and Applied Physics, Paul Scheier's research group has been using helium nanodroplets to study ions with methods of mass spectrometry for around 15 years. Using a supersonic nozzle, tiny, superfluid helium nanodroplets can be produced with temperatures of less than one degree Kelvin. They can very effectively be doped with atoms and molecules. In the case of ionized droplets, the particles of interest are attached to the charges, which are then measured in the mass spectrometer. During their experiments, the scientists have now stumbled upon an interesting phenomenon that has fundamentally changed their work. "For us, this was a gamechanger," says Fabio Zappa from the nano-bio-physics team. "Everything at our lab is now done with this newly discovered method." The researchers have now published the results of their studies in Physical Review Letters.

A surprising phenomenon


When charged particles are fired at a metal plate, the particles are normally neutralized by the many free electrons on the metal surface. They can then no longer be measured in the mass spectrometer. But when the ions are packed in a helium nanodroplet, they remain protected on impact and fly off in all directions with a few weakly bound helium atoms. "The ions are apparently protected by the helium," Zappa says. He doesn't yet fully understand the underlying mechanism. "But there is some evidence that the helium loses its superfluid property before impact and then behaves like a liquid, splashing away from the surface and only then partially evaporates." Another possible reason could be that the first droplets evaporate at the surface, forming a layer of gas that slows down subsequent droplets and, in this way protects them from evaporation. Only further investigations will show if one of these explanations is correct or if there are other reasons. The fact that this method also works with negative ions, which are normally very fragile, indicates to the scientists a strong effect of the previously unknown phenomenon.

Nanotechnology benefits


With this discovery, Paul Scheier's team not only improved their own measurement method, but also gained important insights for other research groups that, for example, deal with the deposition of nanoparticles on surfaces. "Metal nanoparticles are a great example of this," Scheier recounts. "In many modern technologies, metal nanoparticles are found that have very specific properties." The fact that the generation of such nanofilms can often be very inefficient could also be related to the phenomenon now discovered in Innsbruck.

Reference: 

Paul Martini et al, Splashing of Large Helium Nanodroplets upon Surface Collisions, Physical Review Letters (2021). DOI: 10.1103/PhysRevLett.127.263401