Science News

27 Jan. 2023

Prairie voles have long been heralded as models of monogamy. Now, a study suggests that the “love hormone” once thought essential for their bonding — oxytocin — might not be so necessary after all.

Interest in the romantic lives of prairie voles (Microtus ochrogaster) was first sparked more than 40 years ago, says Devanand Manoli, a biologist at the University of California, San Francisco. Biologists trying to capture voles to study would frequently catch two at a time, because “what they were finding were these male-female pairs,” he says. Unlike many other rodents with their myriad partners, prairie voles, it turned out, mate for life (SN: 10/5/15).

Pair-bonded prairie voles prefer each other’s company over a stranger’s and like to huddle together both in the wild and the lab. Because other vole species don’t have social behaviors as complex as prairie voles do, they have been a popular animal system for studying how social behavior evolves.

Research over the last few decades has implicated a few hormones in the brain as vital for proper vole manners, most notably oxytocin, which is also important for social behavior in humans and other animals.

Manoli and colleagues thought the oxytocin receptor, the protein that detects and reacts to oxytocin, would be the perfect test target for a new genetic engineering method based on CRISPR technology, which uses molecules from bacteria to selectively turn off genes. The researchers used the technique on vole embryos to create animals born without functioning oxytocin receptors. The team figured that the rodents wouldn’t be able to form pair-bonds — just like voles in past experiments whose oxytocin activity was blocked with drugs.

Instead, Manoli says, the researchers got “a big surprise.” The voles could form pair-bonds even without oxytocin, the team reports in the March 15 Neuron.

“I was very surprised by their results,” says Larry Young, a biologist at Emory University in Atlanta, who was not involved with the study but has studied oxytocin in prairie voles for decades.

A key difference between the new study and past studies that used drugs to block oxytocin is the timing of exactly when the hormone’s activity is turned off. With drugs, the voles are adults and have had exposure to oxytocin in their brains before the shutoff. With CRISPR, “these animals are born never experiencing oxytocin signaling in the brain,” says Young, whose research group has recently replicated Manoli’s experiment and found the same result.

It may be, Young says, that pair-bonding is controlled by a brain circuit that typically becomes dependent on oxytocin through exposure to it during development, like a symphony trained by a conductor. Suddenly remove that conductor and the symphony will sound discordant, whereas a jazz band that’s never practiced with a conductor fares just fine without one.

Manoli agrees that the technique’s timing matters. A secondary reason for the disparity, he says, could be that drugs often have off-target effects, such that the chemicals meant to block oxytocin could have been doing other things in the voles’ brains to affect pair-bonding. But Young disagrees. “I don’t believe that,” he says. “The [drug] that people use is very selective,” not even binding to the receptor of oxytocin’s closest molecular relative, vasopressin. 

Does this result mean that decades of past work on pair-bonding has been upended? Not quite.

“It shows us that this is a much more complicated question,” Manoli says. “The pharmacologic manipulations … suggested that [oxytocin] plays a critical role. The question is, what is that role?”

The new seemingly startling result makes sense if you look at the big picture, Manoli says. The ability for voles to pair-bond is “so critical for the survival of the species,” he says. “From a genetics perspective, it may make sense that there isn’t a single point of failure.”

The group now hopes to look at how other hormones, like vasopressin, influence pair-bonding using this relatively new genetic technique. They are also looking more closely at the voles’ behavior to be sure that the CRISPR gene editing didn’t alter it in a way they haven’t noticed yet.

In the game of vole “love,” it looks like we’re still trying to understand all the players.

27 Jan. 2023

As far back as roughly 25,000 years ago, Ice Age hunter-gatherers may have jotted down markings to communicate information about the behavior of their prey, a new study finds.

These markings include dots, lines and the symbol “Y,” and often accompany images of animals. Over the last 150 years, the mysterious depictions, some dating back nearly 40,000 years, have been found in hundreds of caves across Europe.

Some archaeologists have speculated that the markings might relate to keeping track of time, but the specific purpose has remained elusive (SN: 7/9/19). Now, a statistical analysis, published January 5 in Cambridge Archeological Journal, presents evidence that past people may have been recording the mating and birthing schedule of local fauna.

By comparing the marks to the animals’ life cycles, researchers showed that the number of dots or lines in a given image strongly correlates to the month of mating across all the analyzed examples, which included aurochs (an extinct species of wild cattle), bison, horses, mammoth and fish. What’s more, the position of the symbol “Y” in a sequence was predictive of birth month, suggesting that “Y” signifies “to give birth.”

The finding is one of the earliest records of a coherent notational system, the researchers say. It indicates that people at the time were able to interpret the meaning of an item’s position in a sequence and plan ahead for the distant future using a calendar of sorts — reinforcing the suggestion that they were capable of complex cognition.

Based on the position of the “Y” in this line-drawing reproduction, the chamois (a “goat-antelope”) gave birth in the second month after the snowmelt, researchers say.
Based on the position of the “Y” in this line-drawing reproduction, the chamois (a “goat-antelope”) gave birth in the second month after the snowmelt, researchers say.B. Bacon et al/Cambridge Archaeological Journal 2023

“This is a really big deal cognitively,” says Ben Bacon, an independent researcher based in London. “We’re dealing with a system that has intense organization, intense logic to it.”

A furniture conservator by day, Bacon spent years poring through scientific articles to compile over 800 instances of these cave markings. From his research and reading the literature, he reasoned that the dots corresponded to the 13 lunar cycles in a year. But he thought that the hunter-gatherers would’ve been more concerned with seasonal changes than the moon.

In the new paper, he and colleagues argue that rather than pinning a calendar to astronomical events like the equinox, the hunter-gatherers started their calendar year with the snowmelt in the spring. Not only would the snowmelt be a clear point of origin, but the meteorological calendar would also account for differences in timing across locations.

For example, though snowmelt would start on different dates in different latitudes, bison would always mate approximately four lunar cycles — or months — after that region’s snowmelt, as indicated by four dots or lines.

“This is why it’s such a clever system, because it’s based on the universal,” Bacon says. “Which means if you migrate from the Pyrenees to Belgium, you can just use the same calendar.”

He needed data to prove his idea. After compiling the markings, he worked with academic researchers to identify the timing of migration, mating and birth for common Ice Age animals targeted by hunter-gatherers by using archaeological data or comparing with similar modern animals. Next, the researchers determined if the marks aligned significantly with important life events based on this calendar. When the team ran the statistical analysis, the results strongly supported Bacon’s theory.

When explaining the markings, “we’ve argued for notational systems before, but it’s always been fairly speculative as to what the people were counting and why they were counting,” says Brian Hayden, an archaeologist at Simon Fraser University in Burnaby, British Columbia, who peer-reviewed the paper. “This adds a lot more depth and specificity to why people were keeping calendars and how they were using them.”

Linguistic experts argue that, given the lack of conventional syntax and grammar, the marks wouldn’t be considered writing. But that doesn’t make the finding inherently less exciting, says paleoanthropologist Genevieve von Petzinger of the Polytechnic Institute of Tomar in Portugal, who wasn’t involved in the study. Writing systems are often mistakenly considered a pinnacle of achievement, when in fact writing would be developed only in cultural contexts where it’s useful, she says. Instead, it’s significant that the marks provide a way to keep records outside of the mind.

“In a way, that was the huge cognitive leap,” she says. “Suddenly, we have the ability to preserve [information] beyond the moment. We have the ability to transmit it across space and time. Everything starts to change.”

The debate over these marks’ meanings continues. Archaeologist April Nowell doesn’t buy many of the team’s assumptions. “It boggles my mind why one would need a calendar … to predict that animals were going to have offspring in the spring,” says Nowell, of the University of Victoria in British Columbia. “The amount of information that this calendar is providing, if it really is a calendar, is quite minimal.”

Hayden adds that, while the basic pattern would still hold, some of the cave marks had “wiggle room for interpretation.” The next step, he says, will be to review and verify the interpretations of the marks.

27 Jan. 2023

Patricia Hidalgo-Gonzalez saw the future of energy on a broiling-hot day last September.

An email alert hit her inbox from the San Diego Gas & Electric Company. “Extreme heat straining the grid,” read the message, which was also pinged as a text to 27 million people. “Save energy to help avoid power interruptions.”

It worked. People cut their energy use. Demand plunged, blackouts were avoided and California successfully weathered a crisis exacerbated by climate change. “It was very exciting to see,” says Hidalgo-Gonzalez, an electrical engineer at the University of California, San Diego who studies renewable energy and the power grid.

This kind of collective societal response, in which we reshape how we interact with the systems that provide us energy, will be crucial as we figure out how to live on a changing planet.

Earth has warmed at least 1.1 degrees Celsius since the 19th century, when the burning of coal, oil and other fossil fuels began belching heat-trapping gases such as carbon dioxide into the atmosphere. Scientists agree that only drastic action to cut emissions can keep the planet from blasting past 1.5 degrees of warming — a threshold beyond which the consequences become even more catastrophic than the rising sea levels, extreme weather and other impacts the world is already experiencing.

The goal is to achieve what’s known as net-zero emissions, where any greenhouse gases still entering the atmosphere are balanced by those being removed — and to do it as soon as we can.

Scientists say it is possible to swiftly transform the ways we produce and consume energy. To show the way forward, researchers have set out paths toward a world where human activities generate little to no carbon dioxide and other greenhouse gases — a decarbonized economy.

The key to a decarbonized future lies in producing vast amounts of new electricity from sources that emit little to none of the gases, such as wind, solar and hydropower, and then transforming as much of our lives and our industries as possible to run off those sources. Clean electricity needs to power not only the planet’s current energy use but also the increased demands of a growing global population.

Once humankind has switched nearly entirely to clean electricity, we will also have to counter­balance the carbon dioxide we still emit — yes, we will still emit some — by pulling an equivalent amount of carbon dioxide out of the atmosphere and storing it somewhere permanently.

Achieving net-zero emissions won’t be easy. Getting to effective and meaningful action on climate change requires overcoming decades of inertia and denial about the scope and magnitude of the problem. Nations are falling well short of existing pledges to reduce emissions, and global warming remains on track to charge past 1.5 degrees perhaps even by the end of this decade.

Yet there is hope. The rate of growth in CO2 emissions is slowing globally — down from 3 percent annual growth in the 2000s to half a percent annual growth in the last decade, according to the Global Carbon Project, which quantifies greenhouse gas emissions.

There are signs annual emissions could start shrinking. And over the last two years, the United States, by far the biggest cumulative contributor to global warming, has passed several pieces of federal legislation that include financial incentives to accelerate the transition to clean energy. “We’ve never seen anything at this scale,” says Erin Mayfield, an energy researcher at Dartmouth College.

Though the energy transition will require many new technologies, such as innovative ways to permanently remove carbon from the atmosphere, many of the solutions, such as wind and solar power, are in hand — “stuff we already have,” Mayfield says.

The current state of carbon dioxide emissions

Of all the emissions that need to be slashed, the most important is carbon dioxide, which comes from many sources such as cars and trucks and coal-burning power plants. The gas accounted for 79 percent of U.S. greenhouse gas emissions in 2020. The next most significant greenhouse gas, at 11 percent of emissions in the United States, is methane, which comes from oil and gas operations as well as livestock, landfills and other land uses.

The amount of methane may seem small, but it is mighty — over the short term, methane is more than 80 times as efficient at trapping heat as carbon dioxide is, and methane’s atmospheric levels have nearly tripled in the last two centuries. Other greenhouse gases include nitrous oxides, which come from sources such as applying fertilizer to crops or burning fuels and account for 7 percent of U.S. emissions, and human-made fluorinated gases such as hydrofluorocarbons that account for 3 percent.

Globally, emissions are dominated by large nations that produce lots of energy. The United States alone emits around 5 billion metric tons of carbon dioxide each year. It is responsible for most of the greenhouse gas emissions throughout history and ceded the spot for top annual emitter to China only in the mid-2000s. India ranks third.

Because of the United States’ role in producing most of the carbon pollution to date, many researchers and advocates argue that it has the moral responsibility to take the global lead on cutting emissions. And the United States has the most ambitious goals of the major emitters, at least on paper. President Joe Biden has said the country is aiming to reach net-zero emissions by 2050. Leaders in China and India have set net-zero goals of 2060 and 2070, respectively.

Under the auspices of a 2015 international climate change treaty known as the Paris agreement, 193 nations plus the European Union have pledged to reduce their emissions. The agreement aims to keep global warming well below 2 degrees, and ideally to 1.5 degrees, above preindustrial levels. But it is insufficient. Even if all countries cut their emissions as much as they have promised under the Paris agreement, the world would likely blow past 2 degrees of warming before the end of this century. 

Every nation continues to find its own path forward. “At the end of the day, all the solutions are going to be country-specific,” says Sha Yu, an earth scientist at the Pacific Northwest National Laboratory and University of Maryland’s Joint Global Change Research Institute in College Park, Md. “There’s not a universal fix.”

But there are some common themes for how to accomplish this energy transition — ways to focus our efforts on the things that will matter most. These are efforts that go beyond individual consumer choices such as whether to fly less or eat less meat. They instead penetrate every aspect of how society produces and consumes energy.

Such massive changes will need to overcome a lot of resistance, including from companies that make money off old forms of energy as well as politicians and lobbyists. But if society can make these changes, it will rank as one of humanity’s greatest accomplishments. We will have tackled a problem of our own making and conquered it.

Here’s a look at what we’ll need to do.

Make as much clean electricity as possible

To meet the need for energy without putting carbon dioxide into the atmosphere, countries would need to dramatically scale up the amount of clean energy they produce. Fortunately, most of that energy would be generated by technologies we already have — renewable sources of energy including wind and solar power.

“Renewables, far and wide, are the key pillar in any net-zero scenario,” says Mayfield, who worked on an influential 2021 report from Princeton University’s Net-Zero America project, which focused on the U.S. economy.

The Princeton report envisions wind and solar power production roughly quadrupling by 2030 to get the United States to net-zero emissions by 2050. That would mean building many new solar and wind farms, so many that in the most ambitious scenario, wind turbines would cover an area the size of Arkansas, Iowa, Kansas, Missouri, Nebraska and Oklahoma combined.

Such a scale-up is only possible because prices to produce renewable energy have plunged. The cost of wind power has dropped nearly 70 percent, and solar power nearly 90 percent, over the last decade in the United States. “That was a game changer that I don’t know if some people were expecting,” Hidalgo-Gonzalez says.

Globally the price drop in renewables has allowed growth to surge; China, for instance, installed a record 55 gigawatts of solar power capacity in 2021, for a total of 306 gigawatts or nearly 13 percent of the nation’s installed capacity to generate electricity. China is almost certain to have had another record year for solar power installations in 2022.

Challenges include figuring out ways to store and transmit all that extra electricity, and finding locations to build wind and solar power installations that are acceptable to local communities. Other types of low-carbon power, such as hydropower and nuclear power, which comes with its own public resistance, will also likely play a role going forward.

Get efficient and go electric

The drive toward net-zero emissions also requires boosting energy efficiency across industries and electrifying as many aspects of modern life as possible, such as transportation and home heating.

Some industries are already shifting to more efficient methods of production, such as steelmaking in China that incorporates hydrogen-based furnaces that are much cleaner than coal-fired ones, Yu says. In India, simply closing down the most inefficient coal-burning power plants provides the most bang for the buck, says Shayak Sengupta, an energy and policy expert at the Observer Research Foundation America think tank in Washington, D.C. “The list has been made up,” he says, of the plants that should close first, “and that’s been happening.”

To achieve net-zero, the United States would need to increase its share of electric heat pumps, which heat houses much more cleanly than gas- or oil-fired appliances, from around 10 percent in 2020 to as much as 80 percent by 2050, according to the Princeton report. Federal subsidies for these sorts of appliances are rolling out in 2023 as part of the new Inflation Reduction Act, legislation that contains a number of climate-related provisions.

Shifting cars and other vehicles away from burning gasoline to running off of electricity would also lead to significant emissions cuts. In a major 2021 report, the National Academies of Sciences, Engineering and Medicine said that one of the most important moves in decarbonizing the U.S. economy would be having electric vehicles account for half of all new vehicle sales by 2030. That’s not impossible; electric car sales accounted for nearly 6 percent of new sales in the United States in 2022, which is still a low number but nearly double the previous year.

Make clean fuels

Some industries such as manufacturing and transportation can’t be fully electrified using current technologies — battery powered airplanes, for instance, will probably never be feasible for long-duration flights. Technologies that still require liquid fuels will need to switch from gas, oil and other fossil fuels to low-carbon or zero-carbon fuels.

One major player will be fuels extracted from plants and other biomass, which take up carbon dioxide as they grow and emit it when they die, making them essentially carbon neutral over their lifetime. To create biofuels, farmers grow crops, and others process the harvest in conversion facilities into fuels such as hydrogen. Hydrogen, in turn, can be substituted for more carbon-intensive substances in various industrial processes such as making plastics and fertilizers — and maybe even as fuel for airplanes someday.

In one of the Princeton team’s scenarios, the U.S. Midwest and Southeast would become peppered with biomass conversion plants by 2050, so that fuels can be processed close to where crops are grown. Many of the biomass feedstocks could potentially grow alongside food crops or replace other, nonfood crops.

Cut methane and other non-CO2 emissions

Greenhouse gas emissions other than carbon dioxide will also need to be slashed. In the United States, the majority of methane emissions come from livestock, landfills and other agricultural sources, as well as scattered sources such as forest fires and wetlands. But about one-third of U.S. methane emissions come from oil, gas and coal operations. These may be some of the first places that regulators can target for cleanup, especially “super emitters” that can be pinpointed using satellites and other types of remote sensing.

In 2021, the United States and the European Union unveiled what became a global methane pledge endorsed by 150 countries to reduce emissions. There is, however, no enforcement of it yet. And China, the world’s largest methane emitter, has not signed on.

Nitrous oxides could be reduced by improving soil management techniques, and fluorinated gases by finding alternatives and improving production and recycling efforts.

Sop up as much CO2 as possible

Once emissions have been cut as much as possible, reaching net-zero will mean removing and storing an equivalent amount of carbon to what society still emits.

One solution already in use is to capture carbon dioxide produced at power plants and other industrial facilities and store it permanently somewhere, such as deep underground. Globally there are around 35 such operations, which collectively draw down around 45 million tons of carbon dioxide annually. About 200 new plants are on the drawing board to be operating by the end of this decade, according to the International Energy Agency.

The Princeton report envisions carbon capture being added to almost every kind of U.S. industrial plant, from cement production to biomass conversion. Much of the carbon dioxide would be liquefied and piped along more than 100,000 kilometers of new pipelines to deep geologic storage, primarily along the Texas Gulf Coast, where underground reservoirs can be used to trap it permanently. This would be a massive infrastructure effort. Building this pipeline network could cost up to $230 billion, including $13 billion for early buy-in from local communities and permitting alone.

Another way to sop up carbon is to get forests and soils to take up more. That could be accomplished by converting crops that are relatively carbon-intensive, such as corn to be used in ethanol, to energy-rich grasses that can be used for more efficient biofuels, or by turning some cropland or pastures back into forest. It’s even possible to sprinkle crushed rock onto croplands, which accelerates natural weathering processes that suck carbon dioxide out of the atmosphere.

Another way to increase the amount of carbon stored in the land is to reduce the amount of the Amazon rainforest that is cut down each year. “For a few countries like Brazil, preventing deforestation will be the first thing you can do,” Yu says.

When it comes to climate change, there’s no time to waste

The Princeton team estimates that the United States would need to invest at least an additional $2.5 trillion over the next 10 years for the country to have a shot at achieving net-zero emissions by 2050. Congress has begun ramping up funding with two large pieces of federal legislation it passed in 2021 and 2022. Those steer more than $1 trillion toward modernizing major parts of the nation’s economy over a decade — including investing in the energy transition to help fight climate change.

Between now and 2030, solar and wind power, plus increasing energy efficiency, can deliver about half of the emissions reductions needed for this decade, the International Energy Agency estimates. After that, the primary drivers would need to be increasing electrification, carbon capture and storage, and clean fuels such as hydrogen.

The Ivanpah Solar Electric Generating System in the Mojave Desert.
A lot of the technology needed for a future with fewer carbon dioxide emissions is already available. The Ivanpah Solar Electric Generating System in the Mojave Desert focuses sunlight to generate steam. That steam spins turbines to make electricity.ADAMKAZ/E+/GETTY IMAGES

The trick is to do all of this without making people’s lives worse. Developing nations need to be able to supply energy for their economies to develop. Communities whose jobs relied on fossil fuels need to have new economic opportunities.

Julia Haggerty, a geographer at Montana State University in Bozeman who studies communities that are dependent on natural resources, says that those who have money and other resources to support the transition will weather the change better than those who are under-resourced now. “At the landscape of states and regions, it just remains incredibly uneven,” she says.

The ongoing energy transition also faces unanticipated shocks such as Russia’s invasion of Ukraine, which sent energy prices soaring in Europe, and the COVID-19 pandemic, which initially slashed global emissions but later saw them rebound.

But the technologies exist for us to wean our lives off fossil fuels. And we have the inventiveness to develop more as needed. Transforming how we produce and use energy, as rapidly as possible, is a tremendous challenge — but one that we can meet head-on. For Mayfield, getting to net-zero by 2050 is a realistic goal for the United States. “I think it’s possible,” she says. “But it doesn’t mean there’s not a lot more work to be done.”

26 Jan. 2023

Birds that dive underwater — such as penguins, loons and grebes — may be more likely to go extinct than their nondiving kin, a new study finds.

Many water birds have evolved highly specialized bodies and behaviors that facilitate diving. Now, an analysis of the evolutionary history of more than 700 water bird species shows that once a bird group gains the ability to dive, the change is irreversible. That inflexibility could help explain why diving birds have an elevated extinction rate compared with nondiving birds, researchers report in the Dec. 21 Proceedings of the Royal Society B.

“There are substantial morphological adaptations for diving,” says Catherine Sheard, an evolutionary biologist at the University of Bristol in England, who was not involved with the study. For instance, birds that plunge into the water from the air, such as gannets and some pelicans, may have tweaks to the neck muscles and the bones in the chest. 

It’s possible that some diving birds are evolving under an evolutionary “ratchet,” where adaptations to exploit a certain food source or habitat unlock some new opportunities, but also encourage ever more specialized evolutionary tailoring. These birds may become trapped in their ways, increasing their risk of extinction. That’s especially true if their habitat rapidly changes in some negative way, possibly because of human-caused climate change (SN: 1/16/20).

Evolutionary biologists Josh Tyler and Jane Younger investigated the evolution of diving in Aequorlitornithes, a collection of 727 water bird species across 11 bird groups. The team divided species into either nondiving birds, or one of three diving types: foot-propelled pursuit (such as loons and grebes), wing-propelled pursuit (like penguins and auks) and the plunge divers.

Diving has evolved at least 14 separate times in the water birds, but there were no instances where diving birds reverted to a nondiving form, the researchers found.

The scientists also explored the link between diving and the development of new species, or their demise, in various bird lineages. Among 236 diving bird species, 75, or 32 percent, were part of lineages that are experiencing 0.02 more species extinctions per million years than the generation of new species. This elevated extinction rate was more common in the wing-propelled and foot-propelled pursuit divers compared with plunge divers. Bird lineages that don’t dive, on the other hand, generated 0.1 more new species per million years than the rate of species dying out.

“The more specialized you become, the more reliant you are on a particular diet, foraging strategy or environment,” says Tyler, of the University of Bath in England. “The range of environments available for foraging is much larger for the nondiving birds than for the specialist divers, and this may play into their ability to adapt and thrive.”

Within diving bird groups, the less specialized, the better. Take penguins, a group that has become the subject of a fair share of conservation concern (SN: 8/1/18). The researchers point out that gentoo penguins (Pygoscelis papua) — which have a broad diet — have larger population sizes than related chinstrap penguins (P. antarcticus) that eat mostly krill, and may actually be as many as four very recently diverged species. 

The International Union for the Conservation of Nature considers both penguin species to be of “least concern” in terms of imminent extinction risk. But chinstrap numbers are declining in some areas, while gentoo population numbers remain generally stable.

If some diving birds are being trapped in their environments by their own adaptations, that doesn’t bode well for their long-term survival, say Tyler and Younger, who is at the University of Tasmania in Hobart.

According to the IUCN, 156 species, or about one-fifth, of the 727 species of water birds are considered vulnerable, endangered or critically endangered. The researchers calculate that of the 75 diving bird species from lineages with heightened extinction rates, 24 species, or nearly one-third, are already listed as threatened.

25 Jan. 2023

The Arctic today is a hostile place for most primates. But a series of fossils found since the 1970s suggest that wasn’t always the case.

Dozens of fossilized teeth and jaw bones unearthed in northern Canada belonged to two species of early primates — or at least close relatives of primates — that lived in the Arctic around 52 million years ago, researchers report January 25 in PLOS ONE. These remains are the first primate-like fossils ever discovered in the Arctic and tell of a groundhog-sized animal that may have skittered across trees in a swamp that once existed above the Arctic Circle.  

The Arctic was significantly warmer during that time. But creatures still had to adapt to extreme conditions such as long winter months without sunlight. These challenges make the presence of primate-like creatures in the Arctic “incredibly surprising,” says coauthor Chris Beard, a paleontologist at the University of Kansas in Lawrence. “No other primate or primate relative has ever been found this far north so far.”

Between frigid temperatures, limited plant growth and months of perpetual darkness, living in the modern Arctic isn’t easy. This is especially true for primates, which evolved from small, tree-dwelling creatures that largely fed on fruit (SN: 6/5/13). To this day, most primates — humans and few other outliers like Japan’s snow monkeys excepted — tend to stick to tropical and subtropical forests, largely found around the equator.

But these forests haven’t always been confined to their present location. During the early Eocene Epoch, which started around 56 million years ago, the planet underwent a period of intense warming that allowed forests and their warm-loving residents to expand northward (SN: 11/3/15).

Scientists know about this early Arctic climate in part because of decades of paleontological work on Ellesmere Island in northern Canada. These digs revealed that the area was once dominated by swamps not unlike those found in the southeastern United States today. This ancient, warm, wet Arctic environment was home to a wide array of heat-loving animals, including giant tapirs and crocodile relatives.

An illustration of a reddish-brown, groundhog-sized early primate clinging to the side of a tree.
A groundhog-sized early primate, Ignacius dawsonae, that lived during the Eocene evolved special teeth and strong jaws to survive the pervasive winter darkness above the Arctic Circle.Kristen Miller/Biodiversity Institute/Univ. of Kansas (CC-BY 4.0)

For the new study, Beard and his colleagues examined dozens of teeth and jawbone fossils found in the area, concluding that they belong to two species, Ignacius mckennai and Ignacius dawsonae. These two species belonged to a now-extinct genus of small mammals that was widespread across North America during the Eocene. The Arctic variants probably made their way north as the planet warmed, taking advantage of the new habitat opening up near the poles.

Scientists have long debated whether this lineage can be considered true primates or whether they were simply close relatives. Regardless, it’s still “really weird and unexpected” to find primates or their relatives in the area, says Mary Silcox, a vertebrate paleontologist at the University of Toronto Scarborough.

For one thing, Ellesmere Island was already north of the Arctic Circle 52 million years ago. So while conditions may have been warmer and wetter, the swamp was plunged into continuous darkness during the winter months.

Newly arrived Ignacius would have had to adapt to these conditions. Unlike their southern kin, the Arctic Ignacius had unusually strong jaws and teeth suited to eating hard foods, the researchers found. This may have helped these early primates feed on nuts and seeds over the winter, when fruit wasn’t as readily available.

This research can shed light on how animals can adapt to live in extreme conditions. “Ellesmere Island is arguably the best deep time analog for a mild, ice-free Arctic,” says Jaelyn Eberle, a vertebrate paleontologist at the University of Colorado Boulder.

Studying how plants and animals adapted to this remarkable period in Arctic history, Beard says, could offer clues to the Arctic’s future residents.

25 Jan. 2023

Shape-shifting liquid metal robots might not be limited to science fiction anymore.

Miniature machines can switch from solid to liquid and back again to squeeze into tight spaces and perform tasks like soldering a circuit board, researchers report January 25 in Matter.

This phase-shifting property, which can be controlled remotely with a magnetic field, is thanks to the metal gallium. Researchers embedded the metal with magnetic particles to direct the metal’s movements with magnets. This new material could help scientists develop soft, flexible robots that can shimmy through narrow passages and be guided externally.  

Scientists have been developing magnetically controlled soft robots for years. Most existing materials for these bots are made of either stretchy but solid materials, which can’t pass through the narrowest of spaces, or magnetic liquids, which are fluid but unable to carry heavy objects (SN: 7/18/19).

In the new study, researchers blended both approaches after finding inspiration from nature (SN: 3/3/21). Sea cucumbers, for instance, “can very rapidly and reversibly change their stiffness,” says mechanical engineer Carmel Majidi of Carnegie Mellon University in Pittsburgh. “The challenge for us as engineers is to mimic that in the soft materials systems.”

So the team turned to gallium, a metal that melts at about 30° Celsius — slightly above room temperature. Rather than connecting a heater to a chunk of the metal to change its state, the researchers expose it to a rapidly changing magnetic field to liquefy it. The alternating magnetic field generates electricity within the gallium, causing it to heat up and melt. The material resolidifies when left to cool to room temperature.

Since magnetic particles are sprinkled throughout the gallium, a permanent magnet can drag it around. In solid form, a magnet can move the material at a speed of about 1.5 meters per second. The upgraded gallium can also carry about 10,000 times its weight.

External magnets can still manipulate the liquid form, making it stretch, split and merge. But controlling the fluid’s movement is more challenging, because the particles in the gallium can freely rotate and have unaligned magnetic poles as a result of melting. Because of their various orientations, the particles move in different directions in response to a magnet.

Majidi and colleagues tested their strategy in tiny machines that performed different tasks. In a demonstration straight out of the movie Terminator 2, a toy person escaped a jail cell by melting through the bars and resolidifying in its original form using a mold placed just outside the bars.

On the more practical side, one machine removed a small ball from a model human stomach by melting slightly to wrap itself around the foreign object before exiting the organ. But gallium on its own would turn to goo inside a real human body, since the metal is a liquid at body temperature, about 37° C. A few more metals, such as bismuth and tin, would be added to the gallium in biomedical applications to raise the material’s melting point, the authors say. In another demonstration, the material liquefied and rehardened to solder a circuit board.

With the help of variable and permanent magnets, researchers turned chunks of gallium into shape-shifting devices. In the first clip, a toy figure escapes its jail cell by liquefying, gliding through the bars and resolidifying using a mold placed just outside the bars. In the second clip, one device removes a ball from a model human stomach by melting slightly to wrap itself around the foreign object and exiting the organ.

Although this phase-shifting material is a big step in the field, questions remain about its biomedical applications, says biomedical engineer Amir Jafari of the University of North Texas in Denton, who was not involved in the work. One big challenge, he says, is precisely controlling magnetic forces inside the human body that are generated from an external device.

“It’s a compelling tool,” says robotics engineer Nicholas Bira of Harvard University, who was also not involved in the study. But, he adds, scientists who study soft robotics are constantly creating new materials.

“The true innovation to come lies in combining these different innovative materials.”

25 Jan. 2023

The worst procrastinators probably won’t be able to read this story. It’ll remind them of what they’re trying to avoid, psychologist Piers Steel says.

Maybe they’re dragging their feet going to the gym. Maybe they haven’t gotten around to their New Year’s resolutions. Maybe they’re waiting just one more day to study for that test.

Procrastination is “putting off to later what you know you should be doing now,” even if you’ll be worse off, says Steel, of the University of Calgary in Canada. But all those tasks pushed to tomorrow seem to wedge themselves into the mind — and it may be harming people’s health.

In a study of thousands of university students, scientists linked procrastination to a panoply of poor outcomes, including depression, anxiety and even disabling arm pain. “I was surprised when I saw that one,” says Fred Johansson, a clinical psychologist at Sophiahemmet University in Stockholm. His team reported the results January 4 in JAMA Network Open.

The study is one of the largest yet to tackle procrastination’s ties to health. Its results echo findings from earlier studies that have gone largely ignored, says Fuschia Sirois, a behavioral scientist at Durham University in England, who was not involved with the new research.

For years, scientists didn’t seem to view procrastination as something serious, she says. The new study could change that. “It’s that kind of big splash that’s … going to get attention,” Sirois says. “I’m hoping that it will raise awareness of the physical health consequences of procrastination.”

Procrastinating may be bad for the mind and body

Whether procrastination harms health can seem like a chicken-and-egg situation.

It can be hard to tell if certain health problems make people more likely to procrastinate — or the other way around, Johansson says. (It may be a bit of both.) And controlled experiments on procrastination aren’t easy to do: You can’t just tell a study participant to become a procrastinator and wait and see if their health changes, he says.

Many previous studies have relied on self-reported surveys taken at a single time point. But a snapshot of someone makes it tricky to untangle cause and effect. Instead, in the new study, about 3,500 students were followed over nine months, so researchers could track whether procrastinating students later developed health issues.

On average, these students tended to fare worse over time than their prompter peers. They were slightly more stressed, anxious, depressed and sleep-deprived, among other issues, Johansson and colleagues found. “People who score higher on procrastination to begin with … are at greater risk of developing both physical and psychological problems later on,” says study coauthor Alexander Rozental, a clinical psychologist at Uppsala University in Sweden. “There is a relationship between procrastination at one time point and having these negative outcomes at the later point.”

The study was observational, so the team can’t say for sure that procrastination causes poor health. But results from other researchers also seem to point in this direction. A 2021 study tied procrastinating at bedtime to depression. And a 2015 study from Sirois’ lab linked procrastinating to poor heart health.

Stress may be to blame for procrastination’s ill effects, data from Sirois’ lab and other studies suggest. She thinks that the effects of chronic procrastinating could build up over time. And though procrastination alone may not cause disease, Sirois says, it could be “one extra factor that can tip the scales.”

No, procrastinators are not lazy

Some 20 percent of adults are estimated to be chronic procrastinators. Everyone might put off a task or two, but chronic procrastinators make it their lifestyle, says Joseph Ferrari, a psychologist at DePaul University in Chicago, who has been studying procrastination for decades. “They do it at home, at school, at work and in their relationships.” These are the people, he says, who “you know are going to RSVP late.”

Though procrastinators may think they perform better under pressure, Ferrari has reported the opposite. They actually worked more slowly and made more errors than non-procrastinators, his experiments have shown. And when deadlines are slippery, procrastinators tend to let their work slide, Steel’s team reported last year in Frontiers in Psychology

For years, researchers have focused on the personalities of people who procrastinate. Findings vary, but some scientists suggest procrastinators may be impulsive, worriers and have trouble regulating their emotions. One thing procrastinators are not, Ferrari emphasizes, is lazy. They’re actually “very busy doing other things than what they’re supposed to be doing,” he says.

In fact, Rozental adds, most research today suggests procrastination is a behavioral pattern.

And if procrastination is a behavior, he says, that means it’s something you can change, regardless of whether you’re impulsive.

Why procrastinators should be kind to themselves

When people put off a tough task, they feel good — in the moment.

“You made a mistake and procrastinated. It’s not the end of the world…. What can you do to move forward?

Behavioral scientist Fuschia Sirois, Durham University

Procrastinating is a way to sidestep the negative emotions linked to the task, Sirois says. “We’re sort of hardwired to avoid anything painful or difficult,” she says. “When you procrastinate, you get immediate relief.” A backdrop of stressful circumstances — say, a worldwide pandemic — can strain people’s ability to cope, making procrastinating even easier. But the relief it provides is only temporary, and many seek out ways to stop dawdling.

Researchers have experimented with procrastination treatments that run the gamut from the logistical to the psychological. What works best is still under investigation. Some scientists have reported success with time-management interventions. But the evidence for that “is all over the map,” Sirois says. That’s because “poor time management is a symptom not a cause of procrastination,” she adds.

For some procrastinators, seemingly obvious tips can work. In his clinical practice, Rozental advises students to simply put down their smartphones. Silencing notifications or studying in the library rather than at home can quash distractions and keep people on task. But that won’t be enough for many people, he says.

Hard-core procrastinators may benefit from cognitive behavioral therapy. In a 2018 review of procrastination treatments, Rozental found that this type of therapy, which involves managing thoughts and emotions and trying to change behavior, seemed to be the most helpful. Still, not many studies have examined treatments, and there’s room for improvement, he says.

Sirois also favors an emotion-centered approach. Procrastinators can fall into a shame spiral where they feel uneasy about a task, put the task off, feel ashamed for putting it off and then feel even worse than when they started. People need to short-circuit that loop, she says. Self-forgiveness may help, scientists suggested in one 2020 study. So could mindfulness training.

In a small trial of university students, eight weekly mindfulness sessions reduced procrastination, Sirois and colleagues reported in the January Learning and Individual Differences. Students practiced focusing on the body, meditating during unpleasant activities and discussed the best way to take care of themselves. A little self-compassion may snap people out of their spiral, Sirois says.

“You made a mistake and procrastinated. It’s not the end of the world,” she says. “What can you do to move forward?”

24 Jan. 2023

SEATTLE — Luke Skywalker’s home planet in Star Wars is the stuff of science fiction. But Tatooine-like planets in orbit around pairs of stars might be our best bet in the search for habitable planets beyond our solar system.

Many stars in the universe come in pairs. And lots of those should have planets orbiting them (SN: 10/25/21). That means there could be many more planets orbiting around binaries than around solitary stars like ours. But until now, no one had a clear idea about whether those planets’ environments could be conducive to life. New computer simulations suggest that, in many cases, life could imitate art.

Earthlike planets orbiting some configurations of binary stars can stay in stable orbits for at least a billion years, researchers reported January 11 at the American Astronomical Society meeting. That sort of stability, the researchers propose, would be enough to potentially allow life to develop, provided the planets aren’t too hot or cold.

Of the planets that stuck around, about 15 percent stayed in their habitable zone — a temperate region around their stars where water could stay liquid — most or even all of the time.

The researchers ran simulations of 4,000 configurations of binary stars, each with an Earthlike planet in orbit around them. The team varied things like the relative masses of the stars, the sizes and shapes of the stars’ orbits around each other, and the size of the planet’s orbit around the binary pair.

The scientists then tracked the motion of the planets for up to a billion years of simulated time to see if the planets would stay in orbit over the sorts of timescales that might allow life to emerge.

A planet orbiting binary stars can get kicked out of the star system due to complicated interactions between the planet and stars. In the new study, the researchers found that, for planets with large orbits around star pairs, only about 1 out of 8 were kicked out of the system. The rest were stable enough to continue to orbit for the full billion years. About 1 in 10 settled in their habitable zones and stayed there.

Of the 4,000 planets that the team simulated, roughly 500 maintained stable orbits that kept them in their habitable zones at least 80 percent of the time.

“The habitable zone . . . as I’ve characterized it so far, spans from freezing to boiling,” said Michael Pedowitz, an undergraduate student at the College of New Jersey in Ewing who presented the research. Their definition is overly strict, he said, because they chose to model Earthlike planets without atmospheres or oceans. That’s simpler to simulate, but it also allows temperatures to fluctuate wildly on a planet as it orbits.

“An atmosphere and oceans would smooth over temperature variations fairly well,” says study coauthor Mariah MacDonald, an astrobiologist also at the College of New Jersey. An abundance of air and water would potentially allow a planet to maintain habitable conditions, even if it spent more of its time outside of the nominal habitable zone around a binary star system.

The number of potentially habitable planets “will increase once we add atmospheres,” MacDonald says, “but I can’t yet say by how much.”

She and Pedowitz hope to build more sophisticated models in the coming months, as well as extend their simulations beyond a billion years and include changes in the stars that can affect conditions in a solar system as it ages.

The possibility of stable and habitable planets in binary star systems is a timely issue says Penn State astrophysicist Jason Wright, who was not involved in the study.

“At the time Star Wars came out,” he says, “we didn’t know of any planets outside the solar system, and wouldn’t for 15 years. Now we know that there are many and that they orbit these binary stars.”

These simulations of planets orbiting binaries could serve as a guide for future experiments, Wright says. “This is an under-explored population of planets. There’s no reason we can’t go after them, and studies like this are presumably showing us that it’s worthwhile to try.”

24 Jan. 2023

A 120-million-year-old fossil bird found in China could offer some new clues about how landbound dinosaurs evolved into today’s flying birds. The dove-sized Cratonavis zhui sported a dinosaur-like head atop a body similar to those of today’s birds, researchers report in the January Nature Ecology & Evolution.

The flattened specimen came from the Jiufotang Formation, an ancient body of rock in northeastern China that is a hotbed for preserved feathered dinosaurs and archaic birds. CT scans revealed that Cratonavis had a skull that was nearly identical (albeit smaller) as those of theropod dinosaurs like Tyrannosaurus rex, paleontologist Li Zhiheng of the Chinese Academy of Sciences in Beijing and colleagues report. This means that Cratonavis still hadn’t evolved the mobile upper jaw found in modern birds (SN: 5/2/18).

A digital reconstruction from CT scans shows the flattened Cratonavis specimen.
Researchers used CT scans to digitally reconstruct the flattened Cratonavis specimen (shown). The scans revealed that the creature had a theropod’s head and a bird’s body.Wang Min

It’s among just a handful of specimens that belong to a recently identified group of intermediate birds known as the jinguofortisids, says Luis Chiappe, a paleontologist at the Natural History Museum of Los Angeles County who was not involved in the study. Its dino-bird mishmash “is not unexpected.” Most birds discovered from the Age of Dinosaurs exhibited more primitive, toothed heads than today’s birds, he says. But the new find “builds on our understanding of this primitive group of birds that are at the base of the tree of birds.”

Cratonavis also had an unusually elongated scapula and hallux, or backward-facing toe. Rarely seen in Cretaceous birds, enlarged shoulder blades might have compensated for the bird’s otherwise underwhelming flight mechanics, the researchers say. And that hefty big toe? It bucks the trend of shrinking metatarsals seen as birds continued to evolve. Cratonavis might have used this impressive digit to hunt like today’s birds of prey, Li’s team says.

Filling those shoes may have been too big of a job for Cratonavis, though. Given its size, Chiappe says, the dino-headed bird would have most likely been a petite hunter, taking down the likes of beetles, grasshoppers and the occasional lizard rather than terrorizing the skies.

23 Jan. 2023

No backside, no problem for some young sea spiders.

The creatures can regenerate nearly complete parts of their bottom halves — including muscles, reproductive organs and the anus — or make do without them, researchers report January 23 in Proceedings of the National Academy of Sciences.

The ability to regrow body parts isn’t super common, but some species manage to pull it off. Some sea slug heads can craft an entirely new body (SN: 3/8/21). Sea spiders and some other arthropods — a group of invertebrates with an exoskeleton — can regrow parts of their legs. But researchers thought new legs were the extent of any arthropod’s powers, perhaps because tough exteriors somehow stop them from regenerating other body parts.

  1. A microscope image of a juvenile sea spider with the last quarter of its body, including two legs and the anal tubercle, were amputated.
  2. A microscope image of a juvenile sea spider after the first molt shown as short stubs attached to a new body segment at the animal’s back end.
  3. A microscope image of a juvenile sea spider as its new anal tubercle and legs start taking shape.
  4. A microscope image of a juvenile sea spider with its anal tubercle and legs fully reformed.

A mishap first clued evolutionary biologist Georg Brenneis in that sea spiders (Pycnogonum litorale) might be able handle more complex repairs too. He accidentally injured one young specimen that he was working on in the lab with forceps. “It wasn’t dead, it was moving, so I just kept it,” says Brenneis, of the University of Vienna. Several months later, the sea spider had an extra leg instead of a scar, he and evolutionary biologist Gerhard Scholtz of Humbolt University of Berlin reported in 2016 in The Science of Nature.

In the new study, most of the 19 young spiders recovered and regrew missing muscles and other parts of their lower halves after amputation, though the regeneration wasn’t always perfect. Some juveniles sported six or seven legs instead of eight.

None of four adults regenerated. That may be because adults no longer shed their skin as they grow, suggesting that regeneration and molting are somehow linked, Brenneis says. Two young sea spiders also didn’t regenerate at all. The animals survived with only four legs and without an anus. Instead of pooping, the pair regurgitated waste out of their mouths.

  1. A microscope image of a young sea spider with three small stubs at the bottom of its body.
  2. A microscope image of the same young sea spider without the three stubs.
  3. A microscope image of the young sea spider with four legs.
  4. A microscope image of a young sea spider with four legs spread out.

Next up is figuring out whether other arthropods also regenerate more than scientists thought, and how sea spiders do it, Brenneis says. “I would like to see how it works.”

23 Jan. 2023

Our planet may have had a recent change of heart.

Earth’s inner core may have temporarily stopped rotating relative to the mantle and surface, researchers report in the January 23 Nature Geoscience. Now, the direction of the inner core’s rotation may be reversing — part of what could be a roughly 70-year-long cycle that may influence the length of Earth’s days and its magnetic field — though some researchers are skeptical.

“We see strong evidence that the inner core has been rotating faster than the surface, [but] by around 2009 it nearly stopped,” says geophysicist Xiaodong Song of Peking University in Beijing. “Now it is gradually mov[ing] in the opposite direction.”

Such a profound turnaround might sound bizarre, but Earth is volatile (SN: 1/13/21). Bore through the ever-shifting crust and you’ll enter the titanic mantle, where behemoth masses of rock flow viscously over spans of millions of years, sometimes upwelling to excoriate the overlying crust (SN: 1/11/17, SN: 3/2/17, SN: 2/4/21). Delve deeper and you’ll reach Earth’s liquid outer core. Here, circulating currents of molten metals conjure our planet’s magnetic field (SN: 9/4/15). And at the heart of that melt, you’ll find a revolving, solid metal ball about 70 percent as wide as the moon.

This is the inner core (SN: 1/28/19). Studies have suggested that this solid heart may rotate within the liquid outer core, compelled by the outer core’s magnetic torque. Researchers have also argued the mantle’s immense gravitational pull may apply an erratic brake on the inner core’s rotation, causing it to oscillate.  

Evidence for the inner core’s fluctuating rotation first emerged in 1996. Geophysicist Paul Richards of Columbia University’s Lamont-Doherty Earth Observatory in Palisades, N.Y., and Song, then also at Lamont-Doherty, reported that over a span of three decades, seismic waves from earthquakes took different amounts of time to traverse Earth’s solid heart.

The researchers inferred that the inner core rotates at a different speed than the mantle and crust, causing the time differences. The planet spins roughly 360 degrees in a day. Based on their calculations, the researchers estimated that the inner core, on average, rotates about 1 degree per year faster than the rest of Earth.

But other researchers have questioned that conclusion, some suggesting that the core spins slower than Song and Richards’ estimate or doesn’t spin differently at all.  

In the new study, while analyzing global seismic data stretching back to the 1990s, Song and geophysicist Yi Yang — also at Peking University — made a surprising observation.

Before 2009, seismic waves generated by sequences and pairs of repeating earthquakes — known as multiplets and doublets — traveled at different rates through the inner core. This indicated the waves from recurring quakes were crossing different parts of the inner core, and that the inner core was rotating at a different pace than the rest of Earth, aligning with Song’s previous research.

But around 2009, the differences in travel times vanished. That suggested the inner core had ceased rotating with respect to the mantle and crust, Yang says. After 2009, these differences returned, but the researchers inferred that the waves were crossing parts of the inner core that suggested it was now rotating in the opposite direction relative to the rest of Earth.

The researchers then pored over records of Alaskan earthquake doublets dating to 1964. While the inner core appeared to rotate steadily for most of that time, it seems to have made another reversal in rotation in the early 1970s, the researchers say.

Song and Yang infer that the inner core may oscillate with a roughly 70-year periodicity — switching directions every 35 years or so. Because the inner core is gravitationally linked to the mantle and magnetically linked to the outer core, the researchers say these oscillations could explain known 60- to 70-year variations in the length of Earth’s days and the behavior of the planet’s magnetic field. However, more work is needed to pin down what mechanisms might be responsible.

But not all researchers are on board. Yang and Song “identif[y] this recent 10-year period [that] has less activity than before, and I think that’s probably reliable,” says geophysicist John Vidale of the University of Southern California in Los Angeles, who was not involved in the research. But beyond that, Vidale says, things get contentious.

In 2022, he and a colleague reported that seismic waves from nuclear tests show the inner core may reverse its rotation every three years or so. Meanwhile, other researchers have proposed that the inner core isn’t moving at all. Instead, they say, changes to the shape of the inner core’s surface could explain the differences in wave travel times.

Future observations will probably help disentangle the discrepancies between these studies, Vidale says. For now, he’s unruffled by the purported chthonic standstill. “In all likelihood, it’s irrelevant to life on the surface, but we don’t actually know what’s happening,” he says. “It’s incumbent on us to figure it out.”

23 Jan. 2023

A crucial link in the life cycle of one parasitic plant may be found in a surprising place — the bellies of the descendants of an ancient line of rabbits.

Given their propensity for nibbling on gardens and darting across suburban lawns, it can be easy to forget that rabbits are wild animals. But a living reminder of their wildness can be found on two of Japan’s Ryukyu Islands, if you have the patience to look for it: the endangered Amami rabbit, a “living fossil” that looks strikingly similar to ancient Asian rabbits.

One estimate suggests there are fewer than 5,000 of the animals left in the wild. The lives of Amamis (Pentalagus furnessi) are shrouded in mystery due to their rarity, but they seem to play a surprising ecological role as seed dispersers, researchers report January 23 in Ecology.

Seed dispersal is the main point in a plant’s life cycle when it can move to a new location (SN: 11/14/22). So dispersal is crucially important for understanding how plant populations are maintained and how species will respond to climate change, says Haldre Rogers, a biologist at Virginia Tech in Blacksburg, who was not involved with the study. Despite this, seed dispersal hasn’t received much attention, she says. “We don’t know what disperses the seeds of most plants in the world.”

Locals from the Ryukyu Islands were the first to notice that the “iconic yet endangered” Amami rabbit was nibbling on the fruit of another local species, the plant Balanophora yuwanensis, says Kenji Suetsugu, a biologist at Kobe University in Japan.

Rabbits generally like to eat vegetative tissue from plants, like leaves and stems, and so haven’t been thought to contribute much to spreading seeds, which are often housed in fleshy fruits.

To confirm what the locals reported, Suetsugu and graduate student Hiromu Hashiwaki set up camera traps around the island to catch the rabbits in the act. The researchers were able to record rabbits munching on Balanophora fruits 11 times, but still needed to check whether the seeds survived their trip through the bunny tummies.

A night photo of an Amami rabbit munching on the fruit of a parasitic plant.
A camera trap captured this Amami rabbit munching on the fruit of the parasitic plant Balanophora yuwanensis.Kenji Suetsugu and Hiromu Hashiwaki

So the team headed out to the subtropical islands and scooped up rabbit poop, finding Balanophora seeds inside that could still be grown. By swallowing the seeds and pooping them out elsewhere, the Amami rabbits were clearly acting as seed dispersers.

Balanophora plants are parasitic and don’t have chlorophyll, so they can’t use photosynthesis to make food of their own (SN: 3/2/17). Instead, they suck energy away from a host plant. This means where their seeds end up matters, and the Amami rabbits “may facilitate the placement of seeds near the roots of a compatible host” by pooping in underground burrows, Suetsugu says. “Thus, the rabbits likely provide a crucial link between Balanophora and its hosts” that remains to be further explored, he says.

Understanding the ecology of an endangered species like the Amami rabbit can help with conserving both it and the plants that depend on it.

An animal need not be in obvious peril for a change in its number to affect seed dispersal, with potentially negative consequences for the ecosystem. For example, “we think of robins as super common … but they’ve declined a lot in the last 50 years,” Rogers says. “Half as many robins means half as many seeds are getting moved around, even though no one’s worried about robins as a conservation issue.”

23 Jan. 2023
covers of the November 19, 2022 & December 3, 2022 issues

In full swing

The swaying feeling in jazz music that compels feet to tap may arise from near-imperceptible delays in musicians’ timing, Nikk Ogasa reported in “Jazz gets its swing from small, subtle delays” (SN: 11/19/22, p. 5).

Reader Oda Lisa, a self-described intermediate saxophonist, has noticed these subtle delays while playing.“I recorded my ‘jazzy’ version of a beloved Christmas carol, which I sent to a friend of mine,” Lisa wrote. “She praised my effort overall, but she suggested that I get a metronome because the timing wasn’t consistent. My response was that I’m a slave to the rhythm that I hear in my head. I think now I know why.”

On the same page

Murky definitions and measurements impede social science research, Sujata Gupta reported in “Fuzzy definitions mar social science” (SN: 11/19/22, p. 10).

Reader Linda Ferrazzara found the story thought-provoking. “If there’s no consensus on the terms people use … then there can be no productive discussion or conversation. People end up talking and working at cross-purposes with no mutual understanding or progress,” Ferrazzara wrote.

Fly me to the moon

Space agencies are preparing to send the next generation of astronauts to the moon and beyond. Those crews will be more diverse in background and expertise than the crews of the Apollo missions, Lisa Grossman reported in “Who gets to go to space?” (SN: 12/3/22, p. 20).

“It is great to see a broader recognition of the work being done to make spaceflight open to more people,” reader John Allen wrote. “Future space travel will and must accommodate a population that represents humanity. It won’t be easy, but it will be done.”

The story also reminded Allen of the Gallaudet Eleven, a group of deaf adults who participated in research done by NASA and the U.S. Navy in the 1950s and ’60s. Experiments tested how the volunteers responded (or didn’t) to a range of scenarios that would typically induce motion sickness, such as a ferry ride on choppy seas. Studying how the body’s sensory systems work without the usual gravitational cues from the inner ear allowed scientists to better understand motion sickness and the human body’s adaptation to spaceflight.

Sweet dreams are made of this

A memory-enhancing method that uses sound cues may boost an established treatment for debilitating nightmares, Jackie Rocheleau reported in “L­earning trick puts nightmares to bed” (SN: 12/3/22, p. 11).

Reader Helen Leaver shared her trick to a good night’s sleep: “I learned that I was having strong unpleasant adventures while sleeping, and I would awaken hot and sweaty. By eliminating the amount of heat from bedding and an electrically heated mattress pad, I now sleep well without those nightmares.”

Pest perspectives

In “Why do we hate pests?” (SN: 12/3/22, p. 26), Deborah Balthazar interviewed former Science News Explores staff writer Bethany Brookshire about her new book, Pests. The book argues that humans — influenced by culture, class, colonization and much more — create animal villains.

The article prompted reader Doug Clapp to reflect on what he considers pests or weeds. “A weed is a plant in the wrong place, and a pest is an animal in the wrong place,” Clapp wrote. But what’s considered “wrong” depends on the humans who have power over the place, he noted. “Grass in a lawn can be a fine thing. Grass in a garden choking the vegetables I’m trying to grow becomes a weed. Mice in the wild don’t bother me. Field mice migrating into my house when the weather cools become a pest, especially when they eat into my food and leave feces behind,” Clapp wrote.

The article encouraged Clapp to look at pests through a societal lens: “I had never thought of pests in terms of high-class or low-class. Likewise, the residual implications of [colonization]. Thanks for provoking me to consider some of these issues in a broader context.”

23 Jan. 2023

More than a century ago, scientists proved that carbon dioxide in Earth’s atmosphere could act like a thermostat — adding more CO2 would turn up the heat, removing it would chill the planet. But back then, most scientists thought that Earth’s climate system was far too large and stable to change quickly, that any fluctuations would happen over such a long timescale that it wouldn’t matter much to everyday life (SN: 3/12/22, p. 16).

Now all it takes is a look at the Weather Channel to know how wrong scientists were. Things are changing fast. Last year alone, Europe, South Asia, China, Japan and the American West endured deadly, record-breaking heat waves (SN: 12/17/22 & 12/31/22, p. 38). As I write this, torrential rains are bringing death and destruction to California. And with levels of climate-warming gases continuing to increase in the atmosphere, extreme weather events will become even more frequent.

Given the vastness of this threat, it’s tempting to think that any efforts that we make against it will be futile. But that’s not true. Around the world, scientists and engineers; entrepreneurs and large corporations; state, national and local governments; and international coalitions are acting to put the brakes on climate change. Last year, the United States signed into law a $369 billion investment in renewable energy technologies and other responses (SN: 12/17/22 & 12/31/22, p. 28). And the World Bank invested $31.7 billion to assist other countries.

In this issue, contributing correspondent Alexandra Witze details the paths forward: which responses will help the most, and which remain challenging. Shifting to renewable energy sources like wind and solar should be the easiest. We already have the technology, and costs have plunged over the last decade. Other approaches that are feasible but not as far along include making industrial processes more energy efficient, trapping greenhouse gases and developing clean fuels. Ultimately, the goal is to reinvent the global energy infrastructure. Societies have been retooling energy infrastructures for centuries, from water and steam power to petroleum and natural gas to nuclear power and now renewables. This next transformation will be the biggest yet. But we have the scientific understanding and technological savvy to make it happen.

This cover story kicks off a new series for Science News, The Climate Fix. In future issues, we will focus on covering solutions to the climate crisis, including the science behind innovations, the people making them happen, and the social and environmental impacts. You’ll also see expanded climate coverage for our younger readers, ages 9 and up, at Science News Explores online and in print.

With this issue, we also welcome our new publisher, Michael Gordon Voss. He comes to us with deep knowledge of the media industry, experience in both for-profit and nonprofit publishing and a love of science. Before joining Science News Media Group, Voss was publisher of Stanford Social Innovation Review, and vice president and associate publisher at Scientific American. With his arrival, publisher Maya Ajmera takes on her new role as executive publisher. Under her leadership, we have seen unprecedented growth. We’re fortunate to have these two visionaries directing our business strategy amid a rapidly changing media environment.

20 Jan. 2023

In Appalachia’s coal country, researchers envision turning toxic waste into treasure. The pollution left behind by abandoned mines is an untapped source of rare earth elements.

Rare earths are a valuable set of 17 elements needed to make everything from smartphones and electric vehicles to fluorescent bulbs and lasers. With global demand skyrocketing and China having a near-monopoly on rare earth production — the United States has only one active mine — there’s a lot of interest in finding alternative sources, such as ramping up recycling.

Pulling rare earths from coal waste offers a two-for-one deal: By retrieving the metals, you also help clean up the pollution.

Long after a coal mine closes, it can leave a dirty legacy. When some of the rock left over from mining is exposed to air and water, sulfuric acid forms and pulls heavy metals from the rock. This acidic soup can pollute waterways and harm wildlife.

Recovering rare earths from what’s called acid mine drainage won’t single-handedly satisfy rising demand for the metals, acknowledges Paul Ziemkiewicz, director of the West Virginia Water Research Institute in Morgantown. But he points to several benefits.

Unlike ore dug from typical rare earth mines, the drainage is rich with the most-needed rare earth elements. Plus, extraction from acid mine drainage also doesn’t generate the radioactive waste that’s typically a by-product of rare earth mines, which often contain uranium and thorium alongside the rare earths. And from a practical standpoint, existing facilities to treat acid mine drainage could be used to collect the rare earths for processing. “Theoretically, you could start producing tomorrow,” Ziemkiewicz says.

From a few hundred sites already treating acid mine drainage, nearly 600 metric tons of rare earth elements and cobalt — another in-demand metal — could be produced annually, Ziemkiewicz and colleagues estimate.

Currently, a pilot project in West Virginia is taking material recovered from an acid mine drainage treatment site and extracting and concentrating the rare earths.

If such a scheme proves feasible, Ziemkiewicz envisions a future in which cleanup sites send their rare earth hauls to a central facility to be processed, and the elements separated. Economic analyses suggest this wouldn’t be a get-rich scheme. But, he says, it could be enough to cover the costs of treating the acid mine drainage.

20 Jan. 2023

Our modern lives depend on rare earth elements, and someday soon we may not have enough to meet growing demand.

Because of their special properties, these 17 metallic elements are crucial ingredients in computer screens, cell phones and other electronics, compact fluorescent lamps, medical imaging machines, lasers, fiber optics, pigments, polishing powders, industrial catalysts – the list goes on and on (SN Online: 1/16/23). Notably rare earths are an essential part of the high-powered magnets and rechargeable batteries in the electric vehicles and renewable energy technologies needed to get the world to a low- or zero-carbon future.

In 2021, the world mined 280,000 metric tons of rare earths — roughly 32 times as much as was mined in the mid-1950s. And demand is only going to increase. By 2040, experts estimate, we’ll need up to seven times as much rare earths as we do today.

Satisfying that appetite won’t be easy. Rare earth elements are not found in concentrated deposits. Miners must excavate huge amounts of ore, subject it to physical and chemical processes to concentrate the rare earths, and then separate them. The transformation is energy intensive and dirty, requiring toxic chemicals and often generating a small amount of radioactive waste that must be safely disposed of. Another concern is access: China has a near monopoly on both mining and processing; the United States has just one active mine (SN Online: 1/1/23).

For most of the jobs rare earths do, there are no good substitutes. So to help meet future demand and diversify who controls the supply — and perhaps even make rare earth recovery “greener” — researchers are looking for alternatives to conventional mining.   

Proposals include everything from extracting the metals from coal waste to really out-there ideas like mining the moon. But the approach most likely to make an immediate dent is recycling. “Recycling is going to play a very important and central role,” says Ikenna Nlebedim, a materials scientist at Ames National Laboratory in Iowa and the Department of Energy’s Critical Materials Institute. “That’s not to say we’re going to recycle our way out of the critical materials challenge.”

Still, in the rare earth magnets market, for instance, by about 10 years from now, recycling could satisfy as much as a quarter of the demand for rare earths, based on some estimates. “That’s huge,” he says.

But before the rare earths in an old laptop can be recycled as regularly as the aluminum in an empty soda can, there are technological, economic and logistical obstacles to overcome.

Why are rare earths so challenging to extract?

Recycling seems like an obvious way to get more rare earths. It’s standard practice in the United States and Europe to recycle from 15 to 70 percent of other metals, such as iron, copper, aluminum, nickel and tin. Yet today, only about 1 percent of rare earth elements in old products are recycled, says Simon Jowitt, an economic geologist at the University of Nevada, Las Vegas.

“Copper wiring can be recycled into more copper wiring. Steel can just be recycled into more steel,” he says. But a lot of rare earth products are “inherently not very recyclable.”

Rare earths are often blended with other metals in touch screens and similar products, making removal difficult. In some ways, recycling rare earths from tossed-out items resembles the challenge of extracting them from ore and separating them from each other. Traditional rare earth recycling methods also require hazardous chemicals such as hydrochloric acid and a lot of heat, and thus a lot of energy. On top of the environmental footprint, the cost of recovery may not be worth the effort given the small yield of rare earths. A hard disk drive, for instance, might contain just a few grams; some products offer just milligrams.

Chemists and materials scientists, though, are trying to develop smarter recycling approaches. Their techniques put microbes to work, ditch the acids of traditional methods or attempt to bypass extraction and separation.

Microbial partners can help recycle rare earths

One approach leans on microscopic partners. Gluconobacter bacteria naturally produce organic acids that can pull rare earths, such as lanthanum and cerium, from spent catalysts used in petroleum refining or from fluorescent phosphors used in lighting. The bacterial acids are less environmentally harmful than hydrochloric acid or other traditional metal-leaching acids, says Yoshiko Fujita, a biogeochemist at Idaho National Laboratory in Idaho Falls. Fujita leads research into reuse and recycling at the Critical Materials Institute. “They can also be degraded naturally,” she says.

In experiments, the bacterial acids can recover only about a quarter to half of the rare earths from spent catalysts and phosphors. Hydrochloric acid can do much better — in some cases extracting as much as 99 percent. But bio-based leaching might still be profitable, Fujita and colleagues reported in 2019 in ACS Sustainable Chemistry & Engineering.

In a hypothetical plant recycling 19,000 metric tons of used catalyst a year, the team estimated annual revenues to be roughly $1.75 million. But feeding the bacteria that produce the acid on-site is a big expense. In a scenario in which the bacteria are fed refined sugar, total costs for producing the rare earths are roughly $1.6 million a year, leaving around just $150,000 in profits. Switching from sugar to corn stalks, husks and other harvest leftovers, however, would slash costs by about $500,000, raising profits to about $650,000.

a reactor machine at Idaho National Laboratory
One experimental recycling approach uses organic acids made by bacteria to extract rare earths from waste products. This reactor at the Idaho National Laboratory prepares an organic acid mixture for such recycling.Idaho National Lab

Other microbes can also help extract rare earths and take them even further. A few years ago, researchers discovered that some bacteria that metabolize rare earths produce a protein that preferentially grabs onto these metals. This protein, lanmodulin, can separate rare earths from each other, such as neodymium from dysprosium — two components of rare earth magnets. A lanmodulin-based system might eliminate the need for the many chemical solvents typically used in such separation. And the waste left behind — the protein — would be biodegradable. But whether the system will pan out on a commercial scale is unknown.

How to pull rare earths from discarded magnets

Another approach already being commercialized skips the acids and uses copper salts to pull the rare earths from discarded magnets, a valuable target. Neodymium-iron-boron magnets are about 30 percent rare earth by weight and the single largest application of the metals in the world. One projection suggests that recovering the neodymium in magnets from U.S. hard disk drives alone could meet up about 5 percent of the world’s demand outside of China before the end of the decade.

Nlebedim led a team that developed a technique that uses copper salts to leach rare earths out of shredded electronic waste that contains magnets. Dunking the e-waste in a copper salt solution at room temperature dissolves the rare earths in the magnets. Other metals can be scooped out for their own recycling, and the copper can be reused to make more salt solution. Next, the rare earths are solidified and, with the help of additional chemicals and heating, transformed into powdered minerals called rare earth oxides. The process, which has also been used on material left over from magnet manufacturing that typically goes to waste, can recover 90 to 98 percent of the rare earths, and the material is pure enough to make new magnets, Nlebedim’s team has demonstrated.

In a best-case scenario, using this method to recycle 100 tons of leftover magnet material might produce 32 tons of rare earth oxides and net more than $1 million in profits, an economic analysis of the method suggests.

That study also evaluated the approach’s environmental impacts. Compared with producing one kilogram of rare earth oxide via one of the main types of mining and processing currently used in China, the copper salt method has less than half the carbon footprint. It produces an average of about 50 kilograms of carbon dioxide equivalent per kilogram of rare earth oxide versus 110, Nlebedim’s team reported in 2021 in ACS Sustainable Chemistry & Engineering.

But it’s not necessarily greener than all forms of mining. One sticking point is that the process requires toxic ammonium hydroxide and roasting, which consumes a lot of energy, and it still releases some carbon dioxide. Nlebedim’s group is now tweaking the technique. “We want to decarbonize the process and make it safer,” he says.

Meanwhile, the technology seems promising enough that TdVib, an Iowa company that designs and manufactures magnetic materials and products, has licensed it and built a pilot plant. The initial aim is to produce two tons of rare earth oxides per month, says Daniel Bina, TdVib’s president and CEO. The plant will recycle rare earths from old hard disk drives from data centers.

Noveon Magnetics, a company in San Marcos, Texas, is already making recycled neodymium-iron-boron magnets. In typical magnet manufacturing, the rare earths are mined, transformed into metal alloys, milled into a fine powder, magnetized and formed into a magnet. Noveon knocks out those first two steps, says company CEO Scott Dunn.

After demagnetizing and cleaning discarded magnets, Noveon directly mills them into a powder before building them back up as new magnets. Unlike with other recycling methods, there’s no need to extract and separate the rare earths out first. The final product can be more than 99 percent recycled magnet, Dunn says, with a small addition of virgin rare earth elements — the “secret sauce,” as he puts it — that allows the company to fine-tune the magnets’ attributes.

Compared with traditional magnet mining and manufacturing, Noveon’s method cuts energy use by about 90 percent, Miha Zakotnik, Noveon’s chief technology officer, and other researchers reported in 2016 in Environmental Technology & Innovation. Another 2016 analysis estimated that for every kilogram of magnet produced via Noveon’s method, about 12 kilograms of carbon dioxide equivalent are emitted. That’s about half as much of the greenhouse gas as conventional magnets.

Dunn declined to share what volume of magnets Noveon currently produces or how much its magnets cost. But the magnets are being used in some industrial applications, for pumps, fans and compressors, as well as some consumer power tools and other electronics.

photo of a robot named Daisy, developed by Apple
To help with recycling, Apple developed the robot Daisy (shown), which can dismantle 23 models of iPhones. Other robots in the works — Taz and Dave — will specialize in recovering rare earth magnets.Apple

Rare earth recycling has logistical hurdles

Even as researchers clear technological hurdles, there are still logistical barriers to recycling. “We don’t have the systems for collecting end-of-life products that have rare earths in them,” Fujita says, “and there’s the cost of dismantling those products.” For a lot of e-waste, before rare earth recycling can begin, you have to get to the bits that contain those precious metals.

Noveon has a semiautomated process for removing magnets from hard disk drives and other electronics.

Apple is also trying to automate the recycling process. The company’s Daisy robot can dismantle iPhones. And in 2022, Apple announced a pair of robots called Taz and Dave that facilitate the recycling of rare earths. Taz can gather magnet-containing modules that are typically lost during the shredding of electronics. Dave can recover magnets from taptic engines, Apple’s technology for providing users with tactile feedback when, say, tapping an iPhone screen.

Even with robotic aids, it would still be a lot easier if companies just designed products in a way that made recycling easy, Fujita says.

No matter how good recycling gets, Jowitt sees no getting around the need to ramp up mining to feed our rare earth–hungry society. But he agrees recycling is necessary. “We’re dealing with intrinsically finite resources,” he says. “Better we try and extract what we can rather than just dumping it in the landfill.”

19 Jan. 2023

Today’s red jungle fowl — the wild forebears of the domesticated chicken — are becoming more chickenlike. New research suggests that a large proportion of the wild fowl’s DNA has been inherited from chickens, and relatively recently.

Ongoing interbreeding between the two birds may threaten wild jungle fowl populations’ future, and even hobble humans’ ability to breed better chickens, researchers report January 19 in PLOS Genetics

Red jungle fowl (Gallus gallus) are forest birds native to Southeast Asia and parts of South Asia. Thousands of years ago, humans domesticated the fowl, possibly in the region’s rice fields (SN: 6/6/22). 

“Chickens are arguably the most important domestic animal on Earth,” says Frank Rheindt, an evolutionary biologist at the National University of Singapore. He points to their global ubiquity and abundance.  Chicken is also one of the cheapest sources of animal protein that humans have.

Domesticated chickens (G. gallus domesticus) were known to be interbreeding with jungle fowl near human settlements in Southeast Asia. Given the unknown impacts on jungle fowl and the importance of chickens to humankind, Rheindt and his team wanted to gather more details. Wild jungle fowl contain a store of genetic diversity that could serve as a crucial resource for breeding chickens resistant to diseases or other threats.

The researchers analyzed and compared the genomes — the full complement of an organism’s DNA — of 63 jungle fowl and 51 chickens from across Southeast Asia. Some of the jungle fowl samples came from museum specimens collected from 1874 through 1939, letting the team see how the genetic makeup of jungle fowl has changed over time. 

Over the last century or so, wild jungle fowl’s genomes have become increasingly similar to chickens’. Between about 20 and 50 percent of the genomes of modern jungle fowl originated in chickens, the team found. In contrast, many of the roughly 100-year-old jungle fowl had a chicken-ancestry share in the range of a few percent.

The rapid change probably comes from human communities expanding into the region’s wilderness, Rheindt says. Most modern jungle fowl live in close vicinity to humans’ free-ranging chickens, with which they frequently interbreed. 

Such interbreeding has become “almost the norm now” for any globally domesticated species, Rheindt says, such as dogs hybridizing with wolves and house cats crossing with wildcats. Pigs, meanwhile, are mixing with wild boars and ferrets with polecats.

Wild populations that interbreed with their domesticated counterparts could pick up physical or behavioral traits that change how the hybrids function in their ecosystem, says Claudio Quilodrán, a conservation geneticist at the University of Geneva not involved with this research. 

The effect is likely to be negative, Quilodrán says, since some of the traits coming into the wild population have been honed for human uses, not for survival in the local environment. 

Wild jungle fowl have lost their genetic diversity as they’ve interbred too. The birds’ heterozygosity — a measure of a population’s genetic diversity — is now just a tenth of what it was a century ago. 

“This result is initially counterintuitive,” Rheindt says. “If you mix one population with another, you would generally expect a higher genetic diversity.”

But domesticated chickens have such low genetic diversity that certain versions of jungle fowl genes are being swept out of the population by a tsunami of genetic homogeneity. The whittling down of these animals’ genetic toolkit may leave them vulnerable to conservation threats.

“Having lots of genetic diversity within a species increases the chance that certain individuals contain the genetic background to adapt to a varied range of different environmental changes and diseases,” says Graham Etherington, a computational biologist at the Earlham Institute in Norwich, England, who was not involved with this research.

A shallower jungle fowl gene pool could also mean diminished resources for breeding better chickens. The genetics of wild relatives are sometimes used to bolster the disease or pest resistance of domesticated crop plants. Jungle fowl genomes could be similarly valuable for this reason.

“If this trend continues unabated, future human generations may only be able to access the entirety of ancestral genetic diversity of chickens in the form of museum specimens,” Rheindt says, which could hamper chicken breeding efforts using the wild fowl genes. 

Some countries such as Singapore, Rheindt says, have started managing jungle fowl populations to reduce interbreeding with chickens.

19 Jan. 2023

The night sky has been brightening faster than researchers realized, thanks to the use of artificial lights at night. A study of more than 50,000 observations of stars by citizen scientists reveals that the night sky grew about 10 percent brighter, on average, every year from 2011 to 2022.

In other words, a baby born in a region where roughly 250 stars were visible every night would see only 100 stars on their 18th birthday, researchers report in the Jan. 20 Science.

The perils of light pollution go far beyond not being able to see as many stars. Too much brightness at night can harm people’s health, send migrating birds flying into buildings, disrupt food webs by drawing pollinating insects toward lights instead of plants and may even interrupt fireflies trying to have sex (SN: 8/2/17; SN: 8/12/15).

“In a way, this is a call to action,” says astronomer Connie Walker of the National Optical-Infrared Astronomy Research Laboratory in Tucson. “People should consider that this does have an impact on our lives. It’s not just astronomy. It impacts our health. It impacts other animals who cannot speak for themselves.”

Walker works with the Globe at Night campaign, which began in the mid-2000s as an outreach project to connect students in Arizona and Chile and now has thousands of participants worldwide. Contributors compare the stars they can see with maps of what stars would be visible at different levels of light pollution, and enter the results on an app.

“I’d been quite skeptical of Globe at Night” as a tool for precision research, admits physicist Christopher Kyba of the GFZ German Research Centre for Geosciences in Potsdam. But the power is in the sheer numbers: Kyba and colleagues analyzed 51,351 individual data points collected from 2011 to 2022.

“The individual data are not precise, but there’s a whole lot of them,” he says. “This Globe at Night project is not just a game; it’s really useful data. And the more people participate, the more powerful it gets.”

Those data, combined with a global atlas of sky luminance published in 2016, allowed the team to conclude that the night sky’s brightness increased by an average 9.6 percent per year from 2011 to 2022 (SN: 6/10/16).

Most of that increase was missed by satellites that collect brightness data across the globe. Those measurements saw just a 2 percent increase in brightness per year over the last decade.

There are several reasons for that, Kyba says. Since the early 2010s, many outdoor lights have switched from high-pressure sodium lightbulbs to LEDs. LEDs are more energy efficient, which has environmental benefits and cost savings.

But LEDs also emit more short-wavelength blue light, which scatters off particles in the atmosphere more than sodium bulbs’ orange light, creating more sky glow. Existing satellites are not sensitive to blue wavelengths, so they underestimate the light pollution coming from LEDs. And satellites may miss light that shines toward the horizon, such as light emitted by a sign or from a window, rather than straight up or down.

satellite image of Milan at night taken from the International Space Station
Satellites have missed some of the light pollution from LEDs, which emit in blue wavelengths. This image from the International Space Station shows LEDs in the center of Milan glowing brighter than the orange lights in the suburbs.Samantha Cristoforetti, NASA, ESA

Astronomer and light pollution researcher John Barentine was not surprised that satellites underestimated the problem. But “I was still surprised by how much of an underestimate it was,” he says. “This paper is confirming that we’ve been undercounting light pollution in the world.”

The good news is that no major technological breakthroughs are needed to help fix the problem. Scientists and policy makers just need to convince people to change how they use light at night — easier said than done.

“People sometimes say light pollution is the easiest pollution to solve, because you just have to turn a switch and it goes away,” Kyba says. “That’s true. But it’s ignoring the social problem — that this overall problem of light pollution is made by billions of individual decisions.”

Some simple solutions include dimming or turning off lights overnight, especially floodlighting or lights in empty parking lots.

Kyba shared a story about a church in Slovenia that switched from four 400-watt floodlights to a single 58-watt LED, shining behind a cutout of the church to focus the light on its facade. The result was a 96 percent reduction in energy use and much less wasted light , Kyba reported in the International Journal of Sustainable Lighting in 2018. The church was still lit up, but the grass, trees and sky around it remained dark.

“If it was possible to replicate that story over and over again throughout our society, it would suggest you could really drastically reduce the light in the sky, still have a lit environment and have better vision and consume a lot less energy,” he says. “This is kind of the dream.”

Barentine, who leads a private dark-sky consulting firm, thinks widespread awareness of the problem — and subsequent action — could be imminent. For comparison, he points to a highly publicized oil slick fire on the Cuyahoga River, outside of Cleveland, in 1969 that fueled the environmental movement of the 1960s and ’70s, and prompted the U.S. Congress to pass the Clean Water Act.

“I think we’re on the precipice, maybe, of having the river-on-fire moment for light pollution,” he says.

19 Jan. 2023

A type of bacteria that’s overabundant in the nasal passages of people with hay fever may worsen symptoms. Targeting that bacteria may provide a way to rein in ever-running noses.

Hay fever occurs when allergens, such as pollen or mold, trigger an inflammatory reaction in the nasal passages, leading to itchiness, sneezing and overflowing mucus. Researchers analyzed the composition of the microbial population in the noses of 55 people who have hay fever and those of 105 people who don’t. There was less diversity in the nasal microbiome of people who have hay fever and a whole lot more of a bacterial species called Streptococcus salivarius, the team reports online January 12 in Nature Microbiology.  

S. salivarius was 17 times more abundant in the noses of allergy sufferers than the noses of those without allergies, says Michael Otto, a molecular microbiologist at the National Institute of Allergy and Infectious Diseases in Bethesda, Md. That imbalance appears to play a part in further provoking allergy symptoms. In laboratory experiments with allergen-exposed cells that line the airways, S. salivarius boosted the cells’ production of proteins that promote inflammation.

And it turns out that S. salivarius really likes runny noses. One prominent, unpleasant symptom of hay fever is the overproduction of nasal discharge. The researchers found that S. salivarius binds very well to airway-lining cells exposed to an allergen and slathered in mucus — better than a comparison bacteria that also resides in the nose.

The close contact appears to be what makes the difference. It means that substances on S. salivarius’ surface that can drive inflammation — common among many bacteria — are close enough to exert their effect on cells, Otto says.

Hay fever, which disrupts daily activities and disturbs sleep, is estimated to affect as many as 30 percent of adults in the United States. The new research opens the door “to future studies targeting this bacteria” as a potential treatment for hay fever, says Mahboobeh Mahdavinia, a physician scientist who studies immunology and allergies at Rush University Medical Center in Chicago.

But any treatment would need to avoid harming the “good” bacteria that live in the nose, says Mahdavinia, who was not involved in the research.

The proteins on S. salivarius’ surface that are important to its ability to attach to mucus-covered cells might provide a target, says Otto. The bacteria bind to proteins called mucins found in the slimy, runny mucus. By learning more about S. salivarius’ surface proteins, Otto says, it may be possible to come up with “specific methods to block that adhesion.”

18 Jan. 2023

High-tech shrink art may be the key to making tiny electronics, 3-D nanostructures or even holograms for hiding secret messages.

A new approach to making tiny structures relies on shrinking them down after building them, rather than making them small to begin with, researchers report in the Dec. 23 Science.

The key is spongelike hydrogel materials that expand or contract in response to surrounding chemicals (SN: 1/20/10). By inscribing patterns in hydrogels with a laser and then shrinking the gels down to about one-thirteenth their original size, the researchers created patterns with details as small as 25 billionths of a meter across.

At that level of precision, the researchers could create letters small enough to easily write this entire article along the circumference of a typical human hair.

Biological scientist Yongxin Zhao and colleagues deposited a variety of materials in the patterns to create nanoscopic images of Chinese zodiac animals. By shrinking the hydrogels after laser etching, several of the images ended up roughly the size of a red blood cell. They included a monkey made of silver, a gold-silver alloy pig, a titanium dioxide snake, an iron oxide dog and a rabbit made of luminescent nanoparticles.

two red dragons made from hydrogels
These two dragons, each roughly 40 micrometers long, were made by depositing cadmium selenide quantum dots onto a laser-etched hydrogel. The red stripes on the left dragon are each just 200 nanometers thick.The Chinese University of Hong Kong, Carnegie Mellon University

Because the hydrogels can be repeatedly shrunk and expanded with chemical baths, the researchers were also able to create holograms in layers inside a chunk of hydrogel to encode secret information. Shrinking a hydrogel hologram makes it unreadable. “If you want to read it, you have to expand the sample,” says Zhao, of Carnegie Mellon University in Pittsburgh. “But you need to expand it to exactly the same extent” as the original. In effect, knowing how much to expand the hydrogel serves as a key to unlock the information hidden inside.  

But the most exciting aspect of the research, Zhao says, is the wide range of materials that researchers can use on such minute scales. “We will be able to combine different types of materials together and make truly functional nanodevices.”

SCAN TO VIEW AND BOOKMARK THIS PAGE ON YOUR PHONE
BACK TO TOP