LSD’s grip on brain protein could explain drug’s long-lasting effects

Locked inside a human brain protein, the hallucinogenic drug LSD takes an extra-long trip.

New X-ray crystallography images reveal how an LSD molecule gets trapped within a protein that senses serotonin, a key chemical messenger in the brain. The protein, called a serotonin receptor, belongs to a family of proteins involved in everything from perception to mood.

The work is the first to decipher the structure of such a receptor bound to LSD, which gets snared in the protein for hours. That could explain why “acid trips” last so long, study coauthor Bryan Roth and colleagues report January 26 in Cell. It’s “the first snapshot of LSD in action,” he says. “Until now, we had no idea how it worked at the molecular level.”
But the results might not be that relevant to people, warns Cornell University biophysicist Harel Weinstein.

Roth’s group didn’t capture the main target of LSD, a serotonin receptor called 5-HT2A, instead imaging the related receptor 5-HT2B. That receptor is “important in rodents, but not that important in humans,” Weinstein says.

Roth’s team has devoted decades to working on 5-HT2A, but the receptor has “thus far been impossible to crystallize,” he says. Predictions of 5-HT2A’s structure, though, are very similar to that of 5-HT2B, he says.

LSD, or lysergic acid diethylamide, was first cooked up in a chemist’s lab in 1938. It was popular (and legal) for recreational use in the early 1960s, but the United States later banned the drug (also known as blotter, boomer, Purple Haze and electric Kool-Aid).

It’s known for altering perception and mood — and for its unusually long-lasting effects. An acid trip can run some 15 hours, and at high doses, effects can linger for days. “It’s an extraordinarily potent drug,” says Roth, a psychiatrist and pharmacologist at the University of North Carolina School of Medicine in Chapel Hill.
Scientists have known for decades that LSD targeted serotonin receptors in the brain. These proteins, which are also found in the intestine and elsewhere in the body, lodge within the outer membranes of nerve cells and relay chemical signals to the cells’ interiors. But no one knew exactly how LSD fit into the receptor, or why the drug was so powerful.

Roth and colleagues’ work shows the drug hunkered deep inside a pocket of the receptor, grabbing onto an amino acid that acts like a handle to pull down a lid. It’s like a person holding the door of a storm cellar closed during a tornado, Roth says.

When the team did additional molecular experiments, tweaking the lid’s handle so that LSD could no longer hang on, the drug slipped out of the pocket faster than when the handle was intact. That was true whether the team used receptor 5-HT2B or 5-HT2A, Roth says. (Though the researchers couldn’t crystallize 5-HT2A, they were able to grow the protein inside cells in the lab for use in their other experiments.) The results suggest that LSD’s grip on the receptor is what keeps it trapped inside. “That explains to a great extent why LSD is so potent and why it’s so long-lasting,” Roth says.

David Nutt, a neuropsychopharmacologist at Imperial College London, agrees. He calls the work an “elegant use of molecular science.”

Weinstein remains skeptical. The 5-HT2A receptor is the interesting one, he maintains. A structure of that protein “has been needed for a very long time.” That’s what would really help explain the hallucinogenic effects of LSD, he says.

If chewing sounds irk you, blame your brain

The sound of someone slurping coffee or crunching an apple can be mildly annoying — but it leaves some people seething. These people aren’t imagining their distress, new research suggests. Anger and anxiety in response to everyday sounds of eating, drinking and breathing come from increased activity in parts of the brain that process and regulate emotions, scientists report February 2 in Current Biology.

People with this condition, called misophonia, are often dismissed as just overly sensitive, says Jennifer Jo Brout, a clinical psychologist not involved with the study. “This really confirms that it’s neurologically based,” says Brout, founder of the Sensory Processing and Emotion Regulation Program at Duke University Medical Center.
Researchers played sounds to 20 people with misophonia and 22 people without. Some sounds were neutral, such as rain falling. Others, like a wailing baby, were annoying to both groups of people but didn’t cause a misophonic response. A third set were sounds known to cause distress in people with misophonia — chewing and breathing noises.

MRI brain scans showed that both groups of people reacted similarly to the neutral and annoying sounds. But misophonics responded far more dramatically to the chewing and breathing. They showed more activity in their anterior insular cortex, a brain structure involved in emotional processing. Scientists found structural differences, too — more connections from the anterior insular cortex to structures like the amygdala and the hippocampus, which also help with processing emotions.
People with misophonia also showed increased heart rate and skin conductivity. That’s the same sort of flight-or-fight response that gets triggered when facing a wild animal or a public speaking engagement.
Sounds most people ignore in their day-to-day listening create a very strong emotional response in misophonics, says study coauthor Sukhbinder Kumar, a cognitive neuroscientist at Newcastle University in Newcastle upon Tyne, England. Their brains are ascribing extra importance to certain sounds. But it’s still unclear why only specific sounds cause a reaction.

Mysteries of time still stump scientists

The topic of time is both excruciatingly complicated and slippery. The combination makes it easy to get bogged down. But instead of an exhaustive review, journalist Alan Burdick lets curiosity be his guide in Why Time Flies, an approach that leads to a light yet supremely satisfying story about time as it runs through — and is perceived by — the human body.

Burdick doesn’t restrict himself to any one aspect of his question. He spends time excavating what he calls the “existential caverns,” where philosophical questions, such as the shifting concept of now, dwell. He describes the circadian clocks that keep bodies running efficiently, making sure our bodies are primed to digest food at mealtimes, for instance. He even covers the intriguing and slightly insane self-experimentation by the French scientist Michel Siffre, who crawled into caves in 1962 and 1972 to see how his body responded in places without any time cues.
In the service of his exploration, Burdick lived in constant daylight in the Alaskan Arctic for two summery weeks, visited the master timekeepers at the International Bureau of Weights and Measures in Paris to see how they precisely mete out the seconds and plunged off a giant platform to see if time felt slower during moments of stress. The book not only deals with fascinating temporal science but also how time is largely a social construct. “Time is what everybody agrees the time is,” one researcher told Burdick.
That subjective truth also applies to the brain. Time, in a sense, is created by the mind. “Our experience of time is not a cave shadow to some true and absolute thing; time is our perception,” Burdick writes. That subjective experience becomes obvious when Burdick recounts how easily our brains’ clocks can be swayed. Emotions, attention (SN: 12/10/16, p. 10) and even fever can distort our time perception, scientists have found.

Burdick delves deep into several neuroscientific theories of how time runs through the brain (SN: 7/25/15, p. 20). Here, the story narrows somewhat in an effort to thoroughly explain a few key ideas. But even amid these details, Burdick doesn’t lose the overarching truth  — that for the most part, scientists simply don’t know the answers. That may be because there is no one answer; instead, the brain may create time by stitching together a multitude of neural clocks.
After reading Why Time Flies, readers will be convinced that no matter how much time passes, the mystery of time will endure.

Germanium computer chips gain ground on silicon — again

First germanium integrated circuits

Integrated circuits made of germanium instead of silicon have been reported … by researchers at International Business Machines Corp. Even though the experimental devices are about three times as large as the smallest silicon circuits, they reportedly offer faster overall switching speed. Germanium … has inherently greater mobility than silicon, which means that electrons move through it faster when a current is applied. — Science News, February 25, 1967

UPDATE:
Silicon circuits still dominate computing. But demand for smaller, high-speed electronics is pushing silicon to its physical limits, sending engineers back for a fresh look at germanium. Researchers built the first compact, high-performance germanium circuit in 2014, and scientists continue to fiddle with its physical properties to make smaller, faster circuits. Although not yet widely used, germanium circuits and those made from other materials, such as carbon nanotubes, could help engineers make more energy-efficient electronics.

Ricin poisoning may one day be treatable with new antidote

WASHINGTON — It has been used by an assassin wielding a poisoned umbrella and sent in a suspicious letter to a president.

Ricin, the potent toxin and bioterrorism agent, has no antidote and can cause death within days. But a cocktail of antibodies could one day offer victims at least a slim window for treatment.

A new study presented February 7 at the American Society for Microbiology’s Biothreats meeting reveals a ricin antidote that, in mice, works even days after exposure to the toxin. Another presented study offers a potential explanation for how such an antidote might work.
Doctors need some way to deal with ricin poisoning, said Patrick Cherubin, a cell biologist at the University of Central Florida in Orlando. Immunologist Nicholas Mantis agreed: “There is no specific treatment or therapy whatsoever.”

Though ricin has an innocuous origin (it’s found in castor beans), the poison is anything but harmless. It’s dangerous and relatively easy to spread — rated by the U.S. Centers for Disease Control and Prevention as a category B bioterrorism agent, just behind the highest-risk category A agents such as anthrax, plague and Ebola.

Ricin poisoning is rare but has featured in some high-profile cases. In 1978, Bulgarian writer Georgi Markov was hit in the thigh with a ricin-poisoned pellet shot from an umbrella gun. A few days later, he was dead. In 2013, a letter addressed to President Barack Obama tested positive for granules of the deadly toxin. A Texas woman had ordered castor bean seeds and lye online, for a do-it-yourself approach to making ricin. No one was injured.

Symptoms of ricin poisoning depend on how the toxin enters the body, and how much gets in. Inhaling ricin can make breathing so difficult the skin turns blue. Ingesting ricin can cause diarrhea, vomiting and seizures. Death can come as soon as 36 hours after exposure.

Ricin is known as an RIP — a scary-sounding acronym that stands for ribosome-inactivating protein, said Mantis, of the New York State Department of Health in Albany. In the cell, ribosomes serve as tiny protein factories. After ricin exposure, “the whole machinery comes to a screeching halt,” Mantis said. For cells, shutting down protein factories for too long is a death sentence.
Scientists have developed two vaccines for ricin, though neither is available yet for use in humans. A vaccine may be “good for soldiers going into the field,” said biochemist Ohad Mazor of the Israel Institute for Biological Research in Ness Ziona. But unvaccinated people are out of luck.
Mazor and colleagues developed a new treatment that could potentially help. The treatment is a mixture of three proteins called neutralizing antibodies; they grab onto ricin and don’t easily let go. With antibodies hanging onto its back, ricin has trouble slipping into cells and wreaking its usual havoc.
Even 48 hours after inhaling ricin, roughly 73 percent of mice, 22 out of 30, treated with the antibodies survived, the team reported at the meeting and in a paper published in the March 1 Toxicon. Untreated mice died within a week.

Previous antibody treatments for ricin work well only if mice are treated within hours after exposure, Mazor said. For poisoned humans, that may not be long enough to diagnose the problem. Mazor doesn’t know how his antibodies might work in people, but he’d like to follow up his mouse work with studies in monkeys or pigs.

Scientists haven’t figured out exactly how antibodies help animals recover, but another study presented at the meeting offers a clue. Cherubin and colleagues added ricin to monkey cells in a dish, and then tracked how much protein was manufactured by the cells.

At high enough levels, ricin exposure shuttered the factories as expected. But when researchers stopped exposing cells to the toxin, protein synthesis started up again and cells recovered. “You need ongoing toxin delivery to eventually kill the cell,” Cherubin said. It’s possible that antibody treatments could cut off ricin delivery to cells, letting them bounce back from poisoning, said study coauthor Ken Teter, also a cell biologist at the University of Central Florida.

Helium’s inertness defied by high-pressure compound

Helium — the recluse of the periodic table — is reluctant to react with other elements. But squeeze the element hard enough, and it will form a chemical compound with sodium, scientists report.

Helium, a noble gas, is one of the periodic table’s least reactive elements. Originally, the noble gases were believed incapable of forming any chemical compounds at all. But after scientists created xenon compounds in the early 1960s, a slew of other noble gas compounds followed. Helium, however, has largely been a holdout.
Although helium was known to hook up with certain elements, the bonds in those compounds were weak, or the compounds were short-lived or electrically charged. But the new compound, called sodium helide or Na2He, is stable at high pressure, and its bonds are strong, an international team of scientists reports February 6 in Nature Chemistry.

As a robust helium compound, “this is really the first that people ever observed,” says chemist Maosheng Miao of California State University, Northridge, who was not involved with the research.

The material’s properties are still poorly understood, but it is unlikely to have immediate practical applications — scientists can create it only in tiny amounts at very high pressures, says study coauthor Alexander Goncharov, a physicist at the Carnegie Institution for Science in Washington, D.C. Instead, the oddball compound serves as inspiration for scientists who hope to produce weird new materials at lower pressures. “I would say that it’s not totally impossible,” says Goncharov. Scientists may be able to tweak the compound, for example, by adding or switching out elements, to decrease the pressure needed.

To coerce helium to link up with another element, the scientists, led by Artem Oganov of Stony Brook University in New York, first performed computer calculations to see which compounds might be possible. Sodium, calculations predicted, would form a compound with helium if crushed under enormously high pressure. Under such conditions, the typical rules of chemistry change — elements that refuse to react at atmospheric pressure can sometimes become bosom buddies when given a squeeze.

So Goncharov and colleagues pinched small amounts of helium and sodium between a pair of diamonds, reaching pressures more than a million times that of Earth’s atmosphere, and heated the material with lasers to temperatures above 1,500 kelvins (about 1200° Celsius). By scattering X-rays off the compound, the scientists could deduce its structure, which matched the one predicted by calculations.
“I think this is really the triumph of computation,” says Miao. In the search for new compounds, computers now allow scientists to skip expensive trial-and-error experiments and zero in on the best candidates to create in a laboratory.

Na2He is an unusual type of compound known as an electride, in which pairs of electrons are cloistered off, away from any atoms. But despite the compound’s bizarre nature, it behaves somewhat like a commonplace compound such as table salt, in which negatively charged chloride ions alternate with positively charged sodium. In Na2He, the isolated electron pairs act like negative ions in such a compound, and the eight sodium atoms surrounding each helium atom are the positive ions.

“The idea that you can make compounds with things like helium which don’t react at all, I think it’s pretty interesting,” says physicist Eugene Gregoryanz of the University of Edinburgh. But, he adds, “I would like to see more experiments” to confirm the result.

The scientists’ calculations also predicted that a compound of helium, sodium and oxygen, called Na2HeO, should form at even lower pressures, though that one has yet to be created in the lab. So the oddball new helium compound may soon have a confirmed cousin.

New, greener catalysts are built for speed

Platinum, one of the rarest and most expensive metals on Earth, may soon find itself out of a job. Known for its allure in engagement rings, platinum is also treasured for its ability to jump-start chemical reactions. It’s an excellent catalyst, able to turn standoffish molecules into fast friends. But Earth’s supply of the metal is limited, so scientists are trying to coax materials that aren’t platinum — aren’t even metals — into acting like they are.

For years, platinum has been offering behind-the-scenes hustle in catalytic converters, which remove harmful pollutants from auto exhaust. It’s also one of a handful of rare metals that move along chemical reactions in many well-established industries. And now, clean energy technology opens a new and growing market for the metal. Energy-converting devices like fuel cells being developed to power some types of electric vehicles rely on platinum’s catalytic properties to transform hydrogen into electricity. Even generating the hydrogen fuel itself depends on platinum.

Without a cheaper substitute for platinum, these clean energy technologies won’t be able to compete against fossil fuels, says Liming Dai, a materials scientist at Case Western Reserve University in Cleveland.

To reduce the pressure on platinum, Dai and others are engineering new materials that have the same catalytic powers as platinum and other metals — without the high price tag. Some researchers are replacing expensive metals with cheaper, more abundant building blocks, like carbon. Others are turning to biology, using catalysts perfected by years of evolution as inspiration. And when platinum really is best for a job, researchers are retooling how it is used to get more bang for the buck.
Moving right along
Catalysts are the unsung heroes of the chemical reactions that make human society tick. These molecular matchmakers are used in manufacturing plastics and pharmaceuticals, petroleum and coal processing and now clean energy technology. Catalysts are even inside our bodies, in the form of enzymes that break food into nutrients and help cells make energy.
During any chemical reaction, molecules break chemical bonds between their atomic building blocks and then make new bonds with different atoms — like swapping partners at a square dance. Sometimes, those partnerships are easy to break: A molecule has certain properties that let it lure away atoms from another molecule. But in stable partnerships, the molecules are content as they are. Left together for a very long period of time, a few might eventually switch partners. But there’s no mass frenzy of bond breaking and rebuilding.

Catalysts make this breaking and rebuilding happen more efficiently by lowering the activation energy — the threshold amount of energy needed to make a chemical reaction go. Starting and ending products stay the same; the catalyst just changes the path, building a paved highway to bypass a bumpy dirt road. With an easier route, molecules that might take years to react can do so in seconds instead. A catalyst doesn’t get used up in the reaction, though. Like a wingman, it incentivizes other molecules to react, and then it bows out.

A hydrogen fuel cell, for example, works by reacting hydrogen gas (H2) with oxygen gas (O2) to make water (H2O) and electricity. The fuel cell needs to break apart the atoms of the hydrogen and oxygen molecules and reshuffle them into new molecules. Without some assistance, the reshuffling happens very slowly. Platinum propels those reactions along.
Platinum works well in fuel cell reactions because it interacts just the right amount with both hydrogen and oxygen. That is, the platinum surface attracts the gas molecules, pulling them close together to speed along the reaction. But then it lets its handiwork float free. Chemists call that “turnover” — how efficiently a catalyst can draw in molecules, help them react, then send them back out into the world.

Platinum isn’t the only superstar catalyst. Other metals with similar chemical properties also get the job done — palladium, ruthenium and iridium, for example. But those elements are also expensive and hard to get. They are so good at what they do that it’s hard to find a substitute. But promising new options are in the works.
Carbon is key
Carbon is a particularly attractive alternative to precious metals like platinum because it’s cheap, abundant and can be assembled into many different structures.

Carbon atoms can arrange themselves into flat sheets of orderly hexagonal rings, like chicken wire. Rolling these chicken wire sheets — known as graphene — into hollow tubes makes carbon nanotubes, which are stronger than steel for their weight. But carbon-only structures don’t make great catalysts.

“Really pure graphene isn’t catalytically active,” says Huixin He, a chemist at Rutgers University in Newark, N.J. But replacing some of the carbon atoms in the framework with nitrogen, phosphorus or other atoms changes the way electric charge is distributed throughout the material. And that can make carbon behave more like a metal. For example, nitrogen atoms sprinkled like chocolate chips into the carbon structure draw negatively charged electrons away from the carbon atoms. The carbon atoms are left with a more positive charge, making them more attractive to the reaction that needs a nudge.

That movement of electrical charge is a prerequisite for a material to act as a catalyst, says Dai, who has pioneered the development of carbon-based, metal-free catalysts. His lab group demonstrated in 2009 in Science that clumps of nitrogen-containing carbon nanotubes aligned vertically — like a fistful of uncooked spaghetti — could stand in for platinum to help break apart oxygen inside fuel cells.
To perfect the technology, which he has patented, Dai has been swapping in different atoms in different combinations and experimenting with various carbon structures. Should the catalyst be a flat sheet of graphene or a forest of rolled up nanotubes, or some hybrid of both? Should it contain just nitrogen and carbon, or a smorgasbord of other elements, too? The answer depends on the specific application.

In 2015 in Science Advances, Dai demonstrated that nitrogen-studded nanotubes worked in acid-containing fuel cells, one of the most promising designs for electric vehicles.

Other researchers are playing their own riffs on the carbon concept. To produce graphene’s orderly structure requires just the right temperature and specific reaction conditions. Amorphous carbon materials — in which the atoms are randomly clumped together — can be easier to make, Rutgers’ He says.

In one experiment, He’s team started with liquid phytic acid, a substance made of carbon, oxygen and phosphorus. Microwaving the liquid for less than a minute transformed it into a sooty black powder that she describes as a sticky sort of sand.

“Phytic acid strongly absorbs microwave energy and changes it to heat so fast,” she says. The heat rearranges the atoms into a jumbled carbon structure studded with phosphorus atoms. Like the nitrogen atoms in Dai’s nanotubes, the phosphorus atoms changed the movement of electric charge through the material and made it catalytically active, He and colleagues reported last year in ACS Nano.

The sooty phytic acid–based catalyst could help move along a different form of clean energy: It sped up a reaction that turns a big, hard-to-use molecule found in cellulose — a tough, woody component of plants — into something that can react with other molecules. That product could then be used to make fuel or other chemicals. He is still tweaking the catalyst to make it work better.

He’s catalyst particles get mixed into the chemical reaction (and later need to be strained out). These more jumbled carbon structures with nitrogen or phosphorus sprinkled in can work in fuel cells, too — and, she says, they’re easier to make than graphene.

Enzyme-inspired energy
Rather than design new materials from the bottom up, some scientists are repurposing catalysts already used in nature: enzymes. Inside living things, enzymes are involved in everything from copying genetic material to breaking down food and nutrients.

Enzymes have a few advantages as catalysts, says M.G. Finn, a chemist at Georgia Tech. They tend to be very specific for a particular reaction, so they won’t waste much energy propelling undesired side reactions. And because they can evolve, enzymes can be tailored to meet different needs.

On their own, enzymes can be too fragile to use in industrial manufacturing, says Trevor Douglas, a chemist at Indiana University in Bloomington. For a solution, his team looked to viruses, which already package enzymes and other proteins inside protective cases.

“We can use these compartments to stabilize the enzymes, to protect them from things that might chew them up in the environment,” Douglas says. The researchers are engineering bacteria to churn out virus-inspired capsules that can be used as catalysts in a variety of applications.
His team mostly uses enzymes called hydrogenases, but other enzymes can work, too. The researchers put the genetic instructions for making the enzymes and for building a protective coating into Escherichia coli bacteria. The bacteria go into production mode, pumping out particles with the hydrogenase enzymes protected inside, Douglas and colleagues reported last year in Nature Chemistry. The protective coating keeps chunky enzymes contained, but lets the molecules they assist get in and out.

“What we’ve done is co-opt the biological processes,” Douglas says. “All we have to do is grow the bacteria and turn on these genes.” Bacteria, he points out, tend to grow quite easily. It’s a sustainable system, and one that’s easily tailored to different reactions by swapping out one enzyme for another.

The enzyme-containing particles can speed along generation of the hydrogen fuel, he has found. But there are still technical challenges: These catalysts last only a couple of days, and figuring out how to replace them inside a consumer device is hard.

Other scientists are using existing enzymes as templates for catalysts of their own design. The same family of hydrogenase enzymes that Douglas is packaging into capsules can be a launching point for lab-built catalysts that are even more efficient than their natural counterparts.

One of these hydrogenases has an iron core plus an amine — a nitrogen-containing string of atoms — hanging off. Just as the nitrogen worked into Dai’s carbon nanotubes affected the way electrons were distributed throughout the material, the amine changes the way the rest of the molecule acts as a catalyst.

Morris Bullock, a researcher at Pacific Northwest National Laboratory in Richland, Wash., is trying to figure out exactly how that interaction plays out. He and colleagues are building catalysts with cheap and abundant metals like iron and nickel at their core, paired with different types of amines. By systematically varying the metal core and the structure and position of the amine, they’re testing which combinations work best.

These amine-containing catalysts aren’t ready for prime time yet — Bullock’s team is focused on understanding how the catalysts work rather than on perfecting them for industry. But the findings provide a springboard for other scientists to push these catalysts toward commercialization.

Sticking with the metals
These new types of catalysts are promising — many of them can speed up reactions almost as well as a traditional platinum catalyst. But even researchers working on platinum alternatives agree that making sustainable and low-cost catalysts isn’t always as simple as removing the expensive and rare metals.

“The calculation of sustainability is not completely straightforward,” Finn says. Though he works with enzymes in his lab, he says, “a platinum-based catalyst that lasts for years is probably going to be more sustainable than an enzyme that degrades.” It might end up being cheaper in the long run, too. That’s why researchers working on these alternative catalysts are pushing to make their products more stable and longer-lasting.
“If you think about a catalyst, it’s really the atoms on the surface that participate in the reaction. Those in the bulk may just provide mechanical support or are just wasted,” says Younan Xia, a chemist at Georgia Tech. Xia is working on minimizing that waste.

One promising approach is to shape platinum into what Xia dubs “nanocages” — instead of a solid cube of metal, just the edges remain, like a frame.

It’s also why many scientists haven’t given up on metal. “I don’t think you can say, ‘Let’s do without metals,’ ” says James Clark, a chemist at the University of York in England. “Certain metals have a certain functionality that’s going to be very hard to replace.” But, he adds, there are ways to use metals more efficiently, such as using nanoparticle-sized pieces that have a higher surface area than a flat sheet, or strategically combining small amounts of a rare metal with cheaper, more abundant nickel or iron. Changing the structure of the material on a nanoscale level also can make a difference.

In one experiment, Xia started with cubes of a different rare metal, palladium. He coated the palladium cubes with a thin layer of platinum just a few atoms thick — a pretty straightforward process. Then, a chemical etched away the palladium inside, leaving a hollow platinum skeleton. Because the palladium is removed from the final product, it can be used again and again. And the nanocage structure leaves less unused metal buried inside than a large flat sheet or a solid cube, Xia reported in 2015 in Science.

Since then, Xia’s team has been developing more complex shapes for the nanocages. An icosahedron, a ball with 20 triangular faces, worked especially well. The slight disorder to the structure — the atoms don’t crystallize quite perfectly — helped make it four times as active as a commercial platinum catalyst. He has made similar cages out of other rare metals like rhodium that could work as catalysts for other reactions.

It’ll take more work before any of these new catalysts fully dethrone platinum and other precious metals. But once they do, that’ll leave more precious metals to use in places where they can truly shine.

Bacteria genes offer new strategy for sterilizing mosquitoes

A pair of bacterial genes may enable genetic engineering strategies for curbing populations of virus-transmitting mosquitoes.

Bacteria that make the insects effectively sterile have been used to reduce mosquito populations. Now, two research teams have identified genes in those bacteria that may be responsible for the sterility, the groups report online February 27 in Nature and Nature Microbiology.

“I think it’s a great advance,” says Scott O’Neill, a biologist with the Institute of Vector-Borne Disease at Monash University in Melbourne, Australia. People have been trying for years to understand how the bacteria manipulate insects, he says.
Wolbachia bacteria “sterilize” male mosquitoes through a mechanism called cytoplasmic incompatibility, which affects sperm and eggs. When an infected male breeds with an uninfected female, his modified sperm kill the eggs after fertilization. When he mates with a likewise infected female, however, her eggs remove the sperm modification and develop normally.

Researchers from Vanderbilt University in Nashville pinpointed a pair of genes, called cifA and cifB, connected to the sterility mechanism of Wolbachia. The genes are located not in the DNA of the bacterium itself, but in a virus embedded in its chromosome.

When the researchers took two genes from the Wolbachia strain found in fruit flies and inserted the pair into uninfected male Drosophila melanogaster, the flies could no longer reproduce with healthy females, says Seth Bordenstein, a coauthor of the study published in Nature. But modified uninfected male flies could successfully reproduce with Wolbachia-infected females, perfectly mimicking how the sterility mechanism functions naturally.

The ability of infected females to “rescue” the modified sperm reminded researchers at the Yale School of Medicine of an antidote’s reaction to a toxin.

They theorized that the gene pair consisted of a toxin gene, cidB, and an antidote gene, cidA. The researchers inserted the toxin gene into yeast, activated it, and saw that the yeast was killed. But when both genes were present and active, the yeast survived, says Mark Hochstrasser, a coauthor of the study in Nature Microbiology.
Hochstrasser’s team also created transgenic flies, but used the strain of Wolbachia that infects common Culex pipiens mosquitoes.

Inserting the two genes into males could be used to control populations of Aedes aegypti mosquitoes, which can carry diseases such as Zika and dengue.

The sterility effect from Wolbachia doesn’t always kill 100 percent of the eggs, says Bordenstein. Adding additional pairs of the genes to the bacteria could make the sterilization more potent, creating a “super Wolbachia.”

You could also avoid infecting the mosquitoes altogether, says Bordenstein. By inserting the two genes into uninfected males and releasing them into populations of wild mosquitoes, you could “essentially crash the population,” he says.

Hochstrasser notes that the second method is safer in case Wolbachia have any long-term negative effects.

O’Neill, who directs a research program called Eliminate Dengue that releases Wolbachia-infected mosquitoes, cautions against mosquito population control through genetic engineering because of public concerns about the technology. “We think it’s better that we focus on a natural alternative,” he says.

Earth’s mantle may be hotter than thought

Temperatures across Earth’s mantle are about 60 degrees Celsius higher than previously thought, a new experiment suggests. Such toasty temperatures would make the mantle runnier than earlier research suggested, a development that could help explain the details of how tectonic plates glide on top of the mantle, geophysicists report in the March 3 Science.

“Scientists have been arguing over the mantle temperature for decades,” says study coauthor Emily Sarafian, a geophysicist at the Woods Hole Oceanographic Institution in Massachusetts and at MIT. “Scientists will argue over 10 degree changes, so changing it by 60 degrees is quite a large jump.”
The mostly solid mantle sits between Earth’s crust and core and makes up around 84 percent of Earth’s volume. Heat from the mantle fuels volcanic eruptions and drives plate tectonics, but taking the mantle’s temperature is trickier than dropping a thermometer down a hole.

Scientists know from the paths of earthquake waves and from measures of how electrical charge moves through Earth that a boundary in the mantle exists a few dozen kilometers below Earth’s surface. Above that boundary, mantle rock can begin melting on its way up to the surface. By mimicking the extreme conditions in the deep Earth — squeezing and heating bits of mantle that erupt from undersea volcanoes or similar rocks synthesized in the lab — scientist can also determine the melting temperature of mantle rock. Using these two facts, scientists have estimated that temperatures at the boundary depth below Earth’s oceans are around 1314° C to 1464° C when adjusted to surface pressure.

But the presence of water in the collected mantle bits, primarily peridotite rock, which makes up much of the upper mantle, has caused problems for researchers’ calculations. Water can drastically lower the melting point of peridotite, but researchers can’t prevent the water content from changing over time. In previous experiments, scientists tried to completely dry peridotite samples and then manually correct for measured mantle water levels in their calculations. The scientists, however, couldn’t tell for sure if the samples were water-free.

The measurement difficulties stem from the fact that peridotite is a mix of the minerals olivine and pyroxene, and the mineral grains are too small to experiment with individually. Sarafian and colleagues overcame this challenge by inserting spheres of pure olivine large enough to study into synthetic peridotite samples. These spheres exchanged water with the surrounding peridotite until they had the same dampness, and so could be used for water content measurements.

Using this technique, the researchers found that the “dry” peridotite used in previous experiments wasn’t dry at all. In fact, the water content was spot on for the actual wetness of the mantle. “By assuming the samples are dry, then correcting for mantle water content, you’re actually overcorrecting,” Sarafian says.
The new experiment suggests that, if adjusted to surface pressure, the mantle under the eastern Pacific Ocean where two tectonic plates diverge, for example, would be around 1410°, up from 1350°. A hotter mantle is less viscous and more malleable, Sarafian says. Scientists have long been puzzled about some of the specifics of plate tectonics, such as to what extent the mantle resists the movement of the overlying plate. That resistance depends in part on the mix of rock, temperature and how melted the rock is at the boundary between the two layers (SN: 3/7/15, p. 6). This new knowledge could give researchers more accurate information on those details.

The revised temperature is only for the melting boundary in the mantle, so “it’s not the full story,” notes Caltech geologist Paul Asimow, who wrote a perspective on the research in the same issue of Science. He agrees that the team’s work provides a higher and more accurate estimate of that adjusted temperature, but he doesn’t think the researchers should assume temperatures elsewhere in the mantle would be boosted by a similar amount. “I’m not so sure about that,” he says. “We need further testing of mantle temperatures.”

Ancient dental plaque tells tales of Neandertal diet and disease

Dental plaque preserved in fossilized teeth confirms that Neandertals were flexible eaters and may have self-medicated with an ancient equivalent of aspirin.

DNA recovered from calcified plaque on teeth from four Neandertal individuals suggest that those from the grasslands around Beligum’s Spy cave ate woolly rhinoceros and wild sheep, while their counterparts from the forested El Sidrón cave in Spain consumed a menu of moss, mushrooms and pine nuts.

The evidence bolsters an argument that Neandertals’ diets spanned the spectrum of carnivory and herbivory based on the resources available to them, Laura Weyrich, a microbiologist at the University of Adelaide in Australia, and her colleagues report March 8 in Nature.

The best-preserved Neandertal remains were from a young male from El Sidrón whose teeth showed signs of an abscess. DNA from a diarrhea-inducing stomach bug and several gum disease pathogens turned up in his plaque. Genetic material from poplar trees, which contain the pain-killing aspirin ingredient salicylic acid, and a plant mold that makes the antibiotic penicillin hint that he may have used natural medication to ease his ailments.

The researchers were even able to extract an almost-complete genetic blueprint, or genome, for one ancient microbe, Methanobrevibacter oralis. At roughly 48,000 years old, it’s the oldest microbial genome sequenced, the researchers report.