Asteroid barrage, ancient marine life boom not linked

An asteroid bombardment that some say triggered an explosion of marine animal diversity around 471 million years ago actually had nothing to do with it.

Precisely dating meteorites from the salvo, researchers found that the space rock barrage began at least 2 million years after the start of the Great Ordovician Biodiversification Event. So the two phenomena are unrelated, the researchers conclude January 24 in Nature Communications.

Some scientists had previously proposed a causal link between the two events: Raining debris from an asteroid breakup (SN: 7/23/16, p. 4) drove evolution by upsetting ecosystems and opening new ecological niches. The relative timing of the impacts and biodiversification was uncertain, though.
Geologist Anders Lindskog of Lund University in Sweden and colleagues examined 17 crystals buried alongside meteorite fragments. Gradual radioactive decay of uranium atoms inside the crystals allowed the researchers to accurately date the sediment layer to around 467.5 million years ago. Based in part on this age, the researchers estimate that the asteroid breakup took place around 468 million years ago. That’s well after fossil evidence suggests that the diversification event kicked off.

Other forces such as climate change and shifting continents instead promoted biodiversity, the researchers propose.

LSD’s grip on brain protein could explain drug’s long-lasting effects

Locked inside a human brain protein, the hallucinogenic drug LSD takes an extra-long trip.

New X-ray crystallography images reveal how an LSD molecule gets trapped within a protein that senses serotonin, a key chemical messenger in the brain. The protein, called a serotonin receptor, belongs to a family of proteins involved in everything from perception to mood.

The work is the first to decipher the structure of such a receptor bound to LSD, which gets snared in the protein for hours. That could explain why “acid trips” last so long, study coauthor Bryan Roth and colleagues report January 26 in Cell. It’s “the first snapshot of LSD in action,” he says. “Until now, we had no idea how it worked at the molecular level.”
But the results might not be that relevant to people, warns Cornell University biophysicist Harel Weinstein.

Roth’s group didn’t capture the main target of LSD, a serotonin receptor called 5-HT2A, instead imaging the related receptor 5-HT2B. That receptor is “important in rodents, but not that important in humans,” Weinstein says.

Roth’s team has devoted decades to working on 5-HT2A, but the receptor has “thus far been impossible to crystallize,” he says. Predictions of 5-HT2A’s structure, though, are very similar to that of 5-HT2B, he says.

LSD, or lysergic acid diethylamide, was first cooked up in a chemist’s lab in 1938. It was popular (and legal) for recreational use in the early 1960s, but the United States later banned the drug (also known as blotter, boomer, Purple Haze and electric Kool-Aid).

It’s known for altering perception and mood — and for its unusually long-lasting effects. An acid trip can run some 15 hours, and at high doses, effects can linger for days. “It’s an extraordinarily potent drug,” says Roth, a psychiatrist and pharmacologist at the University of North Carolina School of Medicine in Chapel Hill.
Scientists have known for decades that LSD targeted serotonin receptors in the brain. These proteins, which are also found in the intestine and elsewhere in the body, lodge within the outer membranes of nerve cells and relay chemical signals to the cells’ interiors. But no one knew exactly how LSD fit into the receptor, or why the drug was so powerful.

Roth and colleagues’ work shows the drug hunkered deep inside a pocket of the receptor, grabbing onto an amino acid that acts like a handle to pull down a lid. It’s like a person holding the door of a storm cellar closed during a tornado, Roth says.

When the team did additional molecular experiments, tweaking the lid’s handle so that LSD could no longer hang on, the drug slipped out of the pocket faster than when the handle was intact. That was true whether the team used receptor 5-HT2B or 5-HT2A, Roth says. (Though the researchers couldn’t crystallize 5-HT2A, they were able to grow the protein inside cells in the lab for use in their other experiments.) The results suggest that LSD’s grip on the receptor is what keeps it trapped inside. “That explains to a great extent why LSD is so potent and why it’s so long-lasting,” Roth says.

David Nutt, a neuropsychopharmacologist at Imperial College London, agrees. He calls the work an “elegant use of molecular science.”

Weinstein remains skeptical. The 5-HT2A receptor is the interesting one, he maintains. A structure of that protein “has been needed for a very long time.” That’s what would really help explain the hallucinogenic effects of LSD, he says.

Mysteries of time still stump scientists

The topic of time is both excruciatingly complicated and slippery. The combination makes it easy to get bogged down. But instead of an exhaustive review, journalist Alan Burdick lets curiosity be his guide in Why Time Flies, an approach that leads to a light yet supremely satisfying story about time as it runs through — and is perceived by — the human body.

Burdick doesn’t restrict himself to any one aspect of his question. He spends time excavating what he calls the “existential caverns,” where philosophical questions, such as the shifting concept of now, dwell. He describes the circadian clocks that keep bodies running efficiently, making sure our bodies are primed to digest food at mealtimes, for instance. He even covers the intriguing and slightly insane self-experimentation by the French scientist Michel Siffre, who crawled into caves in 1962 and 1972 to see how his body responded in places without any time cues.
In the service of his exploration, Burdick lived in constant daylight in the Alaskan Arctic for two summery weeks, visited the master timekeepers at the International Bureau of Weights and Measures in Paris to see how they precisely mete out the seconds and plunged off a giant platform to see if time felt slower during moments of stress. The book not only deals with fascinating temporal science but also how time is largely a social construct. “Time is what everybody agrees the time is,” one researcher told Burdick.
That subjective truth also applies to the brain. Time, in a sense, is created by the mind. “Our experience of time is not a cave shadow to some true and absolute thing; time is our perception,” Burdick writes. That subjective experience becomes obvious when Burdick recounts how easily our brains’ clocks can be swayed. Emotions, attention (SN: 12/10/16, p. 10) and even fever can distort our time perception, scientists have found.

Burdick delves deep into several neuroscientific theories of how time runs through the brain (SN: 7/25/15, p. 20). Here, the story narrows somewhat in an effort to thoroughly explain a few key ideas. But even amid these details, Burdick doesn’t lose the overarching truth  — that for the most part, scientists simply don’t know the answers. That may be because there is no one answer; instead, the brain may create time by stitching together a multitude of neural clocks.
After reading Why Time Flies, readers will be convinced that no matter how much time passes, the mystery of time will endure.

Germanium computer chips gain ground on silicon — again

First germanium integrated circuits

Integrated circuits made of germanium instead of silicon have been reported … by researchers at International Business Machines Corp. Even though the experimental devices are about three times as large as the smallest silicon circuits, they reportedly offer faster overall switching speed. Germanium … has inherently greater mobility than silicon, which means that electrons move through it faster when a current is applied. — Science News, February 25, 1967

UPDATE:
Silicon circuits still dominate computing. But demand for smaller, high-speed electronics is pushing silicon to its physical limits, sending engineers back for a fresh look at germanium. Researchers built the first compact, high-performance germanium circuit in 2014, and scientists continue to fiddle with its physical properties to make smaller, faster circuits. Although not yet widely used, germanium circuits and those made from other materials, such as carbon nanotubes, could help engineers make more energy-efficient electronics.

Helium’s inertness defied by high-pressure compound

Helium — the recluse of the periodic table — is reluctant to react with other elements. But squeeze the element hard enough, and it will form a chemical compound with sodium, scientists report.

Helium, a noble gas, is one of the periodic table’s least reactive elements. Originally, the noble gases were believed incapable of forming any chemical compounds at all. But after scientists created xenon compounds in the early 1960s, a slew of other noble gas compounds followed. Helium, however, has largely been a holdout.
Although helium was known to hook up with certain elements, the bonds in those compounds were weak, or the compounds were short-lived or electrically charged. But the new compound, called sodium helide or Na2He, is stable at high pressure, and its bonds are strong, an international team of scientists reports February 6 in Nature Chemistry.

As a robust helium compound, “this is really the first that people ever observed,” says chemist Maosheng Miao of California State University, Northridge, who was not involved with the research.

The material’s properties are still poorly understood, but it is unlikely to have immediate practical applications — scientists can create it only in tiny amounts at very high pressures, says study coauthor Alexander Goncharov, a physicist at the Carnegie Institution for Science in Washington, D.C. Instead, the oddball compound serves as inspiration for scientists who hope to produce weird new materials at lower pressures. “I would say that it’s not totally impossible,” says Goncharov. Scientists may be able to tweak the compound, for example, by adding or switching out elements, to decrease the pressure needed.

To coerce helium to link up with another element, the scientists, led by Artem Oganov of Stony Brook University in New York, first performed computer calculations to see which compounds might be possible. Sodium, calculations predicted, would form a compound with helium if crushed under enormously high pressure. Under such conditions, the typical rules of chemistry change — elements that refuse to react at atmospheric pressure can sometimes become bosom buddies when given a squeeze.

So Goncharov and colleagues pinched small amounts of helium and sodium between a pair of diamonds, reaching pressures more than a million times that of Earth’s atmosphere, and heated the material with lasers to temperatures above 1,500 kelvins (about 1200° Celsius). By scattering X-rays off the compound, the scientists could deduce its structure, which matched the one predicted by calculations.
“I think this is really the triumph of computation,” says Miao. In the search for new compounds, computers now allow scientists to skip expensive trial-and-error experiments and zero in on the best candidates to create in a laboratory.

Na2He is an unusual type of compound known as an electride, in which pairs of electrons are cloistered off, away from any atoms. But despite the compound’s bizarre nature, it behaves somewhat like a commonplace compound such as table salt, in which negatively charged chloride ions alternate with positively charged sodium. In Na2He, the isolated electron pairs act like negative ions in such a compound, and the eight sodium atoms surrounding each helium atom are the positive ions.

“The idea that you can make compounds with things like helium which don’t react at all, I think it’s pretty interesting,” says physicist Eugene Gregoryanz of the University of Edinburgh. But, he adds, “I would like to see more experiments” to confirm the result.

The scientists’ calculations also predicted that a compound of helium, sodium and oxygen, called Na2HeO, should form at even lower pressures, though that one has yet to be created in the lab. So the oddball new helium compound may soon have a confirmed cousin.

Earth’s mantle may be hotter than thought

Temperatures across Earth’s mantle are about 60 degrees Celsius higher than previously thought, a new experiment suggests. Such toasty temperatures would make the mantle runnier than earlier research suggested, a development that could help explain the details of how tectonic plates glide on top of the mantle, geophysicists report in the March 3 Science.

“Scientists have been arguing over the mantle temperature for decades,” says study coauthor Emily Sarafian, a geophysicist at the Woods Hole Oceanographic Institution in Massachusetts and at MIT. “Scientists will argue over 10 degree changes, so changing it by 60 degrees is quite a large jump.”
The mostly solid mantle sits between Earth’s crust and core and makes up around 84 percent of Earth’s volume. Heat from the mantle fuels volcanic eruptions and drives plate tectonics, but taking the mantle’s temperature is trickier than dropping a thermometer down a hole.

Scientists know from the paths of earthquake waves and from measures of how electrical charge moves through Earth that a boundary in the mantle exists a few dozen kilometers below Earth’s surface. Above that boundary, mantle rock can begin melting on its way up to the surface. By mimicking the extreme conditions in the deep Earth — squeezing and heating bits of mantle that erupt from undersea volcanoes or similar rocks synthesized in the lab — scientist can also determine the melting temperature of mantle rock. Using these two facts, scientists have estimated that temperatures at the boundary depth below Earth’s oceans are around 1314° C to 1464° C when adjusted to surface pressure.

But the presence of water in the collected mantle bits, primarily peridotite rock, which makes up much of the upper mantle, has caused problems for researchers’ calculations. Water can drastically lower the melting point of peridotite, but researchers can’t prevent the water content from changing over time. In previous experiments, scientists tried to completely dry peridotite samples and then manually correct for measured mantle water levels in their calculations. The scientists, however, couldn’t tell for sure if the samples were water-free.

The measurement difficulties stem from the fact that peridotite is a mix of the minerals olivine and pyroxene, and the mineral grains are too small to experiment with individually. Sarafian and colleagues overcame this challenge by inserting spheres of pure olivine large enough to study into synthetic peridotite samples. These spheres exchanged water with the surrounding peridotite until they had the same dampness, and so could be used for water content measurements.

Using this technique, the researchers found that the “dry” peridotite used in previous experiments wasn’t dry at all. In fact, the water content was spot on for the actual wetness of the mantle. “By assuming the samples are dry, then correcting for mantle water content, you’re actually overcorrecting,” Sarafian says.
The new experiment suggests that, if adjusted to surface pressure, the mantle under the eastern Pacific Ocean where two tectonic plates diverge, for example, would be around 1410°, up from 1350°. A hotter mantle is less viscous and more malleable, Sarafian says. Scientists have long been puzzled about some of the specifics of plate tectonics, such as to what extent the mantle resists the movement of the overlying plate. That resistance depends in part on the mix of rock, temperature and how melted the rock is at the boundary between the two layers (SN: 3/7/15, p. 6). This new knowledge could give researchers more accurate information on those details.

The revised temperature is only for the melting boundary in the mantle, so “it’s not the full story,” notes Caltech geologist Paul Asimow, who wrote a perspective on the research in the same issue of Science. He agrees that the team’s work provides a higher and more accurate estimate of that adjusted temperature, but he doesn’t think the researchers should assume temperatures elsewhere in the mantle would be boosted by a similar amount. “I’m not so sure about that,” he says. “We need further testing of mantle temperatures.”

Ancient dental plaque tells tales of Neandertal diet and disease

Dental plaque preserved in fossilized teeth confirms that Neandertals were flexible eaters and may have self-medicated with an ancient equivalent of aspirin.

DNA recovered from calcified plaque on teeth from four Neandertal individuals suggest that those from the grasslands around Beligum’s Spy cave ate woolly rhinoceros and wild sheep, while their counterparts from the forested El Sidrón cave in Spain consumed a menu of moss, mushrooms and pine nuts.

The evidence bolsters an argument that Neandertals’ diets spanned the spectrum of carnivory and herbivory based on the resources available to them, Laura Weyrich, a microbiologist at the University of Adelaide in Australia, and her colleagues report March 8 in Nature.

The best-preserved Neandertal remains were from a young male from El Sidrón whose teeth showed signs of an abscess. DNA from a diarrhea-inducing stomach bug and several gum disease pathogens turned up in his plaque. Genetic material from poplar trees, which contain the pain-killing aspirin ingredient salicylic acid, and a plant mold that makes the antibiotic penicillin hint that he may have used natural medication to ease his ailments.

The researchers were even able to extract an almost-complete genetic blueprint, or genome, for one ancient microbe, Methanobrevibacter oralis. At roughly 48,000 years old, it’s the oldest microbial genome sequenced, the researchers report.

Shocking stories tell tale of London Zoo’s founding

When Tommy the chimpanzee first came to London’s zoo in the fall of 1835, he was dressed in an old white shirt.

Keepers gave him a new frock and a sailor hat and set him up in a cozy spot in the kitchen to weather the winter. Visitors flocked to get a look at the little ape roaming around the keepers’ lodge, curled up in the cook’s lap or tugging on her skirt like a toddler. Tommy was a hit — the zoo’s latest star.
Six months later, he was dead.

Tommy’s sorrowful story comes near the middle of Isobel Charman’s latest book, The Zoo, a tale of the founding of the Gardens of the Zoological Society of London, known today as the London Zoo. The book lays out a grand saga of human ambition and audacity, but it’s the animals’ stories — their lives and deaths and hardships — that catch hold of readers and don’t let go.

Charman, a writer and documentary producer, resurrects almost three decades of history, beginning in 1824, when the zoo was still just a fantastical idea: a public menagerie of animals “that would allow naturalists to observe the creatures scientifically.”

It was a long, hard path to that lofty dream, though: In the zoo’s early years, exotic creatures were nearly impossible to keep alive. Charman unloads a numbing litany of animal misery that batters the reader like a boxer working over a speed bag. Kangaroos hurl themselves at fences, monkeys attack each other in cramped, dark cages and an elephant named Jack breaks a tusk while smashing up his den. Charman’s parade of horrors boggles the mind, as does the sheer number of animals carted from all corners of the world to the cold, wet enclosures of the zoo.

Her story is an incredible piece of detective work, told through the eyes of many key players and famous figures, including Charles Darwin. Charman plumbs details from newspaper articles, diaries, census records and weather reports to craft a narrative of the time. She portrays a London that’s gritty, grimy and cold, where some aspects of science and medicine seem stuck in the Dark Ages. Doctors still used leeches to bleed patients, and no one had a clue how to care for zoo animals.
Zoo workers certainly tried — applying liniment to sores on a lion’s legs, prescribing opium for a sick puma and treating a constipated llama with purgatives. But nothing seemed to stop the endless conveyor belt that brought living animals in and carried dead ones out. Back then, caring for zoo animals was mostly a matter of trial and error, Charman writes. What seems laughably obvious now — animals need shelter in winter, cakes and buns aren’t proper food for elephants — took zookeepers years to figure out.

Over time the zoo adapted, making gradual changes that eventually improved the lives of its inhabitants. It seemed to morph, finally, from mostly “a playground of the privileged,” as Charman calls it, to a reliable place for scientific study, where curious people could learn about the “wild and wonderful” creatures within.

One of those people was Darwin, whose ideas about human origins clicked into place after he spent time with Jenny the orangutan. Her teasing relationship with her keeper, apparent understanding of language and utter likeness to people helped convince Darwin that humankind was just another branch on the tree of life, Charman writes.
Darwin’s work on the subject wouldn’t be published for decades, but in the meantime, the zoo’s early improvements seemed to have stuck. Over 30 years after Tommy the chimpanzee died in his keeper’s arms, a hippopotamus gave birth to “the first captive-bred hippo to be reared by its mother,” Charman notes. The baby hippo not only survived — she lived for 36 years.

Readers may wonder how standards for animal treatment have changed over time. But Charman sticks to history, rather than examining contrasts to modern zoos. Still, what she offers is gripping enough on its own: a bold, no-holds-barred look at one zoo’s beginning. It was impressive, no doubt. But it wasn’t pretty.

Random mutations play large role in cancer, study finds

Researchers have identified new enemies in the war on cancer: ones that are already inside cells and that no one can avoid.

Random mistakes made as stem cells divide are responsible for about two-thirds of the mutations in cancer cells, researchers from Johns Hopkins University report in the March 24 Science. Across all cancer types, environment and lifestyle factors, such as smoking and obesity, contribute 29 percent of cancer mutations, and 5 percent are inherited.
That finding challenges the common wisdom that cancer is the product of heredity and the environment. “There’s a third cause and this cause of mutations is a major cause,” says cancer geneticist Bert Vogelstein.

Such random mutations build up over time and help explain why cancer strikes older people more often. Knowing that the enemy will strike from within even when people protect themselves against external threats indicates that early cancer detection and treatment deserve greater attention than they have previously gotten, Vogelstein says.

Vogelstein and biomathematician Cristian Tomasetti proposed in 2015 that random mutations are the reason some organs are more prone to cancer than others. For instance, stem cells are constantly renewing the intestinal lining of the colon, which develops tumors more often than the brain, where cell division is uncommon. That report was controversial because it was interpreted as saying that most cancers are the result of “bad luck.” The analysis didn’t include breast and prostate cancers. Factoring in those common cancers might change the results, some scientists said. And because the researchers looked at only cancer within the United States, critics charged that the finding might not hold up when considering places around the world where different environmental factors, such as infections, affect cancer development.

In the new study, Vogelstein, Tomasetti and Hopkins colleague Lu Li examined data from 69 countries about 17 types of cancer, this time including breast and prostate. Again, the researchers found a strong link between cancer and tissues with lots of dividing stem cells. The team also used DNA data and epidemiological studies to calculate the proportions of mutations in cancer cells caused by heredity or environmental and lifestyle factors. Remaining mutations were attributed to random errors — including typos, insertions or deletions of genes, epigenetic changes (alterations of chemical tags on DNA or proteins that affect gene activity) and gene rearrangements. Such errors unavoidably happen when cells divide.
Usually cancer results after a cell accumulates many mutations. Some people will have accumulated a variety of cancer-associated mutations but won’t get cancer until some final insult goads the cell into becoming malignant (SN: 12/26/15, p. 28). For some tumors, all the mutations may be the hit-and-miss result of cell division mistakes. There’s no way to evade those cancers, Vogelstein says. Other malignancies may spring up as a result of different combinations of heritable, environmental and random mutations. Lung cancer and other tumor types that are strongly associated with environmentally caused mutations could be eluded by avoiding the carcinogen, even when most of the mutations that spur cancer growth arise from random mistakes, Tomasetti says.

“They are venturing into new territory,” says Giovanni Parmigiani, a biostatistician at the Harvard T.H. Chan School of Public Health. Tomasetti, Li and Vogelstein are the first to rigorously estimate the contribution of environment, heredity and DNA-copying errors to cancer, he says. “Perhaps the estimates will improve in the future, but theirs seems like a very solid starting point.”

Now that the Hopkins researchers have pointed it out, the relationship between dividing cells and cancer seems obvious, says biological physicist Bartlomiej Waclaw of the University of Edinburgh. “I don’t think that the existence of this correlation is surprising,” he says. “What’s surprising is that it’s not stronger.”

Some tissues develop cancers more or less often than other tissues with a similar number of cell divisions, Waclaw and Martin Nowak of Harvard University pointed out in a commentary on the Hopkins study, published in the same issue of Science. That suggests some organs are better at nipping cancer in the bud. Discovering how those tissues avoid cancer could lead to new ways to prevent tumors elsewhere in the body, Waclaw suggests.

Other researchers say the Hopkins team is guilty of faulty reasoning. “They are assuming that just because tissues which have high stem cell turnover also have high cancer rates, that one is causing the other,” says cancer researcher Anne McTiernan of the Fred Hutchinson Cancer Research Center in Seattle. “In this new paper, they’ve added data from other countries but haven’t gotten away from this biased thinking.”

Tomasetti and colleagues based their calculations on data from Cancer Research UK that suggest that 42 percent of cancers are preventable. Preventable cancers are ones for which people could avoid a risk factor, such as unprotected sun exposure or tanning bed use, or take positive steps to lower cancer risks, such as exercising regularly and eating fruits and vegetables. But those estimates may not be accurate, McTiernan says. “In reality, it’s very difficult to measure environmental exposures, so our estimates of preventability are likely very underestimated.”

To attribute so many cancer mutations to chance seems to negate public health messages, Waclaw says, and some people may find the calculation that 66 percent of cancer-associated mutations are unavoidable disturbing because they spend a lot of time trying to prevent cancer. “It’s important to consider the randomness, or bad luck, that comes with cellular division,” he says.

In fact, Tomasetti and Vogelstein stress that their findings are compatible with cancer-prevention recommendations. Avoiding smoking, tanning beds, obesity and other known carcinogens can prevent the “environmental” mutations that combine with inherited and random mutations to tip cells into cancer. Without those final straws loaded from environmental exposures, tumors may be averted or greatly delayed.

People with cancer may be able to take some comfort from the study, says Elaine Mardis, a cancer genomicist at the Nationwide Children’s Hospital in Columbus, Ohio. “Perhaps the positive message here is that, other than known risk factors, such as smoking, radiation exposure and obesity, there is a component of cancer that is simply a consequence of being human.”

Extreme gas loss dried out Mars, MAVEN data suggest

The Martian atmosphere definitely had more gas in the past.

Data from NASA’s MAVEN spacecraft indicate that the Red Planet has lost most of the gas that ever existed in its atmosphere. The results, published in the March 31 Science, are the first to quantify how much gas has been lost with time and offer clues to how Mars went from a warm, wet place to a cold, dry one.

Mars is constantly bombarded by charged particles streaming from the sun. Without a protective magnetic field to deflect this solar wind, the planet loses about 100 grams of its now thin atmosphere every second (SN: 12/12/15, p. 31). To determine how much atmosphere has been lost during the planet’s lifetime, MAVEN principal investigator Bruce Jakosky of the University of Colorado Boulder and colleagues measured and compared the abundances of two isotopes of argon at different altitudes in the Martian atmosphere. Using those measurements and an assumption about the amounts of the isotopes in the planet’s early atmosphere, the team estimates that about two-thirds of all of Mars’ argon gas has been ejected into space. Extrapolating from the argon data, the researchers also determined that the majority of carbon dioxide that the Martian atmosphere ever had also was kicked into space by the solar wind.

A thicker atmosphere filled with carbon dioxide and other greenhouse gases could have insulated early Mars and kept it warm enough for liquid water and possibly life. Losing an extreme amount of gas, as the results suggest, may explain how the planet morphed from lush and wet to barren and icy, the researchers write.