An asteroid bombardment that some say triggered an explosion of marine animal diversity around 471 million years ago actually had nothing to do with it.
Precisely dating meteorites from the salvo, researchers found that the space rock barrage began at least 2 million years after the start of the Great Ordovician Biodiversification Event. So the two phenomena are unrelated, the researchers conclude January 24 in Nature Communications.
Some scientists had previously proposed a causal link between the two events: Raining debris from an asteroid breakup (SN: 7/23/16, p. 4) drove evolution by upsetting ecosystems and opening new ecological niches. The relative timing of the impacts and biodiversification was uncertain, though. Geologist Anders Lindskog of Lund University in Sweden and colleagues examined 17 crystals buried alongside meteorite fragments. Gradual radioactive decay of uranium atoms inside the crystals allowed the researchers to accurately date the sediment layer to around 467.5 million years ago. Based in part on this age, the researchers estimate that the asteroid breakup took place around 468 million years ago. That’s well after fossil evidence suggests that the diversification event kicked off.
Other forces such as climate change and shifting continents instead promoted biodiversity, the researchers propose.
Locked inside a human brain protein, the hallucinogenic drug LSD takes an extra-long trip.
New X-ray crystallography images reveal how an LSD molecule gets trapped within a protein that senses serotonin, a key chemical messenger in the brain. The protein, called a serotonin receptor, belongs to a family of proteins involved in everything from perception to mood.
The work is the first to decipher the structure of such a receptor bound to LSD, which gets snared in the protein for hours. That could explain why “acid trips” last so long, study coauthor Bryan Roth and colleagues report January 26 in Cell. It’s “the first snapshot of LSD in action,” he says. “Until now, we had no idea how it worked at the molecular level.” But the results might not be that relevant to people, warns Cornell University biophysicist Harel Weinstein.
Roth’s group didn’t capture the main target of LSD, a serotonin receptor called 5-HT2A, instead imaging the related receptor 5-HT2B. That receptor is “important in rodents, but not that important in humans,” Weinstein says.
Roth’s team has devoted decades to working on 5-HT2A, but the receptor has “thus far been impossible to crystallize,” he says. Predictions of 5-HT2A’s structure, though, are very similar to that of 5-HT2B, he says.
LSD, or lysergic acid diethylamide, was first cooked up in a chemist’s lab in 1938. It was popular (and legal) for recreational use in the early 1960s, but the United States later banned the drug (also known as blotter, boomer, Purple Haze and electric Kool-Aid).
It’s known for altering perception and mood — and for its unusually long-lasting effects. An acid trip can run some 15 hours, and at high doses, effects can linger for days. “It’s an extraordinarily potent drug,” says Roth, a psychiatrist and pharmacologist at the University of North Carolina School of Medicine in Chapel Hill. Scientists have known for decades that LSD targeted serotonin receptors in the brain. These proteins, which are also found in the intestine and elsewhere in the body, lodge within the outer membranes of nerve cells and relay chemical signals to the cells’ interiors. But no one knew exactly how LSD fit into the receptor, or why the drug was so powerful.
Roth and colleagues’ work shows the drug hunkered deep inside a pocket of the receptor, grabbing onto an amino acid that acts like a handle to pull down a lid. It’s like a person holding the door of a storm cellar closed during a tornado, Roth says.
When the team did additional molecular experiments, tweaking the lid’s handle so that LSD could no longer hang on, the drug slipped out of the pocket faster than when the handle was intact. That was true whether the team used receptor 5-HT2B or 5-HT2A, Roth says. (Though the researchers couldn’t crystallize 5-HT2A, they were able to grow the protein inside cells in the lab for use in their other experiments.) The results suggest that LSD’s grip on the receptor is what keeps it trapped inside. “That explains to a great extent why LSD is so potent and why it’s so long-lasting,” Roth says.
David Nutt, a neuropsychopharmacologist at Imperial College London, agrees. He calls the work an “elegant use of molecular science.”
Weinstein remains skeptical. The 5-HT2A receptor is the interesting one, he maintains. A structure of that protein “has been needed for a very long time.” That’s what would really help explain the hallucinogenic effects of LSD, he says.
The topic of time is both excruciatingly complicated and slippery. The combination makes it easy to get bogged down. But instead of an exhaustive review, journalist Alan Burdick lets curiosity be his guide in Why Time Flies, an approach that leads to a light yet supremely satisfying story about time as it runs through — and is perceived by — the human body.
Burdick doesn’t restrict himself to any one aspect of his question. He spends time excavating what he calls the “existential caverns,” where philosophical questions, such as the shifting concept of now, dwell. He describes the circadian clocks that keep bodies running efficiently, making sure our bodies are primed to digest food at mealtimes, for instance. He even covers the intriguing and slightly insane self-experimentation by the French scientist Michel Siffre, who crawled into caves in 1962 and 1972 to see how his body responded in places without any time cues. In the service of his exploration, Burdick lived in constant daylight in the Alaskan Arctic for two summery weeks, visited the master timekeepers at the International Bureau of Weights and Measures in Paris to see how they precisely mete out the seconds and plunged off a giant platform to see if time felt slower during moments of stress. The book not only deals with fascinating temporal science but also how time is largely a social construct. “Time is what everybody agrees the time is,” one researcher told Burdick. That subjective truth also applies to the brain. Time, in a sense, is created by the mind. “Our experience of time is not a cave shadow to some true and absolute thing; time is our perception,” Burdick writes. That subjective experience becomes obvious when Burdick recounts how easily our brains’ clocks can be swayed. Emotions, attention (SN: 12/10/16, p. 10) and even fever can distort our time perception, scientists have found.
Burdick delves deep into several neuroscientific theories of how time runs through the brain (SN: 7/25/15, p. 20). Here, the story narrows somewhat in an effort to thoroughly explain a few key ideas. But even amid these details, Burdick doesn’t lose the overarching truth — that for the most part, scientists simply don’t know the answers. That may be because there is no one answer; instead, the brain may create time by stitching together a multitude of neural clocks. After reading Why Time Flies, readers will be convinced that no matter how much time passes, the mystery of time will endure.
Integrated circuits made of germanium instead of silicon have been reported … by researchers at International Business Machines Corp. Even though the experimental devices are about three times as large as the smallest silicon circuits, they reportedly offer faster overall switching speed. Germanium … has inherently greater mobility than silicon, which means that electrons move through it faster when a current is applied. — Science News, February 25, 1967
UPDATE: Silicon circuits still dominate computing. But demand for smaller, high-speed electronics is pushing silicon to its physical limits, sending engineers back for a fresh look at germanium. Researchers built the first compact, high-performance germanium circuit in 2014, and scientists continue to fiddle with its physical properties to make smaller, faster circuits. Although not yet widely used, germanium circuits and those made from other materials, such as carbon nanotubes, could help engineers make more energy-efficient electronics.
Helium — the recluse of the periodic table — is reluctant to react with other elements. But squeeze the element hard enough, and it will form a chemical compound with sodium, scientists report.
Helium, a noble gas, is one of the periodic table’s least reactive elements. Originally, the noble gases were believed incapable of forming any chemical compounds at all. But after scientists created xenon compounds in the early 1960s, a slew of other noble gas compounds followed. Helium, however, has largely been a holdout. Although helium was known to hook up with certain elements, the bonds in those compounds were weak, or the compounds were short-lived or electrically charged. But the new compound, called sodium helide or Na2He, is stable at high pressure, and its bonds are strong, an international team of scientists reports February 6 in Nature Chemistry.
As a robust helium compound, “this is really the first that people ever observed,” says chemist Maosheng Miao of California State University, Northridge, who was not involved with the research.
The material’s properties are still poorly understood, but it is unlikely to have immediate practical applications — scientists can create it only in tiny amounts at very high pressures, says study coauthor Alexander Goncharov, a physicist at the Carnegie Institution for Science in Washington, D.C. Instead, the oddball compound serves as inspiration for scientists who hope to produce weird new materials at lower pressures. “I would say that it’s not totally impossible,” says Goncharov. Scientists may be able to tweak the compound, for example, by adding or switching out elements, to decrease the pressure needed.
To coerce helium to link up with another element, the scientists, led by Artem Oganov of Stony Brook University in New York, first performed computer calculations to see which compounds might be possible. Sodium, calculations predicted, would form a compound with helium if crushed under enormously high pressure. Under such conditions, the typical rules of chemistry change — elements that refuse to react at atmospheric pressure can sometimes become bosom buddies when given a squeeze.
So Goncharov and colleagues pinched small amounts of helium and sodium between a pair of diamonds, reaching pressures more than a million times that of Earth’s atmosphere, and heated the material with lasers to temperatures above 1,500 kelvins (about 1200° Celsius). By scattering X-rays off the compound, the scientists could deduce its structure, which matched the one predicted by calculations. “I think this is really the triumph of computation,” says Miao. In the search for new compounds, computers now allow scientists to skip expensive trial-and-error experiments and zero in on the best candidates to create in a laboratory.
Na2He is an unusual type of compound known as an electride, in which pairs of electrons are cloistered off, away from any atoms. But despite the compound’s bizarre nature, it behaves somewhat like a commonplace compound such as table salt, in which negatively charged chloride ions alternate with positively charged sodium. In Na2He, the isolated electron pairs act like negative ions in such a compound, and the eight sodium atoms surrounding each helium atom are the positive ions.
“The idea that you can make compounds with things like helium which don’t react at all, I think it’s pretty interesting,” says physicist Eugene Gregoryanz of the University of Edinburgh. But, he adds, “I would like to see more experiments” to confirm the result.
The scientists’ calculations also predicted that a compound of helium, sodium and oxygen, called Na2HeO, should form at even lower pressures, though that one has yet to be created in the lab. So the oddball new helium compound may soon have a confirmed cousin.
Temperatures across Earth’s mantle are about 60 degrees Celsius higher than previously thought, a new experiment suggests. Such toasty temperatures would make the mantle runnier than earlier research suggested, a development that could help explain the details of how tectonic plates glide on top of the mantle, geophysicists report in the March 3 Science.
“Scientists have been arguing over the mantle temperature for decades,” says study coauthor Emily Sarafian, a geophysicist at the Woods Hole Oceanographic Institution in Massachusetts and at MIT. “Scientists will argue over 10 degree changes, so changing it by 60 degrees is quite a large jump.” The mostly solid mantle sits between Earth’s crust and core and makes up around 84 percent of Earth’s volume. Heat from the mantle fuels volcanic eruptions and drives plate tectonics, but taking the mantle’s temperature is trickier than dropping a thermometer down a hole.
Scientists know from the paths of earthquake waves and from measures of how electrical charge moves through Earth that a boundary in the mantle exists a few dozen kilometers below Earth’s surface. Above that boundary, mantle rock can begin melting on its way up to the surface. By mimicking the extreme conditions in the deep Earth — squeezing and heating bits of mantle that erupt from undersea volcanoes or similar rocks synthesized in the lab — scientist can also determine the melting temperature of mantle rock. Using these two facts, scientists have estimated that temperatures at the boundary depth below Earth’s oceans are around 1314° C to 1464° C when adjusted to surface pressure.
But the presence of water in the collected mantle bits, primarily peridotite rock, which makes up much of the upper mantle, has caused problems for researchers’ calculations. Water can drastically lower the melting point of peridotite, but researchers can’t prevent the water content from changing over time. In previous experiments, scientists tried to completely dry peridotite samples and then manually correct for measured mantle water levels in their calculations. The scientists, however, couldn’t tell for sure if the samples were water-free.
The measurement difficulties stem from the fact that peridotite is a mix of the minerals olivine and pyroxene, and the mineral grains are too small to experiment with individually. Sarafian and colleagues overcame this challenge by inserting spheres of pure olivine large enough to study into synthetic peridotite samples. These spheres exchanged water with the surrounding peridotite until they had the same dampness, and so could be used for water content measurements.
Using this technique, the researchers found that the “dry” peridotite used in previous experiments wasn’t dry at all. In fact, the water content was spot on for the actual wetness of the mantle. “By assuming the samples are dry, then correcting for mantle water content, you’re actually overcorrecting,” Sarafian says. The new experiment suggests that, if adjusted to surface pressure, the mantle under the eastern Pacific Ocean where two tectonic plates diverge, for example, would be around 1410°, up from 1350°. A hotter mantle is less viscous and more malleable, Sarafian says. Scientists have long been puzzled about some of the specifics of plate tectonics, such as to what extent the mantle resists the movement of the overlying plate. That resistance depends in part on the mix of rock, temperature and how melted the rock is at the boundary between the two layers (SN: 3/7/15, p. 6). This new knowledge could give researchers more accurate information on those details.
The revised temperature is only for the melting boundary in the mantle, so “it’s not the full story,” notes Caltech geologist Paul Asimow, who wrote a perspective on the research in the same issue of Science. He agrees that the team’s work provides a higher and more accurate estimate of that adjusted temperature, but he doesn’t think the researchers should assume temperatures elsewhere in the mantle would be boosted by a similar amount. “I’m not so sure about that,” he says. “We need further testing of mantle temperatures.”
Dental plaque preserved in fossilized teeth confirms that Neandertals were flexible eaters and may have self-medicated with an ancient equivalent of aspirin.
DNA recovered from calcified plaque on teeth from four Neandertal individuals suggest that those from the grasslands around Beligum’s Spy cave ate woolly rhinoceros and wild sheep, while their counterparts from the forested El Sidrón cave in Spain consumed a menu of moss, mushrooms and pine nuts.
The evidence bolsters an argument that Neandertals’ diets spanned the spectrum of carnivory and herbivory based on the resources available to them, Laura Weyrich, a microbiologist at the University of Adelaide in Australia, and her colleagues report March 8 in Nature.
The best-preserved Neandertal remains were from a young male from El Sidrón whose teeth showed signs of an abscess. DNA from a diarrhea-inducing stomach bug and several gum disease pathogens turned up in his plaque. Genetic material from poplar trees, which contain the pain-killing aspirin ingredient salicylic acid, and a plant mold that makes the antibiotic penicillin hint that he may have used natural medication to ease his ailments.
The researchers were even able to extract an almost-complete genetic blueprint, or genome, for one ancient microbe, Methanobrevibacter oralis. At roughly 48,000 years old, it’s the oldest microbial genome sequenced, the researchers report.
The Martian atmosphere definitely had more gas in the past.
Data from NASA’s MAVEN spacecraft indicate that the Red Planet has lost most of the gas that ever existed in its atmosphere. The results, published in the March 31 Science, are the first to quantify how much gas has been lost with time and offer clues to how Mars went from a warm, wet place to a cold, dry one.
Mars is constantly bombarded by charged particles streaming from the sun. Without a protective magnetic field to deflect this solar wind, the planet loses about 100 grams of its now thin atmosphere every second (SN: 12/12/15, p. 31). To determine how much atmosphere has been lost during the planet’s lifetime, MAVEN principal investigator Bruce Jakosky of the University of Colorado Boulder and colleagues measured and compared the abundances of two isotopes of argon at different altitudes in the Martian atmosphere. Using those measurements and an assumption about the amounts of the isotopes in the planet’s early atmosphere, the team estimates that about two-thirds of all of Mars’ argon gas has been ejected into space. Extrapolating from the argon data, the researchers also determined that the majority of carbon dioxide that the Martian atmosphere ever had also was kicked into space by the solar wind.
A thicker atmosphere filled with carbon dioxide and other greenhouse gases could have insulated early Mars and kept it warm enough for liquid water and possibly life. Losing an extreme amount of gas, as the results suggest, may explain how the planet morphed from lush and wet to barren and icy, the researchers write.
Computers don’t have eyes, but they could revolutionize the way scientists visualize cells.
Researchers at the Allen Institute for Cell Science in Seattle have devised 3-D representations of cells, compiled by computers learning where thousands of real cells tuck their component parts.
Most drawings of cells in textbooks come from human interpretations gleaned by looking at just a few dead cells at a time. The new Allen Cell Explorer, which premiered online April 5, presents 3-D images of genetically identical stem cells grown in lab dishes (composite, above), revealing a huge variety of structural differences. Each cell comes from a skin cell that was reprogrammed into a stem cell. Important proteins were tagged with fluorescent molecules so researchers could keep tabs on the cell membrane, DNA-containing nucleus, energy-generating mitochondria, microtubules and other cell parts. Using the 3-D images, computer programs learned where the cellular parts are in relation to each other. From those rules, the programs can generate predictive transparent models of a cell’s structure (below). The new views, which can capture cells at different time points, may offer clues into their inner workings. The project’s tools are available for other researchers to use on various types of cells. Insights gained from the explorations might lead to a better understanding of human development, cancer, health and diseases.
Researchers have already learned from the project that stem cells aren’t the shapeless blobs they might appear to be, says Susanne Rafelski, a quantitative cell biologist at the Allen Institute. Instead, the stem cells have a definite bottom and top, a proposed structure that’s now confirmed by the combined cell data, Rafelski says. A solid foundation of skeleton proteins forms at the bottom. The nucleus is usually found in the cell’s center. Microtubules bundle together into large fibers that tend to radiate from the top of the cell toward the bottom. During cell division, microtubules form structures called bipolar spindles that are necessary to divvy up DNA. One surprise was that the membrane surrounding the nucleus gets ruffled, but never completely disappears, during cell division. Near the top of the cell, above the nucleus, stem cells store tubelike mitochondria much the way plumbing and electrical wires are tucked into ceilings. The tubular mitochondria were notable because some researchers thought that since stem cells don’t require much energy, the organelles might separate into small, individual units.
Old ways of observing cells were like trying to get to know a city by looking at a map, Rafelski says. The cell explorer is more like a documentary of the lives of the citizens.
Every year science offers a diverse menu of anniversaries to celebrate. Births (or deaths) of famous scientists, landmark discoveries or scientific papers — significant events of all sorts qualify for celebratory consideration, as long as the number of years gone by is some worthy number, like 25, 50, 75 or 100. Or simple multiples thereof with polysyllabic names.
2017 has more than enough such anniversaries for a Top 10 list, so some worthwhile events don’t even make the cut, such as the births of Stephen Hawking (1942) and Arthur C. Clarke (1917). The sesquicentennial of Michael Faraday’s death (1867) almost made the list, but was bumped at the last minute by a book. Namely:
On Growth and Form, centennial (1917) A true magnum opus, by the Scottish biologist D’Arcy Wentworth Thompson, On Growth and Form has inspired many biologists with its mathematical analysis of physical and structural forces underlying the diversity of shapes and forms in the biological world. Nobel laureate biologist Sir Peter Medawar praised Thompson’s book as “beyond comparison the finest work of literature in all the annals of science that have been recorded in the English tongue.”
Birth of Abraham de Moivre, semiseptcentennial (1667). Born in France on May 26, 1667, de Moivre moved as a young man to London where he did his best work, earning election to the Royal Society. Despite exceptional mathematical skill, though, he attained no academic position and earned a meager living as a tutor. He is most famous for his book The Doctrine of Chances, which was in essence an 18th century version of Gambling for Dummies. It contained major advances in probability theory and in later editions introduced the concept of the famous bell curve. Isaac Newton was impressed; the legend goes that when anyone asked him about probability, Newton said to go talk to de Moivre.
Exoplanets, quadranscentennial (1992)It seems like exoplanets have been around almost forever (and probably actually were), but the first confirmed by Earthbound astronomers were reported just a quarter century ago. Three planets showed up orbiting not an ordinary star, but a pulsar, a rapidly spinning neutron star left behind by a supernova. Astrophysicists Aleksander Wolszczan and Dale Frail found a sign of the planets, first detected with the Arecibo radio telescope, in irregularities in the radio pulses from the millisecond pulsar PSR1257+12. Some luck was involved. In 1990, the Arecibo telescope was being repaired and couldn’t pivot to point at a specific target; instead it constantly watched just one region of the sky. PSR1257+12 just happened to float by.
Birth of Marie Curie, sesquicentennial (1867) No doubt the most famous Polish-born scientist since Copernicus, Curie was born in Warsaw on November 7, 1867, as Maria Sklodowska. Challenged by poverty, family tragedies and poor health, she nevertheless excelled as a high school student. But she then worked as a governess, while continuing as much science education as possible, until her married sister invited her to Paris. There she completed her physics education with honors and met and married another young physicist, Pierre Curie.
Together they tackled the mystery of the newly discovered radioactivity, winning the physics Nobel in 1903 along with radioactivity’s discoverer, Henri Becquerel. Marie continued the work after her husband’s tragic death in 1906; she became the first person to win a second Nobel, awarded in chemistry in 1911 for her discovery of the new radioactive elements polonium and radium.
Laws of Robotics, semisesquicentennial (1942) One of science fiction’s greatest contributions to modern technological philosophy was Isaac Asimov’s Laws of Robotics, which first appeared in a short story in the March 1942 issue of Astounding Science Fiction. Later, those laws formed the motif of his many robot novels and appeared in his famous Foundation Trilogy (and subsequent sequels and prequels). They were:
A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Much later Asimov added a “zeroth law,” requiring robots to protect all of humankind even if that meant violating the other three laws. Artificial intelligence researchers all know about Asimov’s laws, but somehow have not managed to enforce them on social media. Incidentally, this year is also the quadranscentennial of Asimov’s death in 1992.
First sustained nuclear fission chain reaction, semisesquicentennial (1942) Enrico Fermi, the Italian Nobel laureate, escaped fascist Italy to come to the United States shortly after nuclear fission’s discovery in Germany. Fermi directed construction of the “atomic pile,” or nuclear reactor, on a squash court under the stands of the University of Chicago’s football stadium. Fermi and his collaborators showed that neutrons emitted from fissioning uranium nuclei could induce more fission, creating a chain reaction capable of releasing enormous amounts of energy. Which it later did.
Discovery of pulsars, semicentennial (1967) Science’s awareness of the existence of pulsars turns 50 this year, thanks to the diligence of Irish astrophysicist Jocelyn Bell Burnell. She spent many late-night hours examining the data recordings from the radio telescope she helped to build that first spotted a signal from a pulsar. She recognized that the signal was something special even though others thought it was just a glitch in the apparatus. But she was a graduate student so her supervisor got the Nobel Prize instead of her.
Einstein’s theory of lasers, centennial (1917) Albert Einstein did not actually invent the laser, but he developed the mathematical understanding that made lasers possible. By 1917, physicists knew that quantum physics played a part in the working of atoms, but the details were fuzzy. Niels Bohr had shown in 1913 that an atom’s electrons occupy different energy levels, and that falling from a high energy level to a lower one emits radiation.
Einstein worked out the math describing this process when many atoms have electrons in high-energy states and emit radiation. His analysis of matter-radiation interaction indicated that it would be possible to prepare many atoms in the same high-energy state and then stimulate them to emit radiation all at once. Properly done, all the atoms would emit radiation of identical wavelength with the waves in phase. A few decades later other physicists figured out how to build such a device for use as a powerful weapon or to read bar codes at grocery stores.
Qubits, quadranscentennial (1992) An even better quantum anniversary than lasers is the presentation to the world of the concept of quantum bits of information. Physicist Ben Schumacher of Kenyon College in Ohio unveiled the idea at a conference in Dallas in 1992 (I was there). A “quantum bit” of information, or qubit, represents the information contained in a quantum particle, which can exist in multiple states at once. A photon, for instance, might simultaneously be in a state of horizontal or vertical polarization. Or an electron’s spin could be up and down at the same time.
Such states differ from classical bits of information in a computer, recorded as either a 0 or 1; a quantum bit is both 0 and 1 at the same time. It becomes one or the other only when observed, much like a flipped coin is nether heads nor tails until somebody catches it, or it lands on the 50 yard line. Schumacher’s idea did not get a lot of attention at first, but it eventually became the foundational idea for quantum information theory, a field now booming with efforts to construct a quantum computer based on the manipulation of qubits.
Birth of modern cosmology, centennial (1917) It might seem unfair that Einstein gets two Top 10 anniversaries in 2017, but 1917 was a good year for him. Before publishing his laser paper, Einstein tweaked the equations of his brand-new general theory of relativity in order to better explain the universe (details in Part 1). Weirdly, Einstein didn’t understand the universe, and he later thought the term he added to his equations was a mistake. But it turns out that today’s understanding of the universe’s behavior — expanding at an accelerating rate — seems to require the term that Einstein thought he had added erroneously. But you can’t expect Einstein to have foreseen everything. He probably had no idea that lasers would revolutionize grocery shopping either.