Why the turtle got its shell

Turtle shells didn’t get their start as natural armor, it seems. The reptiles’ ancestors might have evolved partial shells to help them burrow instead, new research suggests. Only later did the hard body covering become useful for protection.

The findings might also help explain how turtles’ ancestors survived a mass extinction 250 million years ago that wiped out most plants and animals on earth, scientists report online July 14 in Current Biology.

Most shelled animals, like armadillos, get their shells by adding bony scales all over their bodies. Turtles, though, form shells by gradually broadening their ribs until the bones fuse together. Fossils from ancient reptiles with partial shells made from thickened ribs suggest that turtles’ ancestors began to suit up in the same way.
It’s an unusual mechanism, says Tyler Lyson, a paleontologist at the Denver Museum of Nature and Science who led the study. Thicker ribs don’t offer much in the way of protection until they’re fully fused, as they are in modern turtles. And the modification makes critical functions like moving and breathing much harder — a steep price for an animal to pay. So Lyson suspected there was some advantage other than protection to the partial shells.

He and his colleagues examined fossils from prototurtles, focusing on an ancient South African reptile called Eunotosaurus africanus.

Eunotosaurus shared many characteristics with animals that dig and burrow, the researchers found. The reptile had huge claws and large triceps in addition to thickened ribs.
“We could tell that this animal was very powerful,” says Lyson.
Broad ribs “provide a really, really strong and stable base from which to operate this powerful digging mechanism,” he adds. Like a backhoe, Eunotosaurus could brace itself to burrow into the dirt.

Thanks to a lucky recent find of a fossil preserving the bones around the eyes, the team was even able to tell that the prototurtles’ eyes were well adapted to low light. That’s another characteristic of animals that spend time underground.

Swimming and digging use similar motions, Lyson says, so you would expect to find similar skeletal adaptations in water-dwelling animals. But large claws good for moving dirt suggest a life on land.

Fossils from other prototurtle species also have wider ribs and big claws. So the researchers think these traits may have been important for early turtle evolution in general, not just for Eunotosaurus.

Not everyone is entirely convinced. “It’s a very plausible idea, although many other animals burrow but don’t have these specializations,” says Hans Sues, a paleontologist at the Smithsonian Institution’s National Museum of Natural History. Sues says that it will be important to find and study other turtle ancestors well-adapted to digging to bolster the explanation.

Lyson thinks the prototurtles’ burrowing tendencies might have helped them survive the end-Permian mass extinction around 250 million years ago (SN: 9/19/15, p. 10).

“Lots of animals at this time period burrowed underground to avoid the very, very arid environment that was present in South Africa,” Lyson says. “The burrow provides more climate control.”

Nail-biting and thumb-sucking may not be all bad

There are plenty of reasons to tell kids not to bite their nails or suck their thumbs. Raw fingernail areas pick up infection, and thumbs can eventually move teeth into the wrong place. Not to mention these habits slop spit everywhere. But these bad habits might actually good for something: Kids who sucked their thumbs or chewed their nails had lower rates of allergic reactions in lab tests, a new study finds.

The results come from a group of more than 1,000 children in New Zealand. When the kids were ages 5, 7, 9 and 11, their parents were asked if the kids sucked their thumbs or bit their nails. At age 13, the kids came into a clinic for an allergen skin prick test. That’s a procedure in which small drops of common allergens such as pet dander, wool, dust mites and fungus are put into a scratch on the skin to see if they elicit a reaction.

Kids whose parents said “certainly” to the question of thumb-sucking or nail-biting were less likely to react to allergens in the skin prick test, respiratory doctor Robert Hancox of the University of Otago in New Zealand and colleagues report July 11 in Pediatrics. And this benefit seemed to last. The childhood thumb-suckers and nail-biters still had fewer allergic reactions at age 32.

The results fit with other examples of the benefits of germs. Babies whose parents cleaned dirty pacifiersby popping them into their own mouths were more protected against allergies. And urban babies exposed to roaches, mice and cats had fewer allergies, too. These scenarios all get more germs in and on kids’ bodies. And that may be a good thing. An idea called the hygiene hypothesis holds that exposure to germs early in life can train the immune system to behave itself, preventing overreactions that may lead to allergies and asthma.

It might be the case that germy mouths bring benefits, but only when kids are young. Hancox and his colleagues don’t know when the kids in their study first started sucking thumbs or biting nails, but having spent time around little babies, I’m guessing it was pretty early.

So does this result mean that parents shouldn’t discourage — or even encourage — these habits? Hancox demurs. “We don’t have enough evidence to suggest that parents change what they do,” he says. Still, the results may offer some psychological soothing, he says. “Perhaps if children have habits that are difficult to break, there is some consolation for parents that there might be a reduced risk of developing allergy.”

Debate accelerates on universe’s expansion speed

A puzzling mismatch is plaguing two methods for measuring how fast the universe is expanding. When the discrepancy arose a few years ago, scientists suspected it would fade away, a symptom of measurement errors. But the latest, more precise measurements of the expansion rate — a number known as the Hubble constant — have only deepened the mystery.

“There’s nothing obvious in the measurements or analyses that have been done that can easily explain this away, which is why I think we are paying attention,” says theoretical physicist Marc Kamionkowski of Johns Hopkins University.
If the mismatch persists, it could reveal the existence of stealthy new subatomic particles or illuminate details of the mysterious dark energy that pushes the universe to expand faster and faster.

Measurements based on observations of supernovas, massive stellar explosions, indicate that distantly separated galaxies are spreading apart at 73 kilometers per second for each megaparsec (about 3.3 million light-years) of distance between them. Scientists used data from NASA’s Hubble Space Telescope to make their estimate, presented in a paper to be published in the Astrophysical Journal and available online at arXiv.org. The analysis pegs the Hubble constant to within experimental errors of just 2.4 percent — more precise than previous estimates using the supernova method.

But another set of measurements, made by the European Space Agency’s Planck satellite, puts the figure about 9 percent lower than the supernova measurements, at 67 km/s per megaparsec with an experimental error of less than 1 percent. That puts the two measurements in conflict. Planck’s result, reported in a paper published online May 10 at arXiv.org, is based on measurements of the cosmic microwave background radiation, ancient light that originated just 380,000 years after the Big Bang.

And now, another team has weighed in with a measurement of the Hubble constant. The Baryon Oscillation Spectroscopic Survey also reported that the universe is expanding at 67 km/s per mega-parsec, with an error of 1.5 percent, in a paper posted online at arXiv.org on July 11. This puts BOSS in conflict with the supernova measurements as well. To make the measurement, BOSS scientists studied patterns in the clustering of 1.2 million galaxies. That clustering is the result of pressure waves in the early universe; analyzing the spacing of those imprints on the sky provides a measure of the universe’s expansion.

Although the conflict isn’t new (SN: 4/5/14, p. 18), the evidence that something is amiss has strengthened as scientists continue to refine their measurements.
The latest results are now precise enough that the discrepancy is unlikely to be a fluke. “It’s gone from looking like maybe just bad luck, to — no, this can’t be bad luck,” says the leader of the supernova measurement team, Adam Riess of Johns Hopkins. But the cause is still unknown, Riess says. “It’s kind of a mystery at this point.”
Since its birth from a cosmic speck in the Big Bang, the universe has been continually expanding. And that expansion is now accelerating, as galaxy clusters zip away from one another at an ever-increasing rate. The discovery of this acceleration in the 1990s led scientists to conclude that dark energy pervades the universe, pushing it to expand faster and faster.

As the universe expands, supernovas’ light is stretched, shifting its frequency. For objects of known distance, that frequency shift can be used to infer the Hubble constant. But measuring distances in the universe is complicated, requiring the construction of a “distance ladder,” which combines several methods that build on one another.

To create their distance ladder, Riess and colleagues combined geometrical distance measurements with “standard candles” — objects of known brightness. Since a candle that’s farther away is dimmer, if you know its absolute brightness, you can calculate its distance. For standard candles, the team used Cepheid variable stars, which pulsate at a rate that is correlated with their brightness, and type 1a supernovas, whose brightness properties are well-understood.

Scientists on the Planck team, on the other hand, analyzed the cosmic microwave background, using variations in its temperature and polarization to calculate how fast the universe was expanding shortly after the Big Bang. The scientists used that information to predict its current rate of expansion.

As for what might be causing the persistent discrepancy between the two methods, there are no easy answers, Kamionkowski says. “In terms of exotic physics explanations, we’ve been scratching our heads.”

A new type of particle could explain the mismatch. One possibility is an undiscovered variety of neutrino, which would affect the expansion rate in the early universe, says theoretical astrophysicist David Spergel of Princeton University. “But it’s hard to fit that to the other data we have.” Instead, Spergel favors another explanation: some currently unknown feature of dark energy. “We know so little about dark energy, that would be my guess on where the solution most likely is,” he says.

If dark energy is changing with time, pushing the universe to expand faster than predicted, that could explain the discrepancy. “We could be on our way to discovering something nontrivial about the dark energy — that it is an evolving energy field as opposed to just constant,” says cosmologist Kevork Abazajian of the University of California, Irvine.

A more likely explanation, some experts say, is that a subtle aspect of one of the measurements is not fully understood. “At this point, I wouldn’t say that you would point at either one and say that there are really obvious things wrong,” says astronomer Wendy Freedman of the University of Chicago. But, she says, if the Cepheid calibration doesn’t work as well as expected, that could slightly shift the measurement of the Hubble constant.

“In order to ascertain if there’s a problem, you need to do a completely independent test,” says Freedman. Her team is working on a measurement of the Hubble constant without Cepheids, instead using two other types of stars: RR Lyrae variable stars and red giant branch stars.

Another possibility, says Spergel, is that “there’s something missing in the Planck results.” Planck scientists measure the size of temperature fluctuations between points on the sky. Points separated by larger distances on the sky give a value of the Hubble constant in better agreement with the supernova results. And measurements from a previous cosmic microwave background experiment, WMAP, are also closer to the supernova measurements.

But, says George Efstathiou, an astrophysicist at the University of Cambridge and a Planck collaboration member, “I would say that the Planck results are rock solid.” If simple explanations in both analyses are excluded, astronomers may be forced to conclude that something important is missing in scientists’ understanding of the universe.

Compared with past disagreements over values of the Hubble constant, the new discrepancy is relatively minor. “Historically, people argued vehemently about whether the Hubble constant was 50 or 100, with the two camps not conceding an inch,” says theoretical physicist Katherine Freese of the University of Michigan in Ann Arbor. The current difference between the two measurements is “tiny by the standards of the old days.”

Cosmological measurements have only recently become precise enough for a few-percent discrepancy to be an issue. “That it’s so difficult to explain is actually an indication of how far we’ve come in cosmology,” Kamionkowski says. “Twenty-five years ago you would wave your hands and make something up.”

Science finds many tricks for traveling to the past

Talking about her cover story on what iron-loving elements are telling geologists about the Earth’s deep past, Alexandra Witze likens these rare metals to time travelers. They can tell you, she says, what was happening more than 4.5 billion years ago, during the first 50 million years of our planet’s existence. By then the Earth’s molten interior had begun to settle into its current layer cake form: a dense, solid inner core surrounded by an outer liquid core — both rich in iron and metals such as gold, platinum, ruthenium and others that tend to form alloys with iron. The scarcity of these metals in the outer layers of the planet — the mantle and crust — make them precious to us.
Their high melting points and other properties help them resist change, allowing geoscientists to use them as fingerprints that mark events in the distant past. With new, more precise analytic techniques, scientists can now measure the amounts of these iron-loving metals relative to other elements to deduce what happened to them over eons of time. These traces are found in some very old rocks, Witze reports (SN: 8/6/16, p. 22), such as 3.8-billion-year-old deposits in Greenland. But the metals also show up as ancient time capsules in younger rock. Studying these traces reveals the imperfect mixing of the mantle and can provide insight into outstanding questions, such as why amounts of these metals differ in the mantles of the moon and Earth.
Science is surprisingly adept at this type of virtual time travel. Researchers have repeatedly come up with ways to discover facts about the distant past. In this issue of Science News alone, several new findings illustrate the ability of science to figure out things that would seem impossibly difficult to know. A black hole in a distant galaxy formed over 13 billion years ago, for example, so long ago that it’s hard to even imagine reconstructing the events that led to its birth. But scientists have now pieced together clues, Christopher Crockett reports (SN: 8/6/16, p. 7), that it formed by the direct collapse of a massive gas cloud, rather than from the death of a massive star (the more common origin of black holes).

Reconstructing the evolution of the tail has been stymied by a lack of fossils from creatures that led the transition from water to land. But that hasn’t stopped scientists eager to explore the biomechanics of fishlike animals attempting to hop out of the water and up a slope. Studies of big-tailed fish called mudskippers highlight the utility of a tail in balancing flipper-hops up a sandy incline, Susan Milius reports (SN: 8/6/16, p. 13). To describe the math, scientists built a robot and made it scale an unsteady hill of shifty poppy seeds or plastic bits. Their conclusion: The tail could have been a big assist to flippered creatures emerging on sandy shores several hundred million years ago.

The story on Homo naledi by Bruce Bower (SN: 8/6/16, p. 12) shows why sometimes scientists might just prefer to actually time travel. Efforts to date the bones of this hominid species have proved frustrating; the latest estimate, 912,000 years old, was deduced from evolutionary trees. Knowing how old H. naledi actually is might reveal the diversity of relatively recent hominid species, and perhaps help piece together the story of how Homo sapiens became the sole survivors. That’s some time travel I’d be interested in booking.

Oldest evidence of cancer in human family tree found

Cancer goes way, way back. A deadly form of this disease and a noncancerous but still serious tumor afflicted members of the human evolutionary family nearly 2 million years ago, two new investigations of fossils suggest.

If those conclusions hold up, cancers are not just products of modern societies, as some researchers have proposed. “Our studies show that cancers and tumors occurred in our ancient relatives millions of years before modern industrial societies existed,” says medical anthropologist Edward Odes of the University of the Witwatersrand in Johannesburg, a coauthor of both new studies. Today, however, pesticides, longer life spans and other features of the industrialized world may increase rates of cancers and tumors.
A 1.6-million- to 1.8-million-year-old hominid, either from the Homo genus or a dead-end line called Paranthropus, suffered from a potentially fatal bone cancer, Odes and colleagues say in one of two papers published in the July/August South African Journal of Science. Advanced X-ray techniques enabled identification of a fast-growing bone cancer on a hominid toe fossil previously unearthed at South Africa’s Swartkrans Cave site, the researchers report. This malignant cancer consisted of a mass of bone growth on both the toe’s surface and inside the bone.

Until now, the oldest proposed cancer in hominids consisted of an unusual growth on an African Homo erectus jaw fragment dating to roughly 1.5 million years ago. Critics, though, regard that growth as the result of a fractured jaw, not cancer.

A second new study, led by biological anthropologist Patrick Randolph-Quinney, now at the University of Central Lancashire in England, identifies the oldest known benign tumor in a hominid in a bone from an
Australopithecus sediba child. This tumor penetrated deep into a spinal bone, close to an opening for the spinal cord. Nearly 2-million-year-old partial skeletons of the child and an adult of the same species were found in an underground cave at South Africa’s Malapa site ( SN: 8/10/13, p. 26 ).
Although not life-threatening, this tumor would have interfered with walking, running and climbing, the researchers say. People today, especially children, rarely develop such tumors in spinal bones.

“This is the first evidence of such a disease in a young individual in the fossil record,” Randolph-Quinney says.

X-ray technology allowed scientists to create and analyze 3-D copies of the inside and outside of the toe and spine fossils.

But studies of fossil bones alone, even with sophisticated imaging technology, provide “a very small window” for detecting cancers and tumors, cautions paleoanthropologist Janet Monge of the University of Pennsylvania Museum of Archaeology and Anthropology in Philadelphia. Microscopic analysis of soft-tissue cells, which are typically absent on fossils, confirms cancer diagnoses in people today, she says.

Without additional evidence of bone changes in and around the proposed cancer and tumor, Monge won’t draw any conclusions about what caused those growths.

Monge led a team that found a tumor on a 120,000- to 130,000-year-old Neandertal rib bone from Eastern Europe. Whether the tumor was cancerous or caused serious health problems can’t be determined, the scientists concluded in 2013 in PLOS ONE.

Bottom quarks misbehave in LHC experiment

CHICAGO — Theoretical physicists are scratching their heads after scientists presented surprising new studies of a particle known as the bottom quark.

At the new, higher energies recently reached at the Large Hadron Collider particle accelerator, particles containing bottom quarks flew off at an angle more often than expected. Scientists reported the result August 4 at the International Conference on High Energy Physics.

Quarks make up larger particles like the proton and neutron. At the LHC, near Geneva, scientists smash together protons to produce new particles, including bottom quarks.
Those bottom quarks are bound together with other quarks into larger particles known as b hadrons. Scientists with LHCb, an experiment at the LHC, found an unexpected behavior in b hadrons that sped off at an angle from beams of colliding protons, rather than continuing on a nearly parallel trajectory. At high energies, the number of b hadrons flying off at an angle, relative to those at lower energies, was almost twice as large as expected.

The discrepancy could point to a problem with scientists’ predictions of how the particles should behave. Such predictions are based on the theory of how quarks interact, known as quantum chromodynamics, or QCD, which is important for grasping the inner workings of protons and neutrons. “Understanding QCD really sets the basis of our understanding of nature,” says LHCb member Marina Artuso of Syracuse University in New York.

Scientists who make predictions for how b hadrons should behave have had trouble explaining the discrepancy. “Whichever way you turn it, it’s really weird. Which to me, personally, makes it extremely exciting,” says theoretical physicist Michelangelo Mangano of CERN, the European particle physics lab that operates the LHC.

But, he cautioned, it’s unlikely to be an indication of phenomena that would upend the standard model of particle physics. Rather, it may be that calculations need further refinement, or that scientists need to tweak their understanding of the proton, by altering estimates of the momentum carried by the various particles found inside it.

The issue could also lie with LHCb’s measurement, but the scientists say they are very confident in their result. The team continues to study the data to better characterize the effect.

Scientists get a glimpse of chemical tagging in live brains

For the first time, scientists can see where molecular tags known as epigenetic marks are altered in the brain.

These chemical tags — which flag DNA or its protein associates, known as histones — don’t change the genes but can change gene activity. Abnormal epigenetic marks have been associated with brain disorders such as Alzheimer’s disease, schizophrenia, depression and addiction.

Researchers at Massachusetts General Hospital in Boston devised a tracer molecule that latches on to a protein that removes one type of epigenetic mark known as histone acetylation.

The scientists then used PET scans to detect where a radioactive version of the tracer appeared in the brains of eight healthy young adult men and women, the researchers report in the Aug. 10 Science Translational Medicine. Further studies could show that the marks change as people grow older or develop a disease. The team studied only healthy young volunteers so can’t yet say whether epigenetic marking changes with age or disease.

Tabby’s star drama continues

A star that made headlines for its bizarre behavior has got one more mystery for astronomers to ponder.

Tabby’s star, also known as KIC 8462852, has been inexplicably flickering and fading. The Kepler Space Telescope caught two dramatic drops in light — by up to 22 percent — spaced nearly two years apart. Photographs from other telescopes dating back to 1890 show that the star also faded by roughly 20 percent over much of the last century. Possible explanations for the behavior range from mundane comet swarms to fantastical alien engineering projects (SN Online: 2/2/16).
A new analysis of data from Kepler, NASA’s premier planet hunter, shows that Tabby’s star steadily darkened throughout the telescope’s primary four-year mission. That’s in addition to the abrupt flickers already seen during the same time period. Over the first 1,100 days, the star dimmed by nearly 1 percent. Then the light dropped another 2.5 percent over the following six months before leveling off during the mission’s final 200 days.

Astronomers Benjamin Montet of Caltech and Josh Simon of the Observatories of the Carnegie Institution of Washington in Pasadena, Calif., report the findings online August 4 at arXiv.org.

The new data support a previous claim that the star faded between 1890 and 1989, a claim that some researchers questioned. “It’s just getting stranger,” says Jason Wright, an astronomer at Penn State University. “This is a third way in which the star is weird. Not only is it getting dimmer, it’s doing so at different rates.”
The slow fading hadn’t been noticed before because data from Kepler are processed to remove long-term trends that might confuse planet-finding algorithms. To find the dimming, Montet and Simon analyzed images from the telescope that are typically used only to calibrate data.
“Their analysis is very thorough,” says Tabetha Boyajian, an astronomer at Yale University who in 2015 reported the two precipitous drops in light (and for whom the star is nicknamed). “I see no flaws in that at all.”

While the analysis is an important clue, it doesn’t yet explain the star’s erratic behavior. “It doesn’t push us in any direction because it’s nothing that we’ve ever encountered before,” says Boyajian. “I’ve said ‘I don’t know’ so many times at this point.”

An object (or objects) moving in front of the star and blocking some of the light is still the favored explanation — though no one has figured out what that object is. The drop in light roughly 1,100 days into Kepler’s mission is reminiscent of a planet crossing in front of a star, Montet says. But given how slowly the light dropped, such a planet (or dim star) would have to live on an orbit more than 60 light-years across. The odds of catching a body on such a wide, slow orbit as it passed in front of the star are so low, says Montet, that you would need 10,000 Kepler missions to see just one. “We figure that’s pretty unlikely.”

An interstellar cloud wandering between Earth and KIC 8462852 is also unlikely, Wright says. “If the interstellar medium had these sorts of clumps and knots, it should be a ubiquitous phenomenon. We would have known about this for decades.” While some quasars and pulsars appear to flicker because of intervening material, the variations are minute and nothing like the 20 percent dips seen in Tabby’s star.

A clump of gas and dust orbiting the star — possibly produced by a collision between comets — is a more likely candidate, although that doesn’t explain the century-long dimming. “Nothing explains all the effects we see,” says Montet.

Given the star’s unpredictable nature, astronomers need constant vigilance to solve this mystery. The American Association of Variable Star Observers is working with amateur astronomers to gather continuous data from backyard telescopes around the globe. Boyajian and colleagues are preparing to monitor KIC 8462852 with the Las Cumbres Observatory Global Telescope Network, a worldwide web of telescopes that can keep an incessant eye on the star. “At this point, that’s the only thing that’s going to help us figure out what it is,” she says.

Fentanyl’s death toll is rising

For some people, fentanyl can be a life-saver, easing profound pain. But outside of a doctor’s office, the powerful opioid drug is also a covert killer.

In the last several years, clandestine drugmakers have begun experimenting with this ingredient, baking it into drugs sold on the streets, most notably heroin. Fentanyl and closely related compounds have “literally invaded the entire heroin supply,” says medical toxicologist Lewis Nelson of New York University Langone Medical Center.

Fentanyl is showing up in other drugs, too. In San Francisco’s Bay Area in March, high doses of fentanyl were laced into counterfeit versions of the pain pill Norco. In January, fentanyl was found in illegal pills sold as oxycodone in New Jersey. And in late 2015, fentanyl turned up in fake Xanax pills in California.
This ubiquitous recipe-tinkering makes it impossible for users to know whether they’re about to take drugs mixed with fentanyl. And that uncertainty has proved deadly. Fentanyl-related deaths are rising sharply in multiple areas. National numbers are hard to come by, but in many regions around the United States, fentanyl-related fatalities have soared in recent years.

Maryland is one of the hardest-hit states. From 2007 to 2012, the number of fentanyl-related deaths hovered around 30 per year. By 2015, that number had grown to 340. A similar rise is obvious in Connecticut, where in 2012, there were 14 fentanyl-related deaths. In 2015, that number was 188.
In Massachusetts, two-thirds of people who died from opioid overdoses in the first half of 2016 showed signs of fentanyl. This wave of fentanyl-related overdoses is “horrendous,” says Daniel Ciccarone of the University of California, San Francisco. What’s worse, he says, “I think it’s here to stay.”
Fentanyl is not a new drug. Available in the 1960s, it is still used in hospitals as an anesthetic and is available by prescription to fight powerful pain. What’s new, Ciccarone says, is that clandestine drug manufacturers have discovered that the euphoria-producing opioid can be made cheaply and easily — no poppy fields necessary.

Fentanyl is about 30 to 40 times stronger than heroin and up to 100 times more powerful than morphine, which means that a given effect on the body can be achieved with a much smaller amount of fentanyl. Inadvertently taking a bit of fentanyl can cause big trouble. “It’s a dosing problem,” Nelson says. “Because the drug is so potent, little changes in measurements can have very big implications for toxicity. That’s really the problem.”

That problem is made worse by the variability of illegal drugs — users often don’t know what they’re buying. Illegal labs aren’t pumping out products with carefully calibrated doses or uniform chemical makeup. The drugs change from day to day, making it nearly impossible for a user to know what he or she is about to take, Ciccarone says.

He has seen this struggle up close. Drug users have told him that the products they buy are unpredictable. Another thing people are telling him: “That they and their friends and compatriots are dropping like flies.” Tellingly, some of the most experienced drug users have recently begun doing “tester shots,” small doses to get a sense of the type and dose of drug they’re about to use, Ciccarone says.

Users are right to be wary. Typically, opioids can kill by gradually depressing a person’s ability to breathe. Illicit fentanyl, a recent study suggests, can kill within minutes by paralyzing muscles. Doctors have known that when injected quickly, fentanyl can paralyze chest wall muscles, prevent breathing and kill a person rapidly. That effect, called “wooden chest,” might help explain the rise in fentanyl-related deaths, scientists report in the June Clinical Toxicology.

A quick injection of fentanyl “literally freezes the muscles and you can’t move the chest,” says toxicologist Henry Spiller of the Central Ohio Poison Center in Columbus. That’s why doctors who dispense fentanyl in the hospital intentionally proceed very slowly and keep the opioid-counteracting drug naloxone (Narcan) on hand. “If you give it too fast, we know this occurs,” Spiller says. But it wasn’t known whether this same phenomenon might help explain the death rate of people using the drug illegally.

Spiller and colleagues tested post-mortem concentrations of fentanyl and its breakdown product norfentanyl in 48 fentanyl-related deaths. The body usually begins breaking down fentanyl into norfentanyl within two minutes, an earlier study found. Yet in 20 of the cases, the researchers found no signs of norfentanyl, indicating death came almost immediately after first receiving fentanyl.

Naloxone can counteract the effects of opioids if someone nearby can administer the antidote. But for people whose chests quickly freeze from fentanyl, resuscitation becomes more unlikely. Fentanyl “is just a bad drug,” Spiller says.
Fentanyl’s danger is magnified for people not accustomed to taking opioids, such as those addicted to cocaine, a situation illustrated by a recent tragedy in New Haven, Conn.

New Haven authorities noticed a string of suspicious overdoses in late June, leaving three people dead. Drug users thought they were buying cocaine, but the drugs contained fentanyl, says analytical toxicologist Kara Lynch of the University of California, San Francisco. As one of the handful of labs capable of testing blood and urine for fentanyl, hers was called on to identify the culprit. Her lab spotted fentanyl in Norco tablets back in March.

Lynch’s group uses high-resolution mass spectrometry to detect many drugs’ chemical signatures. But this method reveals only the drugs scientists suspect. “We can look for what we know to look for,” she says. And success depends on getting the samples in the first place.

The logistical hurdles of figuring out exactly what a person took, and how much, and when, are large. Ciccarone contrasts the situation with cases of food poisoning. When people start getting sick, public health officials can figure out what lettuce people ate and test it for pathogens. The same kind of tracking system doesn’t exist for drugs. His efforts to develop a system for testing illegal drugs in Baltimore broke down in part because no one had time to do the work. “The coroner is so busy right now with dead bodies,” he says. “They don’t have the time to test the ‘lettuce.’ ”

In the quest to curb fentanyl-related deaths, scientists and public health officials are searching for new strategies. Spiller advocates a more targeted public health message to users, one that emphasizes that fentanyl is simply a deadly drug, not just a more potent high. Ciccarone says that facilities where drug users can take illegal drugs under the care of medical personnel might reduce the number of fatalities.

For now, the scope of the problem continues to grow, Nelson says. The situation is made worse by the ingenuity of illicit drugmakers, who readily experiment with new compounds. Fentanyl itself can be tweaked to create at least 16 related forms, one of which, acetyl fentanyl, has been linked to overdose deaths. New drugs and new tweaks to old drugs rapidly evolve (SN: 5/16/15, p. 22), Nelson says, creating a game of whack-a-mole in which designer drugs confound public health officials and law enforcement.

“There is no single easy solution to this problem,” he says.

Lyme bacteria swap ‘catch bonds’ to navigate blood vessels

To zip through the bloodstream and spread infection throughout the body, the bacteria that cause Lyme disease take a cue from the white blood cells trying to attack them. Both use specialized bonds to stick to the cells lining blood vessels and move along at their own pace, biologist Tara Moriarty and colleagues report September 6 in Cell Reports.

“It’s really an amazing case of convergent evolution,” says Wendy Thomas, a biologist at the University of Washington in Seattle who wasn’t part of the study. “There’s little structural similarity between the molecules involved in these behaviors, and yet their behavior is the same.”
Traveling through the bloodstream is more like a whitewater rafting adventure than a lazy Sunday afternoon float. It can be a highly efficient way for bacteria to spread from an infection site to set up shop elsewhere in the body, but the microbes need some way to control where they go instead of just being swept away. So to move at their own speed while withstanding the forces of blood flow, bacteria creep along the side walls by steadily making and breaking bonds with other cells, says Moriarty, of the University of Toronto.

Borrelia burgdorferi is a corkscrew-shaped bacterium that causes Lyme disease. It works its way into the human body via a bite from an infected tick but then spreads through the whole body, causing joint pain and neurological problems. Biologists have known that B. burgdorferi can move in and out of the bloodstream, says Mark Wooten, a microbiologist at the University of Toledo in Ohio who wasn’t involved in the work. But this study gives a detailed explanation of exactly how it might do so.
Moriarty and her colleagues lined flow chambers with human endothelial cells to mimic the bloodstream environment. Then her team used high-powered microscopes to watch what happened to bacteria moving through the chamber along with blood cells. A computer program helped the scientists track exactly how individual bacteria navigated the mock bloodstream.
The researchers found that B. burgdorferi making a protein called BBK32 form specialized links called “catch bonds” with the endothelial cells — a technique that white blood cells also use. Catch bonds get stronger when under mechanical stress, helping the bacteria to hold on under pressure. Bungee cord–like structures called tethers work alongside the catch bonds to even out the load placed on the bonds.

But B. burgdorferi need to move to infect, and if they let go of the blood vessel walls completely, they’ll be washed away. So like someone moving hand-over-hand across monkey bars, B. burgdorferi shift their load from bond to bond. As they break one bond, they transfer their load to a new bond, moving steadily forward while remaining continually attached. White blood cells use a similar trick to move across endothelial cells.

B. burgdorferi can also use whiplike appendages called flagella to control their movement through the bloodstream. In B. burgdorferi and other related bacteria, the flagella wrap around the bacteria to help the microbes propel themselves forward like drill bits. The force generated by the B. burgdorferi flagella is greater than the forces trying to rip the bacteria off the blood vessel walls, Moriarty and her colleagues found.

“What that basically means is that bacteria are strong enough to overcome the force that they experience under blood flow, which means they should be theoretically strong enough to get to a spot where they can exit the bloodstream,” says Moriarty. That might allow Lyme bacteria to control when and where they exit the bloodstream to infect other organs.