Debate accelerates on universe’s expansion speed

A puzzling mismatch is plaguing two methods for measuring how fast the universe is expanding. When the discrepancy arose a few years ago, scientists suspected it would fade away, a symptom of measurement errors. But the latest, more precise measurements of the expansion rate — a number known as the Hubble constant — have only deepened the mystery.

“There’s nothing obvious in the measurements or analyses that have been done that can easily explain this away, which is why I think we are paying attention,” says theoretical physicist Marc Kamionkowski of Johns Hopkins University.
If the mismatch persists, it could reveal the existence of stealthy new subatomic particles or illuminate details of the mysterious dark energy that pushes the universe to expand faster and faster.

Measurements based on observations of supernovas, massive stellar explosions, indicate that distantly separated galaxies are spreading apart at 73 kilometers per second for each megaparsec (about 3.3 million light-years) of distance between them. Scientists used data from NASA’s Hubble Space Telescope to make their estimate, presented in a paper to be published in the Astrophysical Journal and available online at arXiv.org. The analysis pegs the Hubble constant to within experimental errors of just 2.4 percent — more precise than previous estimates using the supernova method.

But another set of measurements, made by the European Space Agency’s Planck satellite, puts the figure about 9 percent lower than the supernova measurements, at 67 km/s per megaparsec with an experimental error of less than 1 percent. That puts the two measurements in conflict. Planck’s result, reported in a paper published online May 10 at arXiv.org, is based on measurements of the cosmic microwave background radiation, ancient light that originated just 380,000 years after the Big Bang.

And now, another team has weighed in with a measurement of the Hubble constant. The Baryon Oscillation Spectroscopic Survey also reported that the universe is expanding at 67 km/s per mega-parsec, with an error of 1.5 percent, in a paper posted online at arXiv.org on July 11. This puts BOSS in conflict with the supernova measurements as well. To make the measurement, BOSS scientists studied patterns in the clustering of 1.2 million galaxies. That clustering is the result of pressure waves in the early universe; analyzing the spacing of those imprints on the sky provides a measure of the universe’s expansion.

Although the conflict isn’t new (SN: 4/5/14, p. 18), the evidence that something is amiss has strengthened as scientists continue to refine their measurements.
The latest results are now precise enough that the discrepancy is unlikely to be a fluke. “It’s gone from looking like maybe just bad luck, to — no, this can’t be bad luck,” says the leader of the supernova measurement team, Adam Riess of Johns Hopkins. But the cause is still unknown, Riess says. “It’s kind of a mystery at this point.”
Since its birth from a cosmic speck in the Big Bang, the universe has been continually expanding. And that expansion is now accelerating, as galaxy clusters zip away from one another at an ever-increasing rate. The discovery of this acceleration in the 1990s led scientists to conclude that dark energy pervades the universe, pushing it to expand faster and faster.

As the universe expands, supernovas’ light is stretched, shifting its frequency. For objects of known distance, that frequency shift can be used to infer the Hubble constant. But measuring distances in the universe is complicated, requiring the construction of a “distance ladder,” which combines several methods that build on one another.

To create their distance ladder, Riess and colleagues combined geometrical distance measurements with “standard candles” — objects of known brightness. Since a candle that’s farther away is dimmer, if you know its absolute brightness, you can calculate its distance. For standard candles, the team used Cepheid variable stars, which pulsate at a rate that is correlated with their brightness, and type 1a supernovas, whose brightness properties are well-understood.

Scientists on the Planck team, on the other hand, analyzed the cosmic microwave background, using variations in its temperature and polarization to calculate how fast the universe was expanding shortly after the Big Bang. The scientists used that information to predict its current rate of expansion.

As for what might be causing the persistent discrepancy between the two methods, there are no easy answers, Kamionkowski says. “In terms of exotic physics explanations, we’ve been scratching our heads.”

A new type of particle could explain the mismatch. One possibility is an undiscovered variety of neutrino, which would affect the expansion rate in the early universe, says theoretical astrophysicist David Spergel of Princeton University. “But it’s hard to fit that to the other data we have.” Instead, Spergel favors another explanation: some currently unknown feature of dark energy. “We know so little about dark energy, that would be my guess on where the solution most likely is,” he says.

If dark energy is changing with time, pushing the universe to expand faster than predicted, that could explain the discrepancy. “We could be on our way to discovering something nontrivial about the dark energy — that it is an evolving energy field as opposed to just constant,” says cosmologist Kevork Abazajian of the University of California, Irvine.

A more likely explanation, some experts say, is that a subtle aspect of one of the measurements is not fully understood. “At this point, I wouldn’t say that you would point at either one and say that there are really obvious things wrong,” says astronomer Wendy Freedman of the University of Chicago. But, she says, if the Cepheid calibration doesn’t work as well as expected, that could slightly shift the measurement of the Hubble constant.

“In order to ascertain if there’s a problem, you need to do a completely independent test,” says Freedman. Her team is working on a measurement of the Hubble constant without Cepheids, instead using two other types of stars: RR Lyrae variable stars and red giant branch stars.

Another possibility, says Spergel, is that “there’s something missing in the Planck results.” Planck scientists measure the size of temperature fluctuations between points on the sky. Points separated by larger distances on the sky give a value of the Hubble constant in better agreement with the supernova results. And measurements from a previous cosmic microwave background experiment, WMAP, are also closer to the supernova measurements.

But, says George Efstathiou, an astrophysicist at the University of Cambridge and a Planck collaboration member, “I would say that the Planck results are rock solid.” If simple explanations in both analyses are excluded, astronomers may be forced to conclude that something important is missing in scientists’ understanding of the universe.

Compared with past disagreements over values of the Hubble constant, the new discrepancy is relatively minor. “Historically, people argued vehemently about whether the Hubble constant was 50 or 100, with the two camps not conceding an inch,” says theoretical physicist Katherine Freese of the University of Michigan in Ann Arbor. The current difference between the two measurements is “tiny by the standards of the old days.”

Cosmological measurements have only recently become precise enough for a few-percent discrepancy to be an issue. “That it’s so difficult to explain is actually an indication of how far we’ve come in cosmology,” Kamionkowski says. “Twenty-five years ago you would wave your hands and make something up.”

Oldest evidence of cancer in human family tree found

Cancer goes way, way back. A deadly form of this disease and a noncancerous but still serious tumor afflicted members of the human evolutionary family nearly 2 million years ago, two new investigations of fossils suggest.

If those conclusions hold up, cancers are not just products of modern societies, as some researchers have proposed. “Our studies show that cancers and tumors occurred in our ancient relatives millions of years before modern industrial societies existed,” says medical anthropologist Edward Odes of the University of the Witwatersrand in Johannesburg, a coauthor of both new studies. Today, however, pesticides, longer life spans and other features of the industrialized world may increase rates of cancers and tumors.
A 1.6-million- to 1.8-million-year-old hominid, either from the Homo genus or a dead-end line called Paranthropus, suffered from a potentially fatal bone cancer, Odes and colleagues say in one of two papers published in the July/August South African Journal of Science. Advanced X-ray techniques enabled identification of a fast-growing bone cancer on a hominid toe fossil previously unearthed at South Africa’s Swartkrans Cave site, the researchers report. This malignant cancer consisted of a mass of bone growth on both the toe’s surface and inside the bone.

Until now, the oldest proposed cancer in hominids consisted of an unusual growth on an African Homo erectus jaw fragment dating to roughly 1.5 million years ago. Critics, though, regard that growth as the result of a fractured jaw, not cancer.

A second new study, led by biological anthropologist Patrick Randolph-Quinney, now at the University of Central Lancashire in England, identifies the oldest known benign tumor in a hominid in a bone from an
Australopithecus sediba child. This tumor penetrated deep into a spinal bone, close to an opening for the spinal cord. Nearly 2-million-year-old partial skeletons of the child and an adult of the same species were found in an underground cave at South Africa’s Malapa site ( SN: 8/10/13, p. 26 ).
Although not life-threatening, this tumor would have interfered with walking, running and climbing, the researchers say. People today, especially children, rarely develop such tumors in spinal bones.

“This is the first evidence of such a disease in a young individual in the fossil record,” Randolph-Quinney says.

X-ray technology allowed scientists to create and analyze 3-D copies of the inside and outside of the toe and spine fossils.

But studies of fossil bones alone, even with sophisticated imaging technology, provide “a very small window” for detecting cancers and tumors, cautions paleoanthropologist Janet Monge of the University of Pennsylvania Museum of Archaeology and Anthropology in Philadelphia. Microscopic analysis of soft-tissue cells, which are typically absent on fossils, confirms cancer diagnoses in people today, she says.

Without additional evidence of bone changes in and around the proposed cancer and tumor, Monge won’t draw any conclusions about what caused those growths.

Monge led a team that found a tumor on a 120,000- to 130,000-year-old Neandertal rib bone from Eastern Europe. Whether the tumor was cancerous or caused serious health problems can’t be determined, the scientists concluded in 2013 in PLOS ONE.

Bottom quarks misbehave in LHC experiment

CHICAGO — Theoretical physicists are scratching their heads after scientists presented surprising new studies of a particle known as the bottom quark.

At the new, higher energies recently reached at the Large Hadron Collider particle accelerator, particles containing bottom quarks flew off at an angle more often than expected. Scientists reported the result August 4 at the International Conference on High Energy Physics.

Quarks make up larger particles like the proton and neutron. At the LHC, near Geneva, scientists smash together protons to produce new particles, including bottom quarks.
Those bottom quarks are bound together with other quarks into larger particles known as b hadrons. Scientists with LHCb, an experiment at the LHC, found an unexpected behavior in b hadrons that sped off at an angle from beams of colliding protons, rather than continuing on a nearly parallel trajectory. At high energies, the number of b hadrons flying off at an angle, relative to those at lower energies, was almost twice as large as expected.

The discrepancy could point to a problem with scientists’ predictions of how the particles should behave. Such predictions are based on the theory of how quarks interact, known as quantum chromodynamics, or QCD, which is important for grasping the inner workings of protons and neutrons. “Understanding QCD really sets the basis of our understanding of nature,” says LHCb member Marina Artuso of Syracuse University in New York.

Scientists who make predictions for how b hadrons should behave have had trouble explaining the discrepancy. “Whichever way you turn it, it’s really weird. Which to me, personally, makes it extremely exciting,” says theoretical physicist Michelangelo Mangano of CERN, the European particle physics lab that operates the LHC.

But, he cautioned, it’s unlikely to be an indication of phenomena that would upend the standard model of particle physics. Rather, it may be that calculations need further refinement, or that scientists need to tweak their understanding of the proton, by altering estimates of the momentum carried by the various particles found inside it.

The issue could also lie with LHCb’s measurement, but the scientists say they are very confident in their result. The team continues to study the data to better characterize the effect.

Scientists get a glimpse of chemical tagging in live brains

For the first time, scientists can see where molecular tags known as epigenetic marks are altered in the brain.

These chemical tags — which flag DNA or its protein associates, known as histones — don’t change the genes but can change gene activity. Abnormal epigenetic marks have been associated with brain disorders such as Alzheimer’s disease, schizophrenia, depression and addiction.

Researchers at Massachusetts General Hospital in Boston devised a tracer molecule that latches on to a protein that removes one type of epigenetic mark known as histone acetylation.

The scientists then used PET scans to detect where a radioactive version of the tracer appeared in the brains of eight healthy young adult men and women, the researchers report in the Aug. 10 Science Translational Medicine. Further studies could show that the marks change as people grow older or develop a disease. The team studied only healthy young volunteers so can’t yet say whether epigenetic marking changes with age or disease.

Tabby’s star drama continues

A star that made headlines for its bizarre behavior has got one more mystery for astronomers to ponder.

Tabby’s star, also known as KIC 8462852, has been inexplicably flickering and fading. The Kepler Space Telescope caught two dramatic drops in light — by up to 22 percent — spaced nearly two years apart. Photographs from other telescopes dating back to 1890 show that the star also faded by roughly 20 percent over much of the last century. Possible explanations for the behavior range from mundane comet swarms to fantastical alien engineering projects (SN Online: 2/2/16).
A new analysis of data from Kepler, NASA’s premier planet hunter, shows that Tabby’s star steadily darkened throughout the telescope’s primary four-year mission. That’s in addition to the abrupt flickers already seen during the same time period. Over the first 1,100 days, the star dimmed by nearly 1 percent. Then the light dropped another 2.5 percent over the following six months before leveling off during the mission’s final 200 days.

Astronomers Benjamin Montet of Caltech and Josh Simon of the Observatories of the Carnegie Institution of Washington in Pasadena, Calif., report the findings online August 4 at arXiv.org.

The new data support a previous claim that the star faded between 1890 and 1989, a claim that some researchers questioned. “It’s just getting stranger,” says Jason Wright, an astronomer at Penn State University. “This is a third way in which the star is weird. Not only is it getting dimmer, it’s doing so at different rates.”
The slow fading hadn’t been noticed before because data from Kepler are processed to remove long-term trends that might confuse planet-finding algorithms. To find the dimming, Montet and Simon analyzed images from the telescope that are typically used only to calibrate data.
“Their analysis is very thorough,” says Tabetha Boyajian, an astronomer at Yale University who in 2015 reported the two precipitous drops in light (and for whom the star is nicknamed). “I see no flaws in that at all.”

While the analysis is an important clue, it doesn’t yet explain the star’s erratic behavior. “It doesn’t push us in any direction because it’s nothing that we’ve ever encountered before,” says Boyajian. “I’ve said ‘I don’t know’ so many times at this point.”

An object (or objects) moving in front of the star and blocking some of the light is still the favored explanation — though no one has figured out what that object is. The drop in light roughly 1,100 days into Kepler’s mission is reminiscent of a planet crossing in front of a star, Montet says. But given how slowly the light dropped, such a planet (or dim star) would have to live on an orbit more than 60 light-years across. The odds of catching a body on such a wide, slow orbit as it passed in front of the star are so low, says Montet, that you would need 10,000 Kepler missions to see just one. “We figure that’s pretty unlikely.”

An interstellar cloud wandering between Earth and KIC 8462852 is also unlikely, Wright says. “If the interstellar medium had these sorts of clumps and knots, it should be a ubiquitous phenomenon. We would have known about this for decades.” While some quasars and pulsars appear to flicker because of intervening material, the variations are minute and nothing like the 20 percent dips seen in Tabby’s star.

A clump of gas and dust orbiting the star — possibly produced by a collision between comets — is a more likely candidate, although that doesn’t explain the century-long dimming. “Nothing explains all the effects we see,” says Montet.

Given the star’s unpredictable nature, astronomers need constant vigilance to solve this mystery. The American Association of Variable Star Observers is working with amateur astronomers to gather continuous data from backyard telescopes around the globe. Boyajian and colleagues are preparing to monitor KIC 8462852 with the Las Cumbres Observatory Global Telescope Network, a worldwide web of telescopes that can keep an incessant eye on the star. “At this point, that’s the only thing that’s going to help us figure out what it is,” she says.

Fentanyl’s death toll is rising

For some people, fentanyl can be a life-saver, easing profound pain. But outside of a doctor’s office, the powerful opioid drug is also a covert killer.

In the last several years, clandestine drugmakers have begun experimenting with this ingredient, baking it into drugs sold on the streets, most notably heroin. Fentanyl and closely related compounds have “literally invaded the entire heroin supply,” says medical toxicologist Lewis Nelson of New York University Langone Medical Center.

Fentanyl is showing up in other drugs, too. In San Francisco’s Bay Area in March, high doses of fentanyl were laced into counterfeit versions of the pain pill Norco. In January, fentanyl was found in illegal pills sold as oxycodone in New Jersey. And in late 2015, fentanyl turned up in fake Xanax pills in California.
This ubiquitous recipe-tinkering makes it impossible for users to know whether they’re about to take drugs mixed with fentanyl. And that uncertainty has proved deadly. Fentanyl-related deaths are rising sharply in multiple areas. National numbers are hard to come by, but in many regions around the United States, fentanyl-related fatalities have soared in recent years.

Maryland is one of the hardest-hit states. From 2007 to 2012, the number of fentanyl-related deaths hovered around 30 per year. By 2015, that number had grown to 340. A similar rise is obvious in Connecticut, where in 2012, there were 14 fentanyl-related deaths. In 2015, that number was 188.
In Massachusetts, two-thirds of people who died from opioid overdoses in the first half of 2016 showed signs of fentanyl. This wave of fentanyl-related overdoses is “horrendous,” says Daniel Ciccarone of the University of California, San Francisco. What’s worse, he says, “I think it’s here to stay.”
Fentanyl is not a new drug. Available in the 1960s, it is still used in hospitals as an anesthetic and is available by prescription to fight powerful pain. What’s new, Ciccarone says, is that clandestine drug manufacturers have discovered that the euphoria-producing opioid can be made cheaply and easily — no poppy fields necessary.

Fentanyl is about 30 to 40 times stronger than heroin and up to 100 times more powerful than morphine, which means that a given effect on the body can be achieved with a much smaller amount of fentanyl. Inadvertently taking a bit of fentanyl can cause big trouble. “It’s a dosing problem,” Nelson says. “Because the drug is so potent, little changes in measurements can have very big implications for toxicity. That’s really the problem.”

That problem is made worse by the variability of illegal drugs — users often don’t know what they’re buying. Illegal labs aren’t pumping out products with carefully calibrated doses or uniform chemical makeup. The drugs change from day to day, making it nearly impossible for a user to know what he or she is about to take, Ciccarone says.

He has seen this struggle up close. Drug users have told him that the products they buy are unpredictable. Another thing people are telling him: “That they and their friends and compatriots are dropping like flies.” Tellingly, some of the most experienced drug users have recently begun doing “tester shots,” small doses to get a sense of the type and dose of drug they’re about to use, Ciccarone says.

Users are right to be wary. Typically, opioids can kill by gradually depressing a person’s ability to breathe. Illicit fentanyl, a recent study suggests, can kill within minutes by paralyzing muscles. Doctors have known that when injected quickly, fentanyl can paralyze chest wall muscles, prevent breathing and kill a person rapidly. That effect, called “wooden chest,” might help explain the rise in fentanyl-related deaths, scientists report in the June Clinical Toxicology.

A quick injection of fentanyl “literally freezes the muscles and you can’t move the chest,” says toxicologist Henry Spiller of the Central Ohio Poison Center in Columbus. That’s why doctors who dispense fentanyl in the hospital intentionally proceed very slowly and keep the opioid-counteracting drug naloxone (Narcan) on hand. “If you give it too fast, we know this occurs,” Spiller says. But it wasn’t known whether this same phenomenon might help explain the death rate of people using the drug illegally.

Spiller and colleagues tested post-mortem concentrations of fentanyl and its breakdown product norfentanyl in 48 fentanyl-related deaths. The body usually begins breaking down fentanyl into norfentanyl within two minutes, an earlier study found. Yet in 20 of the cases, the researchers found no signs of norfentanyl, indicating death came almost immediately after first receiving fentanyl.

Naloxone can counteract the effects of opioids if someone nearby can administer the antidote. But for people whose chests quickly freeze from fentanyl, resuscitation becomes more unlikely. Fentanyl “is just a bad drug,” Spiller says.
Fentanyl’s danger is magnified for people not accustomed to taking opioids, such as those addicted to cocaine, a situation illustrated by a recent tragedy in New Haven, Conn.

New Haven authorities noticed a string of suspicious overdoses in late June, leaving three people dead. Drug users thought they were buying cocaine, but the drugs contained fentanyl, says analytical toxicologist Kara Lynch of the University of California, San Francisco. As one of the handful of labs capable of testing blood and urine for fentanyl, hers was called on to identify the culprit. Her lab spotted fentanyl in Norco tablets back in March.

Lynch’s group uses high-resolution mass spectrometry to detect many drugs’ chemical signatures. But this method reveals only the drugs scientists suspect. “We can look for what we know to look for,” she says. And success depends on getting the samples in the first place.

The logistical hurdles of figuring out exactly what a person took, and how much, and when, are large. Ciccarone contrasts the situation with cases of food poisoning. When people start getting sick, public health officials can figure out what lettuce people ate and test it for pathogens. The same kind of tracking system doesn’t exist for drugs. His efforts to develop a system for testing illegal drugs in Baltimore broke down in part because no one had time to do the work. “The coroner is so busy right now with dead bodies,” he says. “They don’t have the time to test the ‘lettuce.’ ”

In the quest to curb fentanyl-related deaths, scientists and public health officials are searching for new strategies. Spiller advocates a more targeted public health message to users, one that emphasizes that fentanyl is simply a deadly drug, not just a more potent high. Ciccarone says that facilities where drug users can take illegal drugs under the care of medical personnel might reduce the number of fatalities.

For now, the scope of the problem continues to grow, Nelson says. The situation is made worse by the ingenuity of illicit drugmakers, who readily experiment with new compounds. Fentanyl itself can be tweaked to create at least 16 related forms, one of which, acetyl fentanyl, has been linked to overdose deaths. New drugs and new tweaks to old drugs rapidly evolve (SN: 5/16/15, p. 22), Nelson says, creating a game of whack-a-mole in which designer drugs confound public health officials and law enforcement.

“There is no single easy solution to this problem,” he says.

Water plays big role in shaping dwarf planet Ceres

Ice volcanoes, patches of water ice and a slew of hydrated minerals paint a picture of dwarf planet Ceres as a geologically active world — one where water has played a starring role. That’s the theme of six papers in the Sept. 2 Science that describe data collected by the Dawn spacecraft.

A 4-kilometer-high mountain dubbed Ahuna Mons, with its bowl-shaped summit and ridged flanks, has the appearance of a cryovolcano — one that erupts water instead of magma. The relatively young Oxo crater also appears to be home to splotches of frozen water. Given that ice should last only tens to hundreds of years on Ceres’ surface, the patches must be recent additions, possibly exposed by a landslide or impact with a meteorite. The surface is also slathered with a class of minerals known as phyllosilicates — silicon-bearing substances that form in the presence of water — which further support the idea that water has been present throughout Ceres’ history.

Ceres is the largest body between Mars and Jupiter. Dawn has been orbiting Ceres since March 6, 2015 (SN: 4/4/15, p. 9), studying its geology and composition to better understand the formation of rocky worlds.

Preteen tetrapods identified by bone scans

Better bone scanning of fossils offers a glimpse of preteen life some 360 million years ago.

Improved radiation scanning techniques reveal accumulating growth zones in chunks of four fossil upper forelimb bones from salamander-shaped beasts called Acanthostega, scientists report online September 7 in Nature. Vertebrate bones typically show annual growth zones diminishing in size around the time of sexual maturity. But there’s no sign of that slowdown in these four individuals from East Greenland’s mass burial of Acanthostega, says study coauthor Sophie Sanchez of Uppsala University in Sweden. They were still juveniles.
The bones came from tropical Greenland of the Devonian Period. Aquatic vertebrates were developing four limbs, which would serve tetrapods well when vertebrates eventually conquered land. This mass die-off doomed at least 20 individuals, presumably when a dry spell after a flood trapped them all in a big, vanishing puddle.
This find makes the strongest case yet for identifying genuine youngsters among ancient tetrapods, Sanchez says. She suspects other individuals trapped could have been juveniles too.

Not many other species were found in the mass burial. So young tetrapods may have stuck together much as today’s young fish school, Sanchez speculates. The limb shape clearly indicates that the youngsters took a long time to start adding hard bone to the initial soft cartilage, she says. So these early tetrapods were at least 6-year-olds and probably 10 years old or more.
For identifying stages of life, the improved technique “allows greater resolution and rigor, so in that regard it is a plus,” says Neil Shubin of the University of Chicago, who studies a fossil fish with some tetrapod-like features called Tiktaalik. There are Tiktaalik preteens, too, he notes.

What interests Nadia Fröbisch of Museum für Naturkunde in Berlin is that some of Acanthostega individuals were different sizes but had reached the same stage of bone development. She muses that they might even have been developing along different trajectories of growth, a flexibility that would be useful in a changeable environment.

Primordial continental crust re‑created in lab

New experiments have re-created the genesis of Earth’s first continents.

By putting the squeeze on water and oceanic rocks under intense heat, researchers produced material that closely resembles the first continental crust, created around 4 billion years ago. The work suggests that thick slabs of oceanic crust helped build the first continents: After plate tectonics pushed the thick slabs underground, the rocks melted, transformed and then erupted to the surface to make continents, the researchers report online August 31 in Geology.
This continental origin story relies on two characteristics that make Earth unlike other rocky planets in the solar system, says study coauthor Alan Hastie, a geologist at the University of Birmingham in England. Earth has both oceans and a network of shifting tectonic plates that can force sections of the planet’s exterior underground, a process known as subduction. “Without liquid oceans and without subduction from plate tectonics, you don’t get continents,” Hastie says. “The only reason I’m sitting here on land today is because of this process.”

The scenario proposed by Hastie and colleagues doesn’t necessarily require active plate tectonics to work, says geochemist Kent Condie of the New Mexico Institute of Mining and Technology in Socorro. Plate tectonics may have started hundreds of millions of years after the first continental crust formed. If thick enough, oceanic crust could have sunk deep enough on its own to create continental crust without the need for subduction, Condie says. “We shouldn’t make the assumption that we need subduction.”

Initially after Earth formed, only oceanic crust and stacks of volcanic rock coated the planet’s surface. Continental crust — which is made of less dense rock than oceanic crust and therefore rises to higher elevations — came perhaps hundreds of millions of years later. The oldest continental crust still around today, found in Greenland, dates back to about 4 billion years ago.
Re-creating the formation of the earliest continental crust involves a lot of trial and error. Scientists compress bits of various rocks at high temperatures that mimic the sinking of various types of rock into the planet’s depths. The rocks transform into different minerals under the intense heat and pressure. The goal is to create rock that looks like ancient continental crust. Using this “cook and look” method, scientists have gotten a few decent matches, but never anything that perfectly replicated the first continents.
Last year, Hastie and colleagues reported finding Jamaican rocks that closely resembled early continental crust, only much younger. The researchers wondered whether the nearby Caribbean Ocean Plateau was partially to blame for the odd rocks. Ocean plateaus are slabs of oceanic crust thickened by hot plumes of material that rise from Earth’s depths. This thick crust, while somewhat rare today, was probably more common billions of years ago when Earth’s interior was much hotter, Hastie says.

Using a special press, the researchers squeezed and melted small samples of water and ocean plateau rock at pressures of up to 2.2 gigapascals — equivalent to three adult African elephants stacked on a postage stamp — and at temperatures up to 1,000° Celsius. These extreme conditions imitate the fate of a chunk of ocean plateau around 30 to 45 kilometers thick forced deep underground.

The experiment transformed the water and rock into a dead ringer for the oldest known continental crust. Once created underground, the new crust would have erupted to the surface via volcanism and formed the forerunners of the modern continents, Hastie says.

Old-school contraptions still work for weighing astronauts

New method to measure mass in space devised — A scale for measuring weight in space that does not depend upon the attraction of gravity has been devised…. In [William Thornton’s] method, the weight of the mass is determined [by] mechanically oscillating a weight in a tray. The heavier the mass, the slower the oscillation rate. The scale is tied to an electronic unit measuring the time required for five cycles of oscillation. A reference to a chart gives the mass’s weight. — Science News, October 1, 1966

UPDATE
Not much has changed. The International Space Station has two spring-based contraptions for weighing in astronauts. An individual rides the Body Mass Measurement Device like a pogo stick — in four or five bounces, it calculates weight. The Space Linear Acceleration Mass Measurement Device uses springs to pull an astronaut; the acceleration reveals weight. In 2012, researchers in Europe experimented with compact computer imaging technology — developed for video games — using photos to estimate mass based on a person’s shape and size.