New process encourages ice to slip, slide away

Ice removal may soon become a lot easier. Researchers have developed a new method for making ice-phobic surfaces by altering the density and slipperiness of spray-on polymer coatings.

The process, reported online March 11 in Science Advances, could lead to a wide range of long-lasting ice-repellent products including windshields, airplane wings, power cables and frozen food packaging, researchers say.

Scientists know that ice easily detaches from softer, less dense materials. Further adjusting the density of rubber polymers used to make the coatings and adding silicone or other lubricants such as vegetable oil, cod-liver oil and safflower oil amplifies the effect, Anish Tuteja, a materials science engineer at the University of Michigan in Ann Arbor, and colleagues found.
In multiple laboratory and field tests, ice slid off treated surfaces under its own weight or when it was pushed by mild wind. The researchers further tested the coatings’ durability on various surfaces such as metal license plates and glass panes. The coatings performed well through two Michigan winters and retained their ice-repelling properties after controlled exposure to icing and heat cycles, corrosive substances such as hydrochloric acid, and wear and tear.

The process has already yielded more than 100 different coatings tailored for specific surfaces, including metal, glass, wood, plastic and cardboard. Tuteja says his team is working on licensing the materials for commercial use.

Racing for answers on Zika

Sometimes science does not move fast enough, despite much hard work and effort. That’s true in the case of the Zika virus outbreak currently marching through the Americas. As we report in a collection of stories, much remains unclear, including the relationship between Zika infection and microcephaly and how best to combat the mosquitoes that spread the disease. So far, however, evidence does suggest that this little-known (and previously largely ignored) virus may indeed target the nervous system, probably triggering Guillain-Barré syndrome in a small percentage of patients. The virus could even pose as-yet undiscovered health risks that may take years to untangle. While scientists will no doubt eventually be able to answer many of the public’s pressing questions, it may be too late for many.
It’s not a global emergency, but the public is also apparently impatient with science’s progress on providing practical advice about the human microbiome — the collection of bacteria and other microorganisms that live in and on us. This issue features two articles about new results from this hot field. Laura Sanders details surprising ways that gut microbes can meddle with the brain, hinting that certain microbial mixes may influence depression and other mental disorders. And Meghan Rosen describes the microbiome’s role in malnutrition, suggesting that resetting children’s microbes may be a useful treatment. It’s hard not to conclude that manipulating the bacteria in your body could offer a path to better health and happiness.
Judging from the shelves at Whole Foods, that is what many makers of probiotic supplements would like you to believe. And it may well turn out to be true — studies have linked the microbiome to metabolic and digestive issues such as obesity, irritable bowel syndrome and inflammatory bowel disease. But science hasn’t yet come up with broad recommendations for the best ways to tend your personal microfloral garden. And since the Food and Drug Administration regulates supplements as foods, not as medicines, probiotic pills may vary in quality and even in actual ingredients; makers don’t have to prove that probiotics are safe or effective.

Notably, none of the researchers that Sanders asked while reporting “Microbes and the mind” said that they regularly take probiotic supplements. They also said that any effects on the brain, while fascinating, are probably subtle for most people — otherwise you’d notice a mood change every time you took antibiotics. In Rosen’s story about malnutrition, researcher François Leulier says: “We can envision some therapy solutions, but we’re still at the basic research level.” It’s just too early to start megadosing, he says, even for very sick kids.

To fill in the gap, people look to anecdote. Or, sometimes knowingly, they engage in uncontrolled self-experiments with an N of 1, fueled by the Internet (see the website Quantified Self) and DIY culture. The data gleaned from these personal trials may help individuals, but they can’t answer big questions.

An eager public — and intriguing science — is propelling microbiome research along. Zika research is sprinting after an elusive and mysterious foe, trying to stop the damage from the virus and learn from a vast natural experiment. In both cases, science must move more swiftly if it is to catch up.

Quake risk in parts of central U.S. as high as in fault-filled California

Northern Oklahoma is just as susceptible to a damaging earthquake within the next year as the most quake-prone areas of California. That’s because earthquakes are no longer just a natural hazard, the U.S. Geological Survey says. In its new quake hazards forecast released March 28, the agency for the first time has included artificially triggered seismicity.

An increased risk in the central United States largely stems from sites where fluids, such as wastewater from fracking, are injected underground (SN: 8/9/14, p. 13). Rising fluid pressure underground can unclamp faults and unleash earthquakes (SN: 7/11/15, p. 10). From 1973 to 2008, an average of 24 potentially damaging quakes rattled the central United States each year. From 2009 to 2015, an uptick in fracking activity helped skyrocket that number to 318 annual quakes on average, with a record-setting 1,010 tremors in 2015 alone. Around 7 million people currently live and work in central and eastern U.S. areas vulnerable to shakes stemming from earthquakes roughly above magnitude 2.7, USGS scientists estimate.

Human-caused quakes aren’t as powerful as their natural counterparts (the strongest induced quake in the United States clocked in at magnitude 5.6 in 2011 compared with the magnitude 7.8 San Francisco temblor in 1906, for instance). But the potential for more powerful shakes exists, the scientists warn. The new hazard assessment should help regulators revise building codes to better prepare for the rising risk.

New habitat monitoring tools find hope for tigers

There’s still enough forest left — if protected wisely — to meet the goal of doubling the number of wild tigers (Panthera tigris) by 2022, says an international research team.

That ambitious target, set by a summit of 13 tiger-range nations in 2010, aims to reverse the species’ alarming plunge toward extinction. Forest loss, poaching and dwindling prey have driven tiger numbers below 3,500 individuals.

The existing forest habitat could sustain the doubling if, for instance, safe-travel corridors connect forest patches, according to researchers monitoring forest loss with free, anybody-can-use-’em Web tools. Previously, habitat monitoring was piecemeal, in part because satellite imagery could be expensive and required special expertise, says Anup Joshi of the University of Minnesota in St. Paul. But Google Earth Engine and Global Forest Watch provide faster, easier, more consistent ways to keep an eye out for habitat losses as small as 30 meters by 30 meters (the space revealed in a pixel).
Looking at 14 years of data, 76 major tiger landscapes altogether have lost less than 8 percent of forest, the researchers say April 1 in Science Advances. Finding so little loss is “remarkable and unexpected,” they write. But 10 of those landscapes account for most of the losses — highlighting the challenges conservationists, and tigers, face.

Possible source of high-energy neutrino reported

Scientists may have found the cosmic birthplace of an ultra-high energy neutrino. They point the finger at a blazar — a brilliantly luminous galaxy that shoots a jet of radiation in the direction of Earth — 9 billion light years away.

If the link between the blazar and neutrino is real, scientists would be closer to long-sought answers about where such power-packing particles come from. Violent astronomical accelerators boost some neutrinos to high energies, but scientists have never been able to convincingly identify their sources.
Neutrinos are aloof elementary particles that rarely interact with other matter — they can sail straight through the Earth, and trillions of them zip through your body every second without a trace. On December 4, 2012, the neutrino in question (which scientists have affectionately nicknamed Big Bird) slammed into the Antarctic ice with an energy of around 2 million billion electron volts. The neutrino observatory IceCube glimpsed the aftermath of the collision and measured its energy with sensitive detectors embedded deep in the ice (SN Online: 04/07/14), leaving scientists hustling to pinpoint its source.

The blazar flared up at just the right time and place to be a prime suspect, researchers report in a paper accepted for publication in a peer-reviewed journal. The result, now available online at arXiv.org, strengthens the case that blazars are the source of such high-energy neutrinos, but it is no smoking gun.

After the neutrino was detected, a team of astrophysicists scoured the heavens for energetic galaxies with TANAMI, short for Tracking Active Galactic Nuclei with Austral Milliarcsecond Interferometry, a network of telescopes peering into space at a variety of wavelengths. That team reported one likely candidate blazar.

But the candidate is not a surefire match, says IceCube leader Francis Halzen of the University of Wisconsin–Madison, who was not involved with the analysis. IceCube could determine the neutrino’s direction within only about 15 degrees on the sky, and the blazar flare-up continued for several months. The probability of such a chance concurrence between an unrelated neutrino and blazar is about 5 percent, the researchers say — too big to rule out chance. “It’s a very intriguing result,” says Halzen “but it’s not a proof.”

The matchup between the blazar and neutrino is noteworthy, even though the researchers can’t fully rule out the possibility that the match is a fluke, says astrophysicist Xiang-Yu Wang of Nanjing University in China, who was not involved with the research. “Given that the two events are very unique … I think it’s convincing.” Wang and colleagues have expanded on the result: In a paper accepted for publication in Physical Review Letters, they use the difference in arrival time between the neutrino and light from the blazar’s outburst — assuming the two are related — to test Einstein’s special and general theories of relativity. Certain theories of quantum gravity predict a delay in the arrival of a neutrino. (Einstein came out unscathed.)
The authors of the blazar study declined to comment on the result, citing the embargo policy of the journal where the paper will be published.

To convincingly identify a blazar as the source of a neutrino, Halzen says, scientists will need a better measurement of the neutrino’s direction, connected to a short-lived blazar outburst. In the future, Halzen says, IceCube will send out “astronomical telegrams” when it detects a neutrino, directing telescopes to take a look, perhaps catching a blazar in the act.

EPA underestimates methane emissions

The U.S. Environmental Protection Agency has a methane problem — and that could misinform the country’s carbon-cutting plans. Recent studies suggest that the agency’s reports fail to capture the full scope of U.S. methane emissions, including “super emitters” that contribute a disproportionate share of methane release. Those EPA reports influence the country’s actions to combat climate change and the regulation of methane-producing industries such as agriculture and natural gas production.

With EPA’s next annual methane report due to be published by April 15, early signs suggest that the agency is taking steps to fix the methane mismatch. A preliminary draft of the report revises the agency’s methane calculations for 2013 — the most recent year reported — upward by about 27 percent for the natural gas and petroleum sectors, a difference of about 2 million metric tons.
Yet it’s unclear how that and other revisions will factor into final methane emission totals in the upcoming report. The draft estimates that U.S. methane emissions from all sources in 2014 were about 28 million metric tons, up slightly from the revised estimate for 2013 and well above the original 2013 estimate of 25.453 million metric tons. But the totals in the draft don’t include updates to emission estimates from the oil and gas industry.
“EPA is reviewing the substantial body of new studies that have become available in the past year on the natural gas and petroleum sector,” says EPA spokesperson Enesta Jones. The agency is also gathering feedback from scientists and industry experts to further improve their reporting.

Methane, which makes up the bulk of natural gas, originates from natural sources, such as wetlands, as well as from human activities such as landfills, cattle ranches (SN: 11/28/15, p. 22) and the oil and gas industry. Globally, human activities release about 60 percent of the 600 million metric tons of methane emitted into the atmosphere each year. Once in the air, methane prevents some of Earth’s heat from escaping into space, causing a warming effect. Methane emissions currently account for about a quarter of human-caused global warming.

The EPA’s underestimation of U.S. methane emissions comes down to accounting. EPA samples emissions from known methane sources, such as cows or natural gas pipelines, and works out an average. That average is then multiplied by the nation’s number of cows, lengths of pipe and other methane sources. Results from this method disagree with satellite and land-based observations that measure changes in the total amount of methane in the air. A 2013 report in the Proceedings of the National Academy of Sciences found that U.S. methane emissions based on atmospheric measurements are about 50 percent larger than EPA estimates (SN Online: 11/25/13).
EPA’s reports don’t just misjudge the scale of emissions, they also miss the long-term trend, recent work suggests. EPA reported that U.S. methane emissions remained largely unchanged from 2002 to 2014. But researchers report online March 2 in Geophysical Research Letters that emissions of the greenhouse gas rose more than 30 percent over that period. The United States could be responsible for as much as 30 to 60 percent of the global increase in methane emissions over the last decade, the study’s authors conclude. “We’re definitely not a small piece of that pie,” says Harvard University atmospheric scientist Alex Turner, who coauthored the study.
Correctly tracking methane is important, Turner says, because over a 100-year period, the warming impact of methane is more than 25 times that of the same amount of CO2. Methane levels have also risen faster: Since the start of the industrial revolution, methane concentrations have more than doubled while CO2 has risen more than 40 percent.

While methane is more potent than CO2, there is about 200 times less methane in the atmosphere than CO2. Furthermore, methane stays in the atmosphere for only around 12 years before being absorbed by soil or breaking apart in chemical reactions. “If we reduce methane emissions, the climate responds very quickly and global warming would slow down almost immediately,” says Cornell University earth systems scientist Robert Howarth. “CO2, on the other hand, has an influence that will go hundreds to even thousands of years into the future.”

Turner and colleagues tracked methane across the continental United States using land stations that measure methane in the air and satellite observations that record dips in the infrared radiation frequencies absorbed and reemitted by methane. The researchers compared these methane measurements with those taken over Bermuda and the North Pacific Ocean — places upwind of the United States and far from major methane sources.

From 2002 through 2014, methane concentrations over the continental United States grew faster than those over the oceans, the researchers found. The difference was most pronounced over the central United States, where methane concentrations rose nearly twice as fast as in the rest of the country. Natural gas drilling and production boomed in in the central United States during the period studied, though the researchers could not precisely trace the source of the additional methane.

Turner and colleagues say they’re now working with EPA to reconcile the methane estimates. EPA will provide small-scale estimates of methane emissions down to a 10-kilometer-wide grid. By combining that grid with space and land observations, scientists should be able to isolate where methane mismatches are the most pronounced.

While Turner’s research can’t pinpoint the exact origins of the additional methane, other studies point to the oil and gas industry. The numbers that the EPA uses to tabulate methane emissions assume that equipment is functioning as intended, says Stanford University sustainability engineer Adam Brandt. Malfunctioning equipment can spew huge amounts of methane. That became abundantly – and visibly – clear last October when the largest U.S. methane leak in history began in an underground storage facility near Los Angeles. The leak released 97,100 metric tons of methane, equivalent to the annual greenhouse gas emissions of 572,000 cars, before being permanently sealed in February, researchers estimated in the March 18 Science.

Super methane emitters are a big problem elsewhere, too, albeit typically much smaller than the California leak, researchers report in the June 2016 Environmental Pollution. Surveying emissions from 100 natural gas leaks around Boston, the researchers found that 7 percent of leaks contributed half of the total methane released. In 2014, a different research team reported in Environmental Science & Technology that 19 percent of pneumatic controllers used at natural gas production sites accounted for 95 percent of all controller emissions.

Monitoring and quick repairs can stamp out rogue methane sources quickly, Brandt says. “This is a problem that’s easier to fix than it is to understand,” he says.

Nightshade plants bleed sugar as a call to ants for backup

Herbivores beware: Take a bite out of bittersweet nightshade (Solanum dulcamara), and you might have an ant problem on your hands. The plants produce a sugary goo that serves as an indirect defense, attracting ants that eat herbivores, Tobias Lortzing of Berlin’s Free University and colleagues write April 25 in Nature Plants.

Observations of wild nightshade plants in Germany suggest that plants that ooze goo attract more ants (mostly European fire ants, or Myrmica rubra) than undamaged plants. In greenhouse experiments, those ants fed on both the goo and roving slugs and flea beetle larvae, substantially reducing leaf damage. Leaf-munching adult flea beetles and, to a lesser degree, slugs prompted the goo production. The ants didn’t attack the beetles but did protect the plant from slugs and beetle larvae.

Plenty of other plants produce defensive nectars via organs called nectaries, and nightshades’ bleeding may be a unique, primitive version of that protective strategy, the scientists report.

Why Labrador retrievers are obsessed with food

Labrador retrievers tend to be more overweight and keen to scarf down their kibble than other dog breeds. Eleanor Raffan of the University of Cambridge and her colleagues chalk this trend up — least in part — to a suspect gene.

The team found that, among a small group of assistance dogs, a form of a gene called POMC that was missing a chunk of DNA was more common in obese Labs than in lean ones. This held true on a larger scale, too: Out of 411 Labs in the United Kingdom and United States, 22 percent carried the deletion mutation. Looking across other breeds, only Labradors and flat coat retrievers, a close relative, carried the gene variant, which also correlated with greater weight and food begging tendencies, the team reports May 3 in Cell Metabolism.

POMC plays a role in a metabolism pathway, and the deletion may inhibit the production of proteins that regulate hunger, the researchers suspect. (That might explain why the variant turned up in about 75 percent of assistance dogs, which are trained using food motivation.)

Here are a few more things for the childproofing list

There’s nothing like having kids to open your eyes to the world’s dangers. With two little rascals in tow, grocery stores, dentists’ offices and even grandparents’ homes morph into death traps full of sharp, poisonous and heavy things. Short of keeping a tight grip on little hands, there’s not much you can do to childproof absolutely everything when you’re out and about. At home, it’s easier to make rooms safe for kids: Cover electrical outlets, keep drugs and potentially poisonous stuff out of reach, bolt dressers to the wall, and so on.

But every so often, I come across a study that points out an unexpectedly dangerous object. Clearly, none of these things rise to Bag O’Glass danger levels. But in the spirit of The More You Know, here are five objects that carry hidden risks to children:

Laundry pods
These cute, candy-colored packets can be irresistible to children — and toxic when eaten. Since 2012, when single-load pods for laundry detergent became popular, poison control centers have been fielding calls about toddlers who got ahold of pods. From 2013 to 2014, over 22,000 U.S. children under age 6 were exposed to these pods, mostly by eating them, data from the National Poison Data System show. And in just that two-year period, cases of laundry pod exposure rose 17 percent, scientists reported in the May Pediatrics.

Those numbers are particularly worrisome because laundry pods appeared to be more dangerous than regular laundry detergent (liquid or powder) and dishwasher detergent in any form (pod, liquid or powder). In a small number of kids, eating laundry pods caused serious trouble, including coma, respiratory arrest and cardiac arrest. Two children died, scientists wrote in the Pediatrics paper.

Tiny turtles
Oh, they’re adorable, but turtles can carry salmonella, bacteria that come with diarrhea, fever and cramps. Kids are particularly susceptible, and infections can be severe for them. Recognizing this risk, the FDA banned the sale of small turtles (shell less than 4 inches long) in 1975.
Yet in recent years, small turtles have slowly crawled back into children’s grubby little hands, carrying salmonella with them, scientists reported in January in Pediatrics. From 2011 to 2013, turtles were implicated in eight multistate Salmonella outbreaks, hitting hard in children younger than 5. Of the 473 people affected by the outbreaks, the median age was 4.

Big TVs
I’m not talking about the dangers of screen time here. I mean the television itself. Today’s flat screen TVs are more wobbly than the older, heavier tube-based TVs. Every 30 minutes, a kid is treated in the emergency room for a TV-related injury — that’s more than 17,000 children in the United States per year and increasing. And little heads and necks are the most frequently injured body parts.

Liquid nicotine
Along with the rise of e-cigarettes come refill cartridges, most of which contain concentrated liquid nicotine in flavors such as cherry crush, vanilla and mint. These appealing flavors mask nicotine that can be dangerous to kids. In 2015, poison control centers reported over 3,000 incidents of unintentional nicotine exposure, many of them in children. In comparison, just 271 exposures were reported in 2011.

That worrisome increase prompted the Child Nicotine Poisoning Prevention Act of 2015, signed into law by President Obama on January 28, requiring nicotine cartridges to be packaged in child-proof containers — a no-brainer.

Trampolines
Maniacal bouncing is clearly exhilarating for children, but also risky. I say this as a childhood-double-bounce survivor, so I understand the appeal. But just a note of caution: These springy injury machines come with a constellation of scary medical stats. Concussions, broken bones, sprains and neck injuries are signature trampoline troubles. A survey of a national injury database showed that broken bones accounted for 29 percent of all trampoline injuries reported to emergency departments, scientists reported in 2014 in the Journal of Pediatrics Orthopedics. The vast majority (93 percent) of those fractures belonged to children 16 and under.

Attempts to make trampolines safer — by putting a net around the perimeter, for instance — don’t seem to lower injury rates, an Australian study found. That’s why the American Academy of Pediatrics, the Canadian Paediatric Society, the American Academy of Orthopaedic Surgeons and other groups all urge caution, or an outright ban.

Space experts say sending humans to Mars worth the risk

WASHINGTON — There’s a long-standing joke that NASA is always 20 years from putting astronauts on Mars. Mission details shared at a recent summit shows that the space agency is right on schedule. A to-do list from 2015 looks remarkably similar to one compiled in 1990. One difference: NASA is now building a rocket and test-driving technologies needed to get a crew to Mars. But the specifics for the longest road trip in history — and what astronauts will do once they arrive — remain an open question.

“Are we going to just send them there to explore and do things that we could do robotically though slower, or can we raise the bar?” asked planetary scientist Jim Bell during the Humans to Mars summit. “We need to make sure that what these folks are being asked to do is worthy of the risk to their lives,” said Bell, of Arizona State University in Tempe.
The three-day symposium, which ended May 19, was organized by Explore Mars Inc., a nonprofit dedicated to putting astronauts on Mars by the 2030s.

While the summit didn’t break new scientific ground, it did bring together planetary scientists , space enthusiasts and representatives from both NASA and the aerospace industry to talk about the challenges facing a crewed mission to Mars and rough ideas for how to get there.

Part of the appeal in sending humans is the pace of discovery. Drilling just one hole with the Curiosity rover, which has been exploring Gale Crater on Mars since August 2012 (SN: 5/2/2015, p. 24), currently takes about a week. “It’s a laborious, frustrating, wonderful — frustrating — multiday process,” said Bell.

Humans also can react to novel situations, make quick decisions and see things in a way robotic eyes cannot. “A robot explorer is nowhere near as good as what a human geologist can do,” says Ramses Ramirez, a planetary scientist at Cornell University. “There’s just a lot more freedom.”

Researchers saw the human advantage firsthand in 1997 when they sent a rover called Nomad on a 45-day trek across the Atacama Desert in Chile. Nomad was controlled by operators in the United States to simulate operating a robot on another planet. Humans at the rover site provided a reality check on the data Nomad sent back. “There was a qualitative difference,” says Edwin Kite, a planetary scientist at the University of Chicago. And it wasn’t just that the geologists could do things faster. “The robots were driving past evidence of life that humans were finding very obvious.”
To get astronauts ready to explore Mars, the Apollo program is a good template, said Jim Head, a geologist at Brown University who helped train the Apollo astronauts. “Our strategy was called t-cubed: train them, trust them and turn them loose.” While each of the moon expeditions had a plan, the astronauts were trusted to use their judgment. Apollo 15 astronaut David Scott, for example, came across a chunk of deep lunar crust that researchers hoped to find although it wasn’t at a planned stop. “He spotted it three meters away,” said Head. “He saw it shining and recognized it immediately. That’s exploration.”

Despite a lack of clear goals for a jaunt to Mars, NASA is forging ahead. The Orion crew capsule has already been to space once; a 2014 launch atop a Delta IV Heavy rocket sent an uncrewed Orion 5,800 kilometers into space before it splashed down in the Pacific Ocean (SN Online: 12/5/2014). And construction of the Space Launch System, a rocket intended to hurl humans at the moon and Mars, is under way. The first test flight, scheduled for October 2018, will send Orion on a multiday uncrewed trip around the moon. NASA hopes to put astronauts onboard for a lunar orbit in 2021.

Meanwhile, the crew aboard the International Space Station is testing technologies that will keep humans healthy and happy during an interplanetary cruise. Astronaut Scott Kelly recently completed a nearly yearlong visit to the station intended to reveal the effects of long-duration space travel on the human body (SN Online: 2/29/2016). And on April 10, a prototype inflatable habitat — the Bigelow Expandable Activity Module — arrived at the station and was attached to a docking port six days later. The station crew will inflate the module for the first time on May 26. No one will live in it, but over the next two years, astronauts will collect data on how well the habitat handles radiation, temperature extremes and run-ins with space debris.
Beyond that, the plans get fuzzy. The general idea is to construct an outpost in orbit around the moon as a testing and staging ground starting in the late 2020s. The first crew to Mars might land on the planet — or might not. One idea is to set up camp in Mars orbit; from there, astronauts could operate robots on the surface without long communication delays. Another idea has humans touching down on one of Mars’ two moons, Phobos or Deimos. When crews do land on the Martian surface, NASA envisions establishing a base from which astronauts could plan expeditions.

With so few details, it’s difficult for the space agency to identify specific technologies to invest in. “There have been lots of studies — we get a lot of grief that it’s nothing but studies,” said Bret Drake, an engineer at the Aerospace Corp. in El Segundo, Calif. “But out of the studies, there are a lot of common things that come to the top no matter what path you take.”

Any mission to Mars has to support astronauts for roughly 500 to 1,000 days. The mission has to deal with round-trip communication delays of up to 42 minutes. It will need the ability to land roughly 40-ton payloads on the surface of Mars (current robotic missions drop about a ton). Living off the land is also key, making use of local water and minerals. And astronauts need the ability to not just survive, but drive around and explore. “We want to land in a safe place, which is going to be geologically boring, but we want to go to exciting locations,” said Drake.

The technical and logistical challenges might be the easiest part. “We do know enough to pull this off,” Ramirez says. “The biggest problem is political will.” Congress has yet to sign off on funding this adventure (nor has NASA presented a budget — expected to be in the hundreds of billions of dollars), and future administrations could decide to kill it.

Multiple summit speakers stressed the importance of using technology that is proven or under development — no exotic engines or rotating artificial gravity habitats for now. And a series of small missions —baby steps to the moon and an asteroid before committing to Mars — could show progress that might help keep momentum (and public interest) alive.

“We thought going to the moon was impossible, but we got there,” says Ramirez. “If we dedicate ourselves as a nation to do something crazy, we’ll do it. I have no doubt.”