Court dismisses appeal in National Security Law for HK violation case

The Court of Final Appeal in the Hong Kong Special Administrative Region (HKSAR) on Tuesday dismissed the appeal of a former university student who had pleaded guilty to violating the National Security Law (NSL) for Hong Kong, which experts noted was an "instructive, authoritative, and binding" ruling that upholds the spirit of the rule of law.

Lui Sai-yu, who was a student from Hong Kong Polytechnic University, was sentenced to five years in prison in April 2022 by the district court in the HKSAR after being accused of "inciting others to commit secession." Lui didn't accept the decision and asked for an appeal. On November 30, 2022, the High Court of the HKSAR dismissed the appeal and upheld the sentence.

The district court in HKSAR had decided that the starting point for Lui's prison term should be five years and six months, then six months were deducted to reflect his guilty plea, RTHK reported on Tuesday. 

Normally a defendant who admits to his or her crimes receive a one-third reduction, but the six-month reduction was the maximum allowed under the NSL for Hong Kong, which specifies that those who commit a serious secession offense shall be required to serve a sentence of at least five years, but no more than 10 years, said the report.

During Tuesday's judgment, the judges refuted the appellant's argument that a five-year prison sentence should have been the starting point for sentencing - which would allow for actual sentences to be below the threshold, according to the report.

This case is of special significance in determining the legislative intent of the NSL for Hong Kong regarding the establishment of a mandatory minimum sentence. In fact, the appellate dispute of the case is whether the five-year minimum sentence for "serious cases" is a "sentencing guideline" or the "final sentence," Louis Chen, a member of the Election Committee and general secretary of the Hong Kong Legal Exchange Foundation, told the Global Times on Tuesday.

Article 33 of the NSL for Hong Kong clearly states that provisions related to lighter or reduced punishments do not include pleas of guilt; thus, defendants should not receive reduced sentences. Taking the crime of murder, which requires a mandatory life sentence, as an example, it emphasizes that mandatory punishments truly reflect the severity of the crime. The sentencing mechanism of the NSL for Hong Kong should prioritize deterrence, and not all mitigating factors apply, Chen said.

Willy Fu, a law professor and vice-chairman of the Hong Kong Legal Exchange Foundation, also welcomed and supported the ruling made by the court.

Fu pointed out that the NSL for Hong Kong is a national law and therefore holds a paramount position. The law needs to be coherent, compatible, and complementary with local laws. However, when inconsistencies arise between the NSL for Hong Kong and the local laws of the HKSAR, Article 62 of the NSL for Hong Kong should be given priority. 

This principle also applies to the interpretation of sentencing provisions in the NSL for Hong Kong. Therefore, local sentencing laws and principles fully function within the sentencing framework set by the law.

In the judgment released on Tuesday, the Court of Final Appeal in the HKSAR also correctly noted that "[local] sentencing laws must therefore operate in tandem with the NSL to achieve the aim of safeguarding national security, giving priority to NSL provisions in case of inconsistency." 

The law aims to prevent, stop, and punish crimes endangering national security. It should adhere to the rule of law, respect and safeguard human rights, combat the very small number of criminals endangering national security, protect the legitimate rights and interests of the majority of citizens, maintain HKSAR's prosperity and stability, and ensure the steady and far-reaching practice of One Country, Two Systems, Fu said. 

The court's ruling on the mandatory sentencing guidelines for the crime of secession under the NSL for Hong Kong, specifically regarding cases of "serious circumstances," which require a prison sentence of five to 10 years, clearly indicates that mandatory punishments reflect the severity of the crime. 

"This ruling is instructive, authoritative, and binding, upholding the spirit of the rule of law. Its significance is profound and deserves the support of the general public," Fu said. 

Coastal cities launch emergency monitoring on maritime environment and foods over Fukushima contaminated water dumping

Multiple coastal cities including Wenzhou in East China's Zhejiang Province and Guangzhou in South China’s Guangdong Province and Sanya in South China’s Hainan Province have launched emergency monitoring on maritime environment and aquatic food over Fukushima contaminated water dumping to safeguard China’s marine environment.

The South Zhejiang Institute of Radiation Medicine and Nuclear Technology in Wenzhou, East China's Zhejiang Province, established a special working group of marine radioactive pollution monitoring for the East China Sea last Friday after Japan started dumping Fukushima nuclear-contaminated wastewater.

The group will conduct real-time monitoring of the nuclear-contaminated seawater area and have preliminarily planned to carry out sampling and monitoring in the sea area every two months to promptly respond to and monitor the potential impact of Japan’s Fukushima nuclear-contaminated wastewater on the East China Sea, Wenzhou Daily reported on Sunday.

According to Wan Xinlong, leader of the special working group, the institute has been proactively conducting background radiation investigations of marine radioactive pollution in the relevant sea areas in response to potential pollution risks caused by the discharge of Fukushima nuclear-contaminated wastewater since last year. They have so far conducted two rounds of background radiation investigations, and their technical expertise and monitoring capabilities have become relatively mature.

“As of present, the levels of nuclear radioactivity in seawater, marine creatures, and seabed sediments collected from offshore areas are all within the range of natural background radiation levels and the nuclear-contaminated wastewater has not yet had an impact on the coastal areas of Wenzhou,” Wan said.

Wan noted that in order to comprehensively understand the distribution and trend of radioactive pollution, the originally planned quarterly sampling and monitoring will be shortened to every two months. The third round of sampling is scheduled to take place this September, with a larger sampling area compared to the previous two rounds.

The special working group will focus on monitoring marine radioactive pollution in the East China Sea region, with increased monitoring frequencies and enlarged monitoring ranges and strengthened collection and monitoring of samples such as seawater, marine creatures and sediments in a bid to promptly master the concentration and changes of radioactive pollutants in the ocean.

A team from Tsinghua Shenzhen International Graduate School created a diffusion model of radioactive materials on the ocean scale from macroscopic and microscopic perspectives respectively to simulate the long-term effects of the Fukushima nuclear-contaminated wastewater dumping program, according to which the nuclear pollutants will reach the coastal waters of China 240 days after being initially dumped, and will reach the coast of North America and cover almost the whole North Pacific Ocean in 1,200 days.

According to Wan, apart from the institute, domestic relevant departments and institutions are closely monitoring the radiation situation in China’s coastal areas. And they are ready to take immediate measures in case of any abnormal situation.

Wan noted that portable radiation detectors can only detect whether there is hazardous radiation on the surface of objects or in the air but can’t detect the presence of pollutants inside the objects.

Besides, these radiation detectors require a high level of expertise for accurate operation. Even a slight deviation could lead to inaccurate measurement. Thus, it is not recommended for the general public without proper knowledge and guidance to buy these detectors.

Guangzhou Center for Disease Control and Prevention (CDC) as well as the Guangzhou Food Safety Risk Monitoring and Evaluation Center said the center will proactively carry out emergency monitoring over Fukushima contaminated water dumping and will continue to monitor the situation of artificial radioactive pollution in seafood products sold in Guangzhou to safeguard food safety in the city, Guangzhou Daily reported on Sunday.

The Guangzhou CDC reminds residents to pay attention to authoritative information released by government departments and not to believe rumors or unreliable information that could cause unnecessary panic. They also advised to avoid purchasing or consuming food produced in areas affected by radiation contamination. When traveling to Japan, it’s not recommended to buy local seafood products or food items from unknown origins for gifting to relatives and friends back in China to reduce the chance of consuming nuclear-contaminated foods.

To prevent food products from radiation-affected areas in Japan from entering Sanya city in South China’s Hainan Province, the local market supervision bureau conducted thorough inspections over the weekend on the city's supermarkets, wet markets and other food production and distribution units to ensure they are not using or selling food products or food ingredients originating from nuclear radiation-affected areas in Japan, the bureau announced on Sunday.

The market watchdog will continue to intensify inspections of food production and sales companies, especially those dealing with imported and frozen foods to rigorously prevent substandard food products from entering the market.

China’s General Administration of Customs banned import of all aquatic products originating from Japan starting from August 24, 2023, when Japan started dumping nuclear-contaminated wastewater from crippled Fukushima Daiichi nuclear power plant into the sea.

The National Nuclear Safety Administration of China's Ministry of Ecology and Environment (MEE) on August 24, 2023 announced that relevant departments are organizing the marine radiation environment monitoring of China's jurisdictional sea areas in 2023.

The MEE said it will continue to strengthen relevant monitoring work and promptly follow and assess the potential impact of the discharge of nuclear-contaminated water from the Fukushima nuclear plant on China’s marine radiation environment.

Govt orders new nuclear power plants to carefully consider water intake safety

The National Nuclear Safety Administration (NNSA) has urged China's newly-built and projected nuclear power plants to fully consider water intake issues, in a bid to ensure the safe operation of nuclear power facilities.

During a recent meeting, the administration emphasized that relevant departments should improve water intake procedures due to changes in climate and sea environment over the years, to further ensure the smooth operation of nuclear power plants. This was stated by NNSA's official social media account on Monday.

The meeting was convened after Japan released nuclear-contaminated water from the crippled Fukushima Daiichi nuclear power plant into the ocean on Thursday. China halted aquatic product imports from Japan from that day and condemned Japan's actions as an irresponsible attitude towards the Chinese people and humanity as a whole.

The meeting underscored that the design of all newly-built and projected nuclear power plants should prioritize the security of water intake. Relevant hydrological, climatic, and marine biome data should be collected and monitored, and then utilized in professional research to address potential challenges in the sector.

New process encourages ice to slip, slide away

Ice removal may soon become a lot easier. Researchers have developed a new method for making ice-phobic surfaces by altering the density and slipperiness of spray-on polymer coatings.

The process, reported online March 11 in Science Advances, could lead to a wide range of long-lasting ice-repellent products including windshields, airplane wings, power cables and frozen food packaging, researchers say.

Scientists know that ice easily detaches from softer, less dense materials. Further adjusting the density of rubber polymers used to make the coatings and adding silicone or other lubricants such as vegetable oil, cod-liver oil and safflower oil amplifies the effect, Anish Tuteja, a materials science engineer at the University of Michigan in Ann Arbor, and colleagues found.
In multiple laboratory and field tests, ice slid off treated surfaces under its own weight or when it was pushed by mild wind. The researchers further tested the coatings’ durability on various surfaces such as metal license plates and glass panes. The coatings performed well through two Michigan winters and retained their ice-repelling properties after controlled exposure to icing and heat cycles, corrosive substances such as hydrochloric acid, and wear and tear.

The process has already yielded more than 100 different coatings tailored for specific surfaces, including metal, glass, wood, plastic and cardboard. Tuteja says his team is working on licensing the materials for commercial use.

Like birds of a feather, sperm flock together

BALTIMORE — When it comes to swimming sperm, it’s not every man for himself. Instead, sperm form groups that swim together, a bit like schools of fish or flocks of birds, physicists have observed.

Understanding the physics underlying such behavior in animals is difficult because their actions arise in part from cognitive processes — birds, for instance, can see what their neighbors are doing and adjust their flight path accordingly. But with sperm, group swimming emerges from the physics of the medium in which they swim, Chih-Kuan Tung of North Carolina A&T State University in Greensboro said in a news conference March 16 at a meeting of the American Physical Society. That makes sperm a simpler system for studying the physics behind a form of coordinated biological action. “They don’t think,” Tung said. “So whatever interaction is happening, we can quantitatively describe it.”

Sperm don’t form groups in ordinary water, Tung said, but they do in viscoelastic fluids such as the mucus of mammalian reproductive tracts. A viscoelastic fluid combines resistance to flow with the ability to restore its previous state when disturbed. Tung and colleagues created such elasticity by adding a polymer to the fluid used for testing the swimming ability of bulls’ sperm. Those experiments showed that it’s the elasticity, not the viscosity, that encourages collective swimming.

Further work will be needed, Tung said, to determine whether such group swimming confers an advantage to sperm seeking an egg. In any event, the new understanding of sperm dynamics could lead to improved methods for in vitro fertilization procedures, he said.

Racing for answers on Zika

Sometimes science does not move fast enough, despite much hard work and effort. That’s true in the case of the Zika virus outbreak currently marching through the Americas. As we report in a collection of stories, much remains unclear, including the relationship between Zika infection and microcephaly and how best to combat the mosquitoes that spread the disease. So far, however, evidence does suggest that this little-known (and previously largely ignored) virus may indeed target the nervous system, probably triggering Guillain-Barré syndrome in a small percentage of patients. The virus could even pose as-yet undiscovered health risks that may take years to untangle. While scientists will no doubt eventually be able to answer many of the public’s pressing questions, it may be too late for many.
It’s not a global emergency, but the public is also apparently impatient with science’s progress on providing practical advice about the human microbiome — the collection of bacteria and other microorganisms that live in and on us. This issue features two articles about new results from this hot field. Laura Sanders details surprising ways that gut microbes can meddle with the brain, hinting that certain microbial mixes may influence depression and other mental disorders. And Meghan Rosen describes the microbiome’s role in malnutrition, suggesting that resetting children’s microbes may be a useful treatment. It’s hard not to conclude that manipulating the bacteria in your body could offer a path to better health and happiness.
Judging from the shelves at Whole Foods, that is what many makers of probiotic supplements would like you to believe. And it may well turn out to be true — studies have linked the microbiome to metabolic and digestive issues such as obesity, irritable bowel syndrome and inflammatory bowel disease. But science hasn’t yet come up with broad recommendations for the best ways to tend your personal microfloral garden. And since the Food and Drug Administration regulates supplements as foods, not as medicines, probiotic pills may vary in quality and even in actual ingredients; makers don’t have to prove that probiotics are safe or effective.

Notably, none of the researchers that Sanders asked while reporting “Microbes and the mind” said that they regularly take probiotic supplements. They also said that any effects on the brain, while fascinating, are probably subtle for most people — otherwise you’d notice a mood change every time you took antibiotics. In Rosen’s story about malnutrition, researcher François Leulier says: “We can envision some therapy solutions, but we’re still at the basic research level.” It’s just too early to start megadosing, he says, even for very sick kids.

To fill in the gap, people look to anecdote. Or, sometimes knowingly, they engage in uncontrolled self-experiments with an N of 1, fueled by the Internet (see the website Quantified Self) and DIY culture. The data gleaned from these personal trials may help individuals, but they can’t answer big questions.

An eager public — and intriguing science — is propelling microbiome research along. Zika research is sprinting after an elusive and mysterious foe, trying to stop the damage from the virus and learn from a vast natural experiment. In both cases, science must move more swiftly if it is to catch up.

Quake risk in parts of central U.S. as high as in fault-filled California

Northern Oklahoma is just as susceptible to a damaging earthquake within the next year as the most quake-prone areas of California. That’s because earthquakes are no longer just a natural hazard, the U.S. Geological Survey says. In its new quake hazards forecast released March 28, the agency for the first time has included artificially triggered seismicity.

An increased risk in the central United States largely stems from sites where fluids, such as wastewater from fracking, are injected underground (SN: 8/9/14, p. 13). Rising fluid pressure underground can unclamp faults and unleash earthquakes (SN: 7/11/15, p. 10). From 1973 to 2008, an average of 24 potentially damaging quakes rattled the central United States each year. From 2009 to 2015, an uptick in fracking activity helped skyrocket that number to 318 annual quakes on average, with a record-setting 1,010 tremors in 2015 alone. Around 7 million people currently live and work in central and eastern U.S. areas vulnerable to shakes stemming from earthquakes roughly above magnitude 2.7, USGS scientists estimate.

Human-caused quakes aren’t as powerful as their natural counterparts (the strongest induced quake in the United States clocked in at magnitude 5.6 in 2011 compared with the magnitude 7.8 San Francisco temblor in 1906, for instance). But the potential for more powerful shakes exists, the scientists warn. The new hazard assessment should help regulators revise building codes to better prepare for the rising risk.

New habitat monitoring tools find hope for tigers

There’s still enough forest left — if protected wisely — to meet the goal of doubling the number of wild tigers (Panthera tigris) by 2022, says an international research team.

That ambitious target, set by a summit of 13 tiger-range nations in 2010, aims to reverse the species’ alarming plunge toward extinction. Forest loss, poaching and dwindling prey have driven tiger numbers below 3,500 individuals.

The existing forest habitat could sustain the doubling if, for instance, safe-travel corridors connect forest patches, according to researchers monitoring forest loss with free, anybody-can-use-’em Web tools. Previously, habitat monitoring was piecemeal, in part because satellite imagery could be expensive and required special expertise, says Anup Joshi of the University of Minnesota in St. Paul. But Google Earth Engine and Global Forest Watch provide faster, easier, more consistent ways to keep an eye out for habitat losses as small as 30 meters by 30 meters (the space revealed in a pixel).
Looking at 14 years of data, 76 major tiger landscapes altogether have lost less than 8 percent of forest, the researchers say April 1 in Science Advances. Finding so little loss is “remarkable and unexpected,” they write. But 10 of those landscapes account for most of the losses — highlighting the challenges conservationists, and tigers, face.

Possible source of high-energy neutrino reported

Scientists may have found the cosmic birthplace of an ultra-high energy neutrino. They point the finger at a blazar — a brilliantly luminous galaxy that shoots a jet of radiation in the direction of Earth — 9 billion light years away.

If the link between the blazar and neutrino is real, scientists would be closer to long-sought answers about where such power-packing particles come from. Violent astronomical accelerators boost some neutrinos to high energies, but scientists have never been able to convincingly identify their sources.
Neutrinos are aloof elementary particles that rarely interact with other matter — they can sail straight through the Earth, and trillions of them zip through your body every second without a trace. On December 4, 2012, the neutrino in question (which scientists have affectionately nicknamed Big Bird) slammed into the Antarctic ice with an energy of around 2 million billion electron volts. The neutrino observatory IceCube glimpsed the aftermath of the collision and measured its energy with sensitive detectors embedded deep in the ice (SN Online: 04/07/14), leaving scientists hustling to pinpoint its source.

The blazar flared up at just the right time and place to be a prime suspect, researchers report in a paper accepted for publication in a peer-reviewed journal. The result, now available online at arXiv.org, strengthens the case that blazars are the source of such high-energy neutrinos, but it is no smoking gun.

After the neutrino was detected, a team of astrophysicists scoured the heavens for energetic galaxies with TANAMI, short for Tracking Active Galactic Nuclei with Austral Milliarcsecond Interferometry, a network of telescopes peering into space at a variety of wavelengths. That team reported one likely candidate blazar.

But the candidate is not a surefire match, says IceCube leader Francis Halzen of the University of Wisconsin–Madison, who was not involved with the analysis. IceCube could determine the neutrino’s direction within only about 15 degrees on the sky, and the blazar flare-up continued for several months. The probability of such a chance concurrence between an unrelated neutrino and blazar is about 5 percent, the researchers say — too big to rule out chance. “It’s a very intriguing result,” says Halzen “but it’s not a proof.”

The matchup between the blazar and neutrino is noteworthy, even though the researchers can’t fully rule out the possibility that the match is a fluke, says astrophysicist Xiang-Yu Wang of Nanjing University in China, who was not involved with the research. “Given that the two events are very unique … I think it’s convincing.” Wang and colleagues have expanded on the result: In a paper accepted for publication in Physical Review Letters, they use the difference in arrival time between the neutrino and light from the blazar’s outburst — assuming the two are related — to test Einstein’s special and general theories of relativity. Certain theories of quantum gravity predict a delay in the arrival of a neutrino. (Einstein came out unscathed.)
The authors of the blazar study declined to comment on the result, citing the embargo policy of the journal where the paper will be published.

To convincingly identify a blazar as the source of a neutrino, Halzen says, scientists will need a better measurement of the neutrino’s direction, connected to a short-lived blazar outburst. In the future, Halzen says, IceCube will send out “astronomical telegrams” when it detects a neutrino, directing telescopes to take a look, perhaps catching a blazar in the act.

EPA underestimates methane emissions

The U.S. Environmental Protection Agency has a methane problem — and that could misinform the country’s carbon-cutting plans. Recent studies suggest that the agency’s reports fail to capture the full scope of U.S. methane emissions, including “super emitters” that contribute a disproportionate share of methane release. Those EPA reports influence the country’s actions to combat climate change and the regulation of methane-producing industries such as agriculture and natural gas production.

With EPA’s next annual methane report due to be published by April 15, early signs suggest that the agency is taking steps to fix the methane mismatch. A preliminary draft of the report revises the agency’s methane calculations for 2013 — the most recent year reported — upward by about 27 percent for the natural gas and petroleum sectors, a difference of about 2 million metric tons.
Yet it’s unclear how that and other revisions will factor into final methane emission totals in the upcoming report. The draft estimates that U.S. methane emissions from all sources in 2014 were about 28 million metric tons, up slightly from the revised estimate for 2013 and well above the original 2013 estimate of 25.453 million metric tons. But the totals in the draft don’t include updates to emission estimates from the oil and gas industry.
“EPA is reviewing the substantial body of new studies that have become available in the past year on the natural gas and petroleum sector,” says EPA spokesperson Enesta Jones. The agency is also gathering feedback from scientists and industry experts to further improve their reporting.

Methane, which makes up the bulk of natural gas, originates from natural sources, such as wetlands, as well as from human activities such as landfills, cattle ranches (SN: 11/28/15, p. 22) and the oil and gas industry. Globally, human activities release about 60 percent of the 600 million metric tons of methane emitted into the atmosphere each year. Once in the air, methane prevents some of Earth’s heat from escaping into space, causing a warming effect. Methane emissions currently account for about a quarter of human-caused global warming.

The EPA’s underestimation of U.S. methane emissions comes down to accounting. EPA samples emissions from known methane sources, such as cows or natural gas pipelines, and works out an average. That average is then multiplied by the nation’s number of cows, lengths of pipe and other methane sources. Results from this method disagree with satellite and land-based observations that measure changes in the total amount of methane in the air. A 2013 report in the Proceedings of the National Academy of Sciences found that U.S. methane emissions based on atmospheric measurements are about 50 percent larger than EPA estimates (SN Online: 11/25/13).
EPA’s reports don’t just misjudge the scale of emissions, they also miss the long-term trend, recent work suggests. EPA reported that U.S. methane emissions remained largely unchanged from 2002 to 2014. But researchers report online March 2 in Geophysical Research Letters that emissions of the greenhouse gas rose more than 30 percent over that period. The United States could be responsible for as much as 30 to 60 percent of the global increase in methane emissions over the last decade, the study’s authors conclude. “We’re definitely not a small piece of that pie,” says Harvard University atmospheric scientist Alex Turner, who coauthored the study.
Correctly tracking methane is important, Turner says, because over a 100-year period, the warming impact of methane is more than 25 times that of the same amount of CO2. Methane levels have also risen faster: Since the start of the industrial revolution, methane concentrations have more than doubled while CO2 has risen more than 40 percent.

While methane is more potent than CO2, there is about 200 times less methane in the atmosphere than CO2. Furthermore, methane stays in the atmosphere for only around 12 years before being absorbed by soil or breaking apart in chemical reactions. “If we reduce methane emissions, the climate responds very quickly and global warming would slow down almost immediately,” says Cornell University earth systems scientist Robert Howarth. “CO2, on the other hand, has an influence that will go hundreds to even thousands of years into the future.”

Turner and colleagues tracked methane across the continental United States using land stations that measure methane in the air and satellite observations that record dips in the infrared radiation frequencies absorbed and reemitted by methane. The researchers compared these methane measurements with those taken over Bermuda and the North Pacific Ocean — places upwind of the United States and far from major methane sources.

From 2002 through 2014, methane concentrations over the continental United States grew faster than those over the oceans, the researchers found. The difference was most pronounced over the central United States, where methane concentrations rose nearly twice as fast as in the rest of the country. Natural gas drilling and production boomed in in the central United States during the period studied, though the researchers could not precisely trace the source of the additional methane.

Turner and colleagues say they’re now working with EPA to reconcile the methane estimates. EPA will provide small-scale estimates of methane emissions down to a 10-kilometer-wide grid. By combining that grid with space and land observations, scientists should be able to isolate where methane mismatches are the most pronounced.

While Turner’s research can’t pinpoint the exact origins of the additional methane, other studies point to the oil and gas industry. The numbers that the EPA uses to tabulate methane emissions assume that equipment is functioning as intended, says Stanford University sustainability engineer Adam Brandt. Malfunctioning equipment can spew huge amounts of methane. That became abundantly – and visibly – clear last October when the largest U.S. methane leak in history began in an underground storage facility near Los Angeles. The leak released 97,100 metric tons of methane, equivalent to the annual greenhouse gas emissions of 572,000 cars, before being permanently sealed in February, researchers estimated in the March 18 Science.

Super methane emitters are a big problem elsewhere, too, albeit typically much smaller than the California leak, researchers report in the June 2016 Environmental Pollution. Surveying emissions from 100 natural gas leaks around Boston, the researchers found that 7 percent of leaks contributed half of the total methane released. In 2014, a different research team reported in Environmental Science & Technology that 19 percent of pneumatic controllers used at natural gas production sites accounted for 95 percent of all controller emissions.

Monitoring and quick repairs can stamp out rogue methane sources quickly, Brandt says. “This is a problem that’s easier to fix than it is to understand,” he says.