MIT Predicted in 1972 That Society Will Collapse This Century. New Research Shows We’re on Schedule.

A 1972 MIT study predicted that rapid economic growth would lead to societal collapse in the mid 21st century. A new paper shows we’re unfortunately right on schedule.
July 14, 2021, 1:00pm
GettyImages-1189106363
Image: Getty 

A remarkable new study by a director at one of the largest accounting firms in the world has found that a famous, decades-old warning from MIT about the risk of industrial civilization collapsing appears to be accurate based on new empirical data. 

As the world looks forward to a rebound in economic growth following the devastation wrought by the pandemic, the research raises urgent questions about the risks of attempting to simply return to the pre-pandemic ‘normal.’

Advertisement

In 1972, a team of MIT scientists got together to study the risks of civilizational collapse. Their system dynamics model published by the Club of Rome identified impending ‘limits to growth’ (LtG) that meant industrial civilization was on track to collapse sometime within the 21st century, due to overexploitation of planetary resources.

The controversial MIT analysis generated heated debate, and was widely derided at the time by pundits who misrepresented its findings and methods. But the analysis has now received stunning vindication from a study written by a senior director at professional services giant KPMG, one of the 'Big Four' accounting firms as measured by global revenue.

Limits to growth

The study was published in the Yale Journal of Industrial Ecology in November 2020 and is available on the KPMG website. It concludes that the current business-as-usual trajectory of global civilization is heading toward the terminal decline of economic growth within the coming decade—and at worst, could trigger societal collapse by around 2040.

The study represents the first time a top analyst working within a mainstream global corporate entity has taken the ‘limits to growth’ model seriously. Its author, Gaya Herrington, is Sustainability and Dynamic System Analysis Lead at KPMG in the United States. However, she decided to undertake the research as a personal project to understand how well the MIT model stood the test of time. 

The study itself is not affiliated or conducted on behalf of KPMG, and does not necessarily reflect the views of KPMG. Herrington performed the research as an extension of her Masters thesis at Harvard University in her capacity as an advisor to the Club of Rome. However, she is quoted explaining her project on the KPMG website as follows: 

“Given the unappealing prospect of collapse, I was curious to see which scenarios were aligning most closely with empirical data today. After all, the book that featured this world model was a bestseller in the 70s, and by now we’d have several decades of empirical data which would make a comparison meaningful. But to my surprise I could not find recent attempts for this. So I decided to do it myself.”

Advertisement

Titled ‘Update to limits to growth: Comparing the World3 model with empirical data’, the study attempts to assess how MIT’s ‘World3’ model stacks up against new empirical data. Previous studies that attempted to do this found that the model’s worst-case scenarios accurately reflected real-world developments. However, the last study of this nature was completed in 2014. 

The risk of collapse 

Herrington’s new analysis examines data across 10 key variables, namely population, fertility rates, mortality rates, industrial output, food production, services, non-renewable resources, persistent pollution, human welfare, and ecological footprint. She found that the latest data most closely aligns with two particular scenarios, ‘BAU2’ (business-as-usual) and ‘CT’ (comprehensive technology). 

“BAU2 and CT scenarios show a halt in growth within a decade or so from now,” the study concludes. “Both scenarios thus indicate that continuing business as usual, that is, pursuing continuous growth, is not possible. Even when paired with unprecedented technological development and adoption, business as usual as modelled by LtG would inevitably lead to declines in industrial capital, agricultural output, and welfare levels within this century.”

Study author Gaya Herrington told Motherboard that in the MIT World3 models, collapse “does not mean that humanity will cease to exist,” but rather that “economic and industrial growth will stop, and then decline, which will hurt food production and standards of living… In terms of timing, the BAU2 scenario shows a steep decline to set in around 2040.”

image3.png

The ‘Business-as-Usual’ scenario (Source: Herrington, 2021

The end of growth? 

In the comprehensive technology (CT) scenario, economic decline still sets in around this date with a range of possible negative consequences, but this does not lead to societal collapse.

image1.png

The ‘Comprehensive Technology’ scenario (Source: Herrington, 2021

Unfortunately, the scenario which was the least closest fit to the latest empirical data happens to be the most optimistic pathway known as ‘SW’ (stabilized world), in which civilization follows a sustainable path and experiences the smallest declines in economic growth—based on a combination of technological innovation and widespread investment in public health and education.

image2.png

)The ‘Stabilized World’ Scenario (Source: Herrington, 2021)

Although both the business-as-usual and comprehensive technology scenarios point to the coming end of economic growth in around 10 years, only the BAU2 scenario “shows a clear collapse pattern, whereas CT suggests the possibility of future declines being relatively soft landings, at least for humanity in general.” 

Both scenarios currently “seem to align quite closely not just with observed data,” Herrington concludes in her study, indicating that the future is open.   

A window of opportunity 

While focusing on the pursuit of continued economic growth for its own sake will be futile, the study finds that technological progress and increased investments in public services could not just avoid the risk of collapse, but lead to a new stable and prosperous civilization operating safely within planetary boundaries. But we really have only the next decade to change course. 

“At this point therefore, the data most aligns with the CT and BAU2 scenarios which indicate a slowdown and eventual halt in growth within the next decade or so, but World3 leaves open whether the subsequent decline will constitute a collapse,” the study concludes. Although the ‘stabilized world’ scenario “tracks least closely, a deliberate trajectory change brought about by society turning toward another goal than growth is still possible. The LtG work implies that this window of opportunity is closing fast.” 

Advertisement

In a presentation at the World Economic Forum in 2020 delivered in her capacity as a KPMG director, Herrington argued for ‘agrowth’—an agnostic approach to growth which focuses on other economic goals and priorities.  

“Changing our societal priorities hardly needs to be a capitulation to grim necessity,” she said. “Human activity can be regenerative and our productive capacities can be transformed. In fact, we are seeing examples of that happening right now. Expanding those efforts now creates a world full of opportunity that is also sustainable.” 

She noted how the rapid development and deployment of vaccines at unprecedented rates in response to the COVID-19 pandemic demonstrates that we are capable of responding rapidly and constructively to global challenges if we choose to act. We need exactly such a determined approach to the environmental crisis.

“The necessary changes will not be easy and pose transition challenges but a sustainable and inclusive future is still possible,” said Herrington. 

The best available data suggests that what we decide over the next 10 years will determine the long-term fate of human civilization. Although the odds are on a knife-edge, Herrington pointed to a “rapid rise” in environmental, social and good governance priorities as a basis for optimism, signalling the change in thinking taking place in both governments and businesses. She told me that perhaps the most important implication of her research is that it’s not too late to create a truly sustainable civilization that works for all.

Advertisement

A ‘Complete’ Energy Transition Is Needed By 2030 to Avoid ‘Existential Threat’ to Humanity

A Swiss government-backed study suggests that we could see major warming in the near future if we do not have a "fast and complete" renewable energy transition.
May 17, 2021, 1:00pm
GettyImages-1317714749
Image: Xu Congjun/VCG via Getty Images

Scientists backed by the Swiss government warn that widely accepted global scenarios for reducing carbon emissions to avoid dangerous climate change in reality pose a 60-80 percent probability of breaching the 1.5 degrees Celsius (C) ‘safe limit’ agreed by world nations under the 2015 Paris Agreement.  

Their new preprint paper published in March, which is currently under review with the journal Environmental Research Letters, finds that only a “fast and complete” global renewable energy transition led by solar power to be accomplished before 2030 can reduce this probability to a more acceptable level of around 20 percent. 

Existential threat 

The vast majority of decarbonization scenarios put forward by the UN’s Intergovernmental Panel on Climate Change (IPCC) and the UN Environment Program, especially those which are consistent with actual current levels of carbon emissions, point toward the 60—80 percent probability of breaching the 1.5°C safe limit. 

The IPCC has called for emissions to be halved by 2030, with a net zero target for 2050. 

Advertisement

Even when the more ambitious IPCC decarbonization scenarios are taken into account, the scientists find that: “All the IPCC scenarios violate the 1.5°C peak heating target” with a probability between 40 percent and higher than 80 percent. 

“Such high probabilities of violation would be considered unacceptable in other areas, like engineering or public health,” write the co-authors Harald Desing and Rolf Widmer of the Swiss Federal Laboratories for Material Science and Technology in Switzerland.  

“Growing understanding of the Earth system suggests that peak heating beyond 1.5°C may be an existential threat to the biosphere and therefore also humanity,” they explain in their paper. “Transitions that exceed this vital threshold with a high probability expose future generations to substantial risks without their prior consent.” 

Although some residual probability might still be acceptable for society, they say, by tolerating high probabilities between 40 and 80 percent the current climate discourse “exposes future generations to unprecedented risks without their prior consent.”  

To illustrate the urgency of climate action, the paper shows that even if carbon emissions were to remain constant at 2018 levels, there would be a 66 percent probability of depleting the remaining carbon budget before 2028. Of course, carbon emissions continue to rise every year and are forecast to increase this year by as much as 5 percent. 

Fast transition 

However, while this sober warning suggests that the “existential threat” of dangerous climate change beyond the 1.5°C limit is baked into prevailing decarbonisation plans, the paper also shows that a rapid transition to a solar-based renewable energy system can dramatically reduce this risk. But we need to act fast.  

The paper uses the concept of Energy Return on Investment (EROI)—a measure of the amount of energy used to extract a certain quantity of energy from any given resource—to assess the prospects of such a transition. The speed of action would require a temporary increase in fossil fuel emissions above current levels “for the sole purpose of accelerating the growth of renewable energy capacity” and switching off the “fossil engine” before 2030.  

Advertisement

“The solar engine grows fastest, when the fossil engine’s output is not replaced by solar energy during the transition… but is increased to utilize all available idle capacity,” says the paper. “Consequently, fossil CO2 emissions will temporarily increase during the transition but will be reduced to zero as soon as the solar engine can replace the fossil engine.”

Lead author Harald Desing told Motherboard that: “The fast transition would need to commence beginning of next year or end of 2021 and… would be complete anywhere in between 2024 and 2028.” 

Although this is far faster than all conventional decarbonization scenarios, it is the only feasible way to reduce the probability of breaching the 1.5C limit to a genuinely safer level.   

Overbuilding solar to ‘switch off’ the fossil engine 

Desing explained that such a rapid global solar transition would create further new possibilities for decarbonization due to the potential for clean electrification of key sectors: 

“The fastest possible transition is achieved by investing all overcapacity we have in today’s fossil energy system in building the solar system. And given the fast exponential growth, the size of overcapacity built into the solar system doesn’t matter that much… This indicates that solar overcapacity is fast to obtain… having a negligible effect on the climate but opening unprecedented possibilities.” 

Advertisement

Recent peer-reviewed research by Columbia University engineer Marc Perez has shown that overbuilding solar capacity—aiming for 100 percent if not more—creates a level of surplus solar energy which can be used to top off the grid, producing electricity some 75 percent cheaper. Perez found that if the US, for instance, operated an overcapacity solar-dominated grid with limited wind power and energy storage, it could generate a constant source of electricity even cheaper than today’s conventional production without intermittency problems. 

The Swiss paper thus confirms that while building out the global renewable energy system will entail exploitation of carbon intensive resources, if deployed correctly this can pave the way for shifting all further societal activities including the manufacturing of renewable energy systems onto the new solar-based power system: “Fossil fuel use should be increased temporarily with the sole purpose to build the solar engine as fast as possible to subsequently switch off the fossil engine forever,” the paper concludes. 

“As solar overcapacity is fast to build, we see two main advantages,” said Desing. “First, it reduced the need for storage capacity. Storage is energetically expensive as it either has high embodied energy, for example, battery, and has a low turnaround efficiency, for instance for synthetic fuels.” Second, he said, solar overcapacity can then provide the clean power to sustain direct air capture and storage (DACS) technologies to drawdown further carbon from the atmosphere. “Stabilizing the climate in the long run, it is necessary to reduce the atmospheric CO2 concentration down to below 350ppm. Solar overcapacity is ideal for this task, operating DACS during peak sun hours and summer, when idle capacity is available.” 

Advertisement

The new research has been funded under the Sustainable Economy National Research Program (NRP) run by the Swiss National Science Foundation. NRPs are administered by the Swiss Federal Council, the seven-member executive council that constitutes the federal government of Switzerland. 

Clean energy abundance?

Two years ago, Desing’s team at the Swiss Federal Laboratories for Material Science and Technology released a paper published in the peer-reviewed Energies journal which attempted to calculate the “global appropriate technical potential”—the actual quantity of energy that could be produced for societal use given realistic environmental, land-use and other constraints—of a solar-dominated renewable energy system. They found that such a system, if deployed properly with sufficient overcapacity, could generate vastly more energy than the existing fossil fuel-based industrial energy system.  

Their study took into account the need to remain within earth system boundaries while also accounting for demand for chemicals and materials. It found that the technical potential incorporating solar installed across the built environment as well as the world’s deserts would be as high as 71 terawatts (TW) of clean electricity annually, “which is significantly larger than the current global energy demand”. Currently global energy consumption is around 17.7 TW, which according to Desing is equivalent to 6.7 TW of electricity.

This is orders of magnitude higher—more than ten times as much—as current levels of energy consumption, and leaves large quantities of electricity “which can then be transformed into e-fuels and heat where necessary.” 

Even without deserts, Desing told Motherboard, solar concentrated on the built environment alone would still contribute 22 TW of electricity, which is a huge three times higher than current equivalent energy consumption.

Desing and his team conclude that this means there is “sufficient renewable energy potentially available to increase energy access for a growing world population as well as for a development towards increasingly closed material cycles within the technosphere”—that is, to devote more energy toward the ‘circular economy’ through increased recycling of raw materials and electrification of industry.  

If the Swiss team’s data is correct—and they have expressed 99 percent confidence in their calculations—a solar transition by 2030 could not only avert the worst risks of dangerous climate change by switching off the age of fossil fuels forever, it would usher a new age of clean energy abundance unleashing “unprecedented possibilities” for human civilization.

Advertisement
Advertisement

Study That Said Smokers Get COVID Less Often Retracted for Big Tobacco Ties

The attention-grabbing study found smokers were 23% less likely to catch COVID-19, and now it's been retracted by the journal that published it.
April 23, 2021, 1:00pm
Study That Said Smokers Get COVID Less Often Retracted for Big Tobacco Ties
Image: Esther Moreno Martinez / EyeEm via Getty Images
210329_MOTHERBOARD_ABSTRACT_LOGO
ABSTRACT breaks down mind-bending scientific research, future tech, new discoveries, and major breakthroughs.

Medical research has always been a crucial component of public health, and the COVID-19 pandemic has made us all hyper-aware of each new study advertising some advancement in our understanding of the virus. Now, an attention-grabbing 2020 study that suggested smoking may reduce the risk of COVID-19 infection has been retracted due to two co-authors having undeclared connections to the tobacco industry. 

Advertisement

The study, “Characteristics and risk factors for COVID-19 diagnosis and adverse outcomes in Mexico: an analysis of 89,756 laboratory-confirmed COVID-19 cases” was published as an “early view” in July 2020 in European Respiratory Journal with the claim that smoking cigarettes made patients 23 percent less likely to be diagnosed with COVID-19. The work was a joint effort by researchers at the University of Patras in Greece, the University of Utah, and in Mexico. It received coverage in the media along with other studies suggesting similar findings, including in the Daily Mail.

This claim is contrary to what many other studies have found regarding the connection between smoking and COVID severity. In fact, in some US states smoking is considered a comorbidity. This claim itself is not what caused the paper to be retracted on March 4th, rather, two co-authors failed to declare their connections to the tobacco industry before submitting their work.

“[O]ne of the authors (José M. Mier) at the time [of submission] had a current and ongoing role in providing consultancy to the tobacco industry on tobacco harm reduction,” the journal wrote in its retraction statement. “And another (Konstantinos Poulas) at the time was a principal investigator for the Greek NGO NOSMOKE.” 

Advertisement

The journal’s statement explains that NOSMOKE has funding connections to the Foundation for a Smoke-Free World (FSFW), which is funded by the tobacco industry. According to the University of Bath’s Tobacco Tactics website, NOSMOKE “develops new vaping products and shares pro e-cigarette research from the tobacco industry,” which is itself invested in developing vaping technology and promoting it as a safer way to ingest nicotine than smoking. Vaporizers are referred to by the tobacco industry and by NOSMOKE as modified risk tobacco products, or MRTPs. NOSMOKE is located at the Patras Science Park, which according to Tobacco Tactics received grants in 2018 from the tobacco industry for the establishment of “an institute, which operates as a research and innovation center in the field of modified risk products (MRTPs) and tobacco harm reduction.”

According to the journal’s retraction notice, undisclosed conflicts of interest are not typically enough reason to retract a study. However, the journal has a strict policy against considering papers from authors with ties to the tobacco industry. On balance, “the editors felt the decision was justified based on the nature of the undisclosed relationship, in the context of the sensitive subject matter presented, and on the need to align the published journal content with the bylaws of the publishing society,” the journal said. Additionally, there was no question of scientific misconduct. 

Advertisement

Konstantinos Farsalinos, one of the study’s co-authors and a frequent collaborator on NOSMOKE-affiliated research who was not named by the journal as having links to the tobacco industry, told Motherboard by email that he considers this retraction decision “unfair and unsubstantiated.”

“I was contacted by the editors of the journal only after the decision for retraction was made,” Farsalinos continues. “I  proposed to publicly release the full dataset and the statistical script so that all findings could be independently verified. The editors declined.”

Farsalinos also said that the conflicts related to the study’s retraction were “irrelevant to the study’s main aims” which was to identify COVID-19 risk factors to prioritize in low-income populations.

Additionally, Theo Giannouchos, the study’s first author who was not one of the authors named as having undisclosed ties to the tobacco industry, pointed out to Motherboard by email that the conclusion the study came to was still by no means an encouragement of smoking.

“It is currently unclear whether nicotine exerts any positive effect, however, there is no doubt that smoking cannot be used as a protective measure and smoking cessation should be encouraged during the COVID-19 pandemic,” the study reads.

Advertisement

The retracted study is also not the only one to hint at a possible connection between smoking and reduced COVID-19 infection. A 2020 study by researchers from Israel’s Clalit Health Services provider, published as a preprint, found a similarly curious possible connection. A French study also uncovered similar findings. Still, many other studies and official guidance from public health organizations indicate that smoking is a risk factor for COVID-19 and for people’s health generally. 

Despite the dishonesty that stems from conflict of interest (COI) omissions, failure to declare COIs are not always a one-way ticket to retraction. But when it comes to the tobacco industry, this can be more complicated. 

“Unfortunately, the tobacco industry and tobacco industry-funded scientists have a long and notorious history of withholding, falsifying and manipulating research data,” said Sven-Eric Jordt, an associate professor in anesthesiology at the Duke University School of Medicine who studies tobacco regulatory science and was not involved in the retracted research, in an interview. “This is why the European Respiratory Society categorically excludes research manuscripts by the tobacco industry and industry-funded scientists.”

Jordt also said that while the “retraction doesn't necessarily say that the results are invalid,” transparently disclosing industry connections in research are essential to promoting public trust in science, which is important now more than ever.

“Many faculty at US universities start companies and provide expert and consulting services to industry, governments, non-profits and law firms,” said Jordt. “Declaring conflicts of interest is a measure to increase public trust in research.”

Advertisement
Advertisement
210521-1813_SH_MB_Facial_Recognition_lede_as
Image: Getty Images

Facial Recognition Is Racist. Why Aren’t More Cities Banning It?

Pockets of cities and states around the United States have banned police use of facial recognition, but progress is slow.
May 25, 2021, 4:01am
210519_1935_TOPIC_LOGO_SH
On May 25th a reckoning with systemic racism was reignited. It's still here — and so are we.

Early in 2020, a coalition of civil rights and community groups in Minneapolis joined forces to campaign for a sweeping city ordinance that would have given residents unprecedented insight into the Minneapolis Police Department’s use of surveillance technologies, and the power to decide how those tools were used in the future. It promised to be an uphill battle. 

Then, on May 25, Minneapolis police murdered George Floyd.

Advertisement

Amid the grief, anger, and calls for wholesale public safety reform that followed, the coalition decided to reset and laser-focus on banning facial recognition—a technology with a well-documented history of bias against women and people of color, and that many feared police were using to identify and punish protesters. 

After months of advocacy, on February 12, the Minneapolis City Council voted unanimously to ban the MPD from using facial recognition, making it only the 16th (and most recent) American city to do so.

“This technology has been used for a long time, and for it only to be in the popular imagination right now and for such a small number of cities to pass [bans on] it is interesting,” Munira Mohammed, a policy associate with the ACLU of Minnesota who worked on the campaign, told Motherboard.

210520-1739_SH_MB_Facial_Recognition_as.png

Image: Daisy Wardell

She said it’s impossible to know how the original Public Oversight of Surveillance Technology and Military Equipment (POSTME) ordinance might have fared in Minneapolis had police not killed Floyd and residents not taken to the streets. “It would have been a harder argument to make for your average, middle-aged, white resident of Minneapolis. But after that summer, everyone had a kind of reawakening,” she said. “People were more interested in reforming the department than ever before.”

The effort to ban facial recognition in Minneapolis is illustrative of how hard it has proved to pass similar measures in cities and states around the country. Polls have shown that around half of Americans trust police to use facial recognition, but those same surveys have also found that respondents have misconceptions about the accuracy of the technology. And in many cases, Americans are unaware that their local law enforcement agencies are using facial recognition.

Advertisement

Successful bans have often come after investigative reporting or lawsuits uncovered that police were lying about not using facial recognition, or as in Minneapolis, after tragedies.

In 2019, it looked like facial recognition bans might sweep the country. San Francisco was the first city to ban police use of the technology in May 2019. It was quickly followed by Somerville, Massachusetts, that June, and Oakland that July.

By the end of the year, four other cities—Alameda and Berkeley in California, Brookline and Northampton in Massachusetts—had also introduced bans. The ordinances continued to spread in Massachusetts to Boston, Cambridge, and Springfield through the summer of 2020. This timeline was compiled using data collected by organizers with Fight for the Future.

As civil rights and police reform protests swept across the country following Floyd’s murder and the police killing of Breonna Taylor and other Black Americans, activists hoped the movement would spur a wave of surveillance reforms.

Jackson, Mississippi, and Portland, Maine, became the first cities outside California and Massachusetts to ban police use of facial recognition, in August 2020. The next month, Portland, Oregon, passed the most comprehensive ban in the country, prohibiting not just government use but also many applications of facial recognition by private companies. And in October, Vermont became the first state to ban police use without express permission from the state legislature in October.

Advertisement

But the pace of the successful efforts has slowed. A rule regulating the use of the technology in Pittsburgh passed, but with loopholes that prompted one city councilor to call it “irrelevant.” Madison, Wisconsin, prohibited some government agencies from using facial recognition, but they left in large carve-outs for police.

In December 2020, after months of community organizing and being told by the New Orleans Police Department that the agency did not use facial recognition, activists there finally learned the truth–police in the city were using the technology—and managed to push through an ordinance that banned police use of facial recognition and imposed a number of other regulations on surveillance technology.

Organizers with Eye on Surveillance, a New Orleans-based privacy group, told Motherboard that they fought through constant pushback and obfuscation from law enforcement and that the measure had to be “dragged across the finish line” and may already be facing efforts to reverse it. “We all put an insane amount of labor into this and were relentless in our work with city council … never backing down and never losing sight of the goal,” said Blair Minnard, an organizer with Eye on Surveillance.

In April, Virginia became the second state and most recent jurisdiction to ban police facial recognition. As with New Orleans, the effort there gained momentum after journalists revealed that police officers were using Clearview AI’s software, sometimes without the knowledge of their own superiors. But rather than a slogging fight, the legislation’s sponsor, Delegate Lashresce Aird, said she was surprised to find bipartisan support. 

Advertisement

Her original bill would only have required local governments to sign off before police could use facial recognition. But after finding support for an outright ban in the Senate, Aird reworked the legislation and it passed with overwhelming support.

Motherboard asked the organizers of successful facial recognition bans why they believe similar laws haven’t spread further. Mohammed, from the ACLU of Minnesota, said that the law enforcement lobby is a powerful force, especially in state legislatures. That forces many opponents to pursue prohibitions at the city level, limiting the effects and creating opportunities for loopholes that allow police to share technology with other agencies. Without a precipitating tragedy or being able to point to direct evidence of how the technology harms local people, momentum for the efforts can quickly decline.

Marvin Arnold, with Eye on Surveillance, said many local officials have misconceptions about how facial recognition can and is being used. One of the biggest hurdles the New Orleans coalition had to overcome was convincing a city councilor that facial recognition and other surveillance tools weren’t going to stop an epidemic of illegal trash dumping in her district.

Aird added that it can be exhausting to explain the research underlying facial recognition bans to other politicians—it took her several years and legislative sessions just to capture enough attention.

“I truly believe that the conversations we are not having around AI and facial recognition technology is a huge missed moment and a huge mistake,” she said. “All the challenges from the civil rights movement that we are battling right now, they are showing up from a technological standpoint and we are not paying attention to it. We’re ignoring it.”

Advertisement

Before Texas, Feds Warned of ‘Catastrophic’ Blackouts Due to Fossil Fuel Failures—But Did Nothing

A previously ignored 2018 study called "Surviving a Catastrophic Power Outage" predicted much of what happened in Texas last week with shocking prescience.
February 25, 2021, 3:18pm
GettyImages-1231266038
Image: THOMAS SHEA/AFP via Getty Images

Three years before Texas experienced the largest forced power outages in American history, President Trump was warned by his own advisors that the US was in danger of experiencing a “catastrophic power outage” that would “outmatch” existing national plans and capabilities. Among the main culprits of this vulnerability, they argued, was an aging electricity grid heavily dependent on oil and natural gas. 

Advertisement

Blackouts that rolled across Texas last week left more than 4 million Americans without power for days at below freezing temperatures. The power outages also left millions without safe drinking water as water pumps and treatment plants went down. Dozens died as a result of the extreme cold. 

This crisis was not only foreseeable, but warned about repeatedly over many years. Its principal cause was not the shutdown of wind and solar power, as falsely claimed by Texas Governor Greg Abbott. Nor was it anything to do with the Biden administration’s Green New Deal, as Republican minority leader Kevin McCarthy implied. 

The warning to Trump 

Although the narrative of renewables bringing down the grid has whipsawed across conservative media outlets, the real vulnerabilities were explained in detail in a US government study commissioned by the Trump administration. The findings of this report had not yet been resurfaced since the Texas disaster until this article. 

The study, Surviving a Catastrophic Power Outage, was published by the President’s National Infrastructure Advisory Council (NIAC) in 2018. Its core finding now seems unnervingly prescient: 

Advertisement

“After interviews with dozens of senior leaders and experts and an extensive review of studies and statutes, we found that existing national plans, response resources, and coordination strategies would be outmatched by a catastrophic power outage… that could leave large parts of the nation without power for weeks or months, and cause service failures in other sectors—including water and wastewater, communications, transportation, healthcare, and financial services—that are critical to public health and safety and our national and economic security.”  

The report urged the need for “significant public and private action” to prepare for this eventuality, including recommendations for all levels—federal, state, territories, cities and localities. Yet few if any of these recommendations were actually followed through. 

Although it mentioned a few examples of potential triggers for a catastrophic blackout, from natural disasters like wildfires to “cyber-physical” attacks, the report avoided direct discussion of what might cause such a crisis. 

It’s the oil and gas, stupid!

Even so, the report nowhere mentions renewable energy as a potential reason for America’s growing vulnerability to blackouts. Instead, it devotes an entire section to vulnerabilities in the “Oil and Natural Gas Sector.” 

The report warned that “a break in the flow of electricity could disrupt the flow of gas when backup power runs out, which could cause further fuel interruptions. This growing interdependency creates risks of cascading, mutually-reinforcing failures across both the Electricity Subsector and Oil and Natural Gas Subsector.” 

Advertisement

It went on to describe this risk in considerable, almost excruciating detail, essentially highlighting that the fundamental danger was too much dependence on hydrocarbon energy sources:  

“Reliance on a single fuel creates the danger of ‘common mode failures’ where a lack of natural gas incapacitates multiple generators simultaneously, which could create power outages lasting a month or longer over multiple regions of the US.”  

In other words, the Texas blackout, unprecedented as it was, came nowhere near the scale of the worst-case scenarios identified in this document.

“During a catastrophic outage, on-site fuel supplies for emergency generators will quickly be depleted,” the report went on. “Massive, multi-sector requirements for fuel resupply would occur, and contractors responsible for resupply operations will likely be unable to meet these requirements.”  

It specifically gave New England as an example, noting that “it has a high reliance on natural gas supply and LNG imports to meet winter peak loads.” A limited number of pipelines and supporting gas infrastructure means that if even a single natural gas compressor station or other gas system facility is lost, this “would create recurring energy shortages that would cause frequent and long rolling blackouts.” 

Triggering catastrophe – a forewarning of complex emergencies 

A prolonged blackout could in turn disrupt supply chains, the report found. “Natural gas disruptions could also cause ripple effects across supply chains, whose corruption could pose significant challenges.”  

Especially vulnerable are water supplies, due to reciprocal and mutual interlinkages between water and energy, through mining, fuel production, hydropower, and power plant cooling. Noting that water pumping, treatment, and distribution needs energy (along with the discharging of wastewater), the document warned that a major driver of risk is the projected rise in water consumption by 50 percent from 2005 to 2030. The other problem is that thermal power plants used by fossil fuels are using increasing quantities of energy.  

Advertisement

“Without water services, factories shut down, hospitals close, communities are disrupted, and most hotels, restaurants, and businesses cease operations,” the report warned. “If water and wastewater systems failed in communities across multiple states or US regions, the societal consequences and risk to the lives and safety of affected populations would be difficult to overestimate.” 

Head in the sand 

The problem is that the Trump administration’s most important proposed solution to build resiliency was no solution at all. Central to its list of recommendations was the report’s myopic focus on ensuring that “all critical natural gas transmission pipeline infrastructure has the appropriate standards, design, and practices to continue service during a catastrophic power outage and maintain rapid availability.”  

Yet the document failed to address the underlying problems of an aging and outmoded industrial-era national grid that is no longer fit for purpose—incapable of withstanding the converging forces of rising demand and accelerating climate disasters. Which helps explain why utilities have done little to upgrade the ailing grid. 

That’s why what happened in Texas was unfortunately a taste of things to come on a business-as-usual trajectory. 

Blackout future? 

Seven years ago, I reported for Motherboard that “industrialized countries face a future of increasingly severe blackouts… due to the proliferation of extreme weather events, the transition to unconventional fossil fuels, and fragile national grids that cannot keep up with rocketing energy demand.” 

The prediction was based on a study published in the Journal of Urban Technology, whose authors professors Hugh Byrd and Steve Matthewman described blackouts as “dress rehearsals for the future in which they will appear with greater frequency and severity.” 

Advertisement

Since then, the data has borne out this grim prediction. Over the last decades, the US has experienced an increasing number and intensity of blackouts. There have been over 2,500 major outages since 2002, nearly half of which were caused by severe weather events. Put another way, there has been an overall 67 percent increase in weather-related blackouts since 2000, affecting two thirds of American states. 

Data from the US Energy Information Administration shows that since 2013, the average duration of power blackouts experienced by US citizens has increased by 300 percent, from two to nearly six hours by 2018. The US now suffers from 147 big blackouts a year, and rising. 

Texas: oil state in denial 

Texas is among the worst offenders, having experienced 105 blackouts over the last two decades, at increasing frequency. 

In the February 2020 crisis, the state’s own Electric Reliability Council confirmed that natural gas providers were the primary cause of the crisis. Nearly half the state’s gas production was shut down, along with at least one nuclear plant. Meanwhile, wind turbine failures accounted for just 13 percent of outages. In fact, thermal sources, such as coal, gas and nuclear lost nearly twice as much power due to the cold than renewable energy sources. 

Advertisement

And the crisis in Texas did not stay in Texas.  

Blackouts swept across Ohio, Mississippi and other states, and left 2.5 million without power in seven northern Mexican states. A third of US oil production ground to a halt, and vaccination efforts in 20 states were disrupted.

This sort of scenario involving a rolling series of power, water, and infrastructure crises triggered by extreme weather is consistent with the warnings of a separate report commissioned by the Pentagon in 2019. That document went so far as to warn that the US military itself might be at risk of collapsing in two decades under the strain of responding to complex domestic emergencies triggered by climate change.

In this context, the Texas blackouts are a dry-run for the severe costs of business-as-usual, pointing to the urgent need to invest in real alternatives.

So what next? 

Perhaps most ironic of all is the fact that the Trump administration had separately commissioned research which identifies the alternative.  

Another 2018 report published by the US Department of Energy’s National Renewable Energy Laboratory (NREL) found that solar panels combined with battery storage could provide a more resilient power system to protect against blackouts than gas or diesel.  

“Diesel generators are often viewed as the default solution for providing resilient power, but they might not always be the most reliable or cost-effective solution,” the report stated. “Reliance on traditional fuel reduces an energy system’s resilience because a disruption or contamination in the fuel supply can cause vulnerabilities. Using solar power to charge on-site energy storage offers unique benefits that traditional diesel-fueled backup power systems cannot. As a result, solar technology combined with energy storage is increasingly being implemented in resilient power system designs.” 

The report concluded that as extreme weather and widespread power outages are increasingly revealing the limitations of traditional solutions, “more businesses and building owners are likely to consider the value of resilience and the viability of PV and storage to avoid outage-related losses.” 

But this hasn’t happened in Texas—where incumbent fossil dependent utilities hold sway, and appear unwilling to concede ground despite the horrific costs. 

The case to transition to a different energy and electric system, however, could not be stronger; especially when that case is inadvertently corroborated by multiple US government research studies commissioned under Trump.

Advertisement
Advertisement