Serendip is an independent site partnering with faculty at multiple colleges and universities around the world. Happy exploring!
You are here
Union of Concerned Scientists Global Warming
Zeldin Wants to “Reconsider” the EPA’s GHG Endangerment Finding. He Can’t Bury the Facts on Climate Science.
In a blitz of destructive actions announced by EPA Administrator Lee Zeldin last month, he specifically called for a reconsideration of the 2009 Endangerment Finding. A formal proposal for reconsideration of the Finding (and all the agency regulations and actions that depend on it) is expected this month. The science underpinning the Endangerment Finding is airtight, but that won’t stop the Trump administration from setting up a rigged process to try to undo it and give a blank check to polluters. The Union of Concerned Scientists will fight back to defend climate science and protect public health safeguards.
In an earlier post, I laid out some of the history and context for the 2009 science-backed Endangerment Finding and the Cause or Contribute Finding. These findings followed from the landmark 2007 Mass v. EPA Supreme Court ruling which held that greenhouse gas emissions are unambiguously air pollutants covered by the Clean Air Act. Together, these establish the clear basis for EPA’s authority and responsibility to set pollutions limits for heat-trapping emissions from vehicles, power plants and other sources of these pollutants, under the Clean Air Act.
Attacks on the Endangerment Finding and EPA’s Clean Air Act authority from industry interests are nothing new. Importantly, courts have repeatedly upheld both, including in a resounding 2012 decision from the U.S. Court of Appeals–D.C. Circuit in Citizens for Responsible Regulation v. EPA. But those who have long sought to overturn or weaken regulations to limit heat-trapping emissions now have Administrator Zeldin in their corner. And he has shown himself to be an unbridled purveyor of disinformation and proponent of harmful attacks on bedrock public health protections, as my colleague Julie McNamara highlights.
The details of what will be included in the reconsideration proposal are unclear at this point. But we do know some of the trumped-up lines of attack the Zeldin EPA could advance to try to invalidate these Findings because many of these tired arguments are outlined in EPA’s reconsideration announcement.
Here are the facts:
Fact #1: The science backing the Endangerment Finding is beyond disputeEvery major scientific society endorses the scientific consensus on human-caused climate change driven by GHG emissions. The Fifth National Climate Assessment and the IPCC’s Sixth Assessment Report are two major recent authoritative summaries of peer-reviewed climate science, which show that the science on climate change has only become more dire and compelling since 2009.
The impacts of climate change on human health are also starkly clear and backed by overwhelming evidence. Here’s the main finding from the NCA5 chapter on public health, for instance:
Climate change is harming physical, mental, spiritual, and community health through the increasing frequency and intensity of extreme events, higher incidences of infectious and vector-borne diseases, and declines in food and water security. These impacts worsen social inequities. Emissions reductions, effective adaptation measures, and climate-resilient health systems can protect human health and improve health equity.
As just one example, climate change is contributing to worsening extreme heat which exerts a punishing toll on people’s health, including that of outdoor workers. Heat is already the leading cause of extreme weather-related deaths in the United States and studies show that heat-related mortality is on the rise.
Looking around the nation, with communities reeling from extreme heatwaves, intensified hurricanes, catastrophic wildfires and record flooding, climate impacts are the lived reality of all too many people. To deny that or obfuscate about the underlying causes is not only disingenuous, but actively harmful and outright cruel.
Fact #2: The law requires an independent scientific determination of endangerment, unhindered by cost considerationsA Finding of Endangerment under the Clean Air Act is specifically focused on a threshold scientific determination of whether the pollutant under consideration harms public health or welfare. Costs to industry of meeting any subsequent regulations are not relevant per the statute.
The original Endangerment Finding was reached in the context of the vehicle emissions, per section 202(a) of the Clean Air Act, partially excerpted below:
The Administrator shall by regulation prescribe (and from time to time revise) in accordance with the provisions of this section, standards applicable to the emission of any air pollutant from any class or classes of new motor vehicles or new motor vehicle engines, which in his judgment cause, or contribute to, air pollution which may reasonably be anticipated to endanger public health or welfare.
In its 2012 decision, the DC Circuit was also clear is noting that “By employing the verb “shall,” Congress vested a non-discretionary duty in EPA.” That duty is not circumscribed by cost considerations.
Of course, the impacts of climate change are themselves incredibly costly and those costs are mounting as heat-trapping emissions rise. Unsurprisingly, the social cost of greenhouse gases, a science-based estimate of those costs, is another metric that the Trump EPA is seeking to undermine in yet another blatant attempt to put a thumb on the scale in favor of polluting industries.
Fact #3: EPA used well-established methodologies in its assessment of six GHGsAs noted in the 2009 endangerment finding, the EPA defined the pollutant contributing to climate change as “the aggregate group of the well-mixed greenhouse gases” with similar attributes. The attributes include that they are sufficiently long-lived, directly emitted, contribute to climate warming and are a focus of science and policy.
The EPA used a very well-established scientific methodology to combine emissions of GHGs on the basis of their heat-trapping potential, measured in CO2 equivalents. In the case of passenger cars, light- and heavy-duty trucks, buses, and motorcycles—the transportation sources EPA considered for the original endangerment finding—they emitted four key greenhouse gases: carbon dioxide, methane, nitrous oxide, and hydrofluorocarbons.
False, glib claims in the reconsideration announcement baselessly accuse the 2009 Endangerment Finding of making “creative leaps” and “mysterious” choices. There is nothing mysterious about the heat-trapping attributes of greenhouse gases, nor their impact on public health. It’s called science. Once again, relying on the mountain of evidence in the peer-reviewed scientific literature would make that readily apparent.
Fact #4: EPA has the responsibility and authority to regulate major sources of GHGsThe Cause or Contribute Finding—which specifically established that greenhouse gas emissions from new vehicles contribute to the pollution that harms public health—may also come under attack. This finding has been extended to other major sources of GHGs, including power plants and oil and gas operations. However, the Trump administration could attempt to use accounting tricks to avoid regulating emissions—as it has tried before.
In its first term, the administration attempted multiple underhanded maneuvers along these lines, including in the context of methane and VOC regulations in the oil and gas sector . For these regulations, the administration split up segments of the source category, designated them as separate source categories, used that manipulation to claim inability to regulate certain segments, and asserted that methane emissions from the remaining segments were too small and regulating them would not provide additional benefits, so those too could not be regulated. Separately, in the final days of the administration, EPA released an absurd framework attempting to set thresholds for determining “significance,” trialed in the context of power plants.
This irrational approach could be used to artificially segment components of power plants or the power system, for example, and then claim no regulations are required. This kind of rigged math wouldn’t fool a kindergarten child but there’s no telling where this administration might go in its desperate attempt to undo or weaken regulations on greenhouse gas emissions.
Zeldin’s relentless subversion of EPA’s missionUnder Administrator Zeldin, EPA’s mission to protect public health and the environment has been completely subverted. His shocking rhetoric lays bare how far he will go to protect polluters at the expense of the public. Here he is, for instance, crowing about going after 31+ EPA regulations and guidance, as well as the enforcement of pollution standards meant to protect all of us:
“Today is the greatest day of deregulation our nation has seen. We are driving a dagger straight into the heart of the climate change religion…”
EPA even set up an email address for polluters to send an email to get a presidential exemption from complying with regulations on toxic pollution, such as mercury emissions, regulated under the Clean Air Act!
Zeldin is fervently committed to dismantling public health protections and rolling back enforcement of existing laws passed by Congress. Going after the Endangerment Finding is an integral part of this all-out assault because, in the Trump administration’s harmful calculation, revoking the Finding is a potential means to rolling back all the regulations that depend on it.
Ironically, some utilities and oil and gas companies have spoken out in favor of keeping the Finding intact, as they fear a greater risk of climate damages lawsuits in the absence of EPA authority to regulate greenhouse gases. Of course, this just exposes that they know their products are causing damage. What they seek is the weakest possible exercise of EPA authority so they can continue to reap profits while evading accountability for those harms.
We can fight back with scienceBut none of this is a foregone conclusion. The legal and scientific basis for the Endangerment Finding is incredibly strong. The false claims Zeldin and other opponents have trotted out are full of bombast but weak on substance.
The science on climate change is so indisputably well-established, that it’s hard to see how any court would uphold a challenge to it. That’s not to say Zeldin won’t try to find a cabal of fringe “scientists” to try to attack it, but they’re unlikely to succeed on the merits.
Public comments on the proposal to reconsider the Endangerment Finding can help set the record straight on facts. And if the Zeldin EPA ignores them and finalizes a sham Finding or revokes the Finding with a faulty rationale, that will be challenged in court.
UCS will be closely following the details of EPA’s proposal to reconsider the Endangerment Finding when it is released. And we will let you know how you can add your voice to bolster this crucial science-based Finding, and the public health protections that flow from it. So, stay tuned!
Why the Climate Accountability Act Matters to Me—and Wisconsin
Last month, I was invited to speak at a press conference alongside Wisconsinites from across the state for the launch of the Climate Accountability Act. At just sixty words, it’s a simple but powerful bill with the potential to make our communities healthier, advance racial equity, and drive our state’s economy forward:
In the 2025-26 legislative session, the legislature shall pass legislation creating a viable plan to reduce carbon emissions in this state by 52 percent by 2030 and creating a viable plan for achieving carbon neutral emissions in this state by 2050. Any plan enacted under this subsection shall maximize the impact of the plan on improving economic and racial equity.
The bill, introduced by State Representative Supreme Moore Omokunde and State Senator Chris Larson, creates an enforceable timeline with specific objectives, allowing flexibility for discussions of the various technology and policy approaches to come later. With nearly 20 legislative cosponsors and a broad coalition of partners—including the Union of Concerned Scientists (UCS)—signed on in support, Wisconsin has the unique chance to prove that states can, and should, take the lead in preparing communities for a more resilient future.
From technical details to personal stakesAs a Senior Energy Analyst with UCS, my work typically involves diving deep into the details of specific pieces of regulation and legislation in Wisconsin and surrounding states: working on comments about archaic electric metering requirements in Wisconsin, writing testimony about interconnection rules in Michigan, or modeling the role of energy storage in Illinois’ clean energy future.
So, when I first heard of the Climate Accountability Act—at a mere two sentences—I could have brushed it off as too high-level. Call me when we get to the nitty gritty! But in reality, as I pointed out in my remarks at the press conference launch, I have a huge stake in Wisconsin’s climate—this state is my home.
I came to Madison ten years ago to pursue a master’s in electrical engineering. Eventually, I met my wife, bought a home, and welcomed a son into the world who is now three years old. Madison is our home. That’s why I believe the Climate Accountability Act is a critical step for Wisconsin, especially given all the ways the federal government is trying to move us backward on addressing climate change.
The cost of inactionIn my own comments, I highlighted some of the ways that the lack of a climate plan affects Wisconsinites, drawing on my colleagues’ research. I highlighted the UCS analysis of the negative health and economic impacts of new fossil gas plants that We Energies is proposing in Oak Creek and Paris, WI.
My colleagues recently led an analysis showing that pollution from these proposed plants could result in nearly $6 billion of health and economic costs over their thirty year lifespan. That’s a big, scary number—but it translates to an even scarier reality: nearly 400 premature mortalities, 300 ER visits, and 900 new cases of asthma.
Since the press conference, our Wisconsin coalition partners RMI released additional analysis about the proposed Oak Creek plant finding that it will cost ratepayers more than $1.25 billion in higher energy costs compared to cleaner alternatives.
To underscore the negative impacts of fossil fuels on our grid, I also pointed to key research around resilience. In a study we completed last year, we found that during five major power outages around the U.S., gas plants were more likely to fail while resources like wind and solar helped keep the lights on.
While the Climate Accountability Act doesn’t directly address the proposed plants, the setting in which the utility proposed them is important to understand. Despite being one of the first states to enact a renewable portfolio standard (RPS) requiring 10% of the state’s energy to come from renewables, the requirement expired in 2015 with no meaningful updates since. And unlike most of our neighbors in the Midwest, Wisconsin is also unique in lacking a requirement for a detailed, public, forward-looking planning process known as integrated resource planning.
Within this context, Wisconsin utilities are unique in their dogged pursuit of new gas capacity—EIA data indicates that natural gas made up only 7% of U.S. planned capacity additions in 2025, with the bulk of these new plants planned for states without a current clean energy standard (the proposed Wisconsin plants won’t show up in the EIA data unless they are approved by the state). With a robust climate plan, Wisconsin utilities would have to look beyond their legacy preference for fossil fuels and consider cleaner, cheaper, and more reliable alternatives.
Climate change threatens Wisconsin’s futureWhile I focused on the energy sector in my comments, the impacts of climate change—and the importance of the climate accountability act—are far-reaching for Wisconsin and other Midwest states.
Our 2019 Killer Heat report found that without action on climate change, the number of days each year with a heat index over 100°F will jump 783%, from six to 53 in the Midwest. The impact will be felt most acutely by outdoor workers and vulnerable populations including children, the elderly, and those who are pregnant. The report also highlights how centuries of social and economic discrimination have increased the exposure of BIPOC communities and individuals to the risks of heat-related illnesses and injuries.
In 2020, we highlighted research finding that the combined effects of rising temperatures and increased CO2 concentrations lead to reduced yields from corn and soybean crops in the Midwest, harming local economies.
That’s just a handful of examples focused on the Midwest—at a higher level we know that climate change will also make winter storms worse, increase the risk of wildfires, and lead to more floods.
And, as the sponsors of the Climate Accountability Act recognized with the inclusion of the critical second sentence, these impacts will most acutely affect environmental justice communities and others who have been historically marginalized. Any plan addressing climate change in Wisconsin must focus on addressing these historic harms.
Source: Matt Brusky/Citizen Action “The fierce urgency of now”In his opening remarks at the press conference, the bill’s sponsor in the Assembly, Representative Supreme Moore-Omokunde, quoted the words of Martin Luther King Jr while reflecting on “the fierce urgency of now.” I’ll close with his call to action:
“We cannot continue to burn fossil fuels with no plans to seek alternatives that are best for urban and rural Wisconsinites. We must develop a plan now that centers racial and class equality, and gets us on a path to net zero.”
Click here to find out more about the bill and how you can support it.
FIRO para evitar el FOMO hídrico: cómo no desperdiciar ni una gota en operaciones de embalses en California
¡Feliz Semana del Agua!
¿Has visto alguna vez la Sierra Nevada de California desde el Valle de San Joaquín a principios de primavera en un día despejado? Cuando la Sierra tiene nieve y la calidad del aire nos permite verla desde aquí, esa vista no tiene igual.
Cada año en esta época, cuando miro la Sierra desde el Valle, sé que si veo poca nieve será un año seco. Cuando hay bastante nieve como ahora, sé que habrá menos dificultades con el suministro de agua durante el verano, pero puede haber inundaciones. Las inundaciones pueden ser causadas por lluvia sobre nieve y por altas temperaturas primaverales que derriten la nieve más rápido y antes de lo habitual. El cambio climático está provocando un deshielo más temprano y rápido.
Para nosotros, los apasionados del agua, esta vista es más que un hermoso paisaje. El manto de nieve es nuestra reserva de agua principal en California después del agua subterránea. Esta es una foto de la parte sur de la Sierra Nevada en 2023 vista desde el condado de Tulare. Foto por Angel S. Fernandez-Bou.El estudio de niveles de nieve de la semana pasada confirmó lo que vemos en las montañas. El Departamento de Recursos Hídricos de California informó que el manto de nieve del estado midió el 96% del promedio en su punto máximo el 1 de abril. Hay matices, ya que el norte recibió el 120% y el sur solo el 84%. Podemos decir que estas son noticias relativamente buenas, pero también debemos recordar que los últimos tres años de manto de nieve promedio fueron seguidos de una severa sequía desde 2020 hasta 2022, el período de tres años más seco registrado en California.
Estos extremos climáticos y los cambios meteorológicos bruscos que experimentamos aquí son cada vez más frecuentes con el cambio climático, y es por eso que necesitamos planificar tanto para las inundaciones como para el próximo período seco que podría estar a la vuelta de la esquina.
Manto de nieve de la Sierra Nevada (norte, centro y sur) presentado como porcentaje de nieve comparado con el promedio histórico el 1 de abril. Mientras que el norte tiene más que el promedio histórico, el sur tiene menos. A escala estatal, el manto de nieve es aproximadamente el promedio histórico, pero habrá más agua en el norte y menos en el sur que el promedio.En esta Semana del Agua 2025 tenemos que recapacitar sobre cómo prepararnos para los extremos climáticos modernizando nuestra gestión del agua. En años anteriores, el deshielo rápido ha provocado inundaciones y preocupaciones sobre la integridad estructural de algunas presas. Por ejemplo, en 2017, casi 200,000 residentes tuvieron que ser evacuados aguas abajo de Oroville debido a la probabilidad de un colapso después de un evento de lluvia sobre nieve.
Daño en el aliviadero de Oroville en 2017. Fuente: DWRPor eso es vital que el estado esté trabajando con la comunidad científica en una nueva estrategia de gestión para reducir el riesgo de inundaciones para las comunidades río abajo y beneficiar los suministros de agua durante los períodos secos. Esta estrategia se llama FIRO, por sus siglas en ingles que significa “operaciones de embalses informadas por pronósticos meteorológicos”, y es un nuevo enfoque que puede ayudarnos a manejar de manera más flexible los extremos del agua.
Numerosas presas en California están diseñadas con un doble propósito: disponer de capacidad para capturar aguas de crecida y prevenir inundaciones, mientras simultáneamente funcionan como reservorios para el almacenamiento hídrico. Tradicionalmente, sin FIRO, estas presas se operan siguiendo normas rígidas basadas en el calendario, que determinan el volumen de agua que debe mantenerse en el embalse según la época del año.
Lo que FIRO aporta es permitir a los gestores de presas utilizar pronósticos meteorológicos para tomar decisiones más inteligentes sobre los niveles de agua. Pueden liberar agua preventivamente antes de tormentas significativas para crear capacidad adicional, o conservarla cuando los pronósticos no indican riesgos inminentes de precipitación. Esta aproximación flexible optimiza la gestión hídrica en ambos frentes: minimiza los riesgos de inundación y maximiza la disponibilidad del recurso.
En esencia, FIRO posibilita que los operadores conserven más agua en los embalses para utilizaciones futuras. Es decir, FIRO elimina ese “miedo a perderse oportunidades” (FOMO) respecto al agua que podría haberse almacenado si se contara con mejores herramientas de predicción.
FIRO: de California al mundoÉrase una vez (y persiste aún) una megasequía en California que alcanzó su punto crítico durante el período de severa escasez hídrica entre 2012 y 2016. Los entusiastas del agua de California tal vez sepan que esta sequía fue el catalizador que impulsó la Ley de Gestión Sostenible del Agua Subterránea (SGMA), la cual a su vez evidenció la necesidad de reusar estratégicamente cerca de un millón de acres de tierras agrícolas irrigadas en el estado. En aquel momento, los operadores de embalses observaban con preocupación cómo se liberaba agua de valor incalculable desde las presas como medida preventiva contra inundaciones, incluso cuando no existían pronósticos de lluvia ni acumulación de nieve por derretir. Esta situación exigía una solución.
El proyecto pionero de FIRO se implementó en el Lago Mendocino, en la cuenca del Río Ruso al norte de California. Allí convergió un equipo multidisciplinario de científicos, gestores hídricos e ingenieros que colaboraron con el Cuerpo de Ingenieros del Ejército, la Administración Nacional Oceánica y Atmosférica (NOAA), la Institución Scripps de Oceanografía y el Departamento de Recursos Hídricos de California para desarrollar una solución innovadora. El elemento decisivo fueron los avances en pronósticos hidrometeorológicos, que permiten predecir con mayor exactitud la temperatura, las precipitaciones y los caudales de los ríos. Este conocimiento científico sobre clima, meteorología e hidrología se perfecciona continuamente gracias a la labor de agencias federales como NOAA y NASA, en estrecha colaboración con la comunidad científica.
La precisión de los pronósticos meteorológicos ha experimentado avances significativos en décadas recientes. Actualmente, alcanzamos una fiabilidad extraordinaria en previsiones a tres días. Con mayor investigación y el desarrollo de supercomputadoras más potentes, ampliaremos nuestra capacidad para realizar pronósticos con mayor anticipación, lo que permitirá gestionar de manera óptima las operaciones de nuestros embalses.A partir de esta experiencia inicial, metodologías similares a FIRO han surgido en diversas regiones del país. En Seattle, por ejemplo, se prevé lograr un mejor equilibrio entre la protección contra inundaciones y la disponibilidad hídrica mediante la implementación de FIRO en la presa Howard Hanson de la cuenca del Río Verde. En la región del Medio Oeste, el Lago Erie cuenta con LEOFS (Sistema de Pronóstico Operativo del Lago Erie) para administrar eficientemente los niveles de agua afectados por variaciones estacionales y el cambio climático. Por su parte, la Autoridad del Valle de Tennessee también ha adoptado este enfoque de gestión de inundaciones para afrontar eventos de precipitación extrema, particularmente ante la creciente frecuencia de huracanes y otros fenómenos climáticos extremos que afectan el sur del país.
Esta revolución en la gestión hídrica trasciende fronteras. Fuera de Estados Unidos, países como Australia y Japón, así como la región mediterránea, están incorporando progresivamente los pronósticos meteorológicos en la planificación y operación de sus sistemas de embalses.
Los beneficios transformadores de implementar FIROEl poder de FIRO reside en su capacidad para revolucionar múltiples aspectos de la gestión hídrica. En primer lugar, optimiza la disponibilidad del agua precisamente cuando más la necesitan las comunidades, el sector agrícola y los ecosistemas. Al conservar el agua en los embalses hasta que los pronósticos meteorológicos señalen una auténtica necesidad de prevención de inundaciones, aseguramos reservas hídricas vitales para nuestros característicos veranos mediterráneos.
Este sistema representa un salto cualitativo en la gestión de inundaciones respecto a los métodos tradicionales basados en calendarios con fechas predeterminadas. Con FIRO, las decisiones de liberación de agua se fundamentan en la convergencia de pronósticos meteorológicos y modelos hidrológicos (la ciencia hidrometeorológica) que identifican riesgos reales de inundación, superando así la dependencia de meras estadísticas históricas.
Una ventaja significativa del sistema FIRO es su capacidad para incrementar el almacenamiento hídrico sin requerir construcciones adicionales. En un contexto donde los nuevos proyectos de presas enfrentan crecientes obstáculos ambientales, sociales y económicos, FIRO permite extraer el máximo rendimiento de la infraestructura ya existente mediante una operación más inteligente. Adicionalmente, la precisión que proporcionan los pronósticos hidrometeorológicos facilita la programación de descargas ambientales estratégicas, garantizando que se atiendan las necesidades ecológicas de los ríos y las especias acuáticas en momentos precisos.
Por último, FIRO constituye una herramienta fundamental para fortalecer la resiliencia frente a sequías, una preocupación cada vez más acuciante conforme el cambio climático intensifica los períodos secos en numerosas regiones. Al conservar agua durante los intervalos sin riesgo de inundación dentro de la estación húmeda, tanto comunidades como agricultores pueden asegurar reservas hídricas estratégicas para afrontar episodios de sequía que, de otro modo, agotarían rápidamente los recursos disponibles y provocarían restricciones severas en el consumo.
Desafíos en el horizonte de implementaciónA pesar de sus evidentes beneficios, la implementación de FIRO presenta diversos desafíos que requieren consideración. Si bien la fiabilidad de los pronósticos meteorológicos es notablemente alta, especialmente en la Costa Oeste de Estados Unidos, no todas las regiones del país o del mundo cuentan con este nivel de precisión. Siempre hay un grado de incertidumbre inherente a cualquier predicción. Aunque la exactitud de los pronósticos mejora anualmente, los operadores de embalses deben contemplar ese margen—pequeño pero existente—de incertidumbre al tomar decisiones sobre la gestión de riesgos por inundación.
Para abordar esta incertidumbre, resulta fundamental el uso de pronósticos probabilísticos y sistemas de ensambles. En situaciones donde la prudencia dicta liberar volúmenes de agua superiores a los óptimos para la protección contra inundaciones, existe la posibilidad de mitigar este impacto canalizando estos excedentes hacia proyectos de recarga de acuíferos. Estos sistemas de recarga no solo proporcionan almacenamiento subterráneo adicional, sino que también contribuyen a contrarrestar problemas de subsidencia del terreno, proteger nuestros acuíferos para que no se sequen nuestros pozos domésticos, preservar ecosistemas dependientes de aguas subterráneas y prevenir la intrusión salina en zonas costeras.
FIRO enfrenta, además, barreras técnicas e institucionales significativas. Desde la perspectiva técnica, requiere conocimientos especializados en meteorología, hidrología y gestión de embalses—competencias que no siempre están disponibles en los organismos responsables de la administración hídrica. En el plano institucional, implica una transformación cultural que abandone las operaciones basadas en calendarios predeterminados para adoptar un modelo de toma de decisiones dinámico fundamentado en pronósticos, lo que puede generar resistencia en organizaciones habituadas a metodologías convencionales. Si bien estas transiciones transformadoras requieren tiempo, es alentador que tanto el Cuerpo de Ingenieros del Ejército como la Oficina de Reclamación estén respaldando activamente las iniciativas FIRO.
Las marcadas diferencias climáticas, topográficas y de características de embalses entre distintas regiones imposibilitan la simple replicación del modelo FIRO de una cuenca a otra. Cada implementación requiere adaptaciones específicas basadas en condiciones locales. Esta diversidad subraya la importancia de integrar a las comunidades locales en los procesos decisorios, pues frecuentemente poseen un conocimiento invaluable sobre las dinámicas de la cuenca y tienen intereses legítimos en la gestión de los embalses que deben ser considerados para garantizar una implementación exitosa.
El futuro pertenece a FIROLos avances en ciencia climática y supercomputación continuarán perfeccionando los pronósticos meteorológicos. La inteligencia artificial (IA) está potenciando este enfoque, incrementando la efectividad de FIRO. La integración de IA en modelos meteorológicos augura una precisión sin precedentes, facilitando decisiones más acertadas sobre almacenamiento y liberación de agua. En un futuro próximo, las previsiones meteorológicas de alta precisión podrían extenderse de días a semanas, otorgando a los operadores de embalses un margen temporal más amplio para prepararse ante eventos extremos.
Conforme el cambio climático intensifica tanto las inundaciones como las sequías extremas, FIRO y metodologías afines se vuelven indispensables para la gestión hídrica moderna, como reconoce la legislación reciente en California mediante el Programa de Investigación y Mejora de Pronósticos de Ríos Atmosféricos: Habilitando la Adaptación Climática a través de Operaciones de Embalses Informadas por Pronósticos y Resiliencia ante Peligros (AR/FIRO). La ley AB30 (2023) actualizó el marco normativo para incorporar explícitamente a FIRO como herramienta estratégica en la gestión de la escasez hídrica y prevención de inundaciones.
Sin embargo, la auténtica revolución que representa FIRO trasciende el ámbito tecnológico; es una nueva concepción sobre infraestructura. En lugar de limitarnos a construir presas más grandes o diques más elevados, FIRO demuestra que, en ocasiones, las soluciones más efectivas surgen de la optimización inteligente de recursos existentes. Este paradigma refleja el pensamiento adaptativo necesario ante un clima cambiante, reminiscente de nuestras iniciativas de reconversión de tierras agrícolas hacia usos múltiples más sostenibles.
FIRO aporta la flexibilidad esencial que las operaciones hídricas requieren para adaptarse al cambio climático y sus múltiples consecuencias: deshielo prematuro, eventos extremos más frecuentes, calentamiento de aguas fluviales, mayor evaporación en lagos, intrusión marina, subsidencia y sobreexplotación de acuíferos. En el incierto panorama climático que enfrentamos, enfoques como FIRO—que abrazan la incertidumbre mediante avances científicos—resultarán determinantes para la sostenibilidad de comunidades, economías y ecosistemas. Aunque los desafíos hídricos y ambientales que aguardan son formidables, mantengo el optimismo: si confiamos en la ciencia y atendemos las voces de la ciudadanía, podremos construir un futuro hídrico caracterizado por su resiliencia y sostenibilidad.
FIRO to Avoid Water FOMO: How to Save Every Drop with Smart Reservoir Operations in California
Happy Water Week!
Have you ever seen the Sierra Nevada of California from the San Joaquin Valley in the early spring on a clear day? When the Sierra has snow and the air quality allows us to see it from here, that view is second to none.
Every year at this time when I look at the Sierra from the Valley, I know if I see little snow, it means it’s a dry year. When there is plenty of snow like now, I know it means less struggle with water supplies during the summer but also potential floods. Floods can come from rain-on-snow events and from high spring temperatures that melt the snow faster, and climate change is triggering earlier and faster snowmelt.
For us water nerds, this view is more than a beautiful landscape. The snowpack is our main water storage in California after groundwater. This is a photo of the southern part of the Sierra Nevada in 2023 seen from Tulare County. Ángel S. Fernández-BouLast week’s snow survey confirmed what I saw in the mountains. The California Department of Water Resources reported that the state’s snowpack measured 96% of average at its peak on April 1. There is nuance, since the north got 120% and the south only 84%. We can say this is relatively good news, but we also have to remember that the last three years of near-average snowpack followed a severe drought from 2020 to 2022, the driest three-year period ever recorded in California.
These climatic extremes and the weather whiplash we experience here are becoming more frequent with climate change, and that’s why we need to plan for both flooding and the next dry period that could be just around the corner.
Snowpack of the Sierra Nevada (north, central, and south) presented as the percentage of the historical average snowpack on April 1st. While the north has more than the historical average, the south has less. At the state scale, the snowpack is approximately the historical average, but there will be more water in the north and less in the south than average.As we mark Water Week 2025, preparing for extremes is critical for modernizing our water management. In past years, supercharged snowmelt has led to flooding and dam safety concerns. For example, in 2017 nearly 200,000 residents had to be evacuated below the Oroville Dam due to fears of collapse after a rain-on-snow event.
Oroville spillway damage in 2017. DWRThat’s why it’s vital that the state is working with the scientific community on a new management strategy to reduce flood risk for downstream communities and benefit water supplies during dry periods. It’s called Forecast Informed Reservoir Operations, or FIRO, a new approach that can help us more flexibly manage water extremes.
Many of the reservoirs in California are managed so they have space to capture flood water to avoid flooding damage and hazards while they are also used for water storage. Without FIRO, reservoirs are managed with fixed calendar-based rules that tell you how much water to keep in the reservoir for that time of year. FIRO enables reservoir operators to use forecasts to adjust the amount of water in the reservoir before storms, reducing flood risk by releasing water ahead of major events while holding water in the reservoir if there are forecasted precipitation events. FIRO benefits both sides of water management by mitigating flood risk and increasing water availability.
FIRO allows reservoir operators to keep water in the reservoir for future uses. In other words, FIRO avoids the fear of missing out (FOMO) on water that you could have stored if you had better forecasting.
FIRO started in California and has since gone worldwideOnce upon a time, there was (and still is) a megadrought in California that peaked in the acute drought of 2012 to 2016. If you’re a water nerd, you may know that drought triggered the Sustainable Groundwater Management Act (SGMA), which itself triggered the need to strategically repurpose about 1,000,000 acres of irrigated cropland in the state. By then, water managers were looking at extremely valuable water being released from reservoirs for flood prevention, even though there were no rainfall forecasts or snow to melt. And they wanted to do something.
The first FIRO pilot project was in Lake Mendocino on the Russian River in Northern California. There, a group of scientists, water managers, and engineers worked with the Army Corps of Engineers, the National Oceanic and Atmospheric Association (NOAA), Scripps Institution of Oceanography, and the California Department of Water Resources to find a solution. The key? Our ever-improving hydrometeorological forecasts, which means more accuracy to predict temperature, precipitation, and streamflow. Our scientific knowledge about climate, meteorology, and hydrology improves every year thanks to federal agencies like NOAA and NASA, and their partnerships with the research community.
The accuracy of weather forecasts has improved a lot over the last decades. At present, we have very high accuracy for a 3-day forecast. With more research and faster supercomputers, we will be able to increase our ability to forecast with greater lead times, which can translate into better control of our reservoir operations.Since then, FIRO-like approaches have appeared in other parts of the country. For example, Seattle can soon expect a better balance between flood protection and water availability as they are planning to use FIRO at the Howard Hanson Dam in the Green River watershed. In the Midwest, Lake Erie has the LEOFS (Lake Erie Operational Forecast System) to better manage water levels affected by seasonal variations and climate change. The Tennessee Valley Authority is also relying on this kind of flood management during extreme precipitation events, especially because of the more common hurricanes and climate change extremes the South is experiencing.
Outside the United States, countries like Australia and Japan, and the Mediterranean Region are also starting to include meteorological forecasts in their reservoir operations.
The benefits of implementing FIROFIRO’s power lies in its multifaceted benefits for water management. First, it can improve water availability when communities, farmers, and the environment need it most. By keeping water in reservoirs until meteorological forecasts indicate an actual need for flood prevention, we preserve our most precious resource for our Mediterranean summertime. This approach also offers more accurate flood management compared to calendar-based releases, as water is released only when meteorological forecasts couple with hydrological models (what we call hydrometeorology) actually indicate a flood risk, rather than based on historical statistics.
FIRO can achieve increased water storage without requiring new infrastructure. In an era where building new dams faces environmental, social, and economic barriers, FIRO maximizes the efficiency of existing infrastructure through smarter operations. The precision offered by hydrometeorological forecasting also allows for more targeted environmental releases, facilitating that ecological needs downstream are met when needed.
Finally, FIRO can contribute significantly to drought resilience—a critical concern as climate change intensifies dry periods in many regions. By retaining water during nonflood periods in the wet season, communities and farmers can save valuable water to protect themselves against drought conditions that might otherwise deplete water availability faster and trigger water use restrictions.
Potential challengesDespite its clear advantages, implementing FIRO comes with several challenges that need to be considered. Forecast reliability is very high, particularly along the US West Coast, but not in all areas of the US or world, there is always uncertainty in any forecast. While meteorological forecasts become more accurate each year, dam operators must still account for the small-but-not-zero uncertainty in these predictions when managing flood risks. To account for the uncertainty in forecasts, the use of ensembles and probabilistic forecasts are important. When uncertainty means releasing more water than might be optimal for flood protection, we can mitigate this by directing releases to aquifer recharge projects. In addition to providing underground storage, recharge projects can be used to combat subsidence impacts, protect groundwater levels for domestic wells, help groundwater-dependent ecosystems, and prevent seawater intrusion in coastal regions.
FIRO also faces implementation barriers on both technical and institutional fronts. Technically, it requires specialized expertise in meteorology, hydrology, and reservoir operations—skill sets that may not always be available in water management agencies. Institutionally, it demands a culture shift away from calendar-based operations toward more dynamic, forecast-based decision making, which can meet resistance in organizations accustomed to traditional approaches. Although transformative changes like FIRO can take time, both the U.S. Army Corps and Bureau of Reclamation are now actively supporting FIRO efforts.
Additionally, the significant variations in climate, topography, and reservoir characteristics across different regions mean that FIRO can’t simply be copied from one watershed to another. Each implementation requires tailored approaches based on local conditions. This variability also underscores the importance of bringing local communities to the decision-making table—they often hold valuable knowledge about watershed behavior and have important stakes in reservoir management outcomes that must be addressed for successful implementation.
The future is FIROMeteorological forecasts will continue improving through advances in climate science and supercomputing. Artificial intelligence (AI) is now enhancing this approach, making FIRO more effective. AI integration with weather models promises greater accuracy, enabling more precise decisions about water storage and releases. Highly accurate forecasts may soon extend from days to weeks, giving water managers even more time to prepare for extreme events.
As climate change intensifies both flood and drought extremes, FIRO and similar approaches are a necessity for water management, as recent legislation in California acknowledges in the Atmospheric Rivers Research and Forecast Improvement Program: Enabling Climate Adaptation Through Forecast-Informed Reservoir Operations and Hazard Resiliency (AR/FIRO) Program. AB30 (2023) updated current legislation to explicitly include FIRO as an emerging tool to better manage water scarcity and floods.
But the true revolution of FIRO extends beyond technology—it represents a fundamental shift in how we think about infrastructure. Rather than simply building bigger dams or higher levees, FIRO shows us that sometimes the most powerful solutions come from a smarter use of what we already have. This approach embodies the kind of adaptive thinking required in our changing climate, and reminds me a lot of our cropland repurposing work for smarter multiple uses of the land.
FIRO gives flexibility to water operations, and that flexibility is essential to adapt to climate change and its consequences in our water systems, such as earlier snowmelt, more frequent and extreme floods and droughts, warmer river water, more evaporation from lakes, seawater intrusion, subsidence, and overdrafted aquifers. As we face an uncertain climate future, approaches like FIRO that embrace uncertainty through better science will be crucial to sustaining our communities, economies, and ecosystems. Our water and environmental challenges ahead are immense, but if we trust science and we listen to people, I am optimistic that we can build a more resilient and sustainable water future.
7 Takeaways from Trump’s Disaster Preparedness Executive Order and What It Means for US
Bit by bit, President Trump has been chopping away at the Federal Emergency Management Agency (FEMA) downsizing it through cuts to the agency’s staff, programs and mission. Reports last week revealed a daunting threat from Department of Homeland Security (DHS) Secretary Noem, who announced in a cabinet meeting: “We’re going to eliminate FEMA.”
Neither Secretary Noem nor President Trump has the legal authority to abolish FEMA—that power lies with Congress. Furthermore, dismantling or eliminating FEMA will endanger millions of people who rely on the agency to help prepare for and recover from disasters.
In fact, legislators from both parties signaled the need for FEMA by introducing legislation to transform it into a stand-alone, cabinet level agency as it once was before the Department of Homeland Security was formed. Republican Representative Byron Donalds and Democrat Representative Jared Moskowitz, both of Florida, introduced the bill on March 24, 2025.
Days before the DHS Secretary’s statement, on March 19, the president signed an executive order (EO) titled “Achieving Efficiency Through State and Local Preparedness.” This action follows his earlier EO in January establishing a FEMA review council. While sparse on details, this new order prompts a major change in federal policy.
While details are scarce in the EO, there are two clear themes that emerge: shifting the burden of disaster response from the federal government to state and local government—even for major disasters that overwhelm states’ capacities—and using frames like “streamlining” and “efficiency” which this administration is already disingenuously using to decimate agency staffing and budgets to the point that they cannot fulfill their missions. Both of these will create significant risk and harm for communities in the path of and reeling from disasters.
Below I provide a summary of the EO including the new policy, initiatives and updates to current federal policies and the takeaways for each.
1. Shifting the national resilience and preparedness burden to state and local governmentsThe most significant piece of policy in the executive order seems to simply put in writing what he has been saying he wants all along: to shift more responsibility for disaster resilience, preparedness and response onto the shoulders of state, local, tribal, territorial governments.
If it comes to pass, disaster response and recovery will be more chaotic and ineffective, endangering more lives—especially the elderly, youth, those who have disabilities and others with fewer resources to prepare or evacuate. States, even larger states, don’t have the resources to handle catastrophic disasters. In those cases, governors will ask for a presidentially declared disaster.
Key takeaway: What will change for states?
Effective and well-resourced emergency preparedness and disaster response can mean the difference between life and death. Given that, the lack of details in the executive order (and fact sheet) is baffling, particularly considering the planning, time and level of funding that is needed for disaster preparedness and resilience. States will be unprepared to respond to major disasters on their own. That’s particularly dangerous during the summer months (May-October) when the risks of extreme heat, hurricanes, wildfires and floods tend to peak and collide—a time UCS calls “Danger Season.”
What level of coordinating capabilities, boots on the ground, financial resources and technical expertise can these state and local jurisdictions expect from FEMA when a major disaster is declared? This NPR article speaks to what will be at stake if FEMA is taken out of the equation:
“Without FEMA, states would need to find thousands of additional personnel to inspect damage, distribute disaster aid and plan the rebuilding of public infrastructure. Without federal funding, states would face billions of dollars in recovery costs. After Hurricane Irma in 2017, Florida relied on more than $5.5 billion dollars from the federal government.”
Of course, FEMA also helps with preparedness, as the article points out. And while there is a lot more that Congress and this administration could do to incentivize more emergency readiness by state, local, tribal and territorial governments, this administration hasn’t shown it understands the concept of reducing risk.
2. Develop a national resilience strategySection 3 of the executive order calls for a national resilience strategy to be developed within 90 days of the order. Specifically, it calls for the Assistant to the President for National Security Affairs Michael Waltz, in coordination with the Assistant to the President for Economic Policy Kevin Hassett, to develop “a national resilience strategy that articulates the priorities, means, and ways to advance the resilience of the Nation.”
Key takeaway: While a national resilience strategy sounds great under any other administration, I am extremely wary of what such a strategy could mean for Trump 2.0. Furthermore, this strategy already exists! In January 2025 the Biden administration released a National Resilience Strategy which was one of the many pieces of climate change resilience-related initiatives the Trump administration revoked. It’s hard to imagine a national resilience strategy that doesn’t address the climate crisis front and center.
A recent Forbes article underscores how climate change has ripple effects throughout the economy and is forcing how and where people live, shaking up real estate and insurance markets, and wreaking havoc on local governments and challenging their ability to provide basic services. And those who are hurt first and worst by climate change-related impacts are often those who live in the riskiest areas with the least resources. This is where and why decision-makers include an equity lens in any kind of disaster assistance, adaptation or resilience strategy.
My colleague Melissa Finucane explains that “without a focus on racial equity, disaster policies don’t just leave these communities behind, they in fact compound the health, environmental, and economic challenges being faced.”
It’s hard to imagine a valuable national resilience strategy being developed within a three-month timeframe. Based on what we’ve seen so far, one thing is for sure: we can expect the Trump administration’s national resilience strategy to be radically different from the Biden administration’s plan. It will also likely be in stark contrast to how climate scientists, policy and planning experts, emergency and floodplain managers understand resilience and how such strategies should be developed and informed by the latest science and public input, prioritizing the needs of communities that have the fewest resources.
3. Review and revise national critical infrastructure policiesSection 3(b) directs the President’s National Security Affairs Assistant, in coordination with the Director of the Office of Science and Technology Policy (OSTP) and heads of relevant agencies, to review all critical infrastructure policies within 180 days and recommend “revisions, recissions, and replacements necessary to achieve a more resilient posture; shift from an all-hazards approach to a risk-informed approach; move beyond information sharing to action; and implement the National Resilience Strategy…”
Key takeaway: Important policies that could be revised include the National Security Memorandum 22 “NSM-22” of April 30, 2024 (Critical Infrastructure Security and Resilience) which replaced Presidential Policy Directive 21 (PPD-21). What I am particularly concerned about is what this administration might do to NSM-22, which was updated after a decade to address new complex threats, and while it has not been revoked, it has been taken down from the White House website. The modernized policy builds on PPD-21 and strengthens it by: 1) encouraging the private sector to comply with minimum resilience standards; 2) adding key new developments such as transitioning energy and transportation sectors away from fossil fuels; and 3) includes emerging threats such as climate change and supply chain disruptions.
Given the fact that this president has dismantled many federal advisory councils, I hope the administration will maintain and include the National Infrastructure Advisory Council (NIAC) in this process, as it has been very productive in producing “30 in-depth studies” with many important recommendations—including this related one that the nation must “Better prepare and respond to disruptions (like Superstorm Sandy) that can ripple across multiple infrastructure systems and paralyze services to entire regions.”
I will keep an eye out for how the administration will implement the “shift from an all-hazards approach to a risk informed approach” as the language leaves emergency manager-types scratching their heads. The NSM-22 updates the term all hazards “as all threats, all hazards” and defines this term as “a threat or an incident, natural or manmade, that warrants action to protect life, property, the environment, and public health or safety, and to minimize disruptions of Government, social, or economic activities. It includes, but is not limited to: natural disasters, cyber incidents, industrial accidents, pandemics, acts of terrorism, sabotage, supply chain disruptions to degrade critical infrastructure, and disruptive or destructive activity targeting critical infrastructure.”
A risk informed approach is one that evaluates all potential threats and hazards, identifies exposure and vulnerability. So could this be the administration’s “streamlined” and “efficiency” approach to removing any climate change-related risk such as sea level rise? Or, is instead a throwback to a time prior to the PPD-21’s “all-hazards” risk approach to one that is more focused on the risk of counterterrorism and infrastructure assets?
4. “Streamlines” the national continuity policySection 3(c) of the executive order calls for a review of all national continuity policies and recommendations to the president from this review within 180 days of this order. Specifically, it states that the President’s National Security Affairs Assistant, in coordination with the heads of relevant agencies, “shall review all national continuity policies and recommend to the President the revisions, recissions, and replacements necessary to modernize and streamline the approach to national continuity capabilities, reformulate the methodology and architecture necessary to achieve an enduring readiness posture, and implement the National Resilience Strategy described in subsection (a) of this section.”
Key takeaway: Continuity policies are just what they sound like, they provide a coordinated approach in the case of an emergency whether a natural hazard, or other type of disaster. Each of these different pieces of policies tie into each other and highlights that the administration could indeed be dismantling FEMA policy by policy, in this case by FEMA’s continuity policy and toolkit which helps ensure FEMA’s essential functions continue in the case of emergencies but also helps communities understand how to maintain their functionality.
I’m concerned that a so-called “streamlined” approach to continuity policies could mean some critical pieces of budgets could be cut and interagency projects and initiatives won’t be supported, all of which will make communities less resilient.
5. Review and revise preparedness and response policiesSection 3(d) of the executive order calls on the President’s National Security Affairs Assistant and other relevant agencies’ heads to review national preparedness and response policies and recommend revisions, recissions, and replacements necessary to the President to “reformulate the process and metrics for Federal responsibility, move away from an all-hazards approach,” and implement the National Resilience Strategy within 240 days of the order.
Key takeaway: Instead of building on the many years of plans and guidance on preparedness, this administration is underscoring the desire to downsize the federal responsibility and ensure this “reformulation” is reflected in the new national resilience strategy. In under eight months or sooner, the Trump administration will have developed new policies and metrics on what the federal role is in national preparedness and response and what this will look like for states and other jurisdictions.
If the administration takes a hatchet to the current policies, we’ll likely see changes in policies and guides such as the National Disaster Recovery Framework that outlines five main areas including federal support to states and local jurisdictions and emphasizes the need for resilient and sustainable recovery planning and the National Response Framework which helps communities, citizens, business, and others to build response plans. We will also have a better idea about what moving away from an “all-hazards approach” will look like —which doesn’t sound good no matter how you slice it.
6. Develop a national risk registerSection 3 (e) of the executive order calls for a national risk register to be developed within 240 days of this order. Specifically, the order charges the President’s National Security Affairs Assistant, in coordination with the Director of the Office of Management and Budget (OMB) and the heads of relevant agencies, to coordinate the development of a national risk register “that identifies, articulates, and quantifies natural and malign risks to our national infrastructure, related systems, and their users. The quantification produced by the National Risk Register shall be used to inform the Intelligence Community, private sector investments, State investments, and Federal budget priorities.”
Key takeaway: To my knowledge, a national risk register does not exist, however the NSM-22 establishes a coordinated approach to federal risk management for critical infrastructure, Cybersecurity and Infrastructure Security Agency (CISA) has a national risk management center and FEMA developed a National Risk Index (NRI) which, while it has its flaws, is a tool for communities to see the level of risk for 18 hazards and overlays of social vulnerability, community resilience and expected annual loss. FEMA also had a climate change risk tool as well called the Future Risk Index that FEMA under the Trump administration (of course) removed, but fortunately scientists recovered and is available for free on GitHub.
The bottom line is, if the Trump administration does not account for climate change within the new risk register, it’ll fail to do its job to quantify the risks to the nation’s infrastructure and will cause a moral hazard.
7. “Streamlines” preparedness and continuity organizational structureSection 3(f) calls on the Secretary of Homeland Security to “streamline” the federal governments’ current national preparedness and continuity organizational structure that spreads across five major functions within one year “to ensure State and local governments and individuals have improved communications with Federal officials and a better understanding of the Federal role.” The functions include: 1) the National Essential Functions, 2) Primary Mission Essential Functions, 3) National Critical Functions, 4) Emergency Support Functions, 5) Recovery Support Functions, and 6) Community Lifelines.”
Key takeaway: Each of these functional categories plays a role in FEMA’s mission. Similar to changes in the related policies above, a simplified and streamlined version would do a disservice to FEMA’s ability to fulfill its mission. Agency staff know what to do, so it would take time and resources away from other efforts to train the federal family in learning and implementing a new organizational structure in time for hurricane season.
What’s next?UCS will closely watch the outcomes of this executive order and this administration’s continued attacks on FEMA because it’s crucial that communities are protected and not sacrificed in the name of harmful and disingenuous efforts purported to advance “efficiency” and “streamline” the federal government’s response to disasters.
We’ll have to wait to have many questions answered as much of the implementation of the EO will be reviewed, written and decided behind closed doors. In the interim, the FEMA Review Council just released a request for information to the public to provide comments on improvements to, and overall experience with FEMA during disasters, which are due May 15, 2025. This is a critical opportunity for the public to weigh in on how important FEMA is in coordinating a “whole-of-government response in the period immediately after a disaster” – as two former FEMA administrators wrote.
Genuine reforms to FEMA should be informed by science, expertise, and the experiences of disaster survivors. Instead, this administration seems hell-bent on a cruel campaign of decimating an agency that millions of people rely on to stay safe and get back on their feet after floods, fires, hurricanes and more.
Given the actions planned to continue to downsize FEMA, including reducing staff even further, streamlining the mission and staff roles and the geographic footprint of the 10 FEMA regional offices, we need bipartisan defense of the agency. This should be a clarion call for lawmakers to keep communities safe by preserving resources meant for FEMA and making science-informed and evidence-based changes to the agency.
Trump National Security Officials: Add NOAA to the Chat for Climate Literacy
Growl. Sigh. Rinse. Repeat.
Yet another resource that belongs to us, the US public, has disappeared down the Trump administration’s memory hole. I just learned from the valiant Environmental Data and Government Initiative that the National Oceanic and Atmospheric Administration has removed the 2024 Climate Literacy Guide from its website (though a data savior has preserved it here). Now, no one can access a fundamental federal resource that helps the public understand climate change via its proper home at https://www.climate.gov/.
Who needs the Climate Literacy Guide? Trump’s Signal crew, that’s whoAnyone who wishes to understand what’s happening to our world—why we keep stacking hottest year on hottest year, why wildfires are so intense, why some hurricanes strengthen so rapidly—can learn from the Climate Literacy Guide.
But some key national security officials could use a new Signal chat, this time discussing the literacy guide to better understand essential principles of climate change science, impacts, and solutions. Bonus: None of this information is classified! And if an accidental invitation is available, I’d love to join officials who notably do use Signal:
Defense Secretary Pete Hegseth, who has just ordered the “elimination of climate defense planning,” scrapping years of Pentagon policy that identified climate as a major and mounting threat to national security.
Vice President JD Vance, who does not acknowledge human-caused climate change.
Secretary of State Marco Rubio, who again can’t quite figure out where he’s supposed to be in the (climate) conversation.
Director of National Security Tulsi Gabbard, who apparently okayed the omission of climate change from the US intelligence community’s annual threat assessment report for the first time in 11 years.
Homeland Security Advisor Stephen Miller, who during the previous Trump administration wasn’t “interested in climate change” even after an internal report showed it was a driver of migration to the US (along with driving enormous human suffering). At the moment, Miller “is more powerful [on immigration] than ever.”
No one seems to understand why Treasury Secretary Scott Bessent was included in the Yemen military attack Signal chat, so I propose swapping him out for Commerce Secretary Howard Lutnick, who promised in his confirmation hearing that he would not dismantle NOAA.
Literacy versus liesIt so happened that NOAA disappeared the guide while I’m at the Climate Information Integrity Summit in Brasília, Brazil. The summit, organized by members of the Climate Action Against Disinformation (CAAD) coalition, was a next step in the Brazilian government’s work with the UN and other member states to further progress on the Global Initiative for Information Integrity on Climate Change.
More than 120 key actors from governments, multilateral organizations, and local and international non-profit organizations discussed concrete steps towards safeguarding the integrity of climate information in the lead-up to the next round of international climate negotiations, COP30.
CAAD members (including UCS) clearly see that climate disinformation undermines elections, renewable energy, science, and human rights. That’s why other nations are already taking action to limit the harm disinformation can do, whether the lies for profit come from fossil fuel companies, agribusiness, and Big Tech companies that run social media platforms or search engines. Climate denial and deception in turn lead to a delay in climate action that we simply cannot afford.
Climate literacy is fundamental to climate information integrity. A public armed with that science, plus an understanding of the disinformation playbook that corporate actors keep on running, is a key pillar of defense against the corporations who profit as people suffer.
No wonder the Trump administration, intent on enacting the fossil fuel agenda, doesn’t want us to know and understand what they’re doing to our climate. Authoritarians prefer an uncritical public that lives in ignorance. Heads up—we’re paying attention and we know.
My City Got Disaster Recovery Money, Now What?
In December 2024, state and local governments across the nation were allocated disaster recovery funds to help address the impact of extreme weather on affordable housing, local economies, and public infrastructure.
These funds, known as Community Development Block Grants for Disaster Recovery (CDBG-DR) flow through the Department of Housing and Urban Development (HUD) and have the extraordinary potential to re-shape communities for the better. Unfortunately, the Trump administration is already undermining this valuable investment program.
Influencing recoveryPositive recovery outcomes aren’t guaranteed, especially given the growing politicization of disaster recovery. One counter to that politicization, which can delay or divert funds from reaching the most impacted communities, is robust local attention to recovery programs from design to implementation. A recent memo from the Trump administration both clarified points of confusion and rescinded previous guidance to state and local governments that was influenced by years of advocacy from disaster survivors.
Right now, state and local governments, referred to as grantees, are in the process of submitting draft recovery plans to the federal government for initial approval. I’ve previously written about principles these plans should embody. Once plans are approved by HUD, state and local governments must hold a public comment period. The exact dates of public comment will vary by each grantee, but this is a crucial opportunity in the disaster recovery process to shape programs and build community with other disaster survivors.
We’ve compiled a spreadsheet that lists the amounts allocated to each grantee and links either to the initial plan for spending CDBG-DR funds or to the grantee website for disaster recovery. Most of the public comment periods are still open and last week’s memo floated the possibility of an extension of the current timeline.
We encourage residents in impacted communities to engage in the public comment process to shape recovery plans and demonstrate the urgency of advancing resilience. Once public comment periods have closed, feedback is considered for incorporation for a final plan that is submitted to HUD for approval before programs are stood up and long-awaited funds begin to flow. Recent executive orders—and the agency’s insistence that state and local governments abide by them—are complicating an already complex process.
Disaster recovery and executive ordersA week before the memo that rescinded Biden-era guidance to grantees, HUD Secretary Scott Turner rejected the City of Asheville’s initial plan for spending Hurricane Helene recovery dollars on the grounds that the plan’s mention of supporting minority and women-owned businesses in economic recovery efforts contradicted President Trump’s executive order on diversity, equity, and inclusion. Asheville, which has a 15-million-dollar revenue shortfall after Helene, has since amended its plan.
Cities and states trying to help residents and local economies recover shouldn’t have to spend precious time balancing recovery needs against legally spurious executive orders to access critical funds. As plans are submitted, we’re tracking both how they address housing and infrastructure needs and the potential for politically motivated interference in the recovery programs.
In addition to the anti-DEI executive order, HUD is also requiring compliance with the executive order on English as the Official Language of the United States. Depending on how state and local governments choose to interpret this guidance—recovery may be placed further out of reach for non-English speakers. Ignoring equity in disaster recovery is costly and deadly.
It’s important to remember that many of the Trump administration’s executive orders conflict with federal law and the constitution. The prevailing wisdom of the courts, and the reason for this administration’s rebukes by the judiciary, is that federal laws passed by Congress and approved by the executive branch supersede executive orders.
These recovery funds are allocated for six years, survivors engaging with the CDBG-DR process should keep in mind that disaster recovery is a long process. Survivors from places as different as Texas, New Jersey, and Hawaii have demonstrated the power of residents to shape state, local and national recovery processes.
As authoritarianism ramps up, we should expect that everything from formal processes like public comment on disaster to recovery to direct action to come with risk. But as anyone on the climate frontlines can tell you, storms and fires are continuing no matter the political whims—a fact that makes getting recovery and mitigation right even more important.
With Fewer Weather Balloons, People in US Heartland Will Be Less Prepared for Tornado Season
On February 27, 2025, over 1,000 employees at the National Weather Service (NWS) were illegally fired by the Trump Administration under the premise of “making the government more efficient,” even though the agency was already severely understaffed. That same day, due to the job losses, weather balloons were suspended at the NWS Office in Kotzubue, AK. But it didn’t end there. On March 7, Albany, NY and Grey, ME announced partial suspension of their weather balloon launches. And just last week, on March 20, NWS offices in Omaha, NE and Rapid City, SD announced the suspension of their weather balloons. Six other NWS offices in states like Nebraska and Wisconsin revealed a reduction in weather balloon launching capacity that same day.
This might not sound like such a big deal, but as we’re gearing up for tornado season, which peaks between April and June, taking weather balloons offline in the Heartland of the United States, also known as Tornado Alley, will directly affect the NWS’s ability to predict severe weather, including tornado-producing thunderstorms. This could lead to more severe weather-related deaths that could have otherwise been avoided.
The current coverage of weather balloon launches in the United States (not including one in Puerto Rico and other launch locations in the Pacific Ocean). The orange dots denote NWS Offices with less balloon launch capacity (one per day instead of two), and the red dots denote NWS offices with balloon launch suspensions. Figure used with permission from the creator, Chris Vagasky (@coweatherman.bsky.social). Why do weather balloon observations matter?Weather balloons are a critical piece of the NWS’s observations infrastructure and have been for nearly a century. They carry radiosondes, instrument packages that report back temperature, pressure, wind, relative humidity, and GPS data to NWS offices, giving us a three-dimensional view of the atmosphere. In the United States, there are 92 NWS locations that release weather balloons, providing data to the NWS and their weather forecasting models.
Weather models use data collected by weather balloonsBut why do we care about what’s going on in the upper atmosphere? Well, first of all, this data is invaluable for our weather forecasting models. As you may know, meteorologists use weather models to help predict what will happen to the atmosphere in the future. Models anticipate things like winter storms, severe weather outbreaks, flood-inducing rains, or conditions favorable for wildfire development.
For a weather model to predict the future, it needs an accurate representation of what’s currently going on in the upper atmosphere. By suspending weather balloon launches at multiple locations, we lose data for the weather model, leading to a decrease in its predictability that negatively affects daily forecasts and outlooks for extreme weather events.
In fact, out of eight types of observations by the NWS (including airplanes and station observations), weather balloons are the second most important in improving prediction of weather models. They also only cost about $10 million per year to launch (assuming each balloon is $200), compared with the total cost of GOES-R satellite—another critical piece of the NOAA observations infrastructure—of $350 million per year. Weather balloon launches are so useful for the prediction of severe weather events that NWS offices often launch more than the usual 2 balloons per day to better inform modeling of a potential tornado outbreak.
Knowing what’s going on in the upper atmosphere could save livesWeather models aside, if we know what’s going on in the upper atmosphere, it makes weather forecasting in general a lot easier in the short-term. What goes on in the upper atmosphere is reflected by weather conditions at the surface.
Imagine you live in central Oklahoma and wake up one morning in mid-May. For the past several days, the NWS and their weather models have been predicting the possibility of a tornado outbreak to the east of where you live. However, observations retrieved by a weather balloon launch that morning revealed favorable conditions for a tornado outbreak to start where you live, rather than to the east of you.
Immediately, the NWS issues a tornado watch for your area, and you and your neighbors prepare for a potential tornado later that day. So, yes, the models were slightly wrong, but at least the NWS was able to provide some prep-time given the observations collected by the weather balloon that morning. If the NWS didn’t release a weather balloon, they may have missed the impending tornado outbreak, and you and your neighbors would have been caught completely off guard.
Ok, it sounds like I’m exaggerating, right? Actually, not at all. On October 3, 1979, a devastating F4 tornado struck Windsor Locks, CT with no warning. According to a study in 1987, the lack of warning was determined to be due to a lack of upper atmospheric data (no nearby, timely weather balloon launches), which led to an underestimation of the strength of the thunderstorm that produced the tornado.
Three people lost their lives in that tornado. It’s not science fiction to say that more people could lose their lives in the future given a lack of observation of the upper atmosphere. Because of this, and especially as we head into peak tornado season, it is critical for the NWS to remain fully staffed and fully funded. American lives are on the line.
EPA Staff Stand Firm As Administration Lobs Cuts, Baseless Accusations, and Cruelty
Neither Lee Zeldin, nor Elon Musk, nor President Trump could possibly look Brian Kelly in the eye to tell him to his face that he is lazy.
They cannot tell Kayla Butler she is crooked.
They dare not accuse Luis Antonio Flores or Colin Kramer of lollygagging on the golf course.
If Zeldin, Musk, or Trump knew a scintilla about them, they would dare not froth at the mouth with their toxic stereotypes about federal civil servants. All four work in Region 5 of the Environmental Protection Agency (EPA), responsible for pollution monitoring, cleanups, community engagement, and emergency hazardous waste response for Illinois, Michigan, Ohio, Indiana, Wisconsin, and Minnesota.
The Midwest is historically so saturated with manufacturing that just those six states generated a quarter of the nation’s hazardous waste back in the 1970s, and it is still today home to a quarter of the nation’s facilities reporting to the EPA’s Toxic Release Inventory Program. When I recently visited Region 5’s main office in Chicago, one enforcement officer, who did not give her name because of the sensitivity of her job, told me there are still toxic sites where “we show up [and] neither the state nor the EPA has ever been [there] to check.”
With irony, I visited the office the same week the Trump administration and Zeldin, President Trump’s new EPA administrator, announced they planned to cut 65% of the agency’s budget. Zeldin has since then dropped even more bombshells in a brazen attempt to gut the nation’s first line of defense against the poisoning of people, the polluting of the environment, and the proliferation of global warming gases.
Zeldin announced on March 12th more than 30 actions he plans to undertake to weaken or cripple air, water, wastewater, and chemical standards, including eliminating the Office of Environmental Justice and External Civil Rights and getting the EPA out of the business of curbing the carbon dioxide and methane gases fueling global warming. Despite record production that has the United States atop the world for oil, Zeldin said he was throttling down on regulations because they are “throttling the oil and gas industry.”
Last week, the New York Times reported the EPA is considering firing half to three-quarters of its scientists (770 to 1,155 out of 1,540) and closing the Office of Research and Development, the agency’s scientific research office. Zeldin justifies this in part by deriding many EPA programs as “left-wing ideological projects.” He violently brags that he is “driving a dagger straight into the heart of the climate change religion.”
Impact of cuts at EPA felt deeply, broadlyKelly, Butler, Flores, Kramer, and many others I talked with in Region 5 said all these plans are actually a bayonet ripping out the heart and soul of their mission. They all spoke to me on the condition that they were talking as members of their union, Local 704 of the American Federation of Government Employees. Nicole Cantello, union president and an EPA attorney, said the attacks on her members are unlike anything she’s seen in her more than 30 years with the agency. As much as prior conservative administrations may have criticized the agency, there’s never been one—until now—that tried to “fire everybody.”
Flores, a chemist who analyzes air, water, and soil samples for everything from lead to PCBs, said a decimated EPA means less scrutiny for another Flint water crisis, less eyeballs on Superfund sites, and limited ability to investigate toxic contamination after train derailments, such as the incident two years ago in East Palestine, Ohio. He added, “And we have a Great Lakes research vessel that tests the water across all the lakes. It’s important for drinking water, tourism, and fishing. If we get crippled, all that goes into question.”
Butler is a community involvement coordinator who works through Superfund legislation to inform communities about remediation efforts. She was deeply concerned that urban neighborhoods and rural communities will be denied the scientific resources to tell the full story of environmental injustice. Superfund sites, the legacy of toxic chemicals used in manufacturing, military operations, mining and landfills, are so poisonous, they can have cumulative, compound effects on affected communities, triggering many diseases. A 2023 EPA Inspector General report said the agency needed stronger policies, guidance, and performance measures to “assess and address cumulative impacts and disproportionate health effects on overburdened communities.”
Butler is deeply concerned cumulative impact assessments will not happen with cuts to the EPA, denying urban neighborhoods and rural communities the scientific resources to fully expose the horror of environmental injustice. “It’s a clear story that they’re trying to erase.” Butler said of the new administration.
For Kelly, an on-site emergency coordinator based out of Michigan, the rollbacks and the erasing of the story of environmental harms have an obvious conclusion. “People will die,” he said. “There will be additional deaths if we roll back these protections.”
What these workers also fear is the slow death of spirit amongst themselves to be civil servants.
Start with Kelly.
I actually talked to him from Chicago by telephone because he was out in Los Angeles County, deployed to assist with the cleanup of the devastating Eaton Fire that killed 17 people and destroyed more than 9,400 structures.
Between the Eaton Fire and the Palisades Fire, which took another 12 lives and destroyed another 6,800 buildings, the EPA conducted what it said was the largest wildfire hazardous materials cleanup in the history of the agency, and likely the most voluminous lithium battery removal in world history—primarily from the electric and hybrid vehicles and home battery storage people were forced to leave behind as they fled.
During a break, Kelly talked about how nimble he and his colleagues must be. He has worked cleanups of monster storms Katrina, Sandy, and Maria, and the East Palestine trail derailment. Based normally out of Michigan, he recalled a day he was working in the Upper Peninsula on a cleanup of an old abandoned mine processing site. He received a call from a state environmental emergency official asking him to drop what he was doing because 20 minutes away a gasoline tanker truck had flipped over, spilling about 6,000 gallons of gasoline onto the roads and down through the storm sewer into local waterways.
When he arrived, Kelly asked the fire chief how he could help. He was asked to set up air monitoring. But then he noticed anxious contractors who were wondering if they were going to get paid for their work. “They’re ordering supplies, they’re putting dirt down to contain this gasoline from getting any further,” Kelly said. “But they’re like, ‘Are we going to get paid for this?’”
“I found the truck driver who was talking to their insurance company. So I get on the phone with the insurance company and say, ‘Hey. This is who I am. This is what’s happening here. You need to come to terms and conditions with these contractors right now or EPA’s going to have to start taking this cleanup over!’”
The insurance was covered. Kelly said he could not have been so assertive with the insurance company without a robust EPA behind him.
“It’s one thing to be able go out and respond to these emergencies, but you have to have attorneys on your side,” Kelly said. “You’ve got to have enforcement specialists behind you. You’ve got to have people who are experts in drinking water and air. You can’t just have one person out there on an island by themselves.”
“Cruel for the sake of being cruel”Butler wonders if whole communities will become remote islands, surrounded by rising tides of pollution. The very morning of our interview, she was informed she was one of the thousands of federal workers across the nation who had their government purchase cards frozen by Elon Musk, the world’s richest human and President Trump’s destroyer of federal agencies. In launching the freeze, Musk claimed with no evidence, “A lot of shady expenditures happening.”
Butler threw shade on that, saying the purchase system is virtually foolproof with multiple layers of vetting and proof of purchase. She uses her purchase card to buy ads and place public notices in newspapers to keep communities informed about remediation of Superfund sites.
She has also used her card to piece together equipment to fit in a van for a mobile air monitor. The monitor assists with compliance, enforcement, and giving communities a read on possible toxic emissions and dust from nearby industrial operations.
“I literally bought the nuts and bolts that feed into this van that allow the scientists to measure all the chemicals, all the air pollution,” Butler said. “I remember seeing the van for the first time after I bought so many things for years. And I was like ‘Wow this is real!’”
Not only was the van real, but air monitoring in general, along with soil monitoring— particularly in places like heavily polluted Southeast Chicago—has been a critical tool of environmental justice to get rid of mountains of petcoke dust and detect neurotoxic manganese dust in the air and lead in backyards.
“Air monitoring created so much momentum for the community and community members to say, ‘this is what we need,’” Butler said.
Kramer is a chemist in quality assurance, working with project planners to devise the most accurate ways of testing for toxic materials, such as for cleanups of sites covered in PFAS—aka ‘forever chemicals’—from fire retardants, or at old industrial sites saturated with PCBs from churning out electrical equipment, insulation, paints, plastics, or adhesives. His job is mostly behind the scenes, but he understood the meaning of his work from one visit to a site to audit the procedures of the Illinois EPA.
The site had a small local museum dedicated to the Native tribes that first occupied the land. “The curator or director told us how the sampling work was going to bring native insects back to the area and different wildlife back to the streams,” Kramer said. “It was kind of a quick offhand conversation, but it gave me a quick snapshot of the work that’s being done.”
Kramer wonders how many more scientists will follow in his footsteps to see that the work keeps getting done. He remembered a painful day recently when a directive came down that he could not talk to contractors, even those who happen to work in the same building as he does.
“I see them every day,” Kramer said. “They come say hi to me. They know my child’s name. Being told that I couldn’t respond if they came to my desk, looked me in the face, and said, ‘good morning,’ is just such an unnecessary wrench into our system that just feels cruel for the sake of being cruel.”
Staff stifled, heartbrokenThe culture of fear is particularly stifling for one staffer who did not want to give her name because she is a liaison to elected officials. Before Zeldin took over, she would get an email from an elected official asking if funding for a project was still on track and “30 seconds later,” as she said, the question would be answered.
Her job “is all about relationships,” keeping officials informed about projects. Now, she said just about everything she depends on to do her job has basically come to a halt. “Everyone’s afraid to say anything, answer emails, put anything in writing without getting approval. Just mass chaos all the way to the top.”
Relationships are being upset left and right according to other staffers. One set of my interviews was with three EPA community health workers who feel they are betraying the communities they serve because their contact with them has fluctuated in the first months of the Trump administration. They’ve had to shift from silence to delicately dancing around any conversation that mentions environmental justice or diversity, equity, and inclusion.
They did not want to be named because they did not want to jeopardize the opportunity to still find ways to serve communities historically dumped on with toxic pollution for decades because of racism and classism.
“Literally since January 20, my entire division has been on edge,” said one of the three. “We kind of feel like we’re in the hot seat. A lot of people working on climate are afraid. If you’re working with [people with] lower to moderate income or [places] more populated by people of color, you’re afraid because you don’t want to send off any flags to the administration.”
The tiptoeing is heartbreaking to them because they see firsthand the poisoning of families from chemicals the EPA has regulated. One of the health workers has painful memories of seeing the “devastated” look on mothers’ faces when giving them the results of child lead tests that were well above the hazardous limit. “I feel like I made a promise to them that I would be there for what they needed,” she said. “And I feel like I’ve been forced to go back on that promise.”
Remembering their mission boosts moraleDespite that, and despite President Trump’s baseless ranting, which included saying during the campaign that “crooked” and “dishonest” federal workers were “destroying this country,” these EPA staffers are far from caving in. Nationally, current and former EPA staff last week published an open letter to the nation that said, “We cannot stand by and allow” the assault on environmental justice programs.
Locally in Region 5, the workers’ union has been trying to keep morale from tanking with town halls, trivia nights, lunch learning sessions, and happy hours. In a day of quiet defiance, many of the 1,000 staffers wore stickers in support of the probationary employees that said, “Don’t Fire New Hires.” Several of the people I interviewed said that if Zeldin and the Trump administration really cared about waste and inefficiency, they would not try to fire tens of thousands of probationary workers across the federal system.
One of them noted how the onboarding process, just to begin her probationary year, took five months. “It wastes all this money onboarding them and then eliminating them,” she said. “That’s totally abusing taxpayer dollars if you ask me. It’s hard enough to get people to work here. We’re powered by smart people who went to school for a long time and could make a lot of money elsewhere.” Federal staffers with advanced degrees make 29% less, on average, than counterparts in the private sector, according to a report last year from the Congressional Budget Office.
Individually, several said they maintained their morale by remembering why they came to the EPA in the first place. Flores, whose public service was embedded into him growing up in a military family, said, “I didn’t want to make the next shampoo,” with his chemistry degrees. “I didn’t want to make a better adhesive for a box…the tangible mission of human health and environmental health is very important me.”
The enforcement officer who wanted to remain anonymous talked about a case where she worked with the state to monitor lead in a fenceline community near a toxic industry. Several children were discovered to have elevated levels of lead in their blood.
“People’ lives are in my hands,” she said. “When we realized how dire the circumstance was, we were able to really speed up our process by working with the company, working with the state and getting a settlement done quick. And now all those fixes are in place. The lead monitoring has returned back to safe levels, and we know that there aren’t going to be any more kids impacted by this facility.”
One of the community health workers I interviewed said her mission means so much to her because at nine years old she lost her mother to breast cancer after exposure to the solvent trichloroethylene (TCE). That carcinogen is used in home, furniture, and automotive cleaning products. The Biden administration banned TCE in its final weeks, but the Trump administration has delayed implementation.
“The loss of her rippled throughout our community,” the worker said of her mother. “She was active in our church, teaching immigrants in our city how to read. The loss of her had such a large impact.” She said if the EPA were gutted, so many people like her mother would be lost too soon. “We play critical roles beyond just laws and regulations,” she said. “We do serve vital functions for communities based on where the need is the most.”
The same worker worried that if an agency as critical to community health as the EPA can be slashed to a shell of itself, there is no telling what is in store next for the nation. “I know people don’t have a lot of sympathy for bureaucrats,” she said. “But I think what is happening to us is a precursor to what happens to the rest of the country. We’re supposed to be this nonpartisan force that’s working for the American people, and attacks to that is a direct attack on the American people.”
One of her co-workers seconded her by saying, “We’re fighting for the American people and we are the American people. We all began this job for a reason. We all have our ‘why.’ And that hasn’t changed just because the administration has changed, because there’s some backlash or people coming after us. Just grounding yourself with people whose ‘why’ is the same as yours helps a lot.”
Our Environmental Movement Outrageously SLAPPed in the Face
In the March 19th verdict in Energy Transfer v Greenpeace, a North Dakota county jury awarded more than $660 million to “one of the largest… energy companies in North America” because Greenpeace supported the efforts of Indigenous Water Protectors in their protests of the Dakota Access Pipeline.
This verdict is an outrage because it undermines Tribal leadership and sovereignty. As Natali Segovia, of the Water Protector Legal Collective, said in the New York Times: “At its core, it’s a proxy war against Indigenous sovereignty using an international environmental organization.”
This verdict is an outrage because it threatens First Amendment rights, including the right to free speech.
This verdict is an outrage because it rewards a SLAPP (Strategic Lawsuit Against Public Participation), an egregious tactic of silencing and intimidation outlawed in 33 states but not in North Dakota.
It’s an outrage that jurors’ conflicts of interest did not disqualify them from service in this trial. It’s a further outrage that one of Energy Transfer’s examples of defamation was Greenpeace’s statement that the Dakota Access Pipeline leaked. The court would not allow an expert witness to testify that the pipeline did, in fact, leak.
Even if Greenpeace wins its appeal, the fact that this suit was allowed to proceed at all is an outrage. This verdict is yet another example of the fossil fuel industry’s agenda being enacted by multiple levels and branches of government. This is more than an outrage. It is a crime that will harm all people and species for generations to come.
We must stand together to overturn this unjust and outrageous verdict. Here at the Union of Concerned Scientists, we’re resisting through Protect the Protest anti-SLAPP taskforce—and by organizing a climate accountability campaign targeting the fossil fuel industry.
I’m imagining a few headlines that might have appeared over the past century if social movements had been SLAPPed for successful campaigns against powerful adversaries.
City of Montgomery Wins Bus Boycott Suit, Awarded DamagesWhat if you’d opened your newspaper in 1957, one year after the Montgomery Bus Boycott had ended and seen this headline. Would you have been outraged?
In reality, the Montgomery Bus Boycott ended in triumph when the City of Montgomery ended racial segregation on its buses. It was coordinated by Dr. Martin Luther King and the Montgomery Improvement Association, with the involvement of key civil rights leaders from Ella Baker to Bayard Rustin. It lasted for 381 days and cost the city approximately $3,000 per day in 1956 dollars—more than $13 million today.
If the city had successfully sued the boycott organizers, would there then have been a Southern Christian Leadership Conference? A Student Nonviolent Coordinating Committee? A March on Washington where Dr. King would deliver the speech from which many of our public officials conveniently cherry-pick one quote and one quote only: “I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character?”
There might well not have been. And that would have been an outrage.
Temperance Movement Owes US Lost Revenue, Enforcement Costs During ProhibitionHow about this for a 1934 headline? The 1920 enactment of the 18th constitutional amendment banning the manufacture, sale, or transportation of alcohol followed years of activism and lobbying by the Women’s Christian Temperance Union and the Anti-Saloon League, a powerful coalition that included the International Workers of the World and John D. Rockefeller, the NAACP and the Ku Klux Klan.
The Prohibition era lasted for 13 years. In today’s dollars the total cost to the US government in lost revenue alone would be approximately $222.7 billion.
The consequences of Prohibition went far beyond the cost to federal coffers: among other ill effects, it yielded enormous benefits for organized crime. Do we think today that the broad coalition of Prohibition activists should be held liable for the federal government’s loss of revenue after it enacted their policy demands, or for the tremendous societal costs of strengthened crime syndicates? Or do we think that organizing according to our consciences and beliefs is a fundamental right we must continue to enjoy?
Boeing Gets $2 Billion in Damages from Machinists Union After 2008 StrikeNo, this didn’t happen. What did: the International Association of Machinists (IAM) struck airplane manufacturer Boeing for eight weeks in 2008, with $1.2 billion in net income lost ($1.48 billion today).
The union struck Boeing again in 2024. Estimated costs for that 53-day action cost Boeing and its suppliers: $9.66 billion.
These are considerable losses for Boeing and the aircraft industry. But the power to strike is the ultimate power of the labor movement. Yes, a prolonged strike costs union members dearly in lost wages and the risk of losing their jobs entirely, but it costs employers dearly too. It’s a game of chicken, and without the ability to strike, the union isn’t driving a car—it’s a pedestrian.
So far, industrial actions such as those taken by the IAM are not subject to the increased power of business to sue for damages. But in an environment where business interests often outweigh the interests of workers, public health and safety, and in the case of climate change, future generations, it’s important to watch closely how juries and courts are thinking about these issues. Because a lot of their thinking is outrageous.
Whose Selfish Agenda Again?Energy Transfer’s lawyer told the court that Greenpeace had exploited the Dakota Access Pipeline to “promote its own selfish agenda.” I find it hard to contain my outrage.
Greenpeace’s “agenda” is “to ensure the ability of Earth to nurture life in all its diversity.” This is a public-serving mission. Here I speak as one who knows: the Union of Concerned Scientists is a generous employer, but no one is getting into the top1% of wealth fighting the insatiable greed of the fossil fuel industry.
Energy Transfer’s agenda is “to safely and reliably deliver the energy that makes our lives possible,” as long as that energy comes from transporting, refining, and ultimately burning the fossil fuels that are wreaking climate destruction now and far into the future. This is a profit-seeking mission. Fossil fuel moguls, from the Rockefellers to the Koch Brothers, have made themselves fabulously rich feeding, and feeding off, its insatiable greed.
The confusion of public and private interests, of what’s good for a company versus what’s good for a sovereign Tribal nation, or for all inhabitants of our planet—I can’t find words.
Apart from outrage.
The Theft, Harm, and Presidential Grift of Privatizing the National Weather Service
This week, as wildfires break out across Texas, life-saving alerts are being issued by the National Weather Service (NWS), informing evacuations ahead of the advancing threat. On the ground, firefighters are using National Oceanic and Atmospheric Administration (NOAA) satellites for wildfire monitoring in real time. This is just one of dozens of emergencies our first responders rely on NOAA and NWS data for on any given week. Simply put: NOAA and the NWS save lives and must be defended against the Trump administration’s ongoing assault.
We are witnessing the vanishing of our own US assets which taxpayers have funded and built over generations to serve the public good. We need those assets and will suffer in their absence. And we may be forced to pay the private sector to dole them back out to us, piecemeal. We need to call the theft, harm and grift what it is—and stop it.
The theftSince 1849, when the Smithsonian Institution began furnishing telegraph offices with weather instruments, meteorological data have been continuously and systematically collected in the United States. In 1870, Congress established within the US Army’s Signal Service the very 19th-century named Division of Telegrams and Reports for the Benefit of Commerce and tasked it with issuing weather forecasts and warnings.
Later, the service became a civilian agency when Congress transferred its meteorological responsibilities to the US Weather Bureau under the Department of Agriculture. Today, those duties are carried out by the National Weather Service (NWS), housed at the National Oceanic and Atmospheric Administration (NOAA) under the Department of Commerce. And thanks to the progression of recognizing the value of investing in weather forecasts and warnings, the American people own the NWS, a public service that is paid for with your tax dollars. That investment totals about $1.3 billion dollars annually—or about $7 per person in the United States—and it puts much more than this back into the US national economy.
The NWS’ own assessment in 2017 found that private businesses can derive up to $13 billion dollars in economic value from weather knowledge, and that its freely-available data powers a $7 billion-dollar market that creates tailored weather products for business and people. Economy-wide, the value of weather and climate information to the US economy exceeds $100 billion annually, which is roughly 10 times the investment made by taxpayers through federal agencies such as NOAA, involved in weather-related science and services.
That weather app on your phone, or the weather report on TV? How about the storm forecast that the airports you fly in or out of receive every three hours for the next 36 hours and are the basis for rerouting or grounding planes? That’s critical for safe air travel, and yes, that was paid by taxpayers and also belongs to you. The 418 people who were rescued last year from incidents over water, land, and in downed aircraft? That was possible because the Coast Guard and the military had access to NOAA’s search and rescue-aided satellites. All of it is powered by NOAA’s free and public data that are available for public safety or business operations.
At UCS, we know full well how valuable the data are—we power our own Danger Season extreme weather tracker using the NWS’s daily-updated data (another free service!)
But the valuable data and information that we obtain from NWS is at risk of being stolen. The Trump administration, Elon Musk, and DOGE—the black-box entity that has no actual legal authority to dismantle agencies created by Congress—have signaled as much by illegally invading NOAA headquarters, firing thousands of its staff, and canceling leases on some of its key buildings.
Here we are in the era of presidential overreach, where a Republican-controlled Congress is allowing the executive branch to usurp its powers, and a Democratic minority leadership is unwilling to use its remaining power to block these illegal actions. And that overreach has slipped into the judicial branch, where the Trump administration is openly ignoring judges’ decisions and orders to reverse course on illegal executive action.
But why? The Trump government wants to dismantle the climate and weather science conducted by NOAA because evidence of a warming world resulting from burning their products is a pesky reality for the fossil fuel industry that gave millions to his campaign. In addition, he would like to put behind a paywall those parts that they will not be able to completely eliminate—the NWS. This is not speculation. Just read the chapter on the Department of Commerce in the Trump government’s blueprint for dismantlement, Project 2025. Or if you can’t stomach the lunacy of the nearly 900-page document, read my blogpost readout of the plan for NOAA and NWS. This is very, very harmful.
The harmWhere is the harm in dismantling—or even simply compromising—NWS and its parent office, NOAA? Without accurate, updated, and free weather information, we lose the ability to prepare ourselves for potentially lethal extreme weather such as hurricanes, heat waves, floods, and snowstorms.
Travel by air becomes an uncertain activity that could kill you (think of the Age of Exploration, when galleons departed with very little certainty of arriving safely on the other side of the world, much less coming back!), as airports will not have reliable and updated storm forecasts. The national economy suffers because weather events account for impactful fluctuations in the country’s GDP and affect the ability of all sectors to provide goods and services. Planning for weather-related risks requires information that can help reduce uncertainty that is costly for business; its absence hampers emergency managers and first responders.
As it turns out, we lose quite a bit of life-saving alert information. I took a look at the number of times that the NWS issued an alert that impacted a county (or county-equivalents in the territories) each day between 2010 and 2024, a metric I call county-alert days. I use this metric rather than the raw number of alerts because NWS alerts often span multiple counties, so the raw number does not quite communicate the spread of alerts in counties.
NWS keeps track of nearly 70 different types of extreme weather, so I grouped them into thirteen categories. I am sure meteorologists may disagree with some of my grouping choices, but I think this serves to illustrate my point: Between 2010 and 2024, NWS issued extreme weather alerts that impacted all 3,144 counties (and equivalents) a whopping 3.7 million times.
I also grouped the alerts geographically according to the Fourth National Climate Assessment regions to show how different regions of the country face different kinds of extreme weather. Wildfires in the Great Plains, the Southwest, and the Northwest have prompted thousands of fire weather (also called “red flag”) alerts by the NWS; historically, alerts in the Southeast and the Northeast are mostly related to flood, cold, heat, and wind. The US Caribbean (that’s Puerto Rico and the US Virgin Islands) have faced floods, dangerous ocean weather and currents, and in the previous two years, extreme heat alerts that were not common before. Note that the small number of storms does not reflect their devastating impact, such as Hurricane María’s in 2017. Finally, Hawai’i and the Pacific Islands have faced much flooding and storms, and in the last few years have seen red flag weather alerts for wildfires such as the terrible Maui fires of 2023.
NWS alerting us to potential harm: Between 2010 and 2024, NWS issued extreme weather alerts that impacted all 3,144 counties and county-equivalents in the US 3.7 million times.Let’s say you live in a coastal community along the Gulf of Mexico. Would you like to know how much storm surge or wind speed you need to prepare for in the face of an incoming hurricane, or when you need to evacuate to higher ground? Well, you could have this information if NOAA could fly their hurricane hunters, those very cool aircraft flown by very brave pilots who soar into hurricanes to collect data that are fed into storm track models to refine projections of intensity, speed, and landfall as hurricanes form, evolve, and intensify rapidly from one day to the next (a hallmark behavior of storms in the climate change era).
But guess what? There is no certainty we will have such information this hurricane season. In February, flight directors and other pilots were fired, but news media reported that some were rehired in March. No clear information is coming through from the administration, so it’s anybody’s guess if there will in fact be planes, pilots, and a flight plan ready to go if and when hurricanes threaten populated areas in the Gulf of Mexico.
For other types of extreme weather worsened by climate change, harm will follow as well: farmers will lose drought monitoring that they rely on to plan and prepare for the season; forest managers and wildfire first responders will lose seasonal and monthly wildfire risk outlooks. Alerts about rapid-onset events such as extreme heat domes and flooding are also at risk of being lost.
Hurricane season and the time of the year when climate change makes extreme weather more likely (we call it Danger Season) are right around the corner. Without our hurricane hunters and their pilots, weather balloons, and forecasters, we are going impaired into seasonal climate and extreme weather dangers that we already know are destroying lives and property.
The presidential grift of what’s oursSo… <deep breath>. Let’s take Project 2025 seriously about its goal of privatizing NWS—which we definitely should take seriously, since in the first two long months of the Trump administration it has reliably been its modus operandi. According to pages 674-677, it appears that the theft and the harm will be followed by the further crime of privatizing what we own and pay for already.
What we already own and pay for is giving back dividends in lives and property saved, increasing prosperity, reducing uncertainty about extreme weather impacts, and providing the scientific bedrock of knowledge that can inform how to safeguard us from a climate-changed world. And the unilateral and illegal actions of the administration intend to put this service behind a paywall to make us pay again for it?
Public services exist to provide parity in access to all people in society without regard to their ability to individually fork out money for such a service—so those unable to pay will end up paying twice: once with their tax dollars, and once with their wellbeing or with their lives. Paywalled weather alerts will deprive individuals, households, or towns with lower incomes of access to life-saving services.
And there are early indications of the privatization to come. The private company WindBorne Systems has offered to backfill atmospheric data no longer collected by weather balloons in Alaska after the Juneau local NWS office lost 10% of its staff due to downsizing. While this may look like good corporate citizen action from a technically-savvy and well-resourced private company, businesses exist to make money, so it is a bit hard to see how WindBorne will be willing or able to permanently fill the gap in data collection in Alaska without compensation.
Is this the wasteful spending that President Trump and Musk pledged to root out? Are we supposed to accept the demolition of the jobs, the infrastructure, and the data that saves lives and property and increases prosperity, under the pretense of rooting out a federal workforce that is falsely vilified as being lazy, leeching the system, and wasting taxpayers’ money?
There is a perverse psychology of revenge at play here. In dismantling the federal workforce, the administration’s goal has been, in the words of director of the Office of Management and Budget Russell Vought (and architect of Project 2025), for “the bureaucrats to be traumatically affected.” This vengeful discourse has been embraced by a significant part of the country who gleefully watch the administration’s actions inflicting pain on the federal workforce across the board.
Year after year, billion-dollar disasters, many of them worsened by climate, destroy and displace communities across the country. And as Danger Season and the heat waves, hurricanes, floods, and wildfires it brings loom over us, people across the country and territories—regardless of political persuasion—will suffer under extreme weather disasters without life-saving information, and without adequately-funded and staffed emergency management, recovery, and reconstruction services.
The life- and property-saving value that federal workers bring to the people of this country is on the line, and I fear that the consequences of dismantling the country’s weather and climate forecasting enterprise as well as disaster assistance and recovery agencies will strike a blow to communities still reeling from previous years’ extreme weather in addition to this year’s worsening economic challenges related to market uncertainties and cost of living increases.
The Trump administration is dismantling institutions, firing expert staff, and stealing data paid for out of our own pockets. Such theft will lead to harm as we lose the information that saves lives, protects property, and enables prosperity across many aspects of daily life in the US. It will also change how the US has regarded science and the NWS as a beloved and public good.
The country has invested in, and innovated through, this scientific public service for over a century, not for selling it to the highest bidder, but for the common good. Dismantling NOAA and the National Weather Service is a presidential grift that we must oppose.
When we save science, we save lives. Take action to tell the Trump administration to stop its all-out war on our science and our scientists.
What Is a Climate Model and How Does It Work?
Climate models are the main tool climate scientists use to predict how Earth will respond to more heat-trapping pollutants in the atmosphere.
But what exactly is a climate model? Let’s start off easy by breaking down the phrase “climate model.” The “climate” is simply the weather averaged over a long period of time. A “model” in this case is a physical approximation of a complex system. So a climate model is an approximation of the Earth’s weather over a long period of time.
Since their debut in the 1960s, scientists have been improving and increasing the complexity of climate models (check out my History of Climate Models blog), and my colleagues and I at UCS continue to use them today.
General circulation modelsWhen climate scientists reference a climate model, they are generally referring to a general circulation model (GCM), which is the main tool climate scientists use to simulate and understand how the Earth’s oceans, land, atmosphere, and cryosphere (a word to describe the planet’s sea and land ice) respond to changes in both its own internal dynamics as well as changes in heat-trapping pollutants.
Just by looking at the name, you can see that a GCM is a model that simulates the circulation of Earth’s different physical systems like the atmosphere and ocean. What causes a circulation? In my blog on the potential collapse of the Atlantic Meridional Overturning Circulation (AMOC), which is the conveyor belt of water moving in the Atlantic Ocean, I discussed how regions around the equator are warmer than the poles due to different amounts of incoming solar radiation, that is, energy from the sun.
The Earth’s climate system doesn’t like imbalances in heat given the difference in density: Earth will do everything in its power to mix the cold poles and the hot tropics. The Earth’s atmosphere and oceans create circulations in order to mix temperature differences between regions; GCMs, or climate models, simulate these circulations quite well.
The AMOC is an oceanic circulation that transports warm, fresh water from the Equator to the North Atlantic and cold, salty water from the North Atlantic to the Equatorial region. https://oceanservice.noaa.gov/facts/amoc.html.How exactly do GCMs simulate circulations? In order to model the climate system, a GCM uses a set of equations that explains how energy, momentum (e.g., moving air), and water interact and change within the atmosphere and oceans. GCMs simulate the Earth as a giant three-dimensional grid and calculate how different variables (e.g., temperature, rainfall, etc.) change at each grid point. The models further simulate how heat and other climate variables travel to and influence values in other grid points.
A climate model splits the Earth into a three-dimensional grid, with calculations of momentum, heat, and water changes at each grid point. https://celebrating200years.noaa.gov/breakthroughs/climate_model/welcome.html A climate model is made up of many modelsIn my blog on the history of climate models, I discussed how the first climate model back in the mid 20th century was actually just a single model of the atmosphere, which is just one part of the climate system. We know that there are other components of Earth’s climate besides the atmosphere, for example, the ocean, the land, and ice. Today’s climate models are so complex because they are made up of all of these components: atmosphere, land, ocean, and ice. We also have scientists who specialize in each component, allowing for further complexity and improvement in prediction of the Earth’s climate system. Today, a climate model is made up of smaller, component models of the atmosphere, ocean, land, and cryosphere.
How exactly do all these different components of Earth’s climate system communicate with each other while a climate simulation is running? Through something called a coupler, which connects the different model components so that data can easily flow between the different sub-models.
Modern-day climate models incorporate multiple subcomponents that are integrated by means of a coupler.Why do we need so many different models? Each model simulates something specific in its respective system. An ocean model calculates ocean circulation (like the AMOC) as well as ocean biogeochemistry, which is the science of how different molecules, such as carbon or nitrogen, cycle through the ocean. A land model will simulate:
- vegetation
- snow cover
- soil moisture
- evapotranspiration (process by which water moves from the land surface or vegetation to the atmosphere)
- river flow
- and carbon storage
A sea-ice model will calculate
- reflection of incoming sunlight
- air-sea heat exchange
- and moisture interaction between ice and water
An atmospheric model calculates changes in
- atmospheric circulation
- radiation
- clouds
- and aerosols
You might be thinking, how could we possibly simulate clouds if they’re created from many tiny water droplets and ice crystals? If we were to simulate a cloud and all of its tiny droplets, our three-dimensional grid would have to be extremely detailed. Unfortunately, we don’t have the computer power to perform these kinds of detailed calculations (we also don’t fully understand the dazzling complexity of all the physics involved), so scientists developed something called a parameterization. A parameterization can be thought of as a model within a model.
Let’s say there’s a cloud in the eastern Tropical Pacific Ocean near the Galápagos Islands. This cloud exists under certain atmospheric conditions (temperature, moisture, wind) that support its existence.
If we were to simulate this cloud in a GCM, these atmospheric conditions would first be reported to the cloud parameterization scheme from the main atmospheric model. The parameterization then calculates certain properties of the cloud, like how much sunlight the cloud reflects or how much cloud coverage there is in the cloud’s surroundings. The parameterization then reports back its findings to the main atmospheric model, which allows for continuous communication between the main atmospheric model and the parameterization to follow the cloud through its lifecycle.
Many small-scale processes are parameterized in GCMs. Beyond clouds, air quality and turbulence are also parameterized. Turbulence is just the word for abrupt, small-scale changes in wind (think of being in a plane and suddenly experiencing a bump, or playing frisbee in a park and the frisbee changes direction or elevation as it suddenly experiences a gust of wind).
What are climate models used for?The obvious use for climate models is to predict how the Earth’s climate may change given a “forcing” applied to Earth’s atmosphere. A forcing is typically a change in the composition of Earth’s gases in the atmosphere or a change in incoming solar radiation that leads to a radiative imbalance.
What do I mean by this? A key feature of the Earth’s climate system is that it is always trying to maintain equilibrium—that is, the energy coming into the planet must always equal the energy leaving the planet. Why? Because the whole of the Earth’s climate system is subject to the laws of thermodynamics: energy in = energy out. But if the composition of gases in the atmosphere changes, then this can affect the energy balance.
When CO2 is added to the atmosphere, an energy imbalance is established, and the only way to reach energy equilibrium again is for the planet to warm up. This is why the Earth is warming in response to added CO2 in the atmosphere.
In the 1960s, it started to become clear, with the help of climate models and theory, that fossil fuel use would warm the planet. The National Academy of Sciences released The Charney Report in 1979, which used climate models to predict, and warn the U.S. government, that the planet would warm due to fossil fuel emissions (though the U.S. government was warned about global warming as early as 1965). The authors estimated that the world would warm 3°C (5.4°F) given a doubling of atmospheric CO2 based on their climate model simulations in the 1970s.
But this is just one example. You could use a climate model to ask any question that would affect the climate system: “What would happen if the Yellowstone supervolcano erupted?” “What if the sun disappeared for five days?” “What if all atmospheric nitrogen was removed?” You can also construct a climate model with any arrangement of continents—for example, a climate model to represent Pangea Earth or a “Waterworld” planet with no continents at all. Some scientists even built a climate model to simulate the climate of Westeros from the Game of Thrones TV show.
Today, climate models are so complex that we can study how climate may be changing on a more regional level. In my research, I’ve run climate models to study how drought in the U.S. Northeast is changing with climate change, how the Earth may start to rapidly warm in the near-future given a change in oceanic warming, and how precipitation patterns might shift in the Southwestern U.S.
Climate models will continue to become more complex and more accurateGCMs are complex, made up of multiple sub-models, and have a few parameterizations. They have been improved on for decades and are the combined work of climate scientists, physicists, mathematicians, and computer scientists. They’re also incredibly accurate—model simulations run in the 1990s predicted how much the Earth would warm by 2025, which matches our current observations.
In the future, climate models will become even more complex, perhaps resolving small-scale features, like clouds, rather than parameterizing them. We need these improved climate models to better predict and reduce uncertainty of regional climate change. The more scientists can equip society and decision makers with the best available climate science, the more we can sufficiently respond, adapt, and prepare for the changes underway.
The Fossil Fuel Industry’s Lasting Imprint on Global Sea Levels
The fossil fuel industry’s role in driving climate change is undeniable, yet corporate accountability remains a contested space. As the scientific evidence strengthens, courts around the world are increasingly considering the role of major fossil fuel companies in climate-related damages. Our latest research—published today in Environmental Research Letters—adds a critical piece to this legal and scientific puzzle by quantifying how emissions from the world’s largest fossil fuel and cement producers have directly contributed to sea level rise, both historically and in the centuries to come.
Advancing Climate Attribution ScienceAttribution science has evolved to the point where we can now link certain climate impacts to emissions from identifiable entities, including corporations. Our study applies the well-established MAGICC7 climate model to trace heat-trapping emissions from the 122 largest fossil fuel and cement producers—the Carbon Majors—and assess their contributions to present-day and future global mean sea level rise.
Our findings are stark: emissions traced to these industrial actors are responsible for 37-58% of the observed global surface temperature increase and 24-37% of historical sea level rise. Moreover, our research projects that these past emissions alone have all but guaranteed an additional 10 to 22 inches (0.26-0.55 meters) of sea level rise by 2300 —even if all emissions were to stop today. Importantly, this projected rise is in addition to the sea level rise driven by emissions from all other sources. This long-term impact reflects the delayed response of ocean temperatures and ice sheet dynamics to past greenhouse gas emissions.
These results demonstrate that the damages we are experiencing today, and those that will continue to unfold for centuries, are directly tied to the actions of a small number of corporate actors whose products and deceptive conduct have been driving climate change.
Why This Matters for Climate LitigationClimate litigation has become a powerful tool for holding corporations accountable for their role in fueling climate change. Cases such as Milieudefensie et al. v. Royal Dutch Shell , Saúl Luciano Lliuya vs. RWE, and Delaware v. BP et al. are among those seeking to hold fossil fuel companies legally accountable for their contributions to climate change.
Our study provides quantitative, peer-reviewed scientific evidence that may help inform litigation strategies in several ways:
- Strengthening Causation Arguments: Courts require clear scientific evidence linking defendants’ actions to damages. Our research quantifies the specific share of global temperature rise and sea level rise that can be attributed to emissions from major fossil fuel producers, reinforcing claims of causation.
- Informing Liability and Damages Assessments: The long-term costs of sea level rise, ranging from infrastructure damage to displacement, are expected to reach trillions of dollars. By establishing a direct link between historical emissions and projected sea level rise, our findings contribute to discussions on liability and potential financial responsibility.
- Countering Industry Defenses: Fossil fuel companies often argue that climate change is the result of collective emissions rather than the responsibility of any particular entity. Our study results directly challenge this premise by demonstrating that a share of sea level rise can be attributed to the products traced to a limited number of companies.
- Emphasizing the Urgency of Action: Delayed emissions reductions all but guarantee future damages. Our study highlights that earlier mitigation efforts could have significantly reduced today’s impacts—and further delays will only increase the severity of future sea level rise and its consequences. The longer action is delayed, the greater the avoidable consequences for coastal communities worldwide.
Scientific research has played a role in informing policy and its importance in litigation is growing. Our study builds on past attribution work that has already been cited in legal arguments worldwide. This growing body of evidence works hand in hand with research showing that fossil fuel companies have long understood the climate consequences of their extraction, production, promotion, and sale of oil, gas, and coal.
Rather than taking responsibility, they have actively misled the public about the dangersand the harms we are now experiencing. The consequences of their actions are no longer speculative; they are quantifiable, they are unfolding before our eyes, and they are disproportionately affecting people and communities with the least capacity to withstand devastating climate impacts.
Looking AheadAs legal battles over climate accountability continue, science will remain a cornerstone of these efforts. Our study contributes to the broader understanding of how industrial emissions have shaped global climate impacts and provides courts with data to inform their deliberations.
While litigation alone won’t solve the climate crisis, it is one piece of the broader landscape of climate governance. Establishing clear scientific links between emissions and damages is a critical step in ensuring that those responsible are held accountable and that decision-makers have the evidence needed to act.
The scientific reality is clear: emissions traced to major fossil fuel producers have played a significant role in driving present-day sea level rise, and the long-term consequences of these emissions will continue to shape our world for centuries to come.
The Infuriating Story Told by the Corporate and National Carbon Emissions Data
Accountability for past emissions should be a critical part in addressing climate change. But the first step in seeking accountability for the highest emitters, whether corporations or countries, is quantifying their contributions. While the pursuit of accountability should consider their role in creating and spreading disinformation and their deception around climate science and research, their contributions of heat-trapping gases to the atmosphere are an important place to start. Here, I’ll describe the data currently available to quantify these emissions, what they tell us about the drivers of climate change, and how we can achieve accountability for its harms moving forward.
Who are the Carbon Majors?The Carbon Majors are the largest fossil fuel producers and cement manufacturers, and a group to which 67.5% of all fossil fuel and cement emissions can be traced. To put a finer point on the immense impact of just a few organizations, more than one-third of these industrial emissions can be traced to just 26 companies. The Carbon Majors database includes emissions traced to investor-owned companies like ExxonMobil, BP, and Peabody; state-owned entities like SaudiAramco and Gazprom; and a handful of nation-states with dedicated fossil fuel and cement production, presently or historically, like China, Former Soviet Union.
Earlier today, my colleague Shaina Sadai released a peer-reviewed study that links emissions traced to the Carbon Majors to present-day and future sea level rise. This study adds yet another example of how emissions from these entities are driving climate impacts globally. Previous UCS studies have already linked their emissions to increases in global average temperature, ocean acidification, and area burned by wildfires. When considered with the growing evidence of companies’ deception and disinformation, these studies paint a damning picture of how these companies shaped our world and the inequities that they’ve reinforced globally.
These data also show that although humans have been emitting heat-trapping emissions into the atmosphere since the industrial revolution, 50% of emissions traced to the Carbon Majors have been released since just 2000. When visualizing data, clarifying the units used is critical. When it comes to emissions, this means distinguishing between cumulative historical (all the heat-trapping emissions they’ve ever emitted over time) and annual (all emissions each year). Both aggregations tell important stories that can help us to mitigate and adapt to climate change, but not specifying how the data are expressed is not only imprecise but can be deliberately misleading. The data in the figure below show annual emissions measured in gigatons of CO2 per year.
Source: UCS/Carbon Majors DatasetAs I wrote in an earlier blog that detailed the nitty gritty and backstory of this data, lawsuits and legal submissions worldwide cite the Carbon Majors Dataset to draw attention to the outsized role of fossil fuel companies in driving the climate crisis, including:
- Lliuya vs RWE, where a Peruvian farmer is suing one of Europe’s largest emitters of heat trapping gases for its role in increasing the risk of a glacial lake outburst flooding, which threatens him and the entire community of Huaraz. This case uses the Carbon Majors Dataset to quantify RWE’s contribution to global historic emissions.
- Greenpeace Italia vs ENI, where affected communities are suing to force ENI, Italy’s largest energy company, to reduce emissions and limit global warming. This case uses the Carbon Majors Dataset and source attribution research to underscore the outsized role of ENI in driving climate change.
- People of California vs Big Oil, where California is suing ExxonMobil, Chevron, BP, Shell and others for misleading advertising, failure to warn and fraudulent business practices. This filing uses these data to demonstrate that the contribution of heat-trapping gases traced to their defendants is quantifiable.
- Multnomah County vs Big Oil, where the largest county in Oregon, home to Portland, is suing ExxonMobil and others for damages and adaptation costs following 2021’s unprecedented and deadly heatwave. This filing uses the Carbon Majors Dataset to show that emissions attributable to each entity are calculable using the amount, type, and emissions factor associated with each product.
- Commission on Human Rights in the Philippines, which was petitioned to investigate the impact of climate change on human rights and the role of the ‘Carbon Majors’ in driving climate change and obstructing action.
- InterAmerican Court of Human Rights, where Colombia and Chile requested an advisory opinion to clarify the state’s human rights obligations in light of climate change. UCS’ joint intervention used these data to highlight the role of a handful of corporations play in driving climate change.
When it comes to emissions, fossil fuel companies are not the only entities that have disproportionately contributed to the atmosphere’s ever increasing concentration of heat-trapping emissions. Some countries, like the United States, Russia, China and Germany, have also contributed an outsized amount of emissions to the atmosphere, and as a result should bear a proportionate amount of responsibility for addressing climate change and its impacts.
The data presented below are from the Global Carbon Project and separate from the Carbon Majors data discussed above. This figure displays annual emissions by country, highlighting the massive historical contribution of the United States (nearly 25% of total global emissions), where several large Carbon Majors are headquartered. But more discouraging are the barely visible contributions (shown in purple) of many nations that are now bearing the brunt of climate change impacts —countries like Tonga and Pakistan, among many, many others. Plotted together, these data tell a powerful story about historical contributions and contextualize discussions around future responsibilities.
Source: Global Carbon Project What do these data mean for climate accountability?Emissions attributed to both the Carbon Majors and individual countries paint a picture of historical contributions that’s difficult to unsee—and inspires a call for accountability.
In the US, states, counties, and communities are seeking accountability through the courts. These lawsuits primarily focus on fossil fuel companies’ deception and disinformation campaigns that delayed climate action for years and continues to pollute our public discourse. While industry, trade groups, and their political allies have fought to dismiss the suits, courts across the country, including the Supreme Court just last week, have continued to affirm the right to seek accountability through the courts.
Internationally, high emitting countries continue to benefit from their historical emissions at the expense of many emerging economies that bear the brunt of climate impacts but have contributed the least amount of heat-trapping emissions. In multilateral agreements, powerful countries have resisted and slowed the adoption of mechanisms for accountability sought by more vulnerable nations.
At COP27 of the United Nations Framework Convention on Climate Change, countries around the world established a Loss and Damage Fund to aid developing countries who are most vulnerable to climate change but have contributed the least to the atmosphere. Countries initially pledged more than 700 million dollars to the fund. While this appears to be an entry point in the path toward accountability, this amount is far below the estimated need of 300 billion annually by 2035 – just 0.2% of what is needed Further, in early March, the US announced its withdrawal from the Loss and Damage Fund.
But these data don’t tell the full story, particularly regarding the environmental racism and injustice wrought by the fossil fuel and high-emitting countries. This is especially evident in Cancer Alley in southern Louisiana, where a high density of petrochemical plants and refineries with scarce regulation and willful neglect, have led to elevated rates of cancer and other health issues, a burden particularly borne by Black residents.
The Trump Administration has already reneged on the US’ bare minimum commitments to address issues of climate justice. We’ve seen fossil fuel industry leaders and climate deniers put in positions of authority. The US has pulled out of the Paris Agreement and stopped federal scientists from engaging with the Intergovernmental Panel on Climate Change, a global scientific body.
These data—both for countries and major carbon producers—tell a clear story about the history of human climate pollution, and the responsibilities of large emitters to act as the impacts of climate change grow increasingly severe. The need to hold them accountable, and guarantee they take up that responsibility, must guide our work every day.
How Major Carbon Producers Drive Sea Level Rise and Climate Injustice
In a new study released today, UCS attributes substantial temperature and sea level rise to emissions traced to the largest fossil fuel producers and cement manufacturers. And for the first time, we extend sea level projections into the future, quantifying how past heat-trapping emissions from the fossil fuel industry will impact the world for centuries to come.
The world’s largest fossil fuel and cement producers have known for decades that their products cause climate change, yet they spread disinformation to misinform the public and have profited as people around the world have suffered from ever-worsening climate impacts. Previous attribution research published by my Union of Concerned Scientists colleagues have allowed us to draw causal connections between sources of heat-trapping emissions and resulting impacts, like present day increases in atmospheric greenhouse gas concentrations, air temperatures, sea levels, ocean acidification, and wildfire burned area. At the same time, social science research has shed light on what the industry knew and when they knew it.
In our new study, we bring together those two lines of research to understand what would have happened if fossil fuels had been phased out following key developments throughout history. We found that heat-trapping emissions traced to major carbon polluters have contributed to nearly half of present day surface air temperature rise and nearly a third to the observed global mean sea level rise. And critically, we demonstrate how these emissions will cause harm for centuries to come.
The past haunts the futureOur new research quantifies how sea levels will rise for hundreds of years as a result of past emissions traced to products produced and sold by the Carbon Majors. By comparing scenarios, with and without industrial fossil fuel development and its associated emissions, we find that past emissions from the Carbon Majors are projected to lead to an additional 0.26-0.55 m (10-21 inches) of sea level rise by the year 2300. While the magnitude of future sea level rise will depend on how emissions evolve this century our results attributing additional future sea level rise to past emissions are largely unchanged, regardless of what future emissions trajectories the world follows.
If the Carbon Majors emissions had ceased after 1990, the long-term sea level rise just from past emissions traced to their products is projected to be an additional 0.17-0.35 m (6-14 inches), showing how just a few decades of emissions can have a big impact on the future. Every delay in phasing out fossil fuels will burden future generations who need to adapt to rising seas and recover from loss and damage due to sea level impacts.
The world that could have beenTo represent versions of the world that could have been if different actions had been taken and the world had acted in a timely manner to address the harms of fossil fuels, we develop several different counterfactual scenarios. We use the newly updated Carbon Majors database which quantifies annual emissions associated with coal, oil, gas, and cement production by each of the 122 largest fossil fuel and cement producers from 1854-2022. We explore 3 counterfactual scenarios, where we remove the emissions from these companies starting in a particular year:
- 1854 counterfactual: A world where industrial fossil fuel development never occurred. In this scenario we remove emissions from the world’s largest fossil fuel and cement producers starting in the year 1854. This is the earliest time period we can reliably know how much fossil fuels were being produced by different companies.
- 1950 counterfactual: A world where fossil fuels had been phased out when the industry knew that fossil fuels were harming the climate system. In this scenario we remove fossil fuel industry emissions after 1950 when research has shown that companies were internally aware of the harms of their products.
- 1990 counterfactual: A world where the international community had acted swiftly to phase out fossil fuels at the start of international efforts to address climate change. In this scenario we remove industry emissions after 1990, when the international community was first forming the Intergovernmental Panel on Climate Change and the United Nations Framework Convention on Climate Change.
In each set of simulations, we subtract emissions traced to the largest producers from the full emissions that actually occurred, and use a climate model, the MAGICC model, to determine what would have happened if the emissions from these companies never entered the atmosphere. MAGICC (Model for the Assessment of Greenhouse Gas Induced Climate Change) is a publicly accessible model that been widely used, including in the Intergovernmental Panel on Climate Change (IPCC) reports to understand how future climate could respond to different heat-trapping emissions scenarios.
Across all scenarios, we find that the world would have been cooler and the sea levels lower if fossil fuel emissions had been phased out earlier. We find that heat-trapping emissions traced to the Carbon Majors during 1854-2020 have contributed to as much as 57% to present day surface air temperature rise and as much as 37% to the observed global mean sea level rise. In the 1950 counterfactual scenario, modern temperatures (averaged from 1990-2020) would have been 0.41-0.66°C above the preindustrial (1850-1900) average and global sea levels would have risen by 0.12-0.17 m. This implies that these companies are responsible for as much as 57% of the present-day air temperature rise and as much as 36% of the present day sea level rise in this scenario. Impacts are similar in the 1854 and 1950 counterfactuals due to the relatively small amount of heat-trapping emissions released 1854-1950 relative to the enormous amount of emissions released after 1950.
In the 1990 counterfactual, the Carbon Majors are responsible for as much as 26% of the present-day air temperature rise and as much as 17% of the present-day sea level rise. The 1990 scenario has full historical emissions from 1854-1990 and then emissions from the fossil fuel industry removed 1990-2020. The climate impacts from emissions in recent decades are not yet fully realized, meaning this scenario underestimates the industry’s responsibility.
How does this study compare to what was found in previous UCS research?The findings of our new research corroborate those of previous UCS studies, affirming the strength of our methods and accuracy of models used. By using the newest available emissions data for the Carbon Majors, this study extends this type of attribution research to present day. The main advancement of this particular study is the look to the future, which the updated methodology allowed us to do.
This research uses the same climate modeling approach used in the 6th IPCC report (2021) to project future temperatures under different emissions scenarios. Previous UCS research had used methods derived from the 5th IPCC report, released in 2014.
One of the biggest differences between these two approaches is how they determine sea level rise. The previously used model was only backward looking, meaning it could describe sea level rise in the past. The new model accounts for different drivers of sea level change, including ice sheet models and glacier models, capturing dynamics that were not happening in the recent past and allowing us to project into the future.
Using research to motivate actionOur research shows that emissions traced to the world’s largest fossil fuel and cement manufacturers have caused global temperatures and sea levels to rise, and that sea levels will continue to rise for hundreds of years in response to heat-trapping emissions which have already occurred. The fossil fuel industry knew by the 1950s that their products were causing climate change and at any time in the intervening decades they could have changed their business model to phaseout fossil fuels, yet they chose to keep producing, and profiting from, these harmful products. These actions have led to worsening climate change which will impact people in the future for centuries to come.
As the people around the world experience the devastating impacts of stronger storms, more destructive wildfires, sea level rise, and other detrimental changes they are calling for those who are responsible to be held accountable. Communities around the world are pursuing accountability through court cases based on the fact that the fossil fuel industry knowingly deceived the public while producing products that would increase risks of climate change. Research that can trace specific climate impacts to the heat-trapping emissions produced by these companies can help inform this litigation. Researchers can help play a role by designing research questions that inform global action. It is long past time the world to phaseout fossil fuels and to get accountability for the harms that have occurred—and will occur in the coming years. The time to take action is now.
How Do ‘Future Climate Scenarios’ Shape Climate Science and Inform Policy?
The IPCC compiles scientific insights on climate change, informing policymakers and the public about risks and possible actions. One of its core tools is the use of future scenarios. Climate models and climate impact studies use emission scenarios—estimates of potential future changes in heat-trapping emissions—to help us see how choices made about emissions today can shape tomorrow’s climate. If you live in a coastal zone and have looked at maps of future sea level rise or have read about how climate change could be slowed with policy changes to reduce emissions, you’ve likely seen these scenarios in action. In essence, combined with climate models, they provide a way to envision the consequences of different actions or inactions.
Scenarios used in the IPCC are often mentioned in discussions about national climate targets, corporate sustainability plans, extreme weather events, slow onset events like sea level rise, and climate litigation. But what exactly are emission scenarios, how are they structured, and why are they essential for understanding climate change impacts and mitigation strategies?
What Are Future Climate Scenarios?Scenarios are projections of future human-caused emissions and their effects on the Earth’s climate system. These scenarios are not predictions; they are “what-if” frameworks that allow us to test the likely outcomes of various choices and actions. By examining possible trajectories for global economic development, technology adoption, and policy actions, the driving forces behind emissions, these scenarios help us assess a range of potential climate futures.
How Scenarios Have EvolvedOver the years, the IPCC and the scientific community has refined how it develops these scenarios. Initially it used four different emission storylines from the Special Report on Emissions Scenarios (SRES) as a scientific basis, however, recognizing the need for a more flexible and policy-relevant framework, scientists developed the Representative Concentration Pathways (RCPs) for the IPCC reports. Four RCP scenarios describe different levels of radiative forcing in the atmosphere by 2100. Radiative forcing is the change in energy balance in the Earth’s atmosphere due to heat trapping emissions. The use of radiative forcing to understand emissions trajectories was then paired with varied political pathways to generate Shared Socioeconomic Pathways (SSPs).
The IPCC currently uses five SSPs that represent different ways society could evolve, incorporating everything from energy use to policy decisions that shape the climate future we may experience. These pathways describe different global socioeconomic conditions (e.g., levels of cooperation or competition among countries, technology adoption, and inequality) in terms of radiative forcing. The five shared socioeconomic pathways are:
- SSP1: Taking the green road– A world focused on sustainable development, global cooperation, and green technology adoption. This scenario would lead to the least amount of global warming.
- SSP2: Middle of the road – A scenario where global trends continue along historical patterns, with moderate development and emissions reductions.
- SSP3: A rocky road – A fragmented world with regional conflicts, slow economic growth, and high inequality, leading to continued high emissions.
- SSP4: A divided world – A highly unequal world where some adopt clean technology while much of the population remains dependent on fossil fuels.
- SSP5: Taking the highway – A scenario driven by economic growth and high fossil fuel use, leading to rapid warming.
These scenarios are identified by their social pathway and the approximate level of radiative forcing resulting from the scenario by 2100.
Figure 1: Future annual CO2 emissions in the five illustrative scenarios
Source: Sixth Assessment Report of IPCC Working Group I, 2021By combining radiative forcing (the climate side, represented by the numbers at the end of each scenario name e.g. 1.9 and 8.5) with socioeconomic factors, SSP scenarios provide a richer description of how the world might develop and how that development would influence emissions.
How Do IPCC Scenarios Inform Climate Research?IPCC scenarios serve as foundational tools in climate research, enabling scientists to explore how different concentrations of heat-trapping emissions influence global temperatures, sea level rise, extreme weather events, and broader environmental changes. These scenarios are used in climate models to simulate various outcomes based on emissions trajectories, helping researchers assess climate system responses to different forcings.
One of the primary applications of IPCC scenarios is in global climate modeling. Climate scientists run general circulation models (GCMs) with these scenarios to simulate future climate states under these different emissions pathways. Studies show that high-emission scenarios like SSP5-8.5 lead to significant disruptions in atmospheric circulation patterns, affecting monsoons and mid-latitude storm tracks, compared to the low-emissions scenario SSP1-2.6.
IPCC scenarios also help scientists evaluate changes in the intensity and frequency of events like hurricanes, droughts, and wildfires. A recent review of tropical cyclone studies found that under high-emissions scenarios (SSP5-8.5), storms become more intense and produce heavier rainfall—even if their overall global frequency decreases. Lower-emission scenarios (SSP1-2.6) show more moderate increases, underscoring how different policy choices alter storm behavior.
These scenarios can also reveal how forests, oceans, and other natural systems might absorb or release carbon in the future. Research published in Earth System Science Data examined how under high-emission scenarios (SSP5-8.5), the ability of forests and oceans to absorb CO₂ weakens. Meanwhile, an intermediate scenario (SSP2-4.5) shows these natural “carbon sinks” remaining more effective for longer.
These are just a few examples of how researchers use these scenarios to help us understand possible climate futures.
Why Understanding These Scenarios MattersThese scenarios illustrate the range of potential climate outcomes based on different emissions trajectories, helping to assess the impact of various policy choices. A world limited to 1.5°C warming contrasts sharply with a high-emissions path. Without these scenarios, it would be nearly impossible to quantify the consequences of different emissions pathways or evaluate which strategies might work best to address climate change. These scenarios provide more than just hypothetical futures; they are tools for informed decision-making. They allow researchers, policymakers, and the public to grasp the potential consequences of inaction versus proactive climate strategies.
As we navigate the challenges of climate change, these scenarios remind us that the future remains open—and that our collective actions today will determine the climate reality we pass on to future generations.
Whose House? Our House. Why We Must Fight the Theft and Butchering of Our Federal Agencies
The ongoing destruction of federal agencies by the Trump team is an illegal effort that not only deprives the American public of essential services, upends lives and destroys livelihoods of federal workers, but steals our legacy of investment in tax-payer-funded institutions and functions. Since our country doesn’t work safely or effectively without these institutions and functions, either the thieves will privatize them and make us pay forever for something we built and already own, or we’ll suffer in their absence. Unless we stop them.
Vital federal agencies face fates ranging from near-total destruction in the case of USAID, to deeply diminished functioning in the case of the National Oceanic and Atmospheric Administration (NOAA), even as we face an intense and lengthening wildfire season and approach another hurricane season, to dangerous muzzling in the case of the Centers for Disease Control and Prevention, even as bird flu spreads. The moves are wasteful, harmful, egregious, ill- or uninformed—and in many cases, illegal. They are, as my colleague, Julie McNamara writes, pushing American innovation to the brink. And they are devastatingly costly, not just in wasted taxpayer dollars, but in human lives.
It’s our house and they are hacking it downFederal agencies represent generational investment in a functional society. They are an asset of today’s generations to pass on in good form to the next. Are reforms important over time? Absolutely. This is not reform, though, it’s wreckage. But rather than verbally light my hair on fire for you, here’s a clumsy but apt metaphor for what the destruction means for everyday people.
You have a home.
It’s nothing fancy, but you’ve been building and investing in it for years and now it has all the necessities and basic comforts. You have to pay each month to keep the lights on, and sometimes you need to do repairs and upgrades, but you’re a careful homeowner on a budget and you make it work. Someday you’re going to pass it on to your kids.
These days, though, your partner has different ideas.
One Monday you come home from work to find someone has torn your shed down. Your partner says, “it wasn’t doing anything useful”. You say, “but I was using it. Where am I supposed to keep my bike and my tools? Why was this necessary?” But they are taking a call.
On Tuesday, you come home and all your appliances have been hauled away. Your partner says, “they weren’t working efficiently”. You ask, “how are we supposed to keep our food cold? Or have clean clothes?” Your partner says a little short-term pain is worth the long-term gain; you’ve been signed up for an appliance subscription service. “But we owned those ones”, you say, “they worked just fine. How is this good for us?” But they have turned on a show.
On Wednesday, you come home and all your windows have been smashed. “They said they were drafty,” your partner relays as they board up the empty frames with plywood. “But how will we have any light? How will we get fresh air?” you ask. “I guess we’ll pay for more electricity and ventilation,” they say. You ask, “how is this good for us?” But they don’t hear you over the hammering.
On Thursday, you come home to find your solar panels and the roof beneath them are gone. “I don’t believe in them,” your partner says, as you frantically staple a blue tarp over the hole in your house. “Believe in what?!” you ask. “Solar electricity? Roofs? The sun? How is this good for us?!”
The next morning you wake up in a dark room to the drip of rainwater from your exposed attic. You put on dirty clothes and are fumbling your way downstairs when the jack-hammering starts. Outside, a crew is hacking away at your foundation. “Stop!” you yell. “This is my home! What are you doing?” The foreman checks his clipboard and says, “Well, it’s basically worthless now, so we’re going to clear it out”.
You turn to your partner, who is finally looking confused and afraid, and you ask again, “So tell me, how is this good for us?”
Our federal agencies are vitalYour partner in this story is people in America who are either initially supportive of these agency cuts or not paying close attention, but in either case, are due for real harm right along with everyone else. Those of us who can go about our lives with a sense of confidence and security do so in no small part due to the existence and effectiveness of our federal agencies. Check your weather app before you get dressed? Thanks, NOAA. Turn on your tap water with confidence that you can drink the coffee you make? Thanks, EPA. Review your kids’ college aid awards over breakfast? Thanks, Education Department. Opt to wear a mask to work because you heard the flu is surging? Thanks, CDC. Talk with your aging mom over lunch about a promising new dementia trial? Thanks NIH. Ask her how a cousin’s recovery from Hurricane Helene is going? Thanks, FEMA. Stop for some groceries on the way home because a big storm is coming? Thanks again, NOAA…
These agencies and their functions didn’t sprout from the head of some government mastermind. They came to be because we needed and demanded that these functions be filled. They were built over time because we funded them. And they exist today because we need and use them.
Destroying them is theftRipping them down like Elon Musk and DOGE, with President Trump’s urging, are doing is not governing in the public interest. It’s ruling by impulse and arrogance and out of the selfish, profit-minded interests of the billionaire class and big polluters. And for the public, it’s the governance equivalent of being carjacked by a gaslighter: violent, illegal, and what the hell—I’m using this car!—all while being told by the carjacker they aren’t taking anything they shouldn’t take…
And like a car-jacking, if and when we rescue these agencies from the chop shop, real damage will be done. To replace and rebuild what we had on January 20th will be incredibly costly and in the near-term, impossible: an unparalleled knowledge bank drained by the hemorrhaging of expert staff; skilled delivery of vital services stopped short by the firing of seasoned, dedicated public servants; decades-long data records vital to science permanently compromised by forced gaps in collection; infrastructure—from buildings to work stations—liquidated. These are all things paid for by us—not just for how they serve us today, but how they will serve us in our unfolding, uncertain future. And these are all things stolen from us.
For what?The spectacle of the world’s richest man slashing federal programs, services and workers in the name of efficiency would be a bad joke, except for how much it hurts and costs. And for what? Obviously not for efficiency, possibly for ruinous tax breaks for the wealthy, certainly for the privatization of public goods and the colossal grift entailed.
So when we hear of more cuts, we should strongly support and defend the people losing their jobs, and we should feel anger for the blatant destruction and theft of our legacy of investment, say “how dare you,” and fight it all, tooth and nail.
There are also things that this administration is doing of a more blatantly authoritarian nature, like threatening to defund colleges that allow students to exercise their right to free speech, threatening deportation of people for their political views, and working hard to dismantle the free press. They want to rule, not govern, so they are coming for everything that makes a democratic society possible.
And so we need to fight them on every front, get every win we can, punch holes in their fascist power play and petro-masculine money grab. Protecting federal agencies like NOAA from being gutted and privatized is one of those fronts. But fighting on any front is important.
So, what can we do?What we can do depends on the day and on the kind of risk our personal privilege enables us to take. Not everyone in this country can afford to take risks right now. But for those of us who have privilege, now is the time to use it, and the time to start stretching outside our comfort zone.
For the moment, we have to keep giving Congress hell…
- Over the spring/Easter Congressional recess (April 11th through the 27th), we can go to our members’ local town halls, if they are still holding them. And if they are not, we can demand that they hold them by writing letters to the editor, contacting the local media, building pressure on social media, or standing outside their office with a sign. Republicans have complete control of the federal government; they have no excuse to hide from their voters.
- Write a letters to the editor. Here’s UCS’s LTE guidance for writing an effective one. Feel free to use talking point from this or other UCS blogs!
- Call members of Congress and tell them to defend against these attacks. Here’s a UCS resource for making calls.
- And write them specifically about protecting NOAA. Here’s another UCS resource.
And we can show up for federal agencies and staff…
- Support federal staff and scientists in our communities. Here’s a UCS resource for folks to have on hand.
- Keep our ear to the ground for opportunities to show up in person and demonstrate support for agencies and rejection of the ongoing harm.
- Help to amplify the stories of fired staff and the stressed staff who remain on social media and other channels.
I’ll be the first to say that this is not enough to turn the tides right now; it’s just about being and staying in the fight. At the same time, taking care of ourselves and each other and not burning out is essential. So let’s stay awake to evolving threats, unify in as big and bold a front as we can, and get ready for when it’s time to go bigger and be braver.
The Long History of Climate Models
Climate models are the main tool scientists use to assess how much the Earth’s temperature will change given an increase in fossil fuel pollutants in the atmosphere. As a climate scientist, I’ve used them in all my research projects, including one predicting a change in Southwestern US precipitation patterns. But how exactly did climate models come to be?
Behind climate models today lie decades of both scientific and computer technological advancement. These models weren’t created overnight—they are the cumulative work of the world’s brightest climate scientists, mathematicians, computer scientists, chemists, and physicists since the 1940s.
Believe it or not, climate models are actually part of the driving force for the advancement of computers! Did you know that the second purpose for the world’s first electronic computer was to make a weather forecast? Predicting the weather and climate is a complex problem that combines computer science and physics. It is a form of applied mathematics that unites so many different fields.
This is the first blog in a three-part series. This first one focuses on the history of climate models; the second will discuss what a climate model is and what is it used for; and in the third blog I will explore how climate scientists integrate the rapidly-changing field of Machine Learning into climate science.
Predicting the weather and climate is a physics problemScientists hypothesized early on that we could predict the weather and longer-term climate using a set of equations that describe the motion and energy transfer of the atmosphere. Because there is often confusion about the difference between weather and climate, keep in mind that the climate is just the weather averaged over a long period of time. The atmosphere around us is an invisible fluid (at least to the human eye): we can apply math to that fluid to predict how it will look in the future. Check out this website for a representation of that fluid in motion.
In 1904, Vilhelm Bjerknes, a founder of modern meteorological forecasting, wrote:
“If it is true, as every scientist believes, that subsequent atmospheric states develop from the preceding ones according to physical law, then it is apparent that the necessary and sufficient conditions for the rational solution of forecasting problems are: 1) A sufficiently accurate knowledge of the state of the atmosphere at the initial time, 2) A sufficiently accurate knowledge of the laws according to which one state of the atmosphere develops from another.”
In other words, if we know the right mathematical equations that predict how atmospheric motion and energy transfer work, and we also know how the atmosphere currently looks, then we should be able to predict what the atmosphere will do in the future!
So why didn’t Vilhelm try to predict the weather in 1904 if he knew it could be done? Well, at the time there was unreliable and poor coverage of observations for how the atmosphere looked. Vilhelm also knew that there would be an immense number of calculations, too many to do by hand, that go into calculating the future of the state of the atmosphere.
It wasn’t until the early 1920s that Lewis Fry Richardson, an English mathematician and physicist, succeeded in doing what had been, until then, thought of as impossible. Using a set of equations that describe atmospheric movement, he predicted the weather eight hours into the future. How long did it take him to carry out those calculations? Six weeks. So you can see it wasn’t possible to make any reliable weather prediction with only calculations done by hand.
Computers changed the forecasting gameSo how can we predict the weather and climate so well today? Computers! In 1945, the Electronic Numerical Integrator and Computer (ENIAC) was invented in the United States. ENIAC was the first electronic computer that could perform an impressive number of calculations in seconds (here, “impressive” is relative to the era, as ENIAC could make about 5,000 calculations per second, while today’s iPhone can make billions of calculations per second). Note that there were computers before ENIAC, but they didn’t rely on electricity and weren’t as complex.
Originally, this electronic computer was used for nuclear fallout calculations, but believe it or not, the second use for ENIAC was to perform the calculations necessary to predict the weather. Why? Because weather prediction was the perfect overlap of applied mathematics and physics that required the quick calculations of a computer.
In 1946, one year after the ENIAC was finished, John von Neumann, a Princeton mathematician who was a pioneer of early computers, organized a conference of meteorologists. According to Jule Charney, a leading 20th century meteorologist, “[to] von Neumann, meteorology was par excellence the applied branch of mathematics and physics that stood the most to gain from high-speed computation.”
Von Neumann knew that with ENIAC we could start predicting both the short-term weather and the longer-term climate. In 1950, a successful weather prediction for North America was run on ENIAC, setting the stage for the future of weather and climate prediction.
Two of the ENIAC programmers preparing the computer for Demonstration Day in February 1946. US Army Photo from the ARL Technical Library archives. Left: Betty Jennings, right: Frances Bilas.Who were the six programmers that ran these calculations? Jean Bartik, Kathleen Antonelli, Marlyn Meltzer, Betty Holberton, Frances Spence, and Ruth Teitelbaum, all women mathematicians recruited by the military in 1942 as so-called “human computers”. I mention this briefly here because women’s contributions to the advancement of computer technology and weather forecasting is often overlooked. During Women’s History Month it is even more important to elevate and remind folks of their critical contributions.
The first climate modelThe weather forecast run on ENIAC in 1950 was for only a short period of time (24 hours) and included only North America. To successfully model the climate, we would need a model to simulate decades of Earth time that covers at least one hemisphere of the Earth.
The first general circulation model (GCM)—what climate scientists call climate models— was developed by Norman Phillips in 1956 using a more sophisticated computer than ENIAC that could handle more calculations. However, this GCM was primitive in nature.
After Phillips successfully demonstrated the climate system could be modeled, four institutions in the United States—UCLA, the Geophysical Fluid Dynamics Laboratory (GFDL) in Princeton, the National Center for Atmospheric Research in Boulder, and Lawrence Livermore National Laboratory—independently developed the first atmosphere-only GCMs in the 1960s. Having four models developed slightly differently contributed to the robustness of the discipline early on in climate modeling.
And thus, the age of climate modeling was born. These GCMs could predict the future state of the atmosphere and its circulation given any change to atmospheric composition (such as heat-trapping pollutants), which is the main application of GCMs to this day.
Today’s climate modelsGCMs are much more sophisticated today than they were in the 1960s. They have higher resolutions (meaning they perform more calculations per area), they are informed by better physical approximations, and they can replicate the climate much better. With each decade since their conception, different earth system models that simulate phenomena such as carbon cycling, vegetation, and aerosols have been added, improving GCM complexity and accuracy. Present-day GCMs consider changes in not just the atmosphere, but also changes in the ocean, the land, and sea-ice (see figure below).
Modern climate models incorporate multiple sub-components that simulate land, ocean, and sea-ice conditions to inform modeling of atmospheric conditions.When scientists run a climate model, they are actually running four different models: one for the atmosphere, one for the ocean, one for the land, and one for sea-ice, simultaneously. These four models then communicate with each other through something called a “coupler” during the calculation stage.
So, for example, if the ocean model says the temperature of the ocean surface changes by 3°C given a doubling of CO2 in the atmosphere, this information is then relayed to the atmospheric model, which can then respond and change the atmospheric circulation based on that temperature change.
For more information on what a climate model is, how it works, and how climate scientists use them, check out my climate model explainer blog coming soon.
Climate models improve incrementally through decades of scientific workClimate models are some of the most reliable models in existence because they have been built upon, tested, and corrected for decades. And while there are some problems we’re still working on correcting, they can replicate the climate system overall with incredible skill and accuracy.
Climate models are at the foundation of the scientific consensus around climate change. And at UCS, we use climate models to advance our understanding of the climate system in order to predict how communities are affected by a changing climate, and, importantly, to know who to hold accountable for the climate crisis.
We Are Charting a Path for Science in the Trump Era
This past week was a busy one for the Union of Concerned Scientists.
On Monday, a whopping 48 scientific societies, associations, and organizations—representing almost 100,000 scientists from diverse disciplines—sent a letter, organized by UCS, to members of Congress demanding they protect federally funded scientific research and federal scientists. Anyone who’s worked with any scientific organization on a collective effort knows it is quite the feat to get such incredible unity in the scientific community.
On Thursday, the Campaign Legal Center filed suit on behalf of UCS and other groups against Elon Musk and his so-called Department of Government Efficiency (DOGE) for acting beyond their power to slash federal funding, dismantle federal agencies, and fire federal employees.
On Friday, UCS staff and members rallied with thousands of others at the Stand Up for Science 2025 events in cities and towns across the country. I spoke at the DC rally, and I was impressed to see the turnout and energy of the scientists and science supporters who trekked to the National Mall to tell the world in a unified voice that the administration’s attacks on science are unacceptable and the scientific community will not be silent.
That night, I shared my rally message on MSBNC “Prime.” Here is the full segment on how the Trump administration’s all-out assaults on science and scientists are harming real people’s health and safety.
This has been a challenging month. Many in the scientific community—and in the general population—have been unclear about what to do in response to the Trump administration’s aggressive and unlawful disruptions to the federal government. The speed and scale at which the Trump Administration has taken a sledgehammer to federal science agencies and the dedicated experts within them has been alarming and disorienting. With limited levers of power across the government to stop these actions, and a complete disregard for policy, process, and law by Trump Administration officials, it is no surprise that people feel disillusioned and powerless.
But we mustn’t. The scientific community has never been one to walk away from a challenging problem. In fact, we pursue them. We undoubtedly face an uphill battle in our current environment, but there is a path forward. We must preserve as much as we can in the federal government, prevent new damages from happening, and rebuild from outside the government when necessary.
In my conversation with MSNBC host and White House veteran Symone Sanders Townsend, she noted that no savior is coming to save us, that we need to lead ourselves out of this, and it is the scientists who are now stepping into the streets.
I felt that on Friday as I spoke to a sea of science supporters overlooking the Reflecting Pool. It is us, as the scientific community, who now have the chance to lead, to be brave, and to do everything in our power to insist on an administration and a world that uses science for good.
I’m determined to face the wind and I hope you are, too.
President Trump’s Cabinet of Polluters, Frackers and Climate Crisis Deniers Rushes to Gut Protections
Lee Zeldin was full of pablum in his January Senate confirmation hearing to run the Environmental Protection Agency (EPA). A former member of Congress from Long Island, New York, with scant regulatory experience, Zeldin promised to “defer to the research of the scientists” on whether climate change made oceans more acidic. In even more laudatory language, he said he would “defer to the talented scientists,” on whether Earth had hit thresholds for runaway climate change.
He said he “would welcome an opportunity to read through all the science and research” on pesticides and search for “common sense, pragmatic solutions” on environmental issues. Claiming there was “no dollar large or small that can influence the decisions that I make,” Zeldin went so far as to say, “It is my job to stay up at night, to lose sleep at night, to make sure that we are making our air and our water cleaner.”
It was all a lie. Last week, President Trump said Zeldin was considering firing 65% of EPA’s staff, which would amount to nearly 10,000 of the agency’s 15,000 workers. The White House later issued a clarification—as if it made any difference—that Zeldin was “committed” to slashing 65% of the agency’s budget. The EPA issued a statement saying President Trump and Secretary Zeldin “are in lock step.”
Also last week, the news broke that Zeldin is urging the White House to strike down the 2009 EPA finding that global warming gases endanger public health and the environment. That finding, made under the Obama administration, girded federal efforts to reduce vehicle and industrial emissions. The finding, long a legal target for climate deniers, has so far held up, even in an ultra-conservative Supreme Court, but that has not stopped the administration from attacking it. Project 2025, the blueprint organized by the Heritage Foundation to guide this White House, calls for an “update” to the endangerment finding. Leading climate denier and former Trump transition adviser Steve Milloy told the Associated Press last week that without the finding, “everything EPA does on climate goes away.”
This is after Zeldin told senators in written answers for his confirmation that he planned to “learn from EPA career staff about the current state of the science on greenhouse gas emissions and follow all legal requirements.” Instead, Zeldin has scientists in a state of bewilderment. In one fell month, he has every employee looking over their shoulder, fearing the dismissal of their work or the tap of outright dismissal.
Zeldin’s latest “lock-step” actions cap an already-breathtaking first month in running the EPA.
He has launched an illegal effort to claw back $20 billion in EPA clean energy funding significantly targeted for disadvantaged communities. He placed nearly 170 workers in the office of Environmental Justice on administrative leave and oversaw the firing of about 400 probationary staff (although some have momentarily been brought back after public outcry).
Zeldin has begun a rollback of Biden administration energy efficiency and water conservation regulations for home appliances and fixtures, and is asking Congress to repeal waivers for California to phase out new, gasoline-only vehicle sales and stricter emissions standards for heavy-duty trucks. Many other states in recent years have decided they would follow California’s standards, as they are allowed to under the Clean Air Act. Combined, these states add up to 40% of the automobile market in the United States.
There are surely many more attempts to come that will turn back the clock on environmental protection.
An EPA led by industry apologistsZeldin’s EPA includes a rogue’s gallery from President Trump’s first term.
Returning to the EPA in top spots for chemical regulation are Nancy Beck and Lynn Dekleva. Both formerly served on the American Chemistry Council, the top lobbying arm of chemical manufacturers, and Dekleva spent more than three decades at DuPont, one of the most notorious companies for burying the dangers of PFAS.
In the first Trump administration, Beck was at the center of the suppression on science to resist the most stringent regulation or bans on carcinogenic chemicals such as trichloroethylene, PFAS, methylene chloride, and asbestos. She was also reported to have helped in burying the strongest possible health and safety guidelines to help communities reopen during the height of the COVID-19 pandemic. Dekleva was accused during her first stint in President Trump’s EPA of pressuring employees to approve new chemicals and colluding with industry to weaken the Toxic Substances Control Act.
The nominee to be Zeldin’s assistant administrator, David Fotouhi, is another returnee who was at the center of the first Trump administration’s efforts to strip wetlands protections. When not inside the EPA, Fotouhi has a long record defending industries in legal battles over standards or contamination lawsuits about toxic chemicals, such as asbestos, PFAS, PCBs, and coal ash.
Holding high-level positions in the Office of Air and Radiation are Abigale Tardif and Alex Dominguez. Tardif lobbied for the oil and petrochemical industry and was a policy analyst for the Koch-funded network Americans for Prosperity. Dominguez lobbied for the American Petroleum Institute, which opposed the vehicle pollution standards of the Biden administration.
Aaron Szabo has been nominated to be assistant secretary for Air and Radiation. Szabo was a contributing consultant to the Project 2025 chapter on the EPA that recommends sharply curtailing the agency’s monitoring of global warming gases and other pollutants and eliminating the Office for Environmental Justice and External Civil Rights.
Other recent EPA appointees who also contributed to Project 2025 (which President Trump disavowed during the presidential campaign) are Scott Mason and Justin Schwab. Steven Cook, a former lobbyist for plastics, chemicals, and oil refining, and another veteran of the first Trump administration, is also returning.
Zeldin may be inexperienced at regulation, but none of the above are. Kyle Danish, a partner at Van Ness Feldman, a consulting firm for energy clients, told the New York Times, “This group is arriving with more expertise in deploying the machinery of the agency, including to unravel regulations from the prior administration. They all look like they graduated one level from what they did in the first Trump administration.”
Same playbook at other agenciesOther agencies responsible for addressing climate change pollution have also quickly deployed the machinery of environmental destruction.
Transportation Secretary Sean Duffy issued a memorandum ordering a review of the fuel economy standards of the Biden administration, claiming without evidence that the standards would destroy “thousands” of jobs and “force the electrification” of the nation’s auto fleets. This is despite the agency’s own analysis showing the rules would save consumers $23 billion in fuel costs and result in annual health costs benefits of $13 billion from reduced air pollution.
Secretary Duffy also issued a memorandum canceling the Department of Transportation’s plans to address environmental justice in low-income populations and communities of color, climate change, and resilience polices for department assets and the department’s Equity Council. Again, no facts were offered as to why communities disproportionately beset with pollution and pollution-related diseases should be excluded from protection. He was just following President Trump’s Orwellian executive order that aims to wipe any consideration of race, gender, climate, equity, and disproportionate impacts from federal programs.
Over in the Interior Department, Secretary Doug Burgum issued a memorandum directing all his assistant secretaries to provide action plans that “suspend, revise, or rescind” more than two dozen regulations. The obvious goal is to plunder more public land and water for private profit for the fossil fuel and mining industries. Many of those regulations to be revised or killed involve endangered wildlife and plants, landscape and conservation health, the Migratory Bird Treaty, and accounting for the benefits to public health, property, and agriculture of reducing climate-related pollution.
In a recent interview on FOX News, Secretary Burgum said he was “completely embracing” the massive shrinking of the federal workforce by the Department of Government Efficiency, a cruel act that means he is just fine with DOGE’s 2,000 job cuts at Interior, including 1,000 in the chronically understaffed National Park Service, which has a $23.3 billion backlog for deferred maintenance.
Climate mockery at Department of EnergyAnd then we have the reported layoff of between 1,200 and 2,000 workers at the Energy Department, now run by Chris Wright, a former CEO of one of the nation’s largest fracking companies. In President Trump’s Cabinet, Secretary Wright is the most blunt in dismissing the effects of the climate crisis. In 2023, he said the “the hype over wildfires is just hype to justify” climate policies. He said, “There is no climate crisis, and we’re not in the midst of an energy transition.”
He has doubled down on his rhetoric during his first month in office. Wright told a conservative policy conference in February—without evidence —that net zero goals for carbon emissions by 2050 were “sinister” and “lunacy.” Wright also went on FOX Business in February to say that climate change is “nowhere near the world’s biggest problem today, not even close.”
Despite all the evidence already unfolding that climate change is a factor in the increasing number of billion-dollar weather disasters in the US, and despite a major 2023 study projecting that five million lives a year could be saved around the world by phasing out fossil fuels and their pollution, Wright said a warmer planet with more carbon dioxide is “better for growing plants.” Never mind the communities living in the crosshairs of contamination and climate catastrophe or conservationists who are concerned anew about endangered species.
Wright spent his first month in office postponing Biden-era energy efficiency standards for home appliances, claiming without evidence that they have “diminished the quality” of them. His office announced the canceling of $124 million in contracts, many of them connected to diversity, inclusion, and equity initiatives. He said those contracts were “adding nothing of value to the American people.” When asked if he wanted fossil fuels to “come back big time,” Wright responded, “Absolutely.”
Behind the pablum of confirmation hearings was an iron fistAnd over in the Commerce Department, the 6,700 scientists and 12,000 staffers at the National Oceanic and Atmospheric Administration (NOAA) are reeling from the recent first wave of hundreds of layoffs. Many more job losses are threatened, with sources telling major media outlets that the Trump administration and new Secretary Howard Lutnick are considering a 50% cut in staff and a 30% cut in the agency’s budget.
It is irrelevant to the Trump administration that NOAA is a bedrock agency that protects the public with its real-time tracking of dangerous storms. It is at the center of long-term federal analysis on climate, the toll in property and life of global warming, the health of our oceans, and the state of our fisheries. Instead of being placed on a pedestal for this central role, NOAA is as much a bullseye for polluters and plunderers as the EPA. Project 2025 calls for the breaking up of NOAA because it “has become one of the main drivers of the climate change alarm industry and, as such, is harmful to future US prosperity.”
Lutnick, a billionaire Wall Street financier, told senators in his January confirmation hearing that he had “no interest” in dismantling NOAA. The firings suggest the dismantling has begun.
When Lee Zeldin promised at his confirmation hearing that he would “defer” to talented scientists on climate change data, it was a mere six days after NOAA and many other weather agencies around the world confirmed that Earth had its hottest year yet in 2024. That was obviously lost on him. In just one month, the only demonstrated deference of Zeldin, Burgum, Wright, Duffy, and Lutnick is to President Trump’s mantra of “drill, baby, drill” and the deregulation of toxic industries.
Left in the wake are demonized and demoralized federal scientists.
In his address to Congress this week, President Trump boasted about ending “environmental restrictions that were making our country far less safe and totally unaffordable.” Hopefully it will not be one hurricane, one contamination, or one disappearing species too many to realize we cannot afford to be without those scientists. We will be far less safe without them.