E&E news [paywalled] recently tackled the subject of evolving climate science. Reporter Chelsea Harvey examined the five assessment reports from the U.N. Intergovernmental Panel on Climate Change (IPCC), which was established in 1988 by the U.N. Environment Programme and the World Meteorological Organization.
The UN tasked the IPCC with assessing the risks from climate change by using the most up-to-date scientific and technical information. The five IPCC reports since 1988 have grown increasingly complex, with the latest published in 2014 (the sixth is due in 2022).
The bottom line over 30 years? The big picture forecast of climate warming, covering a broad range of potential temperature rise, remains the same:
[M]ajor uncertainties about climate sensitivity remain, even though estimates of its value are largely the same as they were in the 1990s. The First and Fifth assessment reports both suggest that a doubling of atmospheric carbon dioxide would increase global temperatures by between 1.5 and 4.5 C.
But the IPCC has been too conservative on some specific topics, like sea level rise:
The First Assessment Report suggested that sea levels would likely rise by about 65 centimeters by the end of the century, under a business-as-usual trajectory, “mainly due to thermal expansion of the oceans and the melting of some land ice.” By the Fifth Assessment Report in 2014, scientists were projecting up to a meter of sea-level rise by the end of the century under a business-as-usual scenario.
Even in the few years since, multiple studies have suggested that the IPCC’s estimates may be too low, taking into account improvements in scientists’ understanding of the physical processes affecting the world’s ice sheets. Some scientists expect the projections reported in the Sixth Assessment Report will be even higher.
And the IPCC underestimated how much warming has already occurred since 1880:
[W]hile the First Assessment Report estimated that global temperatures have warmed by between 0.3 and 0.6 degree Celsius in the past century, the Fifth Assessment Report honed this estimate to about 0.85 C since 1880.
The science has also improved in terms of modeling capability and ability to forecast impacts in specific parts of the globe, as well as attribute particular weather events to climate change with more precision.
Clearly the science over the past 30 years has been too conservative in some respects, which should give us even more motivation to take action on climate. We’ll need to reduce greenhouse gas emissions as much as we can through clean technology deployment, while preparing for the now-unavoidable impacts to come.
We know that city dwellers have a smaller carbon footprint that suburbanites. But now we have a real case study with carbon measurements to document the phenomenon.
14 scientists at the National Center for Atmospheric Research, the University of Utah and several other universities set up a network of carbon dioxide sensors across Salt Lake City and its suburbs. The Washington Post reported on the results:
As suburbs have expanded southwest of Salt Lake City over the last 10 years, carbon dioxide emissions have spiked…
It’s the latest indication that suburban expansion takes an environmental toll, with people driving greater distances and building larger homes that use more energy for heating and cooling.
Similar population growth in the center of Salt Lake City didn’t take the same toll, according to the research. Carbon dioxide emissions in the city center were already higher than in nonurban places. But as the population there grew by 10,000 people, the emissions didn’t increase further.
It’s yet more evidence that encouraging urban growth is one of the most important steps we can take to reduce greenhouse gas emissions. And it’s also a reason why supposedly “environmental” organizations like Sierra Club California that oppose pro-infill measures like SB 827 are actually damaging the environment by doing so.
Cape Town, South Africa, a city of about 4 million people, is just three months away from having to shut down their water supply for residents, barring rain between now and then. Residents will then have to line up at 200 sites around the city to pick up a ration of 6 gallons of water per day per person.
How did this major city, which ironically won an international award for water conservation at the Paris UN climate talks in 2015, end up in this situation? Climate change-induced drought, a growing population, and poor planning are the major culprits. As Warren Tenney from Arizona Municipal Water Users Association explained:
Cape Town’s reservoirs are drying up. There is no precedent in their records for three consecutive years this dry. The extreme drought is compounded by a 79 percent growth in population since 1995, while water storage capacity increased only 15 percent. Plans for developing new water supplies, including a desalination plant, are behind schedule. Steps were not taken early enough to head off this slow-moving disaster. Cape Town is now trying to catch up by lowering water pressure in its distribution system and investing in a far-reaching public information campaign to conserve water. These actions have helped to cut the city’s daily water consumption by 45 percent. If Cape Town can reduce consumption yet another 25 percent, they may make it to the rainy season that is supposed to begin in May – if the drought eases and it rains.
Cape Town’s situation should be particularly alarming for California and other parts of the American West that only get rain during winter seasons. Cape Town has a Mediterranean climate like California with long dry spells, plus a similar agricultural industry. Climate change is already contributing to major droughts on the West Coast, and our growing population could one day face Day Zero conditions as well.
What can be done? The obvious step is to encourage as much water conservation as possible, and use recycled wastewater as much as possible as well. Secondarily, we need to be smarter about our groundwater usage and ensure that we leave enough groundwater in our aquifers as possible (California’s 2014 groundwater legislation is for the first time spurring needed management of this resource here). And finally, we’ll need to explore options to boost supplies through desalination. But this costly and potentially polluting step should be a last resort, after conservation and recycling measures (my Berkeley Law colleague Mike Kiparsky is featured in this Wired article explaining the drawbacks of desalination).
These steps may help other jurisdictions avoid a Day Zero scenario — but for how long? As climate change takes us into unprecedented weather changes, even these actions may not be enough. But that’s no excuse for not trying or not planning.
Scientists have been stunned by a recent surge in climate warming, potentially exacerbated by warm waters during the recent El Nino event, per E&E News [paywalled]:
Global temperatures rose by a record-breaking amount between 2014 and 2016, new research finds.
Over the course of three years, mean surface temperatures jumped by nearly a quarter of a degree Celsius, or more than 0.4 degree Fahrenheit. That’s a whopping 25 percent increase over the total amount of warming the Earth has experienced in the last 150 years. The research is published in the journal Geophysical Research Letters.
“As a climate scientist, it was just remarkable to think that the atmosphere of the planet could warm that much that fast,” Jonathan Overpeck of the University of Arizona, one of the paper’s authors, said in a statement.
Basically, it appears that most of the warming on Earth over the past few decades has been absorbed by the oceans, which then released a lot of that heat during the recent El Nino event.
This kind of data isn’t exactly news to those who have seen the changes on the ground, particularly in our western mountains. Take for example the reclusive 67-year-old Billy Barr, who has spent the last 46 years in a remote cabin in the Rocky Mountain woods. In a Denver Post profile, Barr describes how he began taking notes every day on weather in 1974 out of boredom, recording the low and high temperatures, snow-water equivalent and snowpack depth.
He doesn’t necessarily analyze his data. But he’s seeing a trend: It’s getting warmer. The snow arrives later and leaves earlier.
Lately, he’s charting winters with about 11 fewer days with snow on the ground; roughly 5 percent of the winter without snow. In 44 years, he’d counted one December where the average low was above freezing — until December 2017, when the average low was 35 degrees.
Interview with billy barr, accidental captain among climate researchers
More than 50 percent of the record daily highs he’s logged have come since 2010. In December and January this season, he already has counted 11 record daily-high temperatures. Last year he tallied 36 record-high temperatures, the most for one season. Back in the day, he would see about four, maybe five record highs each winter.
Meanwhile, Colorado just had its second-warmest year on record and is now at a 30-year low in snowpack. And in California, a study published in the hydrological science journal Water shows that in Tahoe and the northern Sierra Nevada mountains, the average elevation of the “snow line,” where snow turns to rain during a storm, has risen roughly 236 feet over the last 10 years.
All in all, the global data show more of these on-the-ground changes in our near future.
This has been a tough year for the environment at the national level in the U.S. We’ve seen rollbacks of key protections, gutting of enforcement, efforts to open up resource extraction on public lands, and a general abdication of climate leadership at home and globally.
On the bright side, states and cities have stepped into the void, providing some good news. And nonprofits have had wins at the courts and at various other levels of government. Businesses, meanwhile, have also made progress, particularly on various clean technologies like battery electric vehicles and renewables. Business leaders have also become strong advocates for climate action globally.
And internationally, countries like China are now aggressively tackling carbon emissions, both with carbon markets and other efforts to control pollution, as well as by promoting clean technologies.
With the holidays here, I’ll be off blogging until the new year, which will hopefully bring some better results overall than we’ve seen in 2017 on the environment and other critical challenges facing the globe.
Wishing everyone the best until then. And in the meantime, enjoy Joni Mitchell’s reflective take on the holidays from 1971:
Climate change and the recent five-year drought in California have now rendered 129 million trees dead in the Sierra Nevada mountains. As a result, they are more susceptible to catastrophic wildfires like we’ve seen this fall in the state’s coastal areas. California’s Sierra Nevada Conservancy has an informative webpage and videos dedicated to educating the public about the urgent need to manage the forests better.
The first video describes the immediate need for action, particularly given the forests’ role as a carbon sink, which could turn to carbon emitter without better management:
The next video shows the risk of continued fire suppression and need for more controlled burns:
The final video shows what controlled burns can do to restore the forest ecosystem and provide carbon benefits:
But as UC Berkeley forest expert Van Butsic described in a recent interview with Water Deeply, the solutions will be controversial, most likely requiring “mechanical thinning” in forests that some environmental groups oppose. Yet as the videos above make clear, the price of continued inaction and fire suppression will only exacerbate the environmental catastrophe we are facing in these beautiful mountains.
When President Trump announced in June that the United States would be withdrawing from the Paris climate agreement, France eagerly stepped into the leadership void. French President Macron responded by offering millions in new grant money for climate science research.
The winners were just announced, including 13 American scientists out of the 18 winners. I spoke to KCBS radio in San Francisco yesterday about the program and what it means for the United States going forward. You can listen to the 4 minute clip here:
Sean Illing in Vox.com conducted a fascinating interview with Steven Sloman, a professor of cognitive science at Brown University, about how we arrive at the conclusions we do. In short, the process (and outcomes) are not pretty, as Dr. Sloman relates:
I really do believe that our attitudes are shaped much more by our social groups than they are by facts on the ground. We are not great reasoners. Most people don’t like to think at all, or like to think as little as possible. And by most, I mean roughly 70 percent of the population. Even the rest seem to devote a lot of their resources to justifying beliefs that they want to hold, as opposed to forming credible beliefs based only on fact.
Think about if you were to utter a fact that contradicted the opinions of the majority of those in your social group. You pay a price for that. If I said I voted for Trump, most of my academic colleagues would think I’m crazy. They wouldn’t want to talk to me. That’s how social pressure influences our epistemological commitments, and it often does it in imperceptible ways.
He concludes that if the people around us are wrong about something, there’s a good chance we will be too. Proximity to truth compounds in the same way. And the phenomenon isn’t a partisan problem; it’s a human problem on all sides of political debates.
In some ways, it’s understandable how this dynamic arose in our species. There’s no way one brain can master all topics, so we have to depend on other people to do some thinking for us. This is a perfectly rational response to our condition. It also may explain why traditional societies often relied on a few religious leaders to make a lot of the key decisions for a society that would rather not have to think too hard about broader societal problems and instead focus on problem-solving in their own immediate lives. The problem though becomes when our beliefs support ideas or policies that are totally unjustified.
So are we doomed to a fate of group-think with the risk of unsupportable beliefs? Dr. Sloman doesn’t think so, noting that some professions train people not to fall into this trap:
People who are more reflective are less susceptible to the illusion. There are some simple questions you can use to measure reflectivity. They tend to have this form: How many animals of each kind did Moses load onto the ark? Most people say two, but more reflective people say zero. (It was Noah, not Moses who built the ark.)
The trick is to not only come to a conclusion, but to verify that conclusion. There are many communities that encourage verification (e.g., scientific, forensic, medical, judicial communities). You just need one person to say, “are you sure?” and for everyone else to care about the justification. There’s no reason that every community could not adopt these kinds of norms. The problem of course is that there’s a strong compulsion to make people feel good by telling them what they want to hear, and for everyone to agree. That’s largely what gives us a sense of identity. There’s a strong tension here.
He’s also pioneering some research on ways to reframe political-type conversations from a focus on what people value to one about actual consequences. As he notes, “when you talk about actual consequences, you’re forced into the weeds of what’s actually happening, which is a diversion from our normal focus on our feelings and what’s going on in our heads.”
This work could contribute to a better understanding about public perceptions around climate change. For example, the denial of basic climate science can certainly be attributed to group-think. But as Sloman posits, reframing the messaging from the science to the outcomes of climate mitigation (such as a cleaner world, less dependence on extractive industries for fuel) might open more in the middle to taking action. We could also focus on training the next generation to be more open-minded on evidence and arguments, as with the scientific, medical and judicial fields.
But just being aware of our mental processing of information and beliefs is a good start to addressing the problem of when those processes take us in the wrong direction.
California’s 2030 climate goals will be a big step forward for the state. We’re already making good progress achieving our 2020 goals (to return to 1990 levels of carbon emissions), with the state likely to hit that goal a bit early thanks to the global recession and the plummeting price of renewables. But the 2030 goals require an additional 5% reduction per year in emissions for the 2020s, to reduce our levels 40% below 1990 emissions. That’s a tall order.
Electric utilities will be a big part of the solution, but not just because of their efforts to decarbonize the electricity supply. They’re also needed to expand the kinds of things that can run on electricity instead of petroleum or natural gas.
SCE used an analysis from the consulting firm E3 that found the cheapest of three pathways to meeting the state’s 2030 emissions goals entails electrifying 24 percent of light-duty vehicles and 15 percent of medium-duty vehicles, in addition to reaching an 80 percent carbon-free electricity target. It also would require 30 percent of residential and commercial water and space heaters to run on electricity rather than gas.
This pathway seems achievable at a reasonable cost, given the advances in battery technologies on the vehicle side. Still, we will need to keep the federal tax credit in place or find a viable substitute to keep demand for EVs strong in the short run.
On the furnace and water heating side, we’ll need some new, cheaper products to wean buildings off of natural gas and onto clean electricity. But the good news is that achieving the 80% carbon-free electricity goal by 2030 may not be so daunting, given that we may be on track for 60% renewables by 2030 anyway, plus all the large hydropower that doesn’t count under the renewables mandate.
As always with the future, there are plenty of variables and unknowns. But California’s progress to date on clean tech gives us a clear idea of what’s needed — and what the costs may be — to achieve the 2030 goals.
Back in 2009, when I first started working full time on climate change law and policy at Berkeley Law, I saw a presentation by a representative of a Napa winery at a California Assembly select committee hearing on climate change. Thomas A. Thornhill III, a partner at Parducci Wine Cellars/Paul Dolan Vineyards, showed the following slide:
As you can see from the chart, as the temperature warms (the red line), certain Napa grapes just won’t be able to survive anymore in the Valley, such as chardonay and sauvignon blanc (although it’s good news for raisin production in the Valley, for what that’s worth)
I thought about that slide over Labor Day weekend this year, when a record-breaking heat wave with temperatures up to 117 degrees hit Napa. This new normal of extreme weather destroyed some of Napa and Sonoma’s premium cabernet grapes, as Bloomberg reported:
Vineyard consultant Steve Matthiasson, who also makes wines under his eponymous label, admitted, “The heat wave screwed us up.” While you need warmth to ripen cabernet, you don’t want too much, and this summer Napa had more than two dozen days with temperatures over 100 degrees. Before the grapes were completely ripe, an extreme heat wave on Labor Day weekend, which didn’t cool down at night, caused grape dehydration. As juice evaporated, some of the unripe grapes shriveled into raisins.
As a result, the wine grape crop is likely to be smaller than expected this year in California and beyond, down from 5 to 35 percent for some individual blocks of vines.
While this was a high-end casualty of climate change, it’s a demonstration of what will happen to the broader multi-billion agricultural industry across places like California as we veer into an increasingly hotter world.
And that’s nothing to toast.