Combined UCS Blogs

3 Reasons Why Federal Energy R&D is a Wise Investment

UCS Blog - The Equation (text only) -

Former Texas Governor Rick Perry will head into confirmation hearings in the next few days to become the next Secretary of the Department of Energy (DOE). A colleague of mine penned a great piece about what we might expect from Perry as head of the DOE. One key thing to highlight is that the DOE is responsible for investments in research and development (R&D), particularly related to energy. In an era where Congress is looking to trim federal spending, they should keep in mind that investments in energy R&D are a wise and proper use of limited federal government resources. Here’s why.

What exactly is R&D, and how much does it cost?

First, what exactly do we mean by R&D? Research and development generally refers to basic and applied research that leads to the creation of new products or the improvement of existing products—companies, for example, typically invest in R&D to maintain their leadership and market share.

So why should the federal government care about R&D? In short, there are many examples of innovation and discoveries that are simply too big and too risky for any single private enterprise to undertake. In these cases, the federal government recognizes the importance to society of research and development and advances technologies to the point where businesses and entrepreneurs can take over.

The benefits to society are broad, ranging from medical technology to clean energy. Federal investments in R&D have traditionally enjoyed bipartisan support, and that should continue.

And, the investments are small, compared to the overall federal budget. According to the American Association for the Advancement of Science, which has been tracking federal investments in R&D for decades, the federal government spent about $64 billion on non-defense R&D in FY16, amounting to about 1.6 percent of the total federal budget. The DOE received about $14.4 billion in total R&D in FY16, a mere 0.36 percent of the total federal budget. And that includes a lot of important basic research that is not what we typically think of as “energy.”

 

Federal agency spending on R&D, including both defense and non-defense programs.

Federal agency spending on R&D, including both defense and non-defense programs.

H/T to AAAS and their cool interactive data page, where I pulled the chart above, and where you can geek out over different ways of looking at this information.

So, here are the top three reasons why Congress should maintain support for federal energy R&D programs, even as they consider tightening the federal purse strings.

1 – Federal investments in energy R&D strengthen the economy

Federal R&D stimulates the economy and creates jobs.

  • For starters, some 110,000 people are employed by our national labs, which are managed by the DOE;
  • Universities currently receive some 60 percent of their research funding from the federal government, helping to train the next generation of scientists and engineers in STEM education;
  • Research conducted at the DOE and the national labs leads to ideas and technology that entrepreneurs can pick up and run with. There is so much demand for new technologies that Oak Ridge National Laboratory last year opened an office in Chattanooga in order to “link local companies to the national laboratory’s resources and expertise.”
  • Signed into law by President Bush in 2007, the DOE’s Advanced Research Projects Agency-Energy (ARPA-E) has been successful in overcoming the long-term and high risk barriers to developing innovative energy technologies. Since 2009, ARPA-E has funded over 400 energy technology projects. As of October 2016, ARPA-E teams have formed 30 new companies and 45 projects teams have attracted more than $1.25 million in private-sector follow-on funding.
2 – Federal investments in energy R&D are critical to advancing new life-changing technologies

I can’t possibly list in one short blog post how much our collective lives have been improved and changed because of federal investments in energy R&D. Just to name a few:

  • At the DOE, decades of investments in R&D on hydraulic fracturing and horizontal drilling techniques have opened up unconventional oil and gas resources, leading to a dramatic decline in natural gas prices over the last ten years. One can argue about the positives and negatives of this technology (for example, we are very worried about an overreliance on natural gas), but there’s no question that these investments have fundamentally changed our electricity system.
  • R&D for carbon capture and sequestration is critical to making this technology cost effective, which is needed if coal and natural gas are to play a significant part in our energy system moving forward in a carbon-constrained world.
  • In the category of “things most of us never think about,” exhibit A is flipping on the lights. Here too, the DOE’s work modernizing the electricity grid is critical—and Rick Perry understands a thing or two about energy infrastructure.
3 – Federal investments in energy R&D demonstrate and maintain American leadership

Finally, federal investments in R&D ensure that America maintains a competitive advantage globally.

  • Back in 2015, the U.S. signed on to Mission Innovation, which is a global initiative to accelerate public and private clean energy innovation. Twenty countries signed onto the initiative seeking to double R&D investments in clean energy over five years. Even though this commitment was made during the Obama administration, it should receive bipartisan support because it will help maintain American leadership in the clean energy space.
  • Drastic reductions in the cost of wind energy are, in part, a result of the $2.4 billion dollars invested by the DOE in wind R&D between 1976 and 2014, which has enabled many key innovations such as the taller turbines, longer blades, and improved electronics.
  • The DOE estimates that the $4.1 billion investment in PV technology research and development (R&D) from 1975 through 2008 accelerated the cost reduction progress by an estimated 12 years, while providing a net economic benefit of $16.5 billion.
  • Public-private partnerships R&D investments have also been critical in bringing down costs for LED lighting by 90 percent since 2008

For those last three bullets, check out this DOE report from 2015.

Conclusion

Governor Perry should recognize the importance of federal R&D investments in energy; from his time in Texas, he certainly understands the value of these investments in terms of local economic development. I’ve highlighted only a small sample of ways that federal investments in energy R&D have benefited society—there are many, many more. And those benefits have come at a relatively small cost to taxpayers.

And the benefits of all federal R&D investments extend beyond the energy sector. America won the space race back in the 1960s because the federal government made it a priority, and ponied up the cash to make it happen. The benefits to society have been far-reaching, not the least of which was inspiring an entire generation of men and women to pursue careers in science and engineering. Congress should remember that federal investments in R&D represent some of the biggest bang-for-the-federal-buck of any use of taxpayer money.

The Importance of Traditional Ecological Knowledge (TEK) When Examining Climate Change

UCS Blog - The Equation (text only) -

It all started with a simple conversation over lunch. The fuse had been lit, the spark began, and the first step had occurred in my journey, unbeknownst to me at the time. Later that day, I realized, for the first time in my life, I had experiences that were unique. And, I realized I held knowledge. Knowledge that was different from others; knowledge that went beyond the scientific or academic type, and that ran richer, deeper, more extensive. Sitting over sandwiches, sitting with culture, sitting with knowledge.

Accumulating knowledge through experience

Truth be told, it had begun much further back, as far back as I can remember, but blissfully unaware. Lunch with a friend brought it all barreling to the forefront of my destiny, and my ancestors’ wishes. I was talking with a friend about life, when he mentioned skeletal remains that had been unearthed underneath a bridge, and how the tribe had been explaining to the non-Native state and local agencies involved that the site was one of the traditional places our Native bands migrated to and from. Traditional Ecological Knowledge (TEK) was far from being on the radar for non-Native agencies at that point, and even as the Native American Graves Protection and Repatriation Act existed, the common attitude about Native Traditional Knowledges (TK) was that it was frequently discounted and dismissed.

I was infuriated at how remains and burial areas could be so flippantly desecrated, when I asked the question about taking the agencies involved to task to acknowledge our Indigenous Knowledge. I was informed that at that time, our tribe did not have a document that would convincingly show our position, and all knowledge that had been documented was brushed off as fable-like stories.

I was initially in shock, because I’d grown up continuously learning information outside of academic schooling confines. I’d never realized how much vital information I had amassed, much of it while simply playing, until it came flying back upon reflection. It reached its peak that day when I returned home to have dinner with my Mom and Dad. I listened as Dad discussed the way Native members of a committee were being brushed off by a state agency as they sat in a meeting that they’d been specifically invited to as tribal hunters and gatherers. The tribal members shared TEK about why the deer in the Western Oregon region were losing hair. Years prior to western scientific information finding an exotic lice species responsible for what is now termed Hair Loss Syndrome, Native tribal members identified the very patterns that had been noted and passed them along through a combination of TEK and TK data. My father, along with other Elders, detailed how the warming trends had allowed a surge of “bugs” to “chew” on the deer, and the massive amount of hair loss that they were all witnessing. They’d outlined the areas in Western Oregon that had been the worst hit, held knowledge collaborations with other hunters and gatherers, and shared ongoing discussions about tribal lands and the other species that were being impacted, directly and indirectly. I sat listening, as I had done so many times before, but literally stopped eating; I suddenly realized I’d reached an awareness level I’d not been at before. There, with those two seemingly innocuous conversations, it began. My journey as a scientist, but more importantly as an educator and facilitator regarding TEK and TK.

Integrating different types of knowledge

Just as with any system, applying and infusing TEK into studies and research is not a guarantee of clarified clear-cut results in a specific topic or area, and following the Indigenous community’s guidelines is imperative when working with any Indigenous tribe or community. The exploitation and theft of Native communities’ information and resources has left an indelible mark,  which must be approached with careful consideration and allowance of Indigenous oversight to sensitive material and intellectual property. The history of theft and destruction is the reason that the publication Guidelines for Considering Traditional Knowledges (TKs) in Climate Change Initiatives was developed for protection.

Wild horses are an integral aspect of the TEK as an indicator species for the Duckwater-Shoshone tribe (NV)

Wild horses are an integral aspect of the TEK as an indicator species for the Duckwater-Shoshone tribe (NV)

Traditional Knowledges are foundational systems with which most Indigenous populations operate.  Traditional Ecological Knowledge evolves from generations of experience; a base that is incomparable in terms of the depth, breadth, and holistic perspectives that it provides for a given ecosystem. While there can be many forms of knowledge, such as Local Ecological Knowledge (LEK), Farmer’s Ecological Knowledge (FEK), Fishermen’s Knowledge (FK), TEK is often highly developed relating to traditional Indigenous areas, and can span hundreds of years back through multiple generations. In most of the nine federally recognized tribes in Oregon, families relied on detailed information being correct, as they lived subsistence lifestyles. Survival depended on informational accuracy, and concerted sustainability efforts. Even small environmental indicators such as squirrel behavior in the fall, or caterpillar markings, can illustrate a TEK data set that has been established and relied on for other traditional activities, such as gathering or hunting and/or fishing. Much trading occurred between all tribal systems, particularly in the Columbia Gorge at Celilo Falls, and different areas’ TEK was often shared for planning purposes. Western Oregon and Eastern Oregon have very different climate and weather patterns, but reliance on TEK information systems was vital for all Natives. Even today, more traditional aspects of cultural information rely heavily on reciprocity and sustainability aspects of TEK for maintenance of cultural traditions and traditional value systems.

Traditional Ecological Knowledge is often discounted as “irrelevant” in ideologies which are based in traditional western scientific paradigms. Dr. Kyle Powys Whyte expertly articulates how western scientific assumptions discount TEK. Colonist thought processes are still prevalent, as evidenced in science curricula. Very little Indigenous information is available for students, at any level, and the lack of TEK and biases are then carried into professional realms. Working to shift this paradigm can be difficult, and daunting. As described in Paul Nadasdy’s book Hunters and Bureaucrats: Power, Knowledge, and Aboriginal-State Relations in the Southwest Yukon, when Indigenous people are invited to conferences and workshop, they are expected to utilize the vocabulary and manner of western science. The invitations come with expectations of addressing one issue at a time. Each issue, or resource, is expected to be divorced from all others, which makes accuracy for TEK experts extremely difficult. TEK is holistic and the expertise regarding the ecosystem addressed, relies on interdependence behaviors of multiple species and is uniquely separate from other, even nearby, ecosystems. TEK observations, sustainability practices, and active participation in TEK resource use and management rely on information databases that can extend back hundreds of years. These long held foundations have often been exclusionary, and TEK still remains the “underdog”, if you will, in western scientific contexts.

Writing, presenting, and collaborating in traditional science areas of research and development are accompanied with challenges of TEK information systems. Because TEK research is relatively new, and due to its interdisciplinary aspect, it’s slow to be accepted and integrated into western science methodology, and funding is not as accessible as it is in other areas. I’m continuously looking for grant funding to continue my research. Increased TEK documentation that is in accordance with the aforementioned Climate and Traditional Knowledges Workgroup (CTKW) guidelines will help contribute to the information of how climate change effects are impacting aspects of anthropogenic causes of climate change and other human impacts on the environment.

I strive to contribute information, along with other TEK scientists and Indigenous communities, to illustrate the relevant contributions that TEK has in scientific communities, and the positive impacts it has in tribal nations. I strive to help change the preconceived notion that TEK is a misnomer, and irrelevant to western scientific systems. TEK is somewhat like an outlier dataset point that, when examined and applied, can illuminate the entire context of the topic. TEK can help to clarify, enhance, and even augment knowledge that is long believed to have been studied exhaustively. When properly applied, TEK can often create a 3D approach to climate change issues presently, rather than the usual 2D regular printed paper or computer screen analyses that have been traditionally relied upon. TEK offers an integrated system of environment and timing knowledge that adds a dimension where none has been fully examined previously. It is the Indigenous science that puts faces and names in congruence with places and events, and assists in the long term assessment of what exactly is going on, by looking at long-held trends from the past.

An added dimension in studying climate change

Many scholars are, and have been, examining climate change issues from a very pragmatic and logical regimented approach that is rooted in western scientific dogmas. Everything from temperatures to land base changes, agricultural crops impacted to increased diseases that are being altered from climate change events. This logic and pragmatism provides much needed information, but difficulties arise when data is deficient in areas of human interaction with the environment, and impacts to human cultural issues. Models that are run for specific tasks cannot offer nor evaluate qualitative measures of human interaction issues such as cultural impact adaptations, traditional food set shifting, or phenology sequencing in relation to traditional cultural activities. Multiple data sets and models are run daily on issues at hand happening worldwide, lacking the insight that TEK can offer. TEK adds a holistic approach to climate change that no other data set can provide. Through the depth, breadth, and length of documented TEK and TK, there is a wealth of information that models and western science cannot reach though western science approaches alone. Human interaction and observation of the environment has been commonly relied upon for multiple generations. This type of interaction is noted in petroglyphs, and in communities FEK, FK, and LEK provide a much shorter timeframe and often a more limited dataset than Indigenous TEK, however. There is a realm of information offered that is complementary, or even new in some instances, when TEK is applied and adjusted to examine environmental events that are occurring. Dovetailing TEK and western scientific methodology can provide datasets that address climate change Impacts in an effective holistic manner, and more comprehensively illustrate human interfacing systems.

Traditional canoe of the Quinault Indian Tribe (WA)

Traditional canoe of the Quinault Indian Tribe (WA)

My 2013-2014 research work was funded through the Northwest Climate Science Center and Oregon Climate Change Research Institute, involving in-depth research with tribes in Pacific Northwest. My research examined climate change impacts to Northwest Native traditional culture and practices. My 2015 research work through generous support of the Great Basin Landscape Conservation Cooperative extended that research to include tribes in the Great Basin region. This research involving tribes’ TEK traditional cultural adaptation responses to climate change both in the Pacific Northwest and the Great Basin brought forth new results of a time and phenology issue perceived in Native American culture that extend beyond seasonality. This newfound timing issue is based on surrounding environmental cues rather than the linear time sequencing that is common with clocks, and calendars devised from abstract time creation. Additionally encompassed are various levels of anticipated changes, with resulting adaptation response measures by Native communities. Adaptation responses included practices such as traditional food substitutions, adjusting timing sequences of hunting, gathering, fishing, or ceremonial events, or noting the changes in environmental and ecological cycles. These responses brought forth results that models and data sets could not have produced alone.

Much like analyzing tree rings for fire, disease, and flood events, TEK can offer a broader view of ecological and scientific topics researched and examined  that are localized in nature, but broad in perspective. Trends that have been documented through generations are more likely to offer detailed long term data patterns, provide tools for a better analysis, and add more comprehensive insight than stand alone western scientific methodologies. Items such as basket materials, regalia changes and fluctuations (due to materials being impacted by events such as floods, fires, or other catastrophic impacts can cause alterations and fluctuation patterns to traditional regalia and use), or even cooking and eating utensils can provide data that can be added into assessing climate change researched topics such as weather fluctuations, tree material adaptations, foods and crop impacts, any issue relating to environmental composition and the human interaction with environmental resources. Even songs, or stories that were once assumed to be merely entertainment can prove to be valuable tools in the quest to understand our changing environment and climate change events.

TEK, when applied, has been able to realize information that can clarify climate change research and analyses further, adding to the base knowledge about cycles and anticipated results, explaining certain impacts with an added depth and breadth that has been lacking in western scientific methods sans TEK. In this time of climate change uncertainty, TEK offers a tool that, can be applicable for insightful results, bridging the interdisciplinary gap that has existed within the traditional rigor of conventional scientific research. Unconventional methods are now at the forefront of addressing climate change research, information, analyses, and policy. This is one of the many ways that Traditional Knowledges can provide understanding in a rapidly changing world.

Dr. Samantha Chisholm Hatfield is an enrolled member of the Confederated Tribes of Siletz Indians, from the Tututni Band, and is also Cherokee. She earned a Doctorate from Oregon State University in Environmental Sciences focusing on Traditional Ecological Knowledge (TEK) of Siletz Tribal Members, from Oregon State University. Dr. Chisholm Hatfield’s specializations include: Indigenous TEK, tribal adaptations due to climate change, and Native culture issues. She’s worked with Oregon Climate Change Research Institute, and successfully completed a Post-Doctoral Research position with Northwest Climate Science Center. She’s spoken on the national level such as the First Stewards Symposium, National Congress of American Indians, Northwest Climate Conference, and webinars. She’s helped coordinate tribal participation for the Northwest Climate Science Center and Oregon State’s Climate Boot Camp workshops. Her dissertation has been heralded nationally by scholars as a template for TEK research, and remains a staple conversation item for academics and at workshops. She is a Native American Longhouse Advisory Board member at Oregon State University, was selected as an H.J. Andrews Forest Visiting Scholar, is actively learning Tolowa, Korean, and continues her traditional cultural practices. In her spare time she dances traditionally at pow wows, spends time with family, and is the owner of a non-profit organization that teaches the game of lacrosse to disadvantaged youth.    

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

New Report Shows Electric Vehicle Technology Advancing Faster Than Anticipated

UCS Blog - The Equation (text only) -

California’s influential Air Resources Board has just released a comprehensive assessment of the status of the state’s “Advanced Clean Car” regulations. While the report is not only about electric vehicles, the state’s Zero Emission Vehicle program is evaluated in detail. Overall, the findings are very positive on how California’s leadership on clean vehicle policy has spurred much of the auto industry to make new technologies available for consumers.

Electric vehicle technologies quickly moving forward

The 662-page report will take some time to digest, but the top line findings are clear: Electric vehicle technology is moving faster than was anticipated just 5 years ago. California leads the nation with over 250,000 EVs sold to date, and the number of plug-in vehicles is now approaching 30 models. The report cites many of factors that are accelerating the EV market, including dramatic improvements in battery performance and costs and the rapidly expanding charging infrastructure in California and the other states that have adopted the Zero Emission Vehicle regulation.

Stage is set to go further with clean car policies

The current pace of deployment also puts us on the road to increasing levels of EV adoption, displacing a growing amount of oil use and harmful emissions. One of the key recommendations in the report is that the Air Resources Board should move to adopt new ZEV standards to extend the current provisions which currently are set to plateau in 2025. The regulation will need to be strengthened to ensure that the state is on a trajectory to meeting both 2030 and later climate targets and the air quality standards in the state’s Central Valley and Los Angeles regions.

Strong signals needed to keep momentum

There are many positive signs for EVs. December 2016 set an all-time high for EV sales in the US, almost double the rate from a year ago. The introduction of the Chevy Bolt, the first long-range battery electric at a mass-market price has generated significant interest, as did Tesla’s announcement of its lower-cost Model 3.

However, not all automakers are shifting to clean technologies with the same effort. Some have zoomed ahead, like General Motors with plug-in cars making up over 5% of their new cars sold in California. On the other hand, several major car companies, like Honda and Fiat Chrysler, have done the bare minimum to comply with California regulations and almost nothing outside of the state. Therefore, the current Zero Emission Vehicle regulation needs to not only continue, but be strengthened. The report lays out the technical evidence for these conclusions.

NOAA and NASA Confirm: 2016 Is Warmest Year on Record.

UCS Blog - The Equation (text only) -

NASA and NOAA held a press conference today, where they confirmed what had been anticipated for a few months now: 2016 broke all records, and is officially the warmest year. NOAA measurements put 2016 at 1.69°F (0.94°C) above the 20th century average, making it the third year in a row to break the record. The main point stressed over and over during the press conference is that, even if the numbers themselves are impressive, this is a multi-decadal trend started in the 70’s, not an isolated, random fact. The significance of this cannot be overstated: in a time when science is basically under attack by a new, incoming administration, facts and data do matter. Or do they?

 NOAA

Image: NOAA

Data and facts have been pointing to a simple fact: the earth is warming

Despite the large amounts of measurements and actual data from global observation points, many still deny the fact of human-caused climate change. In fact (and unfortunately) many of the Cabinet nominees in the new administration insist that yeah, there may be warming, but we don’t know the actual role of human emissions, and/or we cannot tell what is going to happen. Those are absurd statements, as it has been shown (repeatedly) that (1) Carbon dioxide traps heat and exerts major influence on Earth’s temperature when its concentration increases or decreases; (2) Warming of the climate system is unequivocal, and since the 1950s, many of the observed changes are unprecedented over decades to millennia; and (3) Human influence has been the dominant cause of the observed warming since the mid-20th century.

“I am not a scientist” is code for “I don’t need to know”

Or rather, “I don’t care.” Not everyone is a scientist, obviously. Not everyone needs to know scientific minutia, formulas and models. But pretty much everyone understands that science (and scientific data) has brought us knowledge, progress, and many benefits. To deny scientific facts and data to make a misleading point meant to cater to one’s interests will NOT change the facts or the data – and yet, we are seeing it every day at the nominations hearings, especially when it relates to climate (not to mention for the past decade or longer). To say that “we don’t know what will happen” is an actual lie. We DO know what will happen, temperatures will keep going up. What we don’t know is the pace and magnitude of global warming – because it depends on the actual amount of emissions dumped in the atmosphere, an obviously unknown fact which depends on our energy choices, which in turn depend on the implementation of the Paris Agreement, on the fulfillment of each nation’s pledges, the successful transition to renewable energy, and the timeline of all these actions.

It is essential to uphold scientific integrity in the new administration

Scientific integrity may be under attack in the new administration. Many Cabinet nominees have strong records of disparaging science or undermining science-based policies. When science turns into a political tool, everyone loses. There is a lot at stake, and therefore it is more important than ever to stand up for science. Scientists are organizing and speaking up, and mobilizing to protect scientific data. Those are steps that shouldn’t need to be taken, but right now are essential so that science doesn’t get swept up under the new administration carpet.

Can Trump Revive the Coal Industry? Lessons from the Petra Nova and Kemper Projects

UCS Blog - The Equation (text only) -

During the campaign, President-elect Trump promised to revive the coal industry.  As others have reported, gutting EPA regulations designed to protect public health and the climate will have little impact in reviving the industry since the recent decline in burning coal to generate electricity is primarily due to low natural gas prices and cost reductions for wind and solar power. Ironically, rolling back regulations on the oil and natural gas industry would likely make it even more difficult for coal to compete economically.

Trump has also supported the development of so-called “clean coal” as a way to revive the industry. A key component to making coal cleaner is the process of capturing the carbon dioxide (CO2) emissions from burning coal in power plants or industrial facilities, and transporting and storing the CO2 underground so that it cannot add the to atmospheric build-up of carbon emissions that is driving climate change (a process otherwise known as carbon capture and storage, or CCS). While UCS supports CCS as a potential climate solution, we believe the term “clean coal” is an oxymoron because of other environmental and public health impacts of using coal across its fuel cycle.

CCS is not a new idea.  The Obama Administration invested about $4.8 billion in CCS and the Bush Administration spent millions on R&D, tax credits and loan guarantees for CCS. Since the primary reason to do CCS is to reduce carbon emissions, one big question is will Trump support funding for CCS if he truly believes climate change is a hoax?

With the Petra Nova project in Texas beginning commercial operation on January 10, and the Kemper project in Mississippi currently scheduled to go online by January 31, coal with CCS technology has reached an important milestone.

Both projects received federal incentives from the U.S. Department of Energy (DOE) that were vitally important to their development. Below I discuss some key lessons learned from these projects, the implications for future projects if Rick Perry becomes DOE Secretary, and the longer-term outlook for CCS as a potential climate solution.

 NRG.

Petra Nova post-combustion CO2 capture project at an existing pulverized coal plant near Houston, TX. Source: NRG.

The tale of two CCS projects

The Petra Nova and Kemper projects have a few similarities and several differences that offer valuable lessons and insights for future CCS projects. Both projects will inject CO2 into depleted oil fields for enhanced oil recovery (EOR), sequestering the CO2 while increasing pressure at the aging field to produce more oil and help offset costs. However, the projects use different generation and capture technologies, burn different types of coal, and have considerably different costs (see Table 1).

Petra Nova is a 240 Megawatt (MW) post-combustion capture facility installed at an existing pulverized coal plant near Houston that’s designed to capture 90 percent of the total CO2 emissions. A joint venture between NRG and JX Nippon Oil & Gas Exploration, the plant uses low sulfur, sub-bituminous coal from the Power River Basin in Wyoming.

Petra Nova was completed on time and on-budget in 2 years for $1 billion. It will also receive up to $190 million in incentives from DOE’s Clean Coal Power Initiative. In addition, NRG and JX Nippon have equity stakes in the oil field where they will be doing EOR, allowing them to capture more value.

Kemper is a new 582 MW integrated gasification combined cycle (IGCC) power plant in Kemper County Mississippi that’s designed to capture 65 percent of the project’s CO2 emissions prior to combustion when it is still in a relatively concentrated and pressurized form. Owned by Mississippi Power (a subsidiary of Southern Company), the plant will use low-grade lignite coal from a mine next to the plant.

Kemper has been under construction for 6.5 years, is nearly 3 years behind schedule, and the capital cost has more than tripled from $2.2 billion to over $7 billion. It will receive up to $270 million from DOE’s Clean Coal Power Initiative.

One reason for Kemper’s high cost and delay is that they added CCS to a brand new coal IGCC power plant, whereas Petra Nova added CCS to an existing pulverized coal plant. Another reason cited by the New York Times is Southern Company’s mismanagement of Kemper, including allegations of drastically understating the project’s cost and timetable and intentionally hiding problems.

Southern Company’s December filing with the Securities and Exchange Commission (SEC) shows cost increases of $25-35 million per month from Kemper’s most recent delays. They also lost at least $250 million in tax benefits by not placing the plant into service by December 31, 2016.  This is on top of $133 million in tax credits they had to repay the IRS for missing the original May 2014 in-service deadline. In November, they also increased the estimated first year non-fuel operations and maintenance expenses by $68 million.

Customers could pay up to $4.2 billion for Kemper under a cost cap set by the Mississippi Public Service Commission. Southern Company shareholders will absorb nearly $2.7 billion of the cost overruns.

Kemper is also having a negative impact on Fitch ratings for Southern Company. In contrast, Fitch ratings for Cooperative Energy (formerly the South Mississippi Electric Power Association) were recently upgraded from A- to A after bailing out of their 15 percent ownership stake in the project and moving from coal to natural gas and renewables.

Comparison of Kemper and Petra Nova Coal CCS Projects Kemper Petra Nova Type of power plant New integrated gasification combined cycle plant Existing pulverized coal plant retrofit Generation Capacity 582 MW 240 MW Ownership Mississippi Power and KBR NRG and JX Nippon Oil & Gas Exploration Power Plant Location Kemper County, Mississippi Near Houston, TX Type of coal Mississippi Lignite WY Powder River Basin sub-bituminous, low sulfur CO2 Capture Rate 65% 90% CO2 Capture Volume 3.5 million tons/yr 1.6 million tons/yr CO2 Capture Type Pre-combustion Post-combustion CO2 Capture Method Absorption physical solvent-based process (Selexol) Absorption chemical solvent-based process (Amine) CO2 Storage Enhanced Oil Recovery Enhanced Oil Recovery CO2 Transportation 61 mile pipeline to storage site 82 mile pipeline to storage site Construction Period 6.5 years 2 years Schedule Delay 3 years No delay Original Capital Cost $2.2 billion ($3,780/kW) $1 billion ($4,167/kW) Final Capital Cost $7 billion ($12,027/kW) $1 billion ($4,167/kW) DOE Incentives $270 million $190 million

Sources:  MIT, Global CCS Institute, NRG, Kemper project site.

How many CCS demonstration projects are needed?

In 2008, a UCS report titled Coal Power in a Warming World called for building 5-10 full-scale integrated CCS demonstrations projects at coal-fired plants in the U.S. to test different generation and capture technologies and storing CO2 in different sequestration sites. Our recommendations were consistent with recommendations from MIT’s 2007 Future of Coal report and a 2007 report by the Pew Center on Global Climate Change. In 2009, former DOE Secretary Steven Chu set a goal for the U.S. to have 10 CCS demonstration projects in-service by 2016. In 2010, the White House Interagency Carbon Capture and Storage Task Force also recommended bringing 5 to 10 commercial demonstration projects online by 2016.

Unfortunately, the U.S. is lagging behind these targets.  While there are 8 large scale CCS projects currently operating in the U.S., Petra Nova is the only large scale CCS project operating at a power plant, according to the Global CCS Institute. Besides Kemper, the only other power plant project listed under development is the Texas Clean Energy Project, which recently lost its DOE funding because of escalating costs and missed deadlines (see more below). The high profile FutureGen project in Illinois was also cancelled after DOE discontinued funding in 2015 because of cost increases and construction delays.

The 110 MW Boundary Dam project in Canada is the only other large-scale power plant coal with CCS project currently operating outside of the U.S.  After encountering several problems during its first year of operation that reduced the capture rate from 90 percent to 40 percent, the project performed much better in 2016.  Before declaring success at Petra Nova and Kemper, these projects will likely also have to go through a similar teething process to work out any bugs in the technology.

Eight more power plant CCS projects are in different stages of development in China, South Korea, the United Kingdom, and the Netherlands.

Rick Perry’s support for coal

Under the Obama Administration, the DOE’s Office of Fossil Energy and the National Technology Laboratory (NETL) have administered a robust Carbon Capture R&D program. The primary goal of this program is to lower the cost and energy penalty of second generation CCS technologies, resulting in a captured cost of CO2 less than $40/tonne in the 2020-2025 timeframe. Given Perry’s record of supporting coal as Governor of Texas, there’s a good chance he would support continued R&D funding for coal with CCS projects if he becomes DOE Secretary.  Here are a few examples:

  • In 2005, Perry issued a controversial executive order to fast-track the permitting process for 11 coal plants (without CCS) proposed by TXU, now called Energy Futures Holdings. UCS and other groups strongly opposed this coal build-out, which would have been disastrous for the climate. Only 3 of the plants were ultimately built.
  • In 2002, he supported setting-up a clean coal technology council in Texas.
  • In 2009, he signed a bill with tax incentives for clean coal to support projects like the Texas Clean Energy Project, a 400 MW coal gasification with CCS project in West Texas proposed by Summit Power Group. After missing several key deadlines and with the cost nearly doubling to $4 billion, the DOE discontinued funding in May 2016 (after spending $167 million) and asked Congress to reprogram $240 million of the incentives to other R&D efforts. Summit Power said this move would basically kill the project.
The high cost of coal with CCS

CCS advocates often dismiss the high costs of recent projects, arguing that this is expected for first-of-a-kind projects. They claim costs should come down over time through learning, pointing to other technologies like wind and solar as examples.

While it is reasonable to expect that CCS costs will come down, the question is how much and over what time period? Like nuclear power plants, CCS projects tend to be very large, long-lived construction projects that use a lot of concrete and steel, and equipment that is unlikely to be mass-produced in the way more modular technologies like wind turbines and solar panels are manufactured and installed over a much shorter period of time.

Several recent studies project the cost of coal with CCS to be much higher than many other low and zero carbon technologies. For example, the Energy Information Administration’s (EIA) projections from Annual Energy Outlook 2016 show costs for coal with CCS plants in 2022 that are 2-3 times higher than the cost of new onshore wind, utility scale solar, geothermal, and hydropower projects, not including tax incentives (see Table 2 on p. 8). The costs for biopower, advanced nuclear plants, and natural gas combined cycle (NGCC) plants with CCS are also somewhat lower. While EIA projects the costs for coal with CCS plants to decline ~10 percent by 2040, they project the costs for other low carbon technologies to fall by similar or even greater amounts.

ccs-cost

Source: EIA, AEO 2016.

EIA’s cost projections are consistent with other sources including NREL’s 2016 Annual Technology Baseline (ATB) report and Lazard’s most recent levelized cost of energy analysis.  They all show that adding CCS to natural gas power plants could be much more economic than coal.

Other studies show CCS applications at industrial facilities could also be less expensive. Industrial applications of CCS could also be more apt—for example in industries such as Iron and Steel and Cement production—where alternative low carbon, affordable technologies don’t exist. In contrast, the power sector has many technologies available today that can generate electricity without carbon emissions.

Role of CCS in addressing climate change

Many experts believe that reaching net zero carbon emissions by mid-century, in line with global climate goals, will likely require some form of CCS, along with nuclear power and a massive ramp-up of renewable energy and energy efficiency. Given the high cost of coal with CCS compared to other alternatives, it’s not surprising that recent studies show it playing a relatively modest role in addressing climate change. However, some studies analyzing the impacts of reducing power sector CO2 emissions 80-90 percent by 2050 show that natural gas or even biopower with CCS could make a more meaningful contribution after 2040.  For example:

  • A 2016 UCS study showed natural gas with CCS could provide 9-28 percent of U.S. electricity by 2050 under range of deep decarbonization scenarios for the power sector, but no coal with CCS (see Figure 5 below). Natural gas with CCS provided 16 percent of U.S. electricity by 2050 under a Mid-Cost Case and 28 percent under an Optimistic CCS Cost Case.
  • The November 2016 U.S. Mid-Century Strategy for Deep Decarbonization study released by the Obama Administration at the international climate negotiations in Marrakech found that fossil fuels with CCS would provide 20 percent of U.S. electricity generation by 2050 under their “Benchmark” scenario. While natural gas with CCS made the biggest contribution, both coal and bioenergy with CCS also played a role.
  • The 2014 Pathways to Deep Decarbonization in the United States report found that gas with CCS would provide nearly 13 percent of U.S. electricity under their “Mixed” case. They also modeled a High CCS case that “seeks to preserve a status quo energy mix,” in which they assumed CCS would provide 55 percent of U.S. electricity by 2050, split between coal and gas. However, this case also had the highest electricity prices.
  • DOE’s 2017 Quadrennial Energy Review found that under a scenario combining tax incentives with successful federal R&D, coal and gas with CCS could provide 5-7 percent of total U.S. generation in 2040. The scenario assumed a refundable sequestration tax credit of $10/metric ton of CO2 for EOR storage, $50/metric ton of CO2 for saline storage, and a refundable 30 percent investment tax credit for CCS equipment and infrastructure.

us-power-mix

Source:  UCS, The U.S. Power Sector in a Net Zero World, 2016.

Policy implications

Clearly, more DOE-funded R&D is needed to leverage additional private sector investment and demonstrate coal with CCS on a commercial scale using different technologies and at different sequestration sites, like deep saline aquifers. R&D efforts should also be expanded beyond coal to include natural gas and bioenergy power plants, as well as industrial facilities. We also need to develop a strong regulatory and oversight system to ensure that captured CO2 remains permanently sequestered.

Recent policy proposals to increase tax incentives for CCS could also help improve the economic viability of CCS. A price on carbon and higher oil prices for projects using EOR could also make a big difference, but the likelihood of either happening under the Trump Administration is slim. Given the high cost, long lead time, and limited near-term role of CCS as a climate solution, federal efforts should prioritize R&D and deployment incentives on more cost-effective low carbon alternatives like renewable energy and energy efficiency. While studies show coal with CCS could play a modest role in addressing climate change by 2050, it’s unlikely to be enough over the next four years to fulfill Trump’s promises to revive the coal industry.

Two Surprising Facts about NOAA

UCS Blog - The Equation (text only) -

This week the US Senate committee on Commerce, Science, and Transportation holds the nomination hearing of Wilbur Ross for Secretary of the Department of Commerce. If confirmed, Ross will take the helm of the department that has a critical mission to ‘create the conditions for economic growth and opportunity.’

There are two facts that may surprise you about the major bureau within the Department of Commerce—the National Oceanic and Atmospheric Administration (NOAA).

NOAA includes the oldest civilian science organization, established by Thomas Jefferson Ferdinand R. Hassler lead the first survey of U.S. coasts.

Ferdinand R. Hassler proposed a science approach that was adopted for the survey of US coasts. He was charged with overseeing the design and manufacture of special scientific instruments to accomplish the goal of the 1807 Act signed by President Jefferson. Source: http://bit.ly/scoast. Image source: NOAA

On Feb 10, 1807, Thomas Jefferson signed the act that led to the formation of the oldest science “agency” at the time, the Survey of the Coast. The purpose of the act was to survey “the coasts of the United States” including “…any other bank or shoal and the soundings and currents beyond the distance aforesaid to the Gulf Stream, as in his opinion may be especially subservient to the commercial interests of the United States.” (Emphasis added.)

From the very beginning, successful navigation by ships to and from US ports was understood to be foundational to economic activities of the young democracy. The agency may have changed names over time, but it still exists as part of NOAA in the Department of Commerce.

NOAA services support more than one-third of US GDP

According to the NOAA Chief Scientist’s 2016 report, products and services of NOAA affect more than one-third of the Gross Domestic Product (GDP) of the United States. This initially may seem surprising, but it starts to make sense if you think about just a few of the services provided.

NOAA helps ships navigate successfully to and from port by providing accurate nautical charts for the safety of maritime commerce. In addition, NOAA shares timely and accurate information on factors influencing water levels (tides, storm surge, sea level rise, etc.). NOAA also developed the Air Gap system which updates every 6 minutes to inform ship’s captains about changing conditions to keep ships from running aground or into the bridge.

Anyone who eats seafood benefits from NOAA’s stewardship of sustainable fisheries and healthy ecosystems in ways that support jobs and helps keep our seafood safe. NOAA has improved forecasts for harmful algal blooms. Scientists at the agency and cooperative institutes conduct research and monitoring for changes in fisheries and marine ecosystems from ocean acidification and temperature changes.

Businesses, farmers, homeowners and nearly everyone living in the US at some point makes important decisions based on weather forecasts. No matter your source of weather, all forecasts are underpinned by observations and models provided by NOAA through its National Weather Service (NWS) and National Environmental Satellite, Data, and Information Service (NESDIS).

National Hurricane Center Hurricane Track Improvement

Recent Tropical storm and hurricane forecasts from the National Hurricane Center (NHC) have improved track forecasts over time by narrowing the evacuation region over longer lead times (NOAA).

Working with domestic and international partners, NOAA provides increasingly accurate hurricane tracks which gives people more time to get out of harm’s way. More accurate forecasts help save lives and lessen disruptions to economic activities.

Annual US GDP in 2015 was 18036.6 billion US dollars. Services that support one-third of GDP are s worth it. NOAA’s budget is equivalent to only a nickel a day for every American (FY16 Budget).

 NOAA

Coastal intelligence for the 21st century. NOAA provides the latest science and observations for protecting lives and supporting a growing economy. Source: NOAA

  NOAA (http://bit.ly/frHassler) B. Ekwurzel created combined image with labels (left image: http://bit.ly/hurEvac; Right image: http://bit.ly/2jytXsf)

UCS to the NRC: Stop Dragging Your Feet on Important Nuclear Security Updates

UCS Blog - All Things Nuclear (text only) -

Yesterday, UCS sent a letter to Nuclear Regulatory Commission (NRC) chairman Stephen Burns urging the NRC to quickly issue new versions of two outdated security documents that play a critical role in defining how nuclear plants can be adequately protected against terrorist attacks.

 NRC)

NRC Chair Burns (Source: NRC)

The NRC requires nuclear power plants to be protected against radiological sabotage. The design basis threat, or DBT, specifies the characteristics of the attackers that a nuclear plant’s security plan must be designed to protect against (e.g., how many attackers and what sort of equipment they may have). The DBT includes both physical attacks and cyber attacks, and specifies that the attackers can include both outsiders and insiders.

In addition, the 2005 Energy Policy Act requires that every three years the NRC must stage mock attacks (known as “force-on-force” exercises) at each nuclear power plant to demonstrate that plant security forces can protect against the DBT.

As is the case for many of its other regulations, the NRC issues documents that provide guidance to nuclear reactor owners on acceptable means for meeting these security requirements. The NRC periodically reviews these guidance documents and updates them when appropriate. However, the NRC is taking far longer than usual to revise two important security guidance documents, which have not been updated since 2007 and 2009.

Why?

Because the nuclear industry is blocking the way. As I note in the letter, “finalizing the revisions has been unnecessarily delayed due to extensive, persistent and … unreasonable objections raised by the Nuclear Energy Institute (NEI) and the power reactor licensees to the changes proposed by the NRC staff.”

 

This is Our Moment: Time to Amplify the Energy of the Food Movement

UCS Blog - The Equation (text only) -

The nomination of our nation’s new Secretary of Agriculture is imminent—likely to occur over the three days prior to Friday’s inauguration, according to Vice-President-elect Mike Pence. As my colleague Nora Gilbert and I recently wrote, we’ll soon know whether the new administration will use this key position to support the rural and farming population that was so instrumental in placing them in power.

As my colleague Karen Stillerman has meticulously documented, however, we can tell a great deal about the new administration’s intentions from Mr. Trump’s choices for other cabinet and diplomatic positions. In brief, we can expect strong support for export-oriented commodity production, rollback of environmental regulations and the undermining of workers’ rights and wages. If these expectations come true they will work directly against the interests of most farmers and rural citizens—and against key pillars of the “good food” movement, which is working for a more healthful, equitable, and sustainable food system.

Before concluding that under such a scenario—the lack of official federal support—the movement for a better food system for all will stall for at least four years, it is well to take a sober look at the current moment in good food matters.

The 2016 general election was good for the good food movement. This isn’t happy talk. While it is true that food and agriculture issues weren’t part of the official electoral discourse, there were victories for key local building blocks of a better food system. It wasn’t just that voters in four cities approved a tax on soda (joined soon thereafter by Cook County, Illinois.) An Oklahoma initiative that sought to protect animal factory farms from regulation was defeated. And four states voted to raise their minimum wage above the anemic $7.25 per hour federal standard.

These are not trivial achievements. More than 2 million low-wage workers stand to benefit from the successful poverty-fighting ballot initiatives of Arizona, Colorado, Maine and Washington. The Oklahoma Farm Bureau and livestock interest groups spent $1 million in their cynical effort to permanently exempt themselves from environmental responsibility in that state. The powerful American Beverage Association spent $38 million to fight the soda initiatives. These formidable forces are not shadow boxing.

Which brings us to the major reason good food advocates should be encouraged. Many things can be said about this election, but these developments make clear that a bright line has been drawn between the interests of a narrow fringe of agribusiness and the broader interests of the nation, including most its farmers. Most importantly, given the dynamics of this election—ostensibly to overturn entrenched business interests in Washington and reverse growing economic inequality—it is a contest that is too far gone for those narrow agribusiness interests to win. Even if they are ushered directly into leadership of the Department of Agriculture.

How do we know this? The only sector of the food business that is growing is good food, as processors, retailers and restaurateurs know full well because they are scrambling to keep pace with this customer-led trend. That, in turn, is but one indicator of a larger shift in the nation’s food culture.

Americans have become keenly interested in food as a way to improve health, local economies, farmer wellbeing and justice for food workers. Witness: Breakfast cereal sales have been declining for a decade. Soda sales are at a 30-year low. Red-meat consumption has plummeted for four decades. For the first time in a decade, annual obesity rates declined in four states. Local food sales grew to at least $12 billion in 2014 (from $5 billion in 2008), and some estimates indicate these could reach $20 billion by 2019. In 2013-2014, schools purchased almost $800 in local food, benefiting both regional economies and more than 23 million children in over 42,000 schools. Such innovations can only be successful with the full support of school administrators and parent associations.

Additionally, Americans are actively seeking ways to support farmers directly. Over 8,600 farmers markets are now set up regularly in the United States, and almost 1,400 farms are listed as offering direct on-farm sales. More than 700 community-supported agriculture schemes have registered with the Department of Agriculture’s directory. Crucially, citizens have understood and are supporting cross-cutting measures to link food purchasing with the wellbeing of farmers, workers, and the environment. This is what the groundbreaking Good Food Purchasing Program enables (the program has been adopted thus far in Los Angeles, San Francisco and Oakland, and is soon to come to other major American cities.) 215 food policy councils around the country pursue similar goals, as do state food charters adopted by Michigan and Minnesota.

So, when the new administration’s agricultural advisors purport to speak for “American agriculture” and say that they know better than their clients what the direction of the food system is, they are clearly out of step with both market dynamics and the nation’s food culture. The truth is that, at best, they are speaking about the interests of just 4 percent of the farming population: those who operate at a scale (annual sales of $1 million or more) that can engage with global, export-oriented agribusiness markets. These large industrial operations have little in common with the vast majority of US farms, which numbered about 2.1 million in 2012.

This should be important for the new administration, because the rural and farming population that has supported them will rightfully expect federal policies that are equitable and favor most farmers, not just a sliver of already wealthy and politically entrenched agribusiness interests.

Which brings us to the major reason for hope, and a concrete agenda for the next four years of the food movement. Clearly, the nation’s food system innovations are springing from communities and state and local governance, bottom-up, and in largely non-partisan manner. While leadership from the federal level would be welcome, the trend to redirect the food system toward good food has taken hold and is driving the commercial food sector to restructure. We can tell change is real when the largest companies in the sector are investing serious resources to transform their value chains to meet customer demand. The good food movement must continue applying its pressure and leading this fast-paced local and regional work in pursuit of the socially equalizing agenda for more healthful, sustainable, fair, affordable and humane food production.

Meanwhile, if the incoming federal administration is to make good on the expectations it has created among its supporters, it must reconcile crucial inconsistencies between its outright divisive and violent campaign rhetoric and the actual interests of its major supporters. Foremost among these are:

  • Policies that benefit most the nation’s farmers, of all scales, ethnicities and genders, by supporting fair prices and reinvestment in rural economies and infrastructure;
  • Comprehensive immigration reform, including ending wage inequality and worker safety exemptions. Otherwise, these amount to sanctioned labor exploitation, leading directly to poverty and hunger in the midst of one of the wealthiest nations on earth. Without this labor, farms will not work—and no one understands this better than the nation’s farmers;
  • Investment in research, extension and education for regenerative agricultural practices, the kind that reward farm management skills and result in higher profit margins for farmers. Public investment in this area of agricultural science is essential because the private sector is not motivated to develop knowledge that doesn’t result in products (like pesticides and synthetic fertilizers) that can be sold year on year. And studies have shown that each dollar invested in agricultural research returns $10 benefit to the economy.
  • Increasing the minimum wage to enable food workers and other marginalized members of the working class—disproportionately people of color—to afford fair prices for food and to thrive as full-fledged contributors in a healthy economy.

A constant throughout the swirls and eddies of American history and progress has been the persistence and dedication of citizens to lead at the grassroots level—at the frontline of school boards, city councils, county boards, state legislatures and through their entrepreneurial innovation—to develop, test and apply the better ideas that work for everyone. It remains to be seen if the new federal administration will follow through on its promises of creating new jobs and a vibrant economy for those left behind by globalization and economic elitism, for farmers, rural citizens and the working class, but if they do, they will merely be following the shifting food culture. The food movement has risen, it is made up of everyone who eats and wants a better tomorrow, it is already reshaping the food business, and it is a force that cannot be stopped—unless we become dispirited. As my colleagues Mark Bittman, Michael Pollan, Olivier De Schutter and I argue, a moment of truth for the food movement has arrived. We must continue working for what we want, yet amplify the momentum of the food movement by forming common cause with others who will fight for a better world for us all. What could make the nation greater than that? Photo: Michael Fleshman/CC-BY-NC-2.0, Flickr

#dieselgate, pt. II: Sergio’s Revenge

UCS Blog - The Equation (text only) -

Today, the Environmental Protection Agency announced that Fiat-Chrysler (FCA) violated the Clean Air Act with sales of Ram 1500 and Jeep Grand Cherokee vehicles powered by diesel engines.  This falls on the heels of the Volkswagen (VW) diesel scandal.  The engine at question is FCA’s 3.0-liter “EcoDiesel”—which could turn out to be anything but.

What the allegations say

Since this is an ongoing investigation, there are still a number of unanswered questions.  Here is what EPA has alleged:

  • Fiat-Chrysler did not disclose to EPA that certain software affects the operation of the emissions controls devices found in the Ram 1500 and Jeep Grand Cherokee.
  • The software in question shuts off operation of the emissions control device. While this is allowed in extreme cases to protect the reliability and durability of the controls, EPA found numerous operating conditions that would fall into the category of “normal operation and use” and would therefore not be an allowable exception.
  • Taken together, these have the effect of a defeat device—this means that FCA could be liable for cheating federal emissions regulations and emitting smog-forming pollution well above the legal limit.

For his part, Sergio Marchionne, CEO of FCA, says that “there was zero intent on our side” to cheat on emissions regulations, calling such an idea “unadulterated hogwash” and noting that there’s “nothing in common between the VW reality and what we are describing here.”

According to the EPA, Ram 1500 pickups and Jeep Grand Cherokees powered by the 3.0-liter EcoDiesel engine contain software that may behave as a defeat device, releasing tons of excess smog-forming pollution that impacts public health. (Images courtesy of Motor Trend)

According to the EPA, Ram 1500 pickups and Jeep Grand Cherokees powered by the 3.0-liter EcoDiesel engine contain software that may behave as a defeat device, releasing tons of excess smog-forming pollution that negatively impacts public health. (Images courtesy of Motor Trend: Jeep Grand Cherokee and Ram 1500)

How does the sequel compare to the original?

FCA is correct to note that this is not quite the same thing as what VW did—in the case of VW, the software was explicitly tuned to alter operations dependent solely upon whether or not the vehicle was undergoing a test procedure.  The software at question in the Ram 1500 and Jeep Grand Cherokee is more subtle—the emissions control devices remain fully operational during lab tests and are turned off during some typical on-road driving situations, but at question is whether or not these situations are narrow enough to fall within the exceptions allowed to protect the durability and reliability of the engine and its emissions control systems.  Because the types of situations are so broadly typical of routine driving behavior, these undisclosed shutdown modes were enough to raise a few eyebrows, particularly when considered in tandem.

How do the emissions controls work in the 2014-2016 EcoDiesel Jeep/Ram trucks?

The emissions control system in the Jeep/Ram trucks relies upon two separate systems working together to reduce formation of nitrogen oxides (NOx), one of the key ingredients in the formation of smog: exhaust gas recirculation (EGR), which recirculates exhaust gas back into the engine to reduce the formation of NOx; and selective catalytic reduction (SCR), which activates a fluid which reacts to chemically reduce NOx after its formed.

When one of those systems is turned off, the other can often compensate for a short amount of time.  For reasons related to durability of the engine and/or the controls system, there are narrow operating conditions where such shut-off is allowed.  However, such operation must be disclosed to EPA (which EPA alleges FCA did not do), and it must fall within a narrow band of operating conditions.  Key here is that there were a number of situations identified by EPA where both systems would be simultaneously either turned off or reduced in effectiveness—this means that the emissions system would be compromised, regularly emitting excess NOx emissions during normal vehicle operation and use.  Such conditions would not be generated when the vehicle was tested for emissions, which is another part of the reason this would have the effect of a “defeat device”.

In addition to the EPA allegations, a lawsuit was filed against FCA in December for the same vehicles by consumers who bought these vehicles in part because of the “Eco” and “clean diesel” marketing around them.  Accompanying that lawsuit was data from on-road emissions tests which showed average emissions of around 4-5 times the legal limit measured during real-world driving of a Ram 1500 EcoDiesel, with spikes in NOx as high as 40 times the certified level.

What are the potential health and emissions impacts?

Because of the shortage of details, it is impossible to know the full impact this scandal will have on emissions and public health.  Still, there are a few reasons why the problem here is likely not as severe as the VW scandal.

The affected vehicles by this allegation have only been available since model year (MY) 2014, which means regulators can address this problem much more quickly than the VW scandal, minimizing the impact they have on the environment.  The average vehicle identified here is likely to have traveled just half the mileage of those affected in the VW scandal, on average.  Perhaps most importantly, it also appears that the levels of excess emissions generated by the vehicles in question are likely significantly less than many of those included in the VW scandal—the data provided in the lawsuit shows that excess emissions from these vehicles could be around one-third that of the VW diesel cars, at the tailpipe.  And in total, the MY2014-2016 EcoDiesel Ram 1500 and Jeep Grand Cherokee amount to about one-fifth the sales of the VW diesels covered under “dieselgate”.  Taken all together, the impact from this could thus be roughly a few percent that of the Volkswagen scandal.

While that may not sound like a lot, these software cheats could have helped contribute to at least a handful of premature deaths and increased hospitalizations from air pollution-related cardiovascular distress.  Aside from the significant damage to public health and the environment, cleaning up this mess will also likely require tens, if not hundreds, of millions of dollars in remediation—hopefully payable by FCA and not the taxpayer.

Is there any good news on the horizon?

Another scandal like this is obviously terrible for the American people, especially those near congested roadways.  It is also not great for automakers looking to more efficient diesel engines to meet vehicle efficiency regulations set out to 2025.

Like the VW problem, this scandal will not be easy to fix.  While FCA believes it can address the issue solely via software updates and has offered to do so, VW said the same thing about its 3.0L diesel engines, and we are still waiting for an approved fix for those vehicles.

EPA is rightly utilizing more real-world emissions testing to complement its lab tests, and similarly subtle emissions violations by other automakers could yet be uncovered.  Given how complex and nuanced this investigation’s outcomes, it is possible there could be additional inquiries into other manufacturers—Mercedes, for example, has already been under investigation for similar software.

With the next administration getting ready to take office, it will be important that EPA continue to protect the public health and well-being of the American public by remaining vigilant against automakers, maintaining a level playing field where all are held equally accountable for their actions.

Rick Perry and the “Texas Approach” to Renewable Energy and Infrastructure

UCS Blog - The Equation (text only) -

Rick Perry—Trump’s pick for the Department of Energy—saw how infrastructure can impact energy development when he was governor of Texas.

Texas has a history of success producing energy. Two keys to this success have been infrastructure and markets, which served to make Texas the leading state for wind energy development.

Every energy supply needs infrastructure. Texas has always had a “pro-competition” approach to ensure adequate infrastructure so new electric energy supplies could compete in the market. The approach played out big for windpower, at the direction of the Texas legislature.  In 2005, the legislature set out a framework for “competitive renewable energy zones” designed to provide infrastructure for increased renewable energy expansion.

How does this work?

The process worked this way: First, estimates were made of energy production, costs and benefits. Then Texas required commitments of private development interest. At the time, renewable energy in Texas weighed in at 3,000 MW. (A single nuclear plant is about 1,000 MW.) Today, Texas has 18,000 MW of wind and 5,000 MW under construction. The infrastructure made all the difference.

How well does this work?

Texas has 3x more wind energy production than the next most successful state, Iowa. This is in large part thanks to transmission, which is as necessary for windfarms as a road to market is needed for any farm produce.

So, the “Texas approach” could be a great model for the United States more generally. In fact, a collaboration of utilities has described a transmission plan for the eastern United States.

And what about gas pipelines?

Gas fields don’t deliver gas to markets. Gas pipelines are needed for that, too. When the gas industry started to expand in the US with fractured shale developments, they put out a forecast of pipeline expenses that would be needed. The cost for the gas infrastructure in their plans was much more than what has been described as the transmission expansion cost needed for wind.

The numbers look like a 10-fold increase in renewable energy would require less money for transmission than the cost of pipelines for a doubling of natural gas use.

Energy development is something Texas understands. The electric companies are not required to keeping buying more windpower, but they do. The old power plants are not shielded from competition, so when the cost of electricity from wind can beat coal or natural gas, the electric companies buy what is less expensive. Now that solar panel prices have fallen so much, the grid operator in Texas is predicting that solar farms will be built much, much more quickly than new natural gas-burning power plants.

These are the lessons of Texas infrastructure during the Rick Perry years. Now, will Rick Perry bring this approach to the Department of Energy and make transmission a priority?

No President Should Have Absolute Authority to Launch Nuclear Weapons

UCS Blog - The Equation (text only) -

After Donald Trump takes the oath of office later this week, he will be given the codes that allow him to order the launch of nuclear weapons.

At that point, Mr. Trump will inherit a deeply flawed system: one that gives sole and absolute authority to the president to launch US nuclear weapons—and that can put extreme time pressure on him to make that decision.

 Dept. of Defense)

Minunteman III missile (Source: Dept. of Defense)

One of the narratives that arose during the presidential campaign was that a Trump finger on the nuclear button would increase the risk of nuclear war because he is seen as impulsive and unpredictable.

Whatever the merits of those arguments, the public seemed shocked to learn that the US president has the authority to decide—on his or her own, for whatever reason—to launch nuclear weapons, and that no one has the authority to veto that decision. There are military and political experts in advisory roles, but the final authority rests just with the president.

It’s time to change that policy. The reasons behind it are now outdated.

On at least one occasion White House officials were worried enough about the mental state of the president that they tried to insert roadblocks in the way of a potential launch decision.

That was in 1974, late in the Nixon administration. US Secretary of Defense James Schlesinger grew concerned that President Nixon still had control of US nuclear weapons despite the fact that the Watergate crisis had rendered him depressed, emotionally unstable, and drinking heavily. Schlesinger instructed the Joint Chiefs of Staff to route “any emergency order coming from the president”—such as a nuclear launch order—through him first.

Why are things set up this way?

The main reason for giving the president launch authority was the concern that a decision to launch a nuclear strike might need to be made very quickly.

During the Cold War, officials feared the Soviet Union might attempt a disarming first strike against US nuclear weapons. The US response was to build warning sensors and create options to launch land-based missiles very quickly on warning of an attack—so-called “launch under attack” options—before the Soviet missiles could land and destroy US missiles in their silos.

To do this, the US put its missiles on hair-trigger alert so they could be launched within a matter of minutes if US sensors warn of an incoming Russian attack. This system gives the president only about 10 minutes to decide whether to launch these missiles. False alarms have plagued the system in the past—leading to the risk of an accidental or mistaken launch.

Because of this extreme time pressure, the system was designed so that the launch decision—arguably the biggest decision in history—is made by a single person, the president.

And this Cold War system is still with us today.

 Smithsonian Inst.)

The football. (Source: Smithsonian Inst.)

Everywhere President Trump goes, he will be followed by a military officer carrying a briefcase (called the “football”) containing everything he would need to order a launch within minutes, including secure communications equipment and descriptions of nuclear attack options.

Of course, the president could also order a nuclear launch even if there was no incoming attack. And no one has the authority to stop him from doing so.

Outdated and dangerous

Today, however, the military sees a surprise attack as extremely unlikely. More importantly, it has confidence in the survivability of US nuclear missiles based on submarines at sea, which carry more than half the US deployed arsenal of nuclear weapons. Since these forces are not vulnerable to a first strike, retaliation—and therefore deterrence—does not depend on launching quickly (if it ever did).

This fact allows several important changes in US policy.

First, it means the US can take its land-based missiles off hair-trigger alert and eliminate options for launching on warning of attack.

Presidents Obama and Bush both called for an end to hair-trigger alert, but didn’t change the current system. This one sensible change would significantly reduce one of biggest nuclear threats to the US public by eliminating the risk of a mistaken launch, which would almost certainly lead to a retaliatory strike. We have argued strongly that the current practice is dangerous and have repeatedly called on President Obama to remove US missiles from hair-trigger alert.

President Trump should make this happen. His recent statements show he is interested in working with Russia to reduce the nuclear threat. Putin may agree to take Russia’s missiles off alert, since he knows that weaknesses in Russia’s early warning system increase the risk of a mistaken Russian nuclear launch. But Mr. Trump should not give Mr. Putin a veto over taking this step: if Russia drags its feet, the US should not wait to act.

Second, it means that a launch decision does not have to be made within a few minutes, allowing time for more than one person to be involved in making launch decisions.

Removing this time pressure eliminates the rationale for giving the president the sole authority to launch nuclear weapons. Instead, the United States can establish a process to involve others in any decision to use nuclear weapons.

Requiring a decision to be made by even a relatively small group of people—say, the president, vice-president, speaker of the House, and secretaries of State and Defense—would prevent a single person from making an irrational or impulsive decision, but would still involve a small enough group to be manageable in a crisis.

Some members of Congress and outside experts have argued over the years (and most recently) that any first-use of nuclear weapons (as distinct from the launch of a retaliatory nuclear strike) should require a declaration of war by Congress. Thus, a decision to use nuclear weapons—except in response to a nuclear attack—would require the approval of elected officials and would not be solely up to the president.

People were right to feel shocked when they learned that this potentially civilization-ending authority is in the hands of one person.

The current system must be changed—no matter who is sitting in the Oval Office.

Trump and the Nuclear Codes: How To Launch a Nuclear Weapon

UCS Blog - All Things Nuclear (text only) -

There has been a lot of talk about the fact that after his inauguration, Donald Trump will have his finger on the “button” used to launch nuclear weapons. But the president does not actually have a “button.”

Instead when he becomes president he will be given nuclear codes that enable him to launch a nuclear strike.

What does that actually mean?

The Nuclear Football: Speed Matters

The US nuclear launch system is built for speed: it is designed to allow the president to be notified of warning of an incoming attack, be briefed on the specifics, decide what to do, and launch the response, all in less than the 30 minute flight time of an attacking long-range missile.

This demanding timeline was established during the Cold War to allow US land-based missiles to be launched on warning of an attack, before the attacking missiles could land and destroy them in their silos. It persists today despite the fact that most US nuclear weapons are deployed on submarines at sea, which are invulnerable to such an attack; quick launch of land-based missiles is therefore no longer needed for deterrence.

 Smithsonian Inst.)

The football (Source: Smithsonian Inst.)

The short timeline means in practice the president may have less than a minute to be briefed by military officers and his advisors, and have fewer than 10 minutes to decide whether to order a launch.

Warning of attack could come when the president is essentially anywhere, with no time to get to the White House situation room or the Pentagon.

The only way to make this system work is to ensure everything the president needs to order a launch is never far from his side. So he is constantly escorted by a military officer carrying a 45-pound “nuclear briefcase”—typically called the “football.” The football contains, among other things, a secure communications system and the “black book” that lists the menu of preset launch options the Pentagon has drawn up.

The Nuclear Codes: Who’s Who

The president could also, in principle, use the football to order a nuclear strike that was not in response to an attack.

No matter why he ordered a launch, the president needs a way to convince those in the military who actually carry out the launch order that he really is the president and not, for example, a Russian hacker.

That’s where the “nuclear codes” come in. They aren’t codes to actually launch missiles, but to allow the president to prove he is authorized to order a launch.

The codes are a list of letters on essentially a 3×5 card, called the “biscuit,” which are updated daily. To prove he is the president, he must be able to provide the right code from the list on the card. The card may be carried in the football, although some presidents have chosen to carry it with them.

Once the president has provided the proper codes for the day, no one has the authority to stop the launch process.

The Nuclear Launch Order

Once the president has provided the correct identification codes and his launch option over a secure communication line, officers in the Pentagon’s “war room,” or National Military Command Center (NMCC), send out encrypted messages, called the Emergency Action Messages (EAMs).

 Dept. of Defense)

Missile launch officers (Source: Dept. of Defense)

The EAMs are the actual launch orders giving the details of the launch. The EAMs would go, for example, to the officers in the various underground Launch Control Centers who would launch US land-based missiles.

The EAMs specify the launch plan, including the targets and number of warheads to be launched, and the time of the launch. They also contain launch authorization codes that allow the launch crews to confirm that the order is real by comparing it to codes they have in their safes. These are called “sealed-authentication system” (SAS) codes.

Once the EAM arrives, ICBM crews can launch their missiles in 60 seconds and submarine crews can launch in 12 minutes. In that time, the US could launch some or all of the roughly 900 nuclear weapons it keeps ready for immediate launch. The remaining deployed weapons in the US arsenal would take longer to launch. These include aircraft carrying bombs and cruise missiles, and missiles on submarines at sea but out of range of their targets.

Can President Trump Uphold Scientific Integrity in Government Decisionmaking? New Report Tells What’s At Stake

UCS Blog - The Equation (text only) -

Last week, the US Department of Energy released a revised scientific integrity policy in what was likely the last move by the Obama administration to promote scientific integrity in federal decision-making. But we cannot forget the many steps that preceded it. Today, the Union of Concerned Scientists releases a report, Preserving Scientific Integrity in Federal Policymaking: Lessons from the Past Two Administrations and What’s at Stake under the Trump Administration. The report characterizes the progress, missteps and unfinished business of scientific integrity under the Obama administration and considers what’s at stake under the Trump administration.

How will our new president treat government science? Will his administration follow the new scientific integrity policies now in place at federal agencies? And importantly, will the new policies and practices across the government be able to safeguard science through his presidency?

We’ve certainly seen some alarming indicators of how President Trump will respect science. In addition to spreading false information on climate science, vaccines, and wind farms (to name a few), we continue to see cabinet nominees with strong records of disparaging science or undermining science-based policies. It is hard to imagine a Trump administration where science won’t be politicized.

To be fair, as the report details, the Obama administration was not without missteps when it comes to politics overriding science (Think: ozone and emergency contraception). But it’s easy to forget where we came from.

Scientific integrity from Bush to Obama: We’ve come a long way, baby

In the early 2000s, reports started trickling out revealing that the George W Bush Administration was misusing science. We heard from government scientists across federal agencies that their work was being suppressed, manipulated, or misused by political forces. And this was happening across federal agencies and across issue areas—from FDA drug approvals to education to endangered species to climate change. The scientific community was caught off guard. Never before had political interference in science been so pervasive and so widespread across the government.

But the scientific community fought back. The Union of Concerned Scientists organized 15,000 scientists to tell the administration that this disrespect of science would not stand. We surveyed thousands of federal scientists to quantify and document the state of science in federal decision-making. We developed detailed policy recommendations—many of which were ultimately enacted by the next administration. We got strong media coverage, pushed other prominent scientific voices to speak out on this issue, and raised the political price of misusing science for political purposes. The administration ultimately walked back on several political moves where science had been undermined.

csd-blog-si-progress-chart

This chart from “Preserving Scientific Integrity in Federal Policymaking” assesses the scientific integrity progress made at various federal agencies to date. (Click for full-sized version.)

When the next president came in, scientific integrity was high on the agenda. In his inaugural speech, President Obama vowed to “restore science to its rightful place” and took several steps in his first hundred days to do so. There are now scientific integrity policies at 24 federal agencies. While they vary in quality, the policies are designed to guard against the kind of abuse we saw under the Bush administration. Many federal scientists now have more rights written into their agencies’ policies—rights to share their scientific work with the media and public, rights to review documents based on their science before their public release, and rights to share their work in the scientific community. Many policies also explicitly prohibit political appointees and public affairs staff from manipulating agency science, and some agencies have instated scientific integrity officials to oversee the new policies. We have a long way to go in terms of ensuring these policies are implemented, but we are certainly in a better place than we were eight years ago.

Scientists to Trump: We are ready and watching

So how will government science fare under Trump? Scientists are not just going to wait and see. More than 5,500 scientists have now signed onto a letter asking the president-elect to uphold scientific integrity in his administration. Government scientists are more prepared to recognize losses in scientific integrity than ever before and they are now equipped with more tools for dealing with it when they occur. Scientists everywhere are archiving government data and websites, watching every move of the administration, and prepared to hold the new administration accountable when they misuse science or target scientists. We know what’s at stake. We’ve come too far with scientific integrity to see it unraveled by an anti-science president. It’s worth fighting for. And I won’t sit on the sidelines.

The Truth About Pruitt’s EPA Lawsuits is Even Worse Than You Might Think

UCS Blog - The Equation (text only) -

One well-reported thing about Scott Pruitt, President-elect Trump’s nominee for EPA Administrator, is his penchant for filing lawsuits to block the EPA from enforcing clean air, clean water and climate regulations, rather than suing polluters in his own state of Oklahoma.

This alone ought to provide ample grounds for rejecting his nomination. But a closer look at these lawsuits and the legal arguments Pruitt has advanced (or signed onto) tells an even more disturbing story. The legal arguments are disingenuous, often unprincipled and extreme, and display an unfortunate strategy of saying just about anything to win a case.

Consider these three examples.

Pruitt takes on climate scientists: the 2010 lawsuit challenging the EPA’s “Endangerment” finding

global_surface_tempsIn 2009, the EPA made a long overdue, and wholly unremarkable finding that greenhouse gas emissions from the combustion of fossil fuels may endanger public health and welfare. In this finding, the EPA acknowledged the overwhelming consensus of the scientific community and the multiple lines of independent evidence supporting this conclusion.

While the finding broke no new ground scientifically, it was important legally: when the EPA finds that a pollutant endangers public health or welfare, the Clean Air Act requires the EPA to regulate sources of that pollutant. In this case, that meant power plants, cars, trucks, and other sources that combust coal, oil, and natural gas.

To stop such regulation in its tracks, Scott Pruitt filed a lawsuit to overturn the endangerment finding, which he and his fellow litigants characterized as “arbitrary and capricious.” Believe it or not, Pruitt’s primary argument was that the EPA should not have relied upon the multiple reports on climate change issued by the Intergovernmental Panel on Climate Change (IPCC)(established by the United Nations which synthesizes the work of thousands of scientists), the US Climate Change Science Program (CCSP) (a Bush administration body of 13 federal agencies that issued 21 reports on climate change), and the National Research Council (NRC)(the research arm of the National Academy of Sciences).

Pruitt’s legal brief never quite explains what is wrong with relying upon the world’s most prominent experts, but it claimed that the EPA in effect wrongly delegated its decisionmaking to these bodies.

Here are the rather sharp words the court used when it unanimously dismissed this claim:

This argument is little more than a semantic trick. EPA simply did here what it and other decisionmakers often must do to make a science-based judgment: it sought out and reviewed existing scientific evidence to determine whether a particular finding was warranted. It makes no difference that much of the scientific evidence in large part consisted of “syntheses” of individual studies and research. Even individual studies and research papers often synthesize past work in an area and then build upon it. This is how science works. EPA is not required to re-prove the existence of the atom every time it approaches a scientific question [emphasis added][Page 27]

Take a moment to digest this: the person nominated to head the EPA sued that agency because it relied upon the work of the world’s most knowledgeable scientists when making a finding regarding the most important scientific question of our lifetime—whether humans are causing global warming.

Pruitt’s lawsuits to block mercury reductions using a rigged cost-benefit analysis  Peggy Davis/CC-BY SA (Flickr)

Photo: Peggy Davis/CC-BY SA (Flickr)

Mercury has long been known to be one of the most potent neurotoxins: ingestion of even very small amounts can have devastating effects, particularly on children. Coal and oil-fired power plants are responsible for more than 50 percent of the mercury emissions in the United States, which travel long distances and deposit in water bodies, leading to ingestion by fish and humans who consume fish. There is effective technology that many power plants use to control mercury and other toxic pollutants, but approximately forty percent of existing power plants do not use it.

In 1990 Congress amended the Clean Air Act to specifically authorize the EPA to address mercury emissions (and other air toxics), but no progress was made due to EPA delays and litigation. In 2011, the Obama Administration issued a rule to cut mercury emissions from power plants.

The rule required approximately 40 percent of existing power plants to install the same proven controls that the other 60 percent had already adopted.

The EPA estimated that it would avert up to 11,000 premature deaths, 4,700 heart attacks and 130,000 asthma attacks every year.

Scott Pruitt and others launched a lawsuit to prevent the EPA from cutting mercury and toxic air pollutants from power plants. He scored an initial victory on a technicality—the EPA had failed to consider cost of regulation at the preliminary stage when it was considering whether to regulate mercury. (I call this a technicality, because the EPA did perform a formal cost benefit analysis at the later stage when it issued the regulation).

The EPA subsequently complied with the court order and used an updated analysis to support the rule. The analysis showed “monetized” benefits of between $37-90 billion versus a cost of $9 billion.

Unsatisfied, Pruitt filed a second lawsuit, this time taking aim at the cost benefit analysis. As was the case with the endangerment finding, Pruitt’s attack led with an absurd argument – this time about cost benefit analysis.

When the EPA tallied up the costs of the regulation, it included direct costs, like the cost of installing the pollution control, and indirect costs, like higher electricity prices. Similarly, when the EPA calculated the benefits of the regulation, it considered direct benefits, like improved public health from mercury reduction, but also indirect benefits, like reductions in other pollutants such as smog and sulfur dioxide because the pollution control technology used for mercury also reduces these pollutants.

Pruitt’s new lawsuit claims that the EPA cannot consider these “co-benefits.” Instead, he contended that the EPA should only be allowed to count the benefits from mercury reduction. His argument makes no sense—the whole point of cost–benefit analysis is to determine whether an overall societal benefit of a regulation exceeds its overall cost. And nothing in the Clean Air Act or in past practice requires the EPA to use blinders when judging the benefit. In fact, for years, under both political parties, the EPA has factored in “co-benefits” and federal guidance on cost-benefit analysis calls for it to be included.

The court has not yet ruled on this specious claim, but it did reject a request to put the rule on hold while it sorted the question out, suggesting the court’s early skepticism of the argument’s merit. Regardless of the ultimate ruling, the bottom line in the case is this: Pruitt indefensibly favored economic analysis of regulations that considers all of their costs, but only some of their benefits.

Pruitt’s interstate pollution lawsuits reveal further hypocrisy  OK.gov

Photo: OK.gov

It is revealing to note that, at the same time as Pruitt was suing the EPA to count all the costs (but not all the benefits) of its mercury ruling, he also argued against factoring in costs in a separate lawsuit that sought to block an EPA rule that prevents upwind states from sending their pollutants to downwind states in such quantities as to cause the downwind states to exceed heath-based pollution concentration limits.

By way of background, the EPA has tried for years to address the problem of interstate air pollution, but it is fiendishly complex. Many upwind states emit pollutants to multiple downwind states, many downwind states receive pollutants from multiple upwind states, and some states are both upwind and downwind states. Thus, it is difficult to devise a formula to fairly and effectively apportion responsibility.

In 2011, the EPA crafted a “Transport Rule” to address the problem. They conducted extensive analysis of the costs involved to determine how expensive it would be, per ton of pollutant reduction, to ensure that upwind states do not cause downwind states’ air quality to exceed federal standards. They then gave each upwind state a pollution “budget” for the state to use to reduce the pollutants that were wafting beyond their borders, based on this “cost per ton” reduction benchmark.

Scott Pruitt and others challenged this rule, arguing—believe it or not—that costs of compliance should not be the yardstick, arguing instead for an approach that would have been nearly impossible to administer. Not surprisingly, in a 6-2 vote, the Supreme Court rejected his attack.

The bottom line in this case is this: the EPA focused on a problem that states can’t solve on their own (interstate air pollution). They solved the problem using a cost-effectiveness benchmark that is fair to all states, and that conservatives profess to favor. Pruitt’s attack on this approach demonstrates an abandonment of conservative principles in the service of what appears to be his ultimate objective—stopping regulation.

Opposing science to block regulation

Pruitt’s lawsuits clearly demonstrate that he is against regulation, particularly of the oil and gas industry. That much we already knew. But when one looks at the actual cases he has filed and the legal arguments he has advanced, one sees something even more disturbing—a disrespect for science, a penchant for a rigged method of performing cost-benefit analysis, and a lack of interest in helping to police the problem of interstate air pollution—which clearly must be done at the federal level.

This all adds up to someone who uses the law to block good science. This is not acceptable, particularly for the head of the EPA. And that is why I, and twelve other former state environmental protection agency heads, have signed a letter opposing this nomination. Photo: Gage Skidmore/CC BY-SA (Flickr) Photo: Peggy Davis/CC-BY SA (Flickr)

Nine Questions for Ryan Zinke, Donald Trump’s Pick to Lead the Interior Department

UCS Blog - The Equation (text only) -

Montana Congressman Ryan Zinke will begin Senate confirmation hearings today for the post of Secretary of Interior in Donald Trump’s cabinet. As Secretary, he would oversee America’s 500 million acres of public lands, including the National Park System. Zinke would also have responsibility for timber extraction, livestock grazing, coal mining and oil and gas leases on public lands and 1.7 billion acres of seabed on the outer continental shelf. He has called himself a “Teddy Roosevelt Republican” but has voted against air and water protections and voted to weaken the Endangered Species Act and the Antiquities Act. He has questioned climate science, is a vocal supporter of the coal industry and has served on the board of an oil pipeline company. We have questions for Ryan Zinke.

 NPS

Avalanche Lake, Glacier National Park, near Whitefish, Montana where Ryan Zinke was raised. Photo: NPS

Do you accept that there is unequivocal scientific evidence for human caused climate change?

Congressman Zinke has said “The climate is changing, I don’t think you can deny that. But climate has always changed” continuing that “I don’t think there’s any question that man has had an influence” but that “what that influence is, exactly, is still under scrutiny.” And in October 2014, Zinke said “It’s not a hoax, but it’s not proven science either…”. In fact, there is overwhelming scientific consensus and unequivocal evidence that the global climate is warming, that since the 1950’s the dominant cause has been human influence. 

Will you support and effectively utilize nationally important climate science programs throughout DOI?

Increasing temperatures, coastal flooding and erosion, more extreme weather events, worsening wildfires and droughts, melting glaciers and thawing permafrost all represent massive risks for natural and cultural resources under the protection of the Department of the Interior (DOI) in the 500 million acres of public lands it manages. DOI’s strategic plan states that it “will bring the best science to bear to understand these consequences and will undertake mitigation, adaptation, and enhancements to support natural resilience…”

The department’s climate science stable is strong, with a formidable Climate Change Response Program in the National Park Service, and a network of eight regional Climate Science Centers (CSCs) managed by the U.S. Geological Survey (USGS) that synthesize climate impacts data to make it usable and relevant for resource managers. DOI also runs a network of 22 Landscape Conservation Cooperatives (LCCs) which bring federal and state agencies together with non-governmental organizations, tribal entities, and academic institutions to manage natural and cultural landscapes across jurisdictional boundaries, with a strong emphasis on integrating climate management.

Are you committed to do everything you can to protect our national parks in the face of climate change?

Climate change is the biggest threat to our national parks. Impacts can already be seen throughout the country, from colonial Jamestown Island to Glacier National Park, just a few miles from where Congressman Zinke was raised, in Whitefish, Montana. The National Park Service is at the cutting edge of international efforts to develop climate adaptation and resilience strategies for protected areas, but it desperately needs more resources to implement effective management strategies. The NPS has a system wide Climate Change Response Strategy and is implementing an ambitious Climate Change Action Plan. Most recently, in January 2017, the NPS published a groundbreaking Cultural Resources Climate Change Strategy (CRCCS) which lays out key policies needed to protect America’s heritage in a rapidly warming world, including advocating the incorporation of climate science into all planning for cultural resources management.

Will you fight for the budget and resources our national parks need?

Americans love their national parks, and in 2016 a record 325 million people visited them. The national park system is the envy of the world, but it is being starved of resources – not just for climate response, but for all aspects of its operations. The latest report from the National Park Service detailed a backlog of nearly $12 billion in deferred maintenance. The parks support $30 billion in economic activity and nearly 300,000 jobs. The budget for the NPS has been cut by 12% ($364 million in the last five years) 

Climate change is driving worsening wildfires in the West. Are you committed to helping to coordinate and implement an effective federal response?

Rising temperatures have created a longer wildfire season and drier conditions that together with forest management strategies, fire suppression policies and increased development at the wildland-urban Interface are causing bigger, more damaging and costly fires. The US Forest Service and DOI are together the coordinating federal agencies for wildfire response and management. UCS has noted the urgent need to update federal wildfire policies and budgeting in line with what we know about the growing influence of climate change. Legislation has been proposed to invest in hazardous fuels management, fire-fighting technology and improved fire-mapping. However, a budget fix is also needed. In a June 2016 U.S. Senate hearing, Robert Bonnie, Under Secretary for Natural Resources and the Environment, USDA, testified that: “The single most important step Congress can take to advance forest health and resilience is to enact a comprehensive fire budget solution—one that addresses both the growth of fire programs as a percent of the agency’s budget and the compounding annual problem of transferring funds from non-fire programs to cover the cost of fire suppression.”. To date, Ryan Zinke’s record shows that his preferred response to the worsening wildfire situation is merely to increase logging and timber extraction, ostensibly to reduce fire risk.

Are you committed to keeping America’s public lands public for the benefit of all Americans?

Congressman Zinke has said giving away public lands is “a non-starter … in Montana, our public lands are part of our heritage.” He even resigned as a member of the Republican Party Platform committee last year in protest at moves to make public land transfers part of the platform. But in January 2016, after being nominated for the position of Interior Sectretary, Zinke voted for a House rules package that included a measure to make transfers of public land cost-free and budget neutral, considerably easing the path for future privatization which would no longer require costs of transfers to be offset with other spending or budget cuts. Zinke says he hasn’t changed his view on public lands transfers to state, local or private hands, but can he be trusted to hold the line?

 NPS

Natural gas production pad at Padre Island National Seashore, Texas. Photo: NPS

Will you fully implement the DOI policy on scientific integrity?

DOI has one of the strongest scientific integrity policies of all government departments. It safeguards against political interference in department science and grants its scientists freedom to communicate their science. The policy aims to ensure that Interior Department decision-making can rely on robust and trustworthy science of the highest quality. It encourages “an environment of rigorous and honest investigation, open discussion, and constructive peer review, free of political influence that is needed for good science to thrive.” DOI scientists and scholars are also encouraged to participate in professional societies and scientific meetings, as well as talk to the media about their work. But the policy is only as good as its implementation. It is crucial that Representative Zinke prioritize scientific integrity at his agency.

Have you changed your mind about supporting air and water safeguards on mining, oil and gas operations?

In 2015 the League of Conservation Voters scored Ryan Zinke at a miserable 3% for his environmental record. He has voted to weaken the Clean Water Rule proposed by the EPA and the US Army Corps of Engineers and to remove safeguards on air and water protections, including from chemicals in fracking and stream pollution from mountain-top mining.

Can we trust you to limit fossil fuel infrastructure that will negatively impact our public lands, environment and cultural resources?  Tim Evanston/Creative Commons

A train taking Montana coal towards the West Coast. Photo: Tim Evanston/Creative Commons

 

“I always side with Montana Coal Country,” Congressman Zinke said in a campaign ad in 2016, and he has been a major champion of the Gateway Pacific Terminal in Washington which would be the transport hub for increased coal experts from the western states, to Pacific Rim countries. The terminal was denied a permit by the US Army Corps of Engineers on the basis that the Lummi tribe’s treat-protected fishing rights would be affected and has been opposed by communities through which coal trains would pass.

Zinke also served on the board of an oil pipeline technology company from 2012 to 2015 and has been an outspoken supporter of the Keystone pipeline since being elected to Congress. Zinke is likely to be at the center of any renewed fight to on routing the Dakota Access pipeline (DAPL). In December 2016, the U.S. Army Corps of Engineers declined to grant an easement for DAPLK to pass through land sacred to the Standing Rock Sioux tribe and vital for cultural important water resources. The Corps will now prepare an Environmental Impacts Statement for alternative routes, and Zinke as Interior Secretary will likely play a central role in the future of the pipeline.

Even Without an Agriculture Secretary, Trump’s Cabinet Says Plenty about Food and Water Plans

UCS Blog - The Equation (text only) -

It’s official. This week’s Veterans Affairs nomination leaves the Trump administration’s Secretary of Agriculture position as the last cabinet slot to be filled. With his inauguration just 7 days away, the president-elect still hasn’t announced his pick for this vital position that touches every American’s life at least three times a day.

But while we wait (and wait, and wait) to see who will run the department that shapes our nation’s food and farm system, it may be instructive to take a look at what some of his other personnel choices say about his intentions in this realm. And particularly, what the Trump team could mean for two of our most basic human needs—food and water.

First, food. On the whole, today’s US agriculture system is skewed to production of commodity crops—chiefly corn and soybeans—the bulk of which become biofuel components, livestock feed, and processed food ingredients. That said, over the last 8 years we’ve seen increased emphasis, from the White House and the USDA, on healthy eating, local food systems, and the like.

But things seem about to change, and how. The president-elect himself reportedly lives on fast food and well-done steaks. And even without an agriculture secretary nomination, Trump’s other appointees to date seem to indicate that unhealthy food and industrial farming are back in force.

Corn is king and beef is back in Trump’s America

It’s hard to believe at a time when US corn production is at an all-time high, but with Trump’s team we might actually get more of this commodity we already have too much of.

The Iowa Corn Growers Association hailed Governor Terry Branstad’s selection last month as ambassador to China, a hire seen as a boon to that state’s corn-heavy farm sector. What does diplomacy in the Far East have to do with corn farmers in Iowa, you ask? China is already a major buyer of US farm commodities such as Iowa corn and pork, and Branstad is expected to press his “old friend” President Xi to ensure that continues. (Not to be left out, the American Soybean Association sounded happy about the Branstad pick as well.)

The ambassador-in-waiting is already plugging corn domestically, telling Iowa Public Radio and the state’s corn farmers that Trump’s chosen EPA head will support the ethanol industry they feed. Oklahoma Attorney General Scott Pruitt, you may have heard, is Trump’s highly controversial pick to head the Environmental Protection Agency (see why this is laughably unacceptable here, here, and here). Pruitt is an oil guy, and on Oklahoma’s behalf he has fought the EPA’s Renewable Fuel Standard, which boosts the ethanol industry by mandating a level of blending with gasoline. But Branstad and both of Iowa’s Senators say King Corn needn’t worry.

Meanwhile, Pruitt has endeared himself to the American Farm Bureau Federation, the chief lobby group for Big Ag, with his rabidly anti-regulatory stance. The Farm Bureau cheered Pruitt’s appointment, describing it as “welcome news to America’s farmers and ranchers – in fact, to all who are threatened by EPA’s regulatory overreach.”

Read: agribusiness won’t have to deal with pesky environmental regulations under Pruitt.

Branstad’s and Pruitt’s nominations are also gifts to the meat industry, given their allegiances to the Iowa pork industry and Oklahoma beef industry respectively, as Tom Philpott over at Mother Jones explained last month.

Throw in hamburger exec Andrew Puzder as Labor Secretary and the interests of industrialized meat and its fast-food purveyors will be well represented in cabinet meetings. (See more reasons to be worried about Puzder here and here.)

A promise of “crystal clear water”

With the food landscape being reshaped more to Trump’s liking, let’s look briefly at water. During the campaign, candidate Trump said that as president he would ensure the country has “absolutely crystal clear and clean water.” (It’s campaign promise #194 on this list.)

I’m glad he recognizes that clean water is a critical resource and something Americans want. But will we get it?

Probably not if it’s up to Scott Pruitt. Pruitt has sued the EPA over a slew of clean air efforts, including its climate, mercury, haze, and ozone rules, but he has also been vehement in his opposition to the agency’s efforts to protect the nation’s waters from pollution. In particular, he wants to kill the Obama EPA’s Clean Water Rule (also known as the “Waters of the US,” or WOTUS, regulation), which expanded the definition of waterways the federal government has the authority to protect under the Clean Water Act. The manufacturing and fossil fuel industries are major backers of the effort to kill the WOTUS rule, and Big Ag (in the form of the Farm Bureau) has joined them.

Is Trump’s USDA pick our last best hope for healthy food and clean water?

This brings us back to the long-delayed USDA nomination. Since the election, we’ve seen a parade of agriculture secretary hopefuls march in and out of Trump Tower. The process has frustrated farmers and confounded other observers (including the current USDA chief). It’s clear that the new USDA head, whoever he or she turns out to be, won’t be confirmed by the Senate until after the inauguration.

Until the president-elect makes an official announcement, it’s impossible to know where he’s going with this important position. And it is important. The US Department of Agriculture is a sprawling bureaucracy made up of 29 agencies and offices, nearly 100,000 employees, and a budget of $155 billion in FY17. Its vision statement:

[T]o provide economic opportunity through innovation, helping rural America to thrive; to promote agriculture production that better nourishes Americans while also helping feed others throughout the world; and to preserve our Nation’s natural resources through conservation, restored forests, improved watersheds, and healthy private working lands.

The emphasis is mine, to highlight that the department is supposed to be looking out for the economic well-being of farmers and their communities, the health and nutrition of all Americans, and the critical natural resources—including water—that we all depend upon.

Let’s hope that whoever takes the helm at the USDA intends to do just that—even if Trump’s other cabinet picks have given us little reason for optimism.

Now, back to waiting…

Governor Kasich Restores Clean Energy Standards in Ohio

UCS Blog - The Equation (text only) -

During the final hours of 2016, Ohio Governor John Kasich vetoed Substitute House Bill 554, legislation that would have made Ohio’s clean energy standards voluntary, effectively extending the two year freeze. By vetoing the bill, Kasich restored Ohio’s renewable energy and energy efficiency standards, which have been frozen since 2014.

In Governor Kasich’s veto statement he noted that HB 554 risked undermining the state’s job progress by taking away some of the energy generation options, and would deal a setback to efforts that are succeeding in helping businesses and homeowners reduce their energy costs through energy efficiency.

The Governor was applauded for vetoing the bill by a wide array of corporations and advocacy organizations. UCS also released a statement thanking Governor Kasich.

So what happened? Here’s the backstory.

History of the freeze

The clean energy standards in Ohio were created by a Republican-led General Assembly and signed by Governor Strickland in 2008, requiring Ohio utilities to procure energy efficiency and renewable energy. The Ohio legislature then enacted a 2-year freeze of the clean energy standards in 2014 despite them having a proven track-record of creating economic and environmental benefits for consumers.

In December 2016, the Ohio legislature passed HB 554 which would have delayed any enforcement of increased energy efficiency and renewable energy until 2019, therefore making the standards into voluntary goals for that period. This would have extended the clean energy standard freeze for two more years.

Because the Governor vetoed HB 554, the energy efficiency and renewable energy standards will be reinstated. The current clean energy standards (now restored) require utilities to secure 12.5 percent of their power from renewable sources, and achieve a 22 percent cumulative reduction in electricity use through energy efficiency.

The Union of Concerned Scientists has been working in coordination with local experts and activists to lift the clean energy standard freeze to create a win-win situation for the state’s economy and the environment.

Benefits of the clean energy standards in Ohio                                                                                                            

The clean energy standards have brought countless benefits to Ohio. The standards are what are driving a robust clean energy sector in Ohio, with more than 100,000 clean energy jobs in the state. These jobs include workers in renewable energy generation, clean transmission, energy efficiency, clean fuels, and advanced transportation.

Analysis conducted by Cadmus and the Midwest Energy Efficiency Alliance concluded that 2014 energy efficiency investments in Ohio have yielded, and will continue to generate, net benefits for the Ohio state economy. In 2014 alone, these benefits included thousands of new jobs, and more than $175 million in increased statewide income. And over the entire 25 year study period, the 2014 energy efficiency programs are estimated to increase net statewide income by more than $1.2 billion, and add almost $1.9 billion of total value to the state’s economy.

More opportunities for Ohio

It’s important for Ohio to take advantage of the 5 year extension of the federal production and investment tax credits (PTC and ITC) for wind and solar energy resources.  This provides a golden opportunity for Ohio to accelerate renewable energy and deployment even further. States and utilities must act quickly to take advantage of the credits, which begin to decline in value in 2017 (for wind) and 2020 (for solar), before phasing out by 2022.

There is a lot of wind potential in the state. According to the Wind Energy Foundation, Ohio has the opportunity to further lower electricity bills, create additional jobs, increase community investment, and reduce pollution by expanding the use of wind. Wind energy could provide 6.4% of Ohio’s electricity by 2020 and increase to 15.5% by 2030, and create cumulative electricity bill savings of $5.35 billion through 2050.

Our work is far from done  

We applaud Governor Kasich for his veto of House Bill 554. Because of his leadership, Ohio’s clean energy standards are now restored.  Governor Kasich has demonstrated that he has a vision for Ohio’s clean energy future. These clean energy standards are working to create jobs, grow the economy, and reduce harmful emissions.

But our work is not done. It’s predicted that energy will be on the Ohio General Assembly’s agenda again this session. Senate President Larry Obhof has stated that Ohio needs a long term energy plan. So we will celebrate the win, but we are prepared to continue defending clean energy in 2017! Photo: Howard Johnson/CC BY (Flickr)

Renewable Energy for Companies: Which States Make It Easiest (or Hardest)?

UCS Blog - The Equation (text only) -

If you’re a company looking to get your hands on some renewable energy, to power your operations with sources like wind and solar, turns out some states make that a lot easier than others. Here’s what a new study says about different options for businesses interested in going clean, energy-wise.

The new study, Corporate clean energy procurement index: State leadership and rankings, offers an array of useful perspectives. It comes from the Retail Industry Leaders Association (RILA), the Information Technology Industry Council (ITI), and Clean Edge, the research and advisory firm behind various useful rankings of clean energy progress.

The analysis is aimed at assessing states “based upon the ease with which companies can procure [renewable energy] for their operations located within each state.” The index has 15 metrics in three categories: purchasing from utilities, purchasing from third parties (someone other than your electric utility), and using “Onsite/Direct Deployment Options”—putting solar or wind right on your stores, factories, and warehouses.

And here’s what they found:

 RILA, ITI, Clean Edge 2017

Source: RILA, ITI, Clean Edge 2017

The top states are all over the map, literally—from #1 Iowa and #2 Illinois in the middle of the country, to New Jersey, California, and Texas.

As the top performers show, no one region has a lock on making corporate renewables purchases easy. But the authors note that some regions do better:

The Northeast, Midwest, and Mid-Atlantic regions are generally the most favorable regions in the U.S. for corporate customers seeking to power their operations with renewable energy…

Let businesses capture the economic development benefits of renewables… or not

The analysis assesses how much choice and competition for renewable energy purchases exist by state. One indicator of that is whether companies are allowed to enter into PPAs (power purchase agreements) with third parties, which let companies take advantage of the stable prices renewables are uniquely qualified to offer, to lock in electricity rates over the long term.

The answer is yes, no, or maybe:

 RILA, ITI, Clean Edge 2017

Source: RILA, ITI, Clean Edge 2017

As a taste of some of the corporate procurements, the report includes examples of large-scale purchases by some pretty big names:

 RILA, ITI, Clean Edge 2017

Source: RILA, ITI, Clean Edge 2017

Broadening the pie

The authors don’t stop at assessing where we are, but suggest opportunities for a cleaner future. In particular, to help businesses trying to get access to renewable energy, they say, here are a few ideas for what states can do:

  1. “Remove barriers to corporate deployment” of renewables, both onsite and elsewhere
  2. “Support the development of next-generation options” for helping corporate buyers use renewables to save money or hedge against swings in electricity costs
  3. “Expand energy choice options” for commercial and industrial customers in markets that haven’t “restructured”, ones in which electric utilities still own power plants, not just the electric distribution systems
  4. “Ensure that an adequate market exists for renewable purchasing” through utilities or others
  5. Ensure that, in any type of market, renewables “can scale up rapidly”

And, as the report says, while it’s focused on helping the businesses that are members of RILA and ITI in cleaning up their own acts, its findings are also “broadly applicable to many stakeholders, including other business sectors, the military, higher education, and state and local government.”

Businesses seeing the power and value of renewable energy have been important drivers for our transition to energy choices that cut air and water pollution, improve public health, strengthen energy security, and drive economic development.

States can make it easier for leading businesses to play that important role, or not. Clearly many states see the value in making it as easy as possible to get businesses of all stripes and sizes to help us move to clean energy. This new report gives us a chance to see which states those are, and to celebrate them.

EPA (Correctly) Affirms Vehicle Standards, Despite Automaker Misinformation

UCS Blog - The Equation (text only) -

EPA finalized its determination today that the current light-duty vehicle global warming emissions standards for 2022-2025 are appropriate. This adjudication affirms what we have said all along—manufacturers are currently ahead of schedule on the first round of standards (2012-2016) and continue to show the many pathways to cost-effectively meeting future standards.

This is a big affirmation for both consumers and the country as a whole:

To date, our analysis shows that the standards have saved consumers more than $34 billion in fuel. By 2030, this number will grow to $450 billion, even after taking into account costs for the technology used to drive those fuel economy improvements.

At the same time, we’ve avoided over 130 million metric tons of global warming emissions. The standards are working for consumers and the environment—there’s no reason to tap the brakes on that progress.

And for all their whining about wanting to weaken the standards, the automakers themselves have provided data that shows exactly why we shouldn’t.

Automaker data shows 2012-2016 compliance was easier, cheaper than expected

As I wrote about when the proposal was released, this decision is more than four years in the making and is backed up by a tremendous amount of benchmarking, modeling, and analysis. The large body of evidence gathered continues to point to new and innovative pathways that would allow manufacturers to not just meet but exceed the standards on the books—and each new data point confirms that fact.

In fact, the automakers themselves submitted data showing just how little technology they are actually applying to their vehicles in order to meet today’s standards, with much lower penetrations of complex/expensive technologies than originally anticipated.

 Comments by the Alliance for Automobile Manufacturers)

The 2012-2016 Final Rule (FR) on which automakers initially signed off envisioned a much higher penetration of more costly technologies would be needed (dashed red lines). However, manufacturers have shown innovative new ways to improve upon cheaper technologies as they overachieve on those standards (orange bars), leaving plenty of cost-effective technologies available for deployment out to 2025. (Source: Comments by the Alliance for Automobile Manufacturers, Attachment 2, pp. 40-43)

Outpacing expectations, they have been able to continue to exceed the standards with even lower cost technologies thanks to investments resulting from the need to meet strong standards. This innovation has generated numerous new technology pathways such as high-compression engines like Mazda’s SkyActiv and 48V mild hybridization—though those technologies are not yet deployed at large scale. This leaves ample room to continue reducing emissions beyond the current 2025 standards with gasoline-powered engines.

As a colleague of mine likes to say, “While automakers continue to pull the lowest hanging fruit, innovation means that the tree is constantly growing new low-hanging fruit.” This is why historically industry has continued to overstate the costs of regulation.

Automaker data shows that 2025 standards can be met through gasoline-powered vehicles

Additionally, while the auto companies claim on one hand that more electrification and other pricier technologies will be needed to comply in the future, their own analysis shows that they can comply through the broad deployment of advanced gasoline-powered vehicles.

 Comments by the Alliance of Automobile Manufacturers

Analysis submitted by the automakers shows that vehicles in 2025 can meet the standards through the deployment of turbocharged (TC), spark-ignited (SI) gasoline engines, complemented by advanced transmissions (HRST) and stop-start (SS). Note the very low penetrations of electrification required. Source: Comments by the Alliance of Automobile Manufacturers, Attachment 1, p. 74)

These gasoline-powered vehicles will be substantially more efficient than today, incorporating advancements such as 48V mild hybridization, which allows for efficient electric boosting of smaller engines and improved efficiency of accessories; high-compression engines running on thermodynamic cycles that are more efficient; dynamic cylinder deactivation that can downsize the engine in real-time to provide the right amount of power at the right time; more efficient transmissions that keep the engine operating at its most efficient point more frequently; and reductions in road load such as improved aerodynamics and low-rolling resistance tires to help reduce the amount of energy needed to drive the vehicle in the first place.

Investments in those technologies are buoyed by the certainty of the strong standards which EPA today affirmed, as noted by automakers: “By extending the standards for many years into the future, the agencies provide manufacturers with substantial lead-time, which is of great value in compliance planning.”

Meeting 2025 standards is no problem for automakers, which is why EPA held firm

All of this is to state the obvious: the automakers themselves show that the 2025 standards are achievable, which is part of why EPA has affirmed the standards set in 2012. So in the inevitable onslaught of automaker whining that will surely follow this announcement, remember this:

  • Automakers signed on to these standards with much hullabaloo when they were finalized;
  • Automakers are currently ahead of the game, deploying efficient technologies at reduced costs compared to original estimates of compliance;
  • Automaker data submitted in the four years hence continues to show that those 2025 standards are achievable with conventional gasoline-powered vehicles (thanks to the continued investment in and deployment of fuel consumption reduction technologies); and
  • Consumers and the environment stand to benefit tremendously by leaving these cost-effective standards in place.

EPA’s decision today confirms that the data is in and crystal clear: the 2022-2025 standards put on the books in 2012 remain feasible for manufacturers and will provide significant benefits for the country and the environment.

Will a Rick Perry DOE Help Limit a Risky Overreliance on Natural Gas?

UCS Blog - The Equation (text only) -

Within the next couple of weeks, former Texas Governor Rick Perry will appear before Congress for hearings on his nomination for Secretary of Energy under the Trump administration. Mr. Perry clearly subscribes to an “all of the above” strategy on energy—and that could leave us exposed to serious consumer and environmental risks from an over reliance on natural gas. There is another way: prioritize clean, cost-effective renewable energy.

Texas leadership on renewable energy

Mr. Perry’s track record shows that there is some room for optimism that he understands the economic opportunities that renewable energy can bring and the role that investments in a modernized electricity grid can play in providing clean, affordable energy.

Indeed, Texas leads the nation in the deployment of renewable energy, with wind energy from Texas making up about a quarter of the nation’s total wind generation. In 2016 Texas is estimated to have got nearly 15 percent of its electricity from wind generation. And all that wind that has helped provide low cost electricity, jobs and significant economic benefits for the state.

energy-minigraphic-natural-gas-price-volatility-vs-renewable-energy

But Serious Concerns on Natural Gas

Texas also leads the nation in production of natural gas (and crude oil).

The good news is that cheap natural gas and low-cost renewables are helping drive out polluting coal-fired generation nationwide. (Despite comments from Mr. Trump, most industry experts do not see this trend being reversed.) In its 2016 long term assessment, the Electric Reliability Council of Texas (ERCOT) projects significant coal retirements in the state by 2031 with low-cost solar replacing nearly all of the coal. The public health benefits of this transition away from coal will be tremendous.

But the rub is that low natural gas prices are quickly leading to a risky overreliance on natural gas for power generation, in Texas and elsewhere. Natural gas price spikes caused by weather or other factors can then put consumers at serious risk of increases in electricity prices and the cost of heating—which poses an especially significant hardship to low-income and fixed-income households.

If efforts to expand US LNG exports proceed, there could be further upward pressure on domestic natural gas prices.  (The DOE plays an important oversight role in granting authorizations for exports and imports.)

Furthermore, the “cheap” prices do not reflect environmental concerns related to fracking, including risks to drinking water.  And they do not account for climate concerns: using natural gas to generate electricity leads to carbon dioxide emissions, and methane is emitted during the production, storage and distribution of natural gas.

Methane leakage from oil and gas operations wastes gas, and there are solutions to help reduce it. In fact, the DOE is playing an important role in funding research on innovative solutions to find and fix methane leaks from oil and gas infrastructure.

News stories indicate that Mr. Perry is taking credit for cuts in pollution in Texas related to a shift from coal to gas. There’s more progress that needs to be made in cleaning up the air in Texas. Also, importantly, a coal to gas transition it is simply not sufficient to cut carbon emissions in line with climate (and economic) goals. (Take a look at the new World Economic Forum’s 2017 Global Risks Report which once again points to climate change as one of the leading risks to the global economy.)

Insights from the DOE’s Quadrennial Energy Review

The DOE’s Quadrennial Energy Review lays out a vision for a modernized, more resilient energy system for America. Related to natural gas, the QER highlights the growing interdependence of the power and natural gas systems and the reliability and safety challenges posed by our increased reliance on gas. For example, the report says:

“…overall reliance on gas for electricity has gone up, creating a new interdependence and grid vulnerability.”

and

Aging, leak-prone natural gas distribution pipelines and associated infrastructures prompt safety and environmental concerns. Most safety incidents involving natural gas pipelines occur on natural gas distribution systems. These incidents tend to occur in densely populated areas.”

As head of the DOE, Mr. Perry will play an important role in helping to figure out how to address these types of challenges, in coordination with other stakeholders.

Wide Open for (Renewable Energy) Business

One clear way is diversifying our electricity mix by promoting clean energy sources like wind and solar.  Mr. Perry’s experience in Texas should make clear that leveling the playing field for renewable energy is a very good thing for consumers and for the economy. It means more affordable, clean power and a diversified electricity mix that limits the risk of price spikes. And it helps foster business opportunities and innovation that put the US in a strong position to be a global leader in clean energy.  With China, Germany, India and other nations making a big push on renewables, this is not the time to cede ground on clean energy progress.

However, a cozy relationship with oil and gas interests could jeopardize these opportunities and skew the playing field in favor of fossil fuels including natural gas, with long-term consequences for our economic, energy and climate goals.

A Clean Energy Future

Putting aside the irony of the fact that Mr. Perry once ran on a platform of eliminating the agency he is now in line to lead, if he is confirmed as the Energy Secretary, he must approach the job by putting the public interest ahead of that of fossil fuel companies.

Americans want clean, affordable energy from domestic sources. Surveys repeatedly show that there is strong bipartisan support for renewable energy, with solar energy a strong favorite.

Let’s hope Mr. Perry is listening and will use his new position to foster the clean energy future our country wants and needs, one that limits the risks of an over-reliance on natural gas and advances wind, solar and other forms of clean energy.

 

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs