skip to main content

→ Top Stories:
Fracking
Safe Chemicals
Defending the Clean Air Act

Lisa Suatoni’s Blog

Is anyone tracking the oil? If so, the plan and the data should be made public

Lisa Suatoni

Posted May 25, 2010 in Curbing Pollution, Moving Beyond Oil, Reviving the World's Oceans, Saving Wildlife and Wild Places

Tags:
, , , ,
Share | | |

The need to track (and assess) the oil

The federal government plays two important roles following an oil spill: (1) providing aid and guidance to the cleanup efforts and (2) conducting a damage assessment to facilitate restoration.  To achieve these goals the government needs to know not only the volume of oil entering the environment, but where it goes and what it impacts.  These roles are not voluntary; they are mandated by law, under the Oil Pollution Act of 1990.

Given that the Deepwater Horizon blowout is offshore, the principal government office tracking the oil at sea and identifying its fate (air, open ocean water, sea bottom, shore) is the Emergency Response Division at the National Oceanic and Atmospheric Association’s (NOAA) Office of Response and Restoration. The Emergency Response Division uses this information to advise the federal on-scene coordinator and the responsible party (in this case, BP) about where and how to concentrate the response efforts.  For example, they have been producing and updating the oil slick trajectory maps over the past month.  This job is essentially ‘damage control’. 

 In addition, the federal government, along with state and tribal ‘trustees’, is responsible for conducting a natural resource damage assessment (NRDA).  To properly conduct an assessment the trustees need to know the fate of the oil, its byproducts in the environment, and the subsequent impacts to plants, animals, and habitats.  Again, given that the spill originates offshore, NOAA will play a key role in this process by tracking the oil at sea and characterizing its chemical and physical state.  This will be done through the Assessment and Restoration Division of the Office of Response and Restoration.  (The Department of the Interior, which oversees 33 wildlife refuges and several units of national parks/seashores along the Gulf coast, will also play a critical role in the damage assessment process.) This phase can be viewed as ‘damage assessment’.

 

The necessary research

 The Gulf oil disaster is unique and challenging in a number of ways.  It is deep, continuous, and dispersants have been applied extensively with unprecedented techniques.  Understanding the behavior and fate of the spilled oil will require a carefully designed, aggressive research effort. 

In general, tracking the fate of oil released in deep water is significantly more challenging than surface spills.  As a 2002 NRC study (Oil in the Sea III) points out, “[t]he release of oil beneath the surface introduces a number of complications compared to oil released at the surface.  From the standpoint of fate … important complications are enhanced by dissolution in the water column and, perhaps, emulsification.” 

The formation of very small droplets of oil in deep waters can result in the formation of oil plumes below the surface that can expand and travel great distances in currents, undetected.  Following a field study of deep sea spills at the Helland Hansen site in the Norwegian Sea, scientists found that they could only account for the fate of less than 28% of the oil, as much of it did not surface in the vicinity of the release location during the trial.

As a result of these challenges, field monitoring and careful surveillance is of particular importance in deep sea spills.  Numerical models used to describe the behavior and predict the fate of the deep sea oil releases, are less developed and reliable than models used to track surface spills.  Consequently, field data used to validate the models are critically important.  Ultimately, the open ocean assessment effort should employ a suite of techniques, including: aerial surveillance to document slick behavior, sonar to track subsurface plumes, autonomous underwater vehicles to sample chemical and physical characteristics of the water, traditional water sampling to characterize the chemistry and toxicity of the oil, bottom sampling to identify contamination of sediments, and sampling of marine life to assess damages.

 

Rising concerns about monitoring and assessment

A recent article in the New York Times raised concerns that efforts to track the subsurface oil have been lagging.  For example, it is not clear why the Emergency Response Division is releasing oil spill trajectory maps for the surface-oil only, when there are indications that much of the oil resides below the surface of the water.

The federal government has given assurances that the necessary research to track the oil and conduct a full off-shore assessment has commenced.  Yet questions remain about how coordinated and comprehensive the undertaking is.  We believe that making the at-sea assessment plan, and subsequent data, publicly available would go a long way to reducing the growing concern.  There is no clear reason for the current lack of transparency.

The enormous, and potentially unprecedented, challenge facing NOAA’s Office of Response and Restoration should be recognized, particularly having lost one-third of its staff over the past five years.  If the apparent data lag in ocean sampling is real, and is a consequence of being forced to prioritize ‘damage control’ over ‘damage assessment,’ (e.g., fewer than a dozen of 1200 the boats out on water are studying the spill) the government should take notice immediately and provide NOAA the necessary resources to conduct its simultaneous responsibilities. 

The prolonged nature of this event offers an historic opportunity to study oil response strategies and to develop a better understanding of the impacts of deepwater oil spills.  It would be a double tragedy to miss this opportunity.

Share | | |

Comments

WestonMay 26 2010 12:00 AM

Thanks. I really appreciated this perspective.

Dominique BacheletMay 26 2010 03:35 PM

Richard Kerr, Eli Kintisch and colleagues have been publishing and posting on the Science magazine web site updates of what is known and unknown about the spill (ex. Science of the Oil Spill: Our Reporting Team Tackles Five Key Issues).There is important information there about the potential cause (methane hydrates), dispersant (Corexit) toxic effects on natural microbes that could transform the oil, and the suite of impacts on plants and animals that use the area both on the coast and in the water. The challenges of detection and estimation of long term impacts are huge. It is essential that all the information gathered about the spill be made readily available to all the scientists who are trying to evaluate the extent of the damage and do something about it. Working for an organization that specializes in making conservation data available to all (CBI), I really want to emphasize the importance of communication and the need for transparency of what data are available. I also want to stress the importance of support for long term monitoring of the area so that not only acute symptoms are recorded but also the long term impacts on the area fisheries and wetlands which may last for years to come.

Comments are closed for this post.

About

Switchboard is the staff blog of the Natural Resources Defense Council, the nation’s most effective environmental group. For more about our work, including in-depth policy documents, action alerts and ways you can contribute, visit NRDC.org.

Feeds: Lisa Suatoni’s blog

Feeds: Stay Plugged In