ICIAM 2011: Math Modelers Dive into the Gulf Oil Spill

February 5, 2012

Barry A. Cipra

In one of the unintentional laugh lines of the 1997 disaster movie Volcano, Tommy Lee Jones, playing the head of a FEMA-like emergency-response agency faced with an impending river of lava, turns to an underling and shouts, "Get me
a scientist!" How quaintly 20th-century. In the new, multi-disastered millennium, our ruggedly handsome bureaucrat is more likely to issue the order, "Get me a mathematical modeler!"

If Mac Hyman of Tulane University has his way, Tommy Lee's 21st-century counterpart won't need to shout---the analytic and algorithmic help he needs will already be at his side, having computed through the worst- and best-case
scenarios, and calmly giving him actionable advice. At ICIAM 2011, Hyman illustrated some of the ways in which quick-response mathematical modeling could have helped meet one of the major disasters of 2010: British Petroleum's
Deepwater Horizon oil spill in the Gulf of Mexico.

Hyman's presentation was part of a minisymposium on the oil spill organized by Ben Fusaro of Florida State University. The speakers were Fusaro, Hyman, and Clint Dawson* of the University of Texas, Austin.

The BP disaster began on April 20, 2010, with a fiery explosion on the Deepwater Horizon platform, a 52,000-ton high-tech buoy floating over 10,000 square meters of open water. The explosion killed 11 workers on the platform. That tragedy would have held the public's attention for at most a news cycle. What riveted the nation (and to a lesser extent the rest of the world) for months was the ensuing ecological disaster: Equipment failures left the freshly drilled deep-water well unsealed and crude oil gushing from the seabed floor.

BP initially downplayed the extent of the leak, estimating it at a mere thousand barrels (42,000 gallons) per day. That estimate quickly quintupled, but it was only after outside experts began studying footage from videocameras sent down to examine the uncapped well that the correct order of magnitude became apparent. It was eventually determined that the gusher began at a rate of around 62,000 barrels per day, slowing to around 53,000 barrels per day by the time the well was capped on July 15, for a total spill of approximately 4.9 million barrels.

In the meantime, evidence of the spill's extent began to shimmer on the surface of the Gulf and wash up on its beaches. The Deepwater disaster effectively shut down commercial fishing in the Gulf and put a dent in the area's tourism, with estimates of short-term costs ranging from hundreds of millions to tens of billions of dollars.

The oil spill's likely long-range effects on the environment are still being calculated; hundreds of species are estimated to be threatened. And topping off the tank of unanswered questions is the ultimate unknowable: How much worse could the disaster have been? Could the spill, for example, have entered the Gulf Loop Current, to be swept from there around the Florida Keys and up the East Coast? How lucky was the absence of major hurricanes in the Gulf that summer?

Such questions cry out for data-driven computational analysis. Mathematical modeling, Hyman notes, can predict where the oil will go, what its ecological impact will be, and what can be done to mitigate the effects. In the aftermath of the Deepwater disaster, the National Science Foundation awarded a slew of "rapid response" grants for the study of some of these questions. Hyman pointed to those of Naphtali Rishe of Florida International University, Andrew Juhl of Columbia University, Richard McLaughlin of the University of North Carolina, Ethan Kubatko of Ohio State University, Andrea Bertozzi of UCLA, Azmy Ackleh of the University of Louisiana at Lafayette, Caz Taylor of Tulane, Xiaobo Tan of Michigan State University, and David Finkelstein of the University of Massachusetts.

Rishe, for example, is working on giving decision-makers---or anyone, for that matter---easy access to everything they need to know about the oil spill. Specifically, a module he is developing for FIU's TerraFly project is tailored to regions affected by the Deepwater disaster. TerraFly, which is open to anyone with a web browser, allows users to "fly" online over aerial imagery and query geospatial databases. The oil spill module will be a geospatial database management system running on hardware tailored to the problem at hand. Its intended applications range from environmental monitoring to the analysis of real estate values.

Kubatko has been studying the what-if scenario of oil from a spill in the Gulf traveling around to the Atlantic side of Florida and from there up along the Intercoastal Waterway (also called "The Ditch"). He is developing a tool for modeling the transport of oil entrained in Gulf Stream eddies so that simulations can identify at-risk areas, including beaches and coastal wetlands. The tool is based on the ADCIRC (Advanced Circulation) model developed by Dawson and colleagues Rick Luettich of the University of North Carolina at Chapel Hill, Randall Kolar of the University of Oklahoma, and Joannes Westerink of the University of Notre Dame. One of ADCIRC's specialties is the modeling of hurricane storm surges.

Several groups are looking at what the oil does and how it moves before reaching the surface or the shoreline. Juhl has been measuring properties of oil in the subsurface plume and the photosynthetic capacity and biomass of organisms affected by the spill. The goal is to estimate the location and extent of plumes, map their spatial extent near the surface, and understand their effect on subsurface oxygen concentrations. McLaughlin and colleagues, meanwhile, have taken a major step toward explaining why plumes persist in the first place, by modeling their behavior in density-stratified water. Laboratory experiments and numerical simulations have shown that whereas oil injected by itself into cold, salty water will rise to the surface, oil mixed with soap forms micro-droplets that rise only until they encounter warmer, less salty water of matching density. That's pertinent because one of the "treatments" for the Deepwater disaster was 800,000 gallons of chemical dispersants pumped directly into the flow of oil at the wellhead.

Tan is developing robotic fish equipped with oil-detecting sensors. The idea is that a school of small oil-sniffing fish can determine where oil plumes are and where they're going, so that beach closures and cleanup efforts can be planned in advance. Needless to say, oil-detecting sensors are not the only instruments that could be installed. The project, according to the proposal abstract, "could usher in a new era in aquatic environmental monitoring."

The poster children of any environmental disaster are the distressed animals whose ecological niches have been nicked. Ackleh and Taylor are working to assess the impact of the oil spill on marine mammals and migratory birds, respectively. Ackleh and colleagues hope to predict the long-term effects on the population dynamics of endangered whale species. They have conducted acoustic experiments, in essence eavesdropping on whale conversations in the Gulf, and are comparing what they hear to similar data collected earlier in the decade. The group is developing statistical and mathematical models to estimate what happened to marine mammal populations in the vicinity of the dis-aster and what the long-term impact is likely to be. Taylor's group is comparing indicators of fitness for shorebirds wintering in the Gulf at oil-exposed and unaffected control sites. The question is not just what happens to the birds themselves when they ingest oil at their wintering sites, but also what happens to the distant Arctic ecosystem when they try to go back to breed.

Oil-soaked animals have a poster cousin in pictures of tar-balled beaches. Much of the economic impact on tourism is centered on how long the oil sticks around once it comes ashore. Finkelstein is documenting the geochemical fate of oil in the wake of the spill, using mass spectrometric identification and isotopic characterization. The idea is to test models of biodegradation by characterizing residual organic compounds as a function of exposure time for oil that has washed ashore. Bertozzi is also studying the littoral fate of oil, but from a fluid-mechanics point of view. The flow of oil in water is challenging enough, but the dynamics of oil in sand---or, rather, sand in oil---is even more complex (see article).

Bertozzi's group has developed mathematical models for the behavior of oil�sand mixtures on slopes. That's particularly relevant to issues in beach clean-up: A beach, after all, necessarily has a slope.

As "rapidly" as these research projects have taken shape, the real point, Hyman says, is to have this sort of scientific infrastructure in place before a new oil spill, hurricane, or some other disaster strikes. The mathematical scientists working on these projects should keep in mind the need to present their results to decision makers in a way that's meaningful to the decision-making process. In part, that means emphasizing outcomes and options, rather than long-winded explanations of the fascinating technical aspects of the underlying research; it also means that researchers need to cultivate relationships of trust with the people they'll be advising, so that the people in charge of responding to the next disaster (and the one after that) will know who to turn to and what kind of help to expect. Otherwise, another famous movie line comes to mind: "What we've got here is failure to communicate."

*See "Computational Modeling Challenges and the Gulf Oil Spill," SIAM News, July/August 2010; http://www.siam.org/news/news.php?id=1787.

Barry A. Cipra is a mathematician and writer based in Northfield, Minnesota.


Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+