Computational Modeling Challenges and the Gulf Oil Spill

July 23, 2010


May 12, 2010. A heavy band of oil in the Gulf of Mexico, taken during an overflight. Photo courtesy of the National Oceanic and Atmospheric Administration.
About a month after the explosion and onset of the oil leak in the Gulf of Mexico, as oil continued to gush from the uncapped BP well, the National Science Foundation made an emergency award to a group of mathematical scientists and engineers: one million hours on 4096 cores of the Ranger supercomputer at the Texas Advanced Computing Center at the University of Texas, Austin. Among the researchers is Clint Dawson of the university's Institute for Computational Engineering and Sciences, who has been working with collaborators on a two-dimensional circulation model for the region. In a phone conversation with SIAM News, he responded to questions about the current model and challenges facing the researchers as they work to extend it to three dimensions.

What can you tell us about the development of the model?

We have been working on a hurricane/storm-surge model for about ten years, focusing on the part of the Gulf affected by Katrina, Rita, and other severe storms, with high resolution at the coast. When news of the oil spill emerged, we already had a model covering the affected region. But to model the spill, we needed an additional piece: code for tracking the trajectory of the oil. We obtained that code from two collaborators, Johannes Westerink (a professor of civil engineering at Notre Dame University) and Rick Luettich (director of the Institute of Marine Sciences, University of North Carolina, Chapel Hill).
We coupled our circulation model with the particle-tracking code, and we now run the resulting code a few times a day on the Ranger supercomputer. We initiated the particle-tracking code with 20,000 particles, which seems to give pretty good coverage of the extent of the plume for now.

What other data do you need?

We use meteorological updates, tidal data, and surface images of the oil plume. The Center for Space Research at UT Austin gets satellite images, mainly from NOAA, and from other sources as well. We use those surface images of the spill as initial conditions for our depth-averaged current models.
The National Centers for Environmental Prediction, part of the National Weather Service, post weather forecasts on the web every six hours. We use the NCEP forecasts to update our model, producing a 72-hour water-current forecast.
We have already used about half of our time on the supercomputer---much of it during the start-up phase. To get the necessary data on tides, for example, we went back to April 1 for a two-month tidal simulation. Now that we are up and running, we do one or two forecast runs a day, each requiring about three hours.
[At this point, Dawson interjected, "I'm looking at some of the results we just generated---it actually looks like an oil spill!"]

Are other models in use? How does your model differ from them?
NOAA developed a code called WFS/ROMS (coupled Weather Forecasting System and Regional Ocean Model System) to study the physics of hurricanes. Our circulation model includes the entire western North Atlantic, with the focus on the Gulf Coast. We are now performing test runs for marshes, wetlands, and river channels along the Louisiana, Mississippi, and Texas coasts. We believe that it's the high resolution of these coastal features that sets our model apart from others.

What can you tell us about the ability of the model to track severe storms and hurricanes?
It looks as if it's going to be an active hurricane season. We have gone back and recovered data from Hurricane Gustav [2008], which occurred in much the same area affected by the oil spill. We also have saved data from Katrina, Rita, and just about every other major storm of the last decade.

What are some of the challenges of moving to 3D?
We are trying to get to 3D as quickly as possible. We can run our 2D model in 3D now with no problem---it just takes longer. What is a huge task is getting the 3D extent of the plume. We have seen renderings of major pieces of the plume, including one on the east and another on the west side of the spill. We need better data on the subsurface extent of the plume to initialize a 3D model.
There are also algorithmic issues to be worked out, and additional physics that needs to be incorporated in the model. On Monday, something in the model blew up, but the problem was fixed within 24 hours and we were back up and running.

What is your main challenge now?
The main challenge now is getting accurate information on the exact location of the spill, tracking it, and trying to validate the results. The spill is also growing, so we will probably have to increase the number of particles used to track it. It is also splitting into multiple plumes. As the number of particles increases, we may have to parallelize the tracking code to keep the run times within the forecast window.

Can the model track the effects of dispersants?
The spill involves both heavy and light oil, as well as dispersants sprayed on the surface. We can tag particles with different characteristics in the model, but to model the chemical reactions involving the dispersants, we would need to add a chemist to the team. With the appropriate information, it would be possible to model the effects of the dispersants on the oil.

Any other thoughts for now?
This is a good illustration of the benefits of federal investment in research.


Donate · Contact Us · Site Map · Join SIAM · My Account
Facebook Twitter Youtube linkedin google+