Floods have been in the news lately across the Midwest, and they’re wreaking havoc on farms and land across those regions.
Predicting flood damage and actions can be a challenge, and in the Pacific Northwest it is compounded by the complexity of storms, tides and other interactions.
Oregon State University researchers have developed a new methodology for building computer models that paves the way to better understanding the flood risks faced by coastal communities. That’s good news for areas that are near estuaries that make them vulnerable to both storm tide inundation and river flooding.
Estuaries are the point at which rivers flow into the ocean. That can mean that those rivers are impacted by tidal flooding, and they can experience frequent, periodic changes in salinity, sunlight and oxygen. Predicting that impact can have value for nearby residents.
Researchers used Washington state’s Grays Harbor for their work, but the methodology can be applied to any area where estuarine flooding could occur. The study was published in a recent issue of Coastal Engineer.
Kai Parker, corresponding author of the study, noted that flooding in areas like the Pacific Northwest is complicated since many processes contribute to problems including tides, large waves and river flow. “We need to be able to predict water levels on several time scales,” he says.
During a particular storm, short-term predictions can inform decisions about actions including evacuations and road closures. But it’s also important to understand how flooding occurs across longer time scales, so planners can have more information when deciding whether to develop a low-lying land parcel.
Computer model at work
The new computer model uses emulation and statistical techniques, as opposed to traditional models that try to directly reproduce the wide collection of physical issues at play when estuaries flood. The direct reproduction approach requires time and processing power, Parker says.
“The computational expense makes it difficult to study flooding at long time scales,” he says. “The key question we wanted to answer in this study is, ‘Is there a better way to handle long simulation times for computationally expensive flooding models?’”
Parker explains that researchers were able to reduce the complexity of the model using statistical methods to create an emulator. Once the training data set is created and the emulator is trained, additional use of it comes at nearly zero cost, Parker says.
Essentially, the emulator isn’t measuring actual storm input, but recreating it — using key statistics to predict outcomes. The researchers found the emulator performed well-reproducing extreme water levels of recent flooding events at Grays Harbor.
Roughly 140 miles northwest of Portland, Ore., Grays Harbor is a shallow bay — its average depth is roughly 5 meters — with a deepwater navigation channel maintained by the U.S. Army Corps of Engineers.
Grays Harbor covers 235 square kilometers, is fed by five rivers that drain a watershed of greater than 7,000 square kilometers and is “subject to an energetic storm and wave climate,” Parker says, providing a solid test for the model.
“Our model is very useful, since we can use it to explore an infinite variety of future flooding scenarios,” Parker says. “This allows us to better understand the risk of flooding in coastal communities, as well as how this risk will change moving into the future.”
The National Oceanic and Atmospheric Administration supported this research, as did the Quinault Treaty area tribal governments.