Can artificial intelligence outsmart natural disasters?


As natural disasters grow more severe in the United States, local governments are increasingly using predictive analytics to understand where and when an emergency will impact their communities. — 123rf.com

When fires start in Gilpin County, Colorado, the United States, they burn hot and grow fast.

Floods in Texas’ San Antonio River basin spill across highways, blocking emergency responders, and Norfolk, Virginia, sees homes inundated by coastal storms.

Earthquakes shaking the US Pacific Northwest risk derailing trains, injuring residents and causing power outages at hospitals.

Prediction and early detection tools – as well as automated responses – aim to help local governments reduce the damage from these kinds of natural disasters.

Today’s tools are warning residents, triggering mitigation and helping first responders react more effectively.

As artificial intelligence (AI) advances, sensors proliferate and data collections grow, prediction and detection technologies are likely to become more precise and effective.

Tarek Ghani, assistant professor of strategy at Washington University’s Olin Business School, and Grant Gordon, former senior director of innovation strategy at the International Rescue Committee, envision using AI to predict disasters in advance, thus enabling responders to take swift actions to prevent or mitigate them.

In their 2021 article, Predictable Disasters: AI And The Future Of Crisis Response, they write that such tools could also anticipate how a crisis would develop, guiding responders to be more effective in their interventions.

Not all disasters are equally accessible to AI, however, and the technology is most reliable at analysing events for which the root causes are well understood, plenty of data is available to train the algorithms, and instances are recurrent enough that the models’ predictions can be compared against reality and fine-tuned, Ghani and Gordon write.

Floods are a strong example.

The San Antonio River Authority (Sara) uses a tool to predict floods 12 hours in advance and inform emergency responders, its senior technical engineer Wayne Tschirhart said.

In Bexar County, Texas, where the tool is currently deployed, rainfall can turn into a full-fledged flood within two hours, but the National Weather Service only updates its predictive models twice a day, meaning a flood could come and go before a new alert is out.

Sara sought to supplement those services with its own, locally focused projections that it reruns every 15 minutes, giving a picture as close to “real time” as possible.

Sara uses data from the National Weather Service’s forecasting products to inform its model.

To ensure the system stays accurate, it compares its estimates against on-the-ground readings it pulls every 10 minutes from water gauges placed at high-risk areas like dams and low-water crossings.

Disaster responders need to balance a desire for early alerts against their need for accuracy.

“We can get up to 12 hours pretty accurately, and it’ll go out to 24 hours; that’s our maximum prediction horizon,” Tschirhart said.

“We want to know... now that the rain has fallen, within the next few hours, what can we expect from this creek as far as flooding is concerned? The further you go out (in your projections), the riskier it gets.”

The idea is to inform emergency responders quickly so they can route around submerged bridges and roadways when rushing to help, rather than being forced to backtrack.

Responders identifying inaccessible areas would also know to call for help from neighbouring jurisdictions that may have easier access.

Earthquakes aren’t as easily – or feasibly – predicted as floods, however.

In fact, the US Geological Survey (USGS) stresses that accurate earthquake predictions are impossible now and in the “foreseeable future”, with scientists able, at best, to give the probability of a significant quake hitting an area “within a certain number of years”.

Researchers and city agencies seeking to minimise quake damage are instead focused on detecting, and responding to, the first signs of a quake as rapidly as possible.

The USGS offers an early detection system in California, Oregon and Washington known as ShakeAlert.

It uses a network of seismic sensors to detect and evaluate the first vibrations of an earthquake (known as primary waves, or P waves), then relay their readings to a data centre.

If four separate sensors register shaking, the system’s algorithms assume it is a real event, rather than a false positive caused by a hypothetical truck collision with a sensor or other non-quake vibrations, explained Bob de Groot, USGS ShakeAlert national coordinator for communication, education, outreach and technical engagement.

Algorithms then estimate the scope, location and severity of the earthquake’s shaking.

USGS gives those details to its various message distribution partners – including public transit agencies and companies like Google – which then send out warnings to residents and critical infrastructure operators via phone alerts, app-based notifications and other methods.

Messages should arrive seconds before the earthquake’s damaging secondary waves hit.

Ideally, residents receiving alerts have time to drop, cover and hold on to something to reduce their chances of injury.

Some cities and institutions also automatically trigger public safety responses when they receive ShakeAlerts.

San Francisco’s transit system automatically slows trains, for example.

Other common automations include activating hospitals’ backup generators, opening elevator doors at the nearest floor and shutting water utility valves to avoid the risk of reservoirs emptying.

Ghani and Gordon also thought automation could accelerate post-incident response.

Mitigating large-scale disasters can require extra emergency funding, and automated systems could send financial aid to areas where algorithms calculate a high probability of a serious event occurring.

Different levels of funding could be automatically unlocked as crises worsen, ensuring responders have the resources at hand to get straight to work, rather than be delayed by the need to seek and wait for aid.

“Instead of an operational infrastructure grounded in post-hoc fundraising and service delivery, a future humanitarian system could orient around an operational structure that flexibly increases capacity for rapid response as a crisis worsens,” Ghani and Gordon wrote.

Hitting the limit

ShakeAlert produces analysis within seconds or tens of seconds from an earthquake’s start, and in earthquake response, every instant of advance notice counts.

To achieve its speed, ShakeAlert can only collect a “snapshot” of information before analysing and transmitting findings – otherwise the warning comes too late. What it offers is a rapid-fire best-guess assessment of the situation.

“With ShakeAlert, we have a trade-off between time and accuracy,” de Groot told GovTech.

“We make sure we’re as accurate as possible in the short(est) amount of time as possible.”

There are other limits, too. People near an earthquake’s epicentre are so close that they are likely to feel the shaking before receiving a warning, because it still takes time for multiple sensors to trigger, algorithms to analyse and partners to send out alerts.

Residents sometimes say they want to receive ShakeAlerts about any earthquake they can notice, not just those that risk injuring them.

But cellular messaging wasn’t built with this kind of speed in mind and sending phone alerts to large populations – say, an entire city – uses precious seconds.

Adding recipients who don’t absolutely need to know slows the message, de Groot said.

“The more people that an app has to deliver to, the harder it is to move it quickly,” de Groot said. “Even though the cellphone company has the information within a couple of seconds, it takes time to push it out to the phones just because of their systems.”

Instead, USGS and partners only send cellphone-based alerts about quakes of magnitude 4.5 or higher and only to people who’ll feel at least a weak shaking.

The system faces one other handicap: recognising the Big One.

ShakeAlert is likely to struggle to accurately calculate the impact of any earthquake over magnitude eight, due to lack of data, de Groot said.

The world has seen only four magnitude nine earthquakes since 1952, per USGS, and none in the Northwestern US region where ShakeAlert focuses. The last was over 300 years ago.

That leaves scientists trying to build models using synthetic data and extrapolations from events in other countries – until one hits the US.

Finding the fire

By the time Gilpin County, Colorado, residents see enough smoke to prompt an emergency call, fires are often already out of hand. In remote areas, a tree hit by lightning might smoulder for weeks without a passerby to detect it before it ignites into a full-blown conflagration, said Gilpin County Emergency Manager Nate Whittington.

And smoke is only a vague indicator of where firefighters need to go, as shifting winds can create confusion, Whittington said.

The chase to locate the flames delays response and means that firefighters in the mountainous region may not discover if the fire is up a steep climb until they arrive.

Some sites are impassable to firetrucks and unreasonably slow to hike up to on foot, requiring responders to call for helicopter or plane assistance.

The county hopes a tool can help them more rapidly detect and accurately pinpoint nascent fires.

That means knowing in advance whether to dispatch an aviation team – saving valuable minutes – and catching lightning-struck trees while they’re still only smouldering.

“I’m hoping that these sensors can make it so that we are fighting fires and not fighting wildfires,” Whittington said.

As of March, the county was early into adopting a fire detection and location system from the firm N5. N5’s chief revenue officer, Debra Deininger, said Giplin is the first commercial deployment.

That system uses sensors mounted throughout target areas that are designed to detect chemical traces, smoke particulates and gases in the air as well as take heat readings, CEO Abhishek Motayed said.

Sensors relay readings to a Cloud-based algorithm that analyses the data to update digital maps and deliver alerts and coordinates to responders’ mobile phones.

The algorithms are intended to analyse sensor readings to differentiate between smoke from innocuous situations – home chimneys or campfires – and smoke from dangerous burnings.

Seeing whether several sensors light up can also help, with multiple sensor activations more likely to confirm a spreading fire, Motayed said.

Gilpin County will pilot N5’s system and Whittington says one incident during earlier testing was particularly promising.

When the forestry service conducts controlled burns, sensors are brought along to collect data that will help the algorithm learn to distinguish between normal and abnormal air conditions.

The evening following one such burn, a pile of vegetation reignited unintentionally; the system detected the unusually hot heat signature.

This abnormal reading prompted N5 to call Whittington, who then contacted dispatch about sending someone to check it out.

He was still on the phone when a 911 call came in, reporting a fire on the pile.

The tool would’ve given a warning significantly in advance of the 911 caller, if its sensors had been programmed with the alert metrics being designed at the time, Whittington said.

“If sensors had been programmed to where we needed them, that notification would have come in 36 minutes before that 911 call,” Whittington said.

Whittington had several goals when selecting a technology solution, such as being able to withstand severe weather, but said he didn’t plan to evaluate its level of effectiveness based on specific metrics.

Instead, he’s taking a broad look and judging the investment as useful or not based on whether it helps save lives or fails to detect a fire.

“The best testament that I’m going to have to this technology is if one of my sensors goes off and I can evacuate all my people before that fire even gets close enough to their house that I have to worry about it.

“Then I can look back and say, ‘Yes, that worked’,” he said.

Sara similarly defines the success of its flood prediction tool in terms of lives potentially saved and time and effort spared by helping emergency responders better direct their resources and efforts.

In one instance, early flood prediction allowed Sara to warn a federal jet engine facility five hours in advance, allowing the facility to hoist sensitive equipment out of the water’s path.

Filling in the blanks

Technologies are also being used to help get ahead of disasters before they start, such as in Norfolk, where its Office of Resilience uses tools to provide residents with tailored advice about better protecting their homes against flooding.

Messages around flood risks have traditionally been too general, describing all members of a community as facing the same level of risk.

This overlooks how differences in home construction and the frequency and depth of the flooding events they’re exposed to all influence the risks facing a property, Norfolk Coastal Resiliency Manager Matt Simons said.

The elevation of property grades and the elevation of the home matter, as do factors like the building’s age, foundation type, and the existence of a basement or flood vents.

Blanket advice about flood preparation may not feel urgent to residents, either. Simons hopes to provide more meaningful guidance and better encourage residents to take action through a tool that offers recommendations personalised for their individual situations.

Norfolk offers an online Flood Risk Learning Center tool that allows renters and homeowners to plug in their addresses and a few other details, then view information on their chances of experiencing a flood and how high the waters might reach.

The tool also suggests mitigations residents can take to lower flood insurance premiums and reduce total damage, such as filling in basements and relocating utilities out of crawl spaces.

The flood risk system draws on the US Federal Emergency Management Agency (Fema) flood zone information, elevation data collected when Norfolk takes lidar readings of the city and housing records held by the city Real Estate Assessor’s Office to inform its calculations.

But data gaps remain that can make it harder to accurately assess flood risk for some areas.

That’s especially the case for lower-income communities, where residents are less likely to engage in activities like refinancing homes or seeking building permits that can result in elevation certificates being shared with City Hall.

Norfolk is hoping machine learning models can fill in those gaps and produce estimates about homes’ first-floor elevations and other data that isn’t otherwise already available.

“Usually, homes do very well at handling damage in the crawlspace, but once a flood enters that first floor of living space, each additional inch starts to dramatically scale up the amount of damage.

“And so that’s kind of a mystery data point.

“The pilot is doing things like filling in data gaps that are very difficult and expensive to get,” Simons said.

Norfolk piloted the machine learning tools in 2021 on communities for which plenty of data was available.

That made it easier to check and verify – or correct – the AI’s predictions.

Fema’s Hazus Programme supports emergency response, preparedness and recovery activities by modelling an area’s potential damage from natural disasters.

It can make basic projections drawing on “generalised national databases”, or users can feed it local specific information to get more accurate estimates.

Norfolk hopes the machine learning tool will provide data it can use to produce more accurate Hazus damage models.

Looking ahead

As scientists amass more data over the years and as data-collecting technologies like camera- equipped drones and satellites become cheaper and more widespread, algorithmic predictions are likely to become more precise and applicable to new areas.

AI systems haven’t always had enough good data to assess lightning strikes, for example.

But University of Washington (UW) researchers announced in late 2021 that enough information had accumulated.

They created a machine learning algorithm to anticipate where lightning would strike in the southeastern US.

It reportedly predicts strikes two days sooner than a popular physics-based prediction method could.

“Machine learning requires a lot of data – that’s one of the necessary conditions for a machine learning algorithm to do some valuable things,” researcher and UW associate professor of Atmospheric Sciences Daehyun Kim said.

“Five years ago, this would not have been possible because we did not have enough data, even from (the World Wide Lightning Location Network).” – Governing/Tribune News Service

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

FBI working towards nabbing Scattered Spider hackers, official says
Crypto group with 440,000 members launches PAC to target House, Senate elections
TikTok to start labelling AI-generated content as technology becomes more universal
Hong Kong businesses embrace potential of silver economy with more services, tech for rising number of elderly
China carer devotes life to solitary elderly man for 12 years, gets five flats worth millions in thanks for efforts, wins plaudits online
Einstein and anime: Hong Kong university tests AI professors
Foxconn's Q1 profit to jump from low base, AI to power growth
China tech giant Baidu VP apologises after backlash over tough style
Boater dies just feet from land when he dives in to find cellphone, US cops say
Snapchat is focused on making app safe, CEO Evan Spiegel says

Others Also Read