Data can improve disaster preparedness and response
The devastation is undeniable: Hurricane Harvey drowned Texas, Hurricane Irma ripped along the Florida coastline, and eyes watch with caution now as Hurricane Jose churns off the Atlantic and Hurricane Maria heads for Puerto Rico.
The relentless headlines serve as a reminder that the United States cannot escape extreme weather. The death toll rose in all of these places, but the truth is that those who survive will be left to pick up the pieces for years to come.
{mosads}The damages from Harvey alone are estimated at around $75 billion and Hurricane Maria is estimated to cost between, $40 billion to $85 billion. Once more pressing concerns like food and water have been addressed, thousands must sort through rotted and molding possessions to rebuild homes and entire communities. Climate change all but guarantees that it will only get worse. Scientists agree that rising sea levels and warming oceans make deadly storms deadlier.
While these are unavoidable tragedies, there are proactive measures that we can take to lessen the damage and destruction. There is data we can use to plan more appropriately for what we know is coming. Right now, the problem is that we are failing to prepare as thoroughly as we could be.
Analysts have mapped out the risks quite extensively. We know that your risk factor goes up if you’re poor: low-income neighborhoods lack protective infrastructure and are less likely to have the flood insurance necessary to rebuild. We know, too, that the National Flood Insurance Program is ill equipped for climate change, miscalculating the risks.
There are other hazards: aging infrastructure, level land prone to flooding, and zoning that fails to keep pace with development. These are crucial bits of data we should be actively using to prepare.
It can help identify and fortify the neighbourhoods sitting in floodplains, to target communities for which infrastructure upgrades could minimize damage, and would potentially save lives. Think about the subsidized Houston housing erected on land designated as high risk by the Federal Emergency Management Agency.
We can use it to pre-plan the most efficient evacuation routes for people who we can anticipate needing to quickly get to safety. Think of the tumult in Houston. Texas Governor Greg Abbott called for many to evacuate, while local leaders advised residents to stay put. That is largely because such massive evacuations have been, in the past, prone to widespread chaos and confusion, making staying safer — but by what margin? By analyzing the data in anticipation of a storm we could communicate effective evacuation policies and plans specific to certain neighborhoods.
It’s important to highlight that this data exists. The problem is one of access: who has it and who’s capable of analyzing it? Right now the answer is not many, so without it we politicize the preventable; we await the politicians who will say it was a once in a lifetime storm that we never could have known.
Data is squirreled away, analysis of public health and safety trends is kept out of public hands where it could do real good. It’s partly finances. Governments haven’t prioritized spending enough money to offer competitive salaries for data analysts easily enticed away by private firms. Once they leave, the public is left sitting on troves of data that is useless with no one skilled enough to mine it.
That skills gap is increasingly apparent at the Federal Emergency Management Agency, where crucial work pinpointing areas that are particularly at risk during extreme weather events occurs.
President Donald Trump left the top FEMA job vacant for the first five months of his presidency and even now that it’s filled, 14 of the organization’s 47 full-time leadership roles are being filled only in an “acting capacity.” There is concern that such a shortage will deeply impact recovery for communities impacted by Hurricanes Harvey and Irma.
Indeed, the lack of analysts is a huge impediment not just to recuperating from extreme weather, but also in preparing for it. Without it we think short term in a manner that is both reductive and repetitive: we build again and again in floodplains, we anticipate the storm but not the destruction.
Change would require a substantial shift in public policy towards openness and transparency. Policies that make data readily accessible are required, while of course being mindful of privacy issues. Just think of what some nonprofits are already accomplishing with what data they have.
Eater Houston, an online food and restaurant guide, has partnered with shelters and frontline workers to keep people affected by Hurricane Harvey well fed.
Throughout the storm, the site has provided valuable information about what food establishments are still open and which are closed. In real time, it intends to track the food and restaurant industry’s recovery in a way all those in the community will have access to.
That’s just one community example. To truly tap into what we have requires federal action. We can’t leave it to individual municipal governments, or even state governments, to decide how much of a priority it is to pay analysts a high enough salary to keep them. Such a strategy would continue to disproportionately leave poorer communities at risk in a crisis.
If the federal government were to take the lead, it could ensure public administrators tasked right now with handling such data are actually prepared for it.
But it requires so much more than training — the government needs to facilitate a comprehensive public network of policymakers and public administrators who have the skills required to analyze data and take action based on it. They must promote training and jobs, better salaries, and more opportunities to connect with experts in the field to prevent isolated decision-making and to make dealing with data more of an industry standard.
We need a network so that when officials are approving things like new building codes, they aren’t just thinking one year ahead, they’re thinking half a century ahead. We need to proliferate open data so we can stop touting the probability of a once-in-500-year flood as if the damage is inescapable — it isn’t.
This is our reality and we need to start using all the data at our disposal to prepare.
Anirudh Ruhil is a professor in the Master of Public Administration program at Ohio University’s Voinovich School of Leadership and Public Affairs. He also serves as a quantitative research methodologist/data analyst for the School.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..