The views expressed by contributors are their own and not the view of The Hill

We need extreme measures to recover Afghans’ personal data


When U.S. forces completed their withdrawal from Afghanistan, they left behind databases that may be damaging — or even fatal — to Afghans that worked with the U.S. and international agencies.  While helicopters were disabled, embassy computers, documents and passports were destroyed and State Department and other agency websites and social media were scrubbed, key biometric databases — and the sensing equipment that supports some of them — was left behind.    

A collection of databases and handheld devices contain biometric and identity data for roughly 20 million Afghans; about 80 percent of the country’s population. The system can look up any of these individuals using their fingerprints, facial recognition or irises. The system contains data on those that worked with the U.S. and those that received certain types of international aid.  

Now the system is in Taliban hands. Given reports of torture and the Taliban “hunting for ‘collaborators’,” concerns for the safety of previous Afghan government employees, activists and women, and even the recent alleged execution of a folk singer, the potential uses of this data are horrific. 

At best, this identifying information will be a treasure trove for sophisticated future attacks and identity theft. Sensitive information on the Afghans in the database — even those lucky enough to have already made it out of the country — could haunt them for years to come. 

At worst, those still in the country face a far more imminent threat. 

Leaving behind key assets is, unfortunately, not unusual. A December 2020 Defense Department report identified prior issues with inventory management, equipment tracking and property loss in Afghanistan. Incidents where equipment worth thousands of dollars was left unsecured at an abandoned Kandahar Airfield, and surveillance drones and vehicles worth millions of dollars could not be located, have been previously reported. In another case, sensitive data was not wiped from four navigation systems before disposal. All of these incidents preceded the gargantuan airlift that evacuated over 120,000 people amidst constant fear of attack. Logistical considerations resulted in over 150 military vehicles — Humvees, mine resistant ambush protected vehicles and aircraft — being left behind, but demilitarized during the rapid Afghanistan withdrawal. 

The U.S. and its allies must consider the repercussions of the databases’ sensitive information falling into the wrong hands. Even if the Taliban do not take advantage of the captured databases, others most certainly will. Getting the databases back isn’t an option — the U.S. doesn’t need copies of the database, it needs to make sure that others can’t access it.  

With digital data, though, once it is in the open, it can be copied over-and-over, making it nearly — if not actually — impossible to control. This has led some to argue that it should never have been collected in the first place. Problematically, the databases include data beyond the standard identifying details — including key genealogical data, skills assessments and numerous other characteristics. While the biometric details in copies of the databases that are still secure may be useful for identifying individuals when interacting with them in person, the litany of details that they include facilitate digital impersonation. They contain answers for many common security questions (such as relatives names and the individual’s data of birth) as well as more obscure details such as some Afghans’ favorite fruits and vegetables

While the proverbial genie can’t be put back in the bottle, there are actions that can be taken that have been shown to be effective in recovering data in the past. 

For the equipment needed to operate these systems and any data stored on them, a buyback program can be used. This has been done before. In the late 1980s, after the CIA provided thousands of lightweight shoulder-fired FIM-92 Stinger surface-to-air missiles to the mujahideen in Afghanistan to support their war against the Soviets, some missiles found their way into countries such as North Korea, Iran and Qatar. The U.S. initiated a multimillion-dollar buyback program and was able to secure most of the Stingers it had delivered.  

A program like the ‘blood chits’ used to protect pilots could also be used. During World War II, U.S. pilots wore flight jackets with documents sewn to the back, printed in several different languages, promising a reward for food, shelter and safe repatriation of the pilot. In many cases, the kit would also contain gold and maps as gifts. During the global war on terror, the gifts were upgraded to bearer bonds amounting to nearly half a million U.S. dollars. Similar gifts and rewards could be offered to those that protect key databases and turn them over or demonstrate their secure destruction. 

Targeted malware could also be developed to seek out and neutralize or destroy the databases. The Stuxnet malware is a historic example of this type of software. It was engineered so that once it was introduced onto a network via a USB thumb drive, it could self-propagate from machine-to-machine until it found ones that controlled Iranian centrifuges to damage them. 

The time to act is now, as the longer the data is out, unsecured, the harder it will be to secure it. The United States, United Nations and other nations that created or used the systems should begin taking restorative actions immediately. Fighting nation states have a well-recognized duty to protect non-combatant civilians. This extends to the digital realm

This incident should serve as a cautionary tale: While prompt action can mitigate damages, the U.S. may end up bidding against parties that seek the data for ill, rather than destruction.  

Zahid Anwar is an associate professor in the Department of Computer Science at the North Dakota State University and a NDSU Challey Institute Faculty Scholar.  Jeremy Straub is the director of the NDSU Institute for Cyber Security Education and Research, a NDSU Challey Institute Faculty Fellow and an assistant professor in the NDSU Computer Science Department.  The authors’ opinions are their own.