Bringing cognitive science to the intelligence community
{mosads}Our brains come preprogrammed with certain “heuristics” — mental shortcuts — that help us process and act on the incredible amount of data the world
throws at us every day. In most circumstances, these heuristics are immensely valuable. They help us save brain energy and act quickly and efficiently.
What cognitive science has shown, however, is that there are certain circumstances in which these heuristics backfire — they can bias our thinking in ways that lead to bad decisions. These cognitive flaws are universal, found even in the people who can least afford to be swayed by them.
Our government employs tens of thousands of highly trained intelligence analysts to identify potential security threats and determine the best way to mitigate them. These people are brilliant and deeply dedicated to the United States. But they’re still human, so they’re susceptible to these ingrained biases.
Government intelligence agencies have started to invest in cognitive research. For example, the 2004 Intelligence Reform and Terrorism Prevention Act directed the community to regularly conduct “alternative analysis…of the information and conclusions in intelligence products” and formulate “a
long-term strategy for scientific advances in the field of intelligence.”
So far, however, only a few, small, disconnected projects have emerged.
Federal intelligence agencies would greatly benefit from directing more investment into cognitive research. Key findings could be translated into policies that seriously improve how this country goes about collecting and analyzing intelligence data.
The enemies of this country are evolving — they’re getting better at cloaking their activities and blending into the peaceful citizenry. The United States should evolve, too.
One common cognitive error is “confirmation bias,” in which someone actively seeks out information supporting a pre-existing belief, while ignoring
evidence that might contradict that belief. Consider an employer who has already decided that a job applicant is extremely qualified. There’s a good
chance she’ll ignore evidence that contradicts her preconception.
Another important bias is “anchoring,” in which people rely on a past reference point to make a future prediction, even if that past point is completely irrelevant. A consumer looking for a used car may focus exclusively on the year it was manufactured and its odometer reading, ignoring the car’s maintenance record.
In everyday life, biases such as these tend to have consequences that are, at worst, mildly negative. Perhaps you hire someone who doesn’t work out,
overpay for a car, or get stuck in an intractable argument during a company board meeting.
But for intelligence analysts, the consequences are much more severe. It’s all too easy to anchor on prior intelligence estimates and only pay heed to
evidence that supports prior conclusions
By seriously investing in cognitive science research and applying its findings, policymakers could make America safer. Just as policymakers regularly invest in technological upgrades, they should invest in neural upgrades, as well. Throughout this process, researchers must make sure analysts are intimately involved in the design, development, and implementation of new protocols aimed at counteracting cognitive bias.
Our country’s intelligence analysts are committed to protecting this country. But like everyone else, they’re relying on three pounds of neurons to make their decisions. And embedded in those neurons are biases that, left uncorrected, can impair their analysis.
Kretz is the lead cognitive scientist for Raytheon User-Centric Analytics, a former intelligence analyst, and a doctoral candidate in cognition and neuroscience at the University of Texas at Dallas.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..