During a recent U.S. Senate Committee on Veterans’ Affairs hearing about veteran suicide response, Chairman Jon Tester told U.S. Department of Veterans Affairs (VA) officials, “We just got to do better. … It’s ruining lives, it’s ruining families.” This comes in the wake of a VA Office of Inspector General report criticizing the agency’s crisis hotline for suicide prevention over its handling of a text from a suicidal veteran.
In this incident, a veteran reached out to the VA’s Veterans Crisis Line, admitting that he had tested his means for suicide and had access to the materials necessary to follow through. The responder failed to promptly alert emergency personnel or take the steps to ensure the veteran’s safety. Just 11 minutes later, the veteran took his own life. This tragedy is illustrative of not only the mistakes that need to be addressed by the crisis line, but also that even with information going through the proper channels, waiting until a veteran is in a crisis state to intervene is often too late.
Like Chairman Tester, U.S. Sen. John Boozman sounded the alarm two years ago when veteran suicide rates spiked. Today, the crisis continues. Our nation has invested billions of dollars into the search for a solution, but we still have not succeeded in flattening or reducing this suicide curve. Our current models place the burden on struggling veterans to self-assess and then seek help on their own. We need to revolutionize our approach to suicide prevention to proactively detect and intervene earlier.
Congress needs to drive federal agencies to leverage and fund existing and new artificial intelligence and machine learning (AI/ML) technologies to help detect warning signals of potential suicide risk among veterans before they reach a critical breaking point. When it comes to veterans’ lives, there is no room for error, delay or oversight. It is time to take bold, innovative steps. AI/ML provides a necessary safety net and can help individuals accurately identify troubling behavior early on and prevent tragedy before it strikes.
While the VA is doing its best to serve veterans and prioritize their well-being, it’s clear there are still gaps in the current support system. Veteran suicide has risen over the past two decades, with a 57 percent higher risk of veteran suicide than for those who have not served in the military – more than 1.5 times the national average. Despite countless hearings and resources allocated toward addressing the mental health needs of our nation’s heroes, an average of 16.8 veterans take their lives every day.
During the recent Veterans’ Affairs Committee hearing, Chairman Tester expressed willingness to provide the VA with whatever it needs for mental health support programs, but emphasized the importance of ensuring requested resources can truly address the issue. Innovative reporting and monitoring technology that leverages AI/ML can help ensure veterans get the focused resources they need. Providing the VA with the right tools is essential if Congress wants to combat this crisis efficiently.
AI and ML models can identify service members and veterans with the highest suicidal risk anchored on decades of research by the Centers for Disease Control and Prevention, Department of Defense and VA around social determinants of health data, environmental factors, and individual events and behaviors. By analyzing these research-informed indicators that can lead to stress and depression, loneliness, financial distress, legal challenges, and substance addiction, VA employees can conduct prioritized outreach, identify veterans in need of immediate support, and report issues through the proper channels promptly. Technology aids in continuous and secure monitoring and evaluation and helps guarantee that no veteran falls through the cracks.
While VA ad campaigns like “Don’t Wait. Reach Out.” and the Solid Start program are important in educating veterans on the resources available to them, the burden is still placed on the struggling veteran to seek help during a time of distress. Instead, the VA should proactively monitor event-based signals supported by science that can identify risks before they become harmful and offer assistance before veterans have to ask.
We have maintenance alert systems in various modes of transportation that tell us when they need service or repair before a potential full-system breakdown. We should support our veterans in a similar way. Rather than working through solutions during the point-in-time crisis when a veteran is already at their most vulnerable, our legislators need to integrate innovative tools into the veteran suicide prevention process today.
Whether veterans are managing mental or physical health issues or not, transitioning out of the armed forces and into the civilian world can be difficult. Congress, the VA and each of us has a responsibility to our veterans to keep them from going it alone.
To combat veteran suicide, Congress must explore every tool available. We cannot keep throwing money at the same prevention models that have continually proven to be ineffective over time. Congress needs to budget for innovative monitoring and reporting technologies to help military organizations like the VA detect early behavioral warning signs and identify anomalies before it is too late. Empowering the VA with real-time insights into struggling veterans is a matter of life and death.
Col. Michael Hudson (Ret.) is a vice president at ClearForce, a risk management organization. He served in the Marine Corps for 30 years, including commanding a helicopter squadron and a Marine Expeditionary Unit, and in his last active-duty billet, led the Marine Corps’s Sexual Assault Prevention and Response.