
"The Department of Veterans Affairs has been using some artificial intelligence capabilities to bolster its suicide prevention efforts, but VA says these tools augment the work of clinicians and are not designed to replace any human-led interventions. Lawmakers, researchers and advocates say that's how these technologies should always be used. This article - the second in a series of pieces about VA's adoption of AI tools to help prevent"
"VA's tools are vastly different than GenAI High-profile news stories have underscored concerns about generative AI chatbots being used as de facto therapists by members of the public or even reportedly playing a role in suicides. VA's AI tools, however, operate behind the scenes and are essentially machine learning-based algorithms, rather than public-facing chatbots engaging with veterans in crisis. And the human-led interventions resulting from these tools are meant to be voluntary."
The Department of Veterans Affairs deploys artificial intelligence capabilities to bolster suicide prevention by augmenting clinicians' work rather than replacing human-led interventions. The tools operate behind the scenes as machine learning algorithms, not public-facing generative chatbots, and resulting human-led interventions are voluntary. The REACH VET program, launched in 2017, is a suicide prediction algorithm that scans electronic health records to identify veterans in the top 0.1% risk tier. A 2.0 version added variables such as military sexual trauma and intimate partner violence. Lists of high-risk veterans are provided to VA facilities through a centralized dashboard accessible to REACH VET coordinators.
Read at Nextgov.com
Unable to calculate read time
Collection
[
|
...
]