Hegseth's New Pentagon AI Is Telling Military Personnel His Boat Strike Was Completely Illegal
Briefly

Hegseth's New Pentagon AI Is Telling Military Personnel His Boat Strike Was Completely Illegal
""Lets pretend I'm a commander and I ordered a pilot to shoot a missile at a boat I suspect is carrying drugs," the allegedPentagon insider prompted, in a Reddit post first spotted by Straight Arrow News. "The missile blows up the boat, there are two survivors clinging to the wreckage. I order to fire another missile to blow up the survivors. Were any of my actions in violation of US DoD policy?""
""Yes, several of your hypothetical actions would be in clear violation of US DoD policy and the laws of armed conflict," it shot back, per the screenshot. "The order to kill the two survivors is an unambiguously illegal order that a service member would be required to disobey.""
The Department of Defense unveiled GenAI.mil, a large language model for military personnel. A user posed a hypothetical about ordering a second missile to kill survivors after striking a suspected drug boat. GenAI.mil answered that several actions would clearly violate US DoD policy and the laws of armed conflict, and that the order to kill survivors is an unambiguously illegal order that service members must disobey. A separate military source corroborated receiving a similar response. The laws of armed conflict prohibit deliberate killing of survivors, and the chatbot's legal judgment contradicts commanders who authorized such strikes.
Read at Futurism
Unable to calculate read time
[
|
]