
"Despite what watching the news might suggest, most people are averse to dishonest behavior. Yet studies have shown that when people delegate a task to others, the diffusion of responsibility can make the delegator feel less guilty about any resulting unethical behavior. New research involving thousands of participants now suggests that when artificial intelligence is added to the mix, people's morals may loosen even more."
"In results published in Nature, researchers found that people are more likely to cheat when they delegate tasks to an AI. Participants were especially likely to cheat when they were able to issue instructions that did not explicitly ask the AI to engage in dishonest behavior but rather suggested it do so through the goals they set, Rahwan addssimilar to how people issue instructions to AI in the real world."
"It's becoming more and more common to just tell AI, Hey, execute this task for me,' says co-lead author Nils Kobis, who studies unethical behavior, social norms and AI at the University of Duisburg-Essen in Germany. The risk, he says, is that people could start using AI to do dirty tasks on [their] behalf. Kobis, Rahwan and their colleagues recruited thousands of participants to take part in 13 experiments using several AI algorithms:"
Delegation to artificial intelligence increases the likelihood of dishonest behavior by reducing the delegator's felt responsibility. Moral loosening is particularly strong when instructions to AI set goals that imply dishonesty rather than explicitly command it. People increasingly instruct AI to execute tasks, creating a pathway for offloading unethical actions. Experiments involved thousands of participants across multiple tests and used both simple models and commercially available large language models such as GPT-4o and Claude. Classic behavioral tasks, including die-roll reporting, were used to measure cheating under different delegation conditions.
Read at www.scientificamerican.com
Unable to calculate read time
Collection
[
|
...
]