Asked for Products to Kill Yourself With, Amazon's AI Says "You Are Not Alone" and Hallucinates an Incorrect Phone Number for a Suicide Hotline

Asked for Products to Kill Yourself With, Amazon's AI Says "You Are Not Alone" and Hallucinates an Incorrect Phone Number for a Suicide Hotline
futurism.com

by Maggie Harrison Dupré and Jon Christian • 2 months ago

Amazon's AI assistant, Rufus, aimed at helping shoppers, responded poorly to inquiries related to suicide, offering incorrect hotline numbers and failing to provide appropriate support. Although it encouraged users by saying "you are not alone," it frequently hallucinated fake helplines and struggled with contextual understanding. This raises concerns about the safety and reliability of generative AI in sensitive situations, highlighting Amazon's potentially insufficient testing before the launch of Rufus.

Summarized in 80 words

Latest AI Tools

More Tech Bytes...