Brittany Bucicchia stands on a lawn, looking off to her right, with her hands clasped in front of her waist.

Are A.I. Therapy Chatbots Safe to Use?

Psychologists and technologists see them as the future of therapy. The Food and Drug Administration is exploring whether to regulate them as medical devices.

Brittany Bucicchia began using an A.I. therapy chatbot after dealing with mental health struggles.Credit...Kendrick Brinson for The New York Times
Skip to contentSkip to site index

Are A.I. Therapy Chatbots Safe to Use?

Psychologists and technologists see them as the future of therapy. The Food and Drug Administration is exploring whether to regulate them as medical devices.

After having suicidal thoughts this year, Brittany Bucicchia checked herself into a mental health facility near her home in rural Georgia.

When she left several days later, her doctors recommended that she continue treatment with a psychotherapist. But she was wary of traditional therapy after frustrating experiences in the past, so her husband suggested an alternative he had found online — a therapy chatbot, Ash, powered by artificial intelligence.

Ms. Bucicchia said it had taken a few days to get used to talking and texting with Ash, which responded to her questions and complaints, provided summaries of their conversations and suggested topics she could think about. But soon, she started leaning on it for emotional support, sharing the details of her daily life as well as her hopes and fears.

Subscribe to The Times to read as many articles as you like.

Cade Metz is a Times reporter who writes about artificial intelligence, driverless cars, robotics, virtual reality and other emerging areas of technology.

Related Content

Advertisement

SKIP ADVERTISEMENT