AI as Emotional Support | The New York Times (5/3/23)

"Pi’s boundaries are easy to find. When I tried picking fights, I mostly received kindness in return. “I appreciate you saying that,” Pi’s text gently unfurled on my screen. “I think it’s important to see things from all perspectives, and not to just focus on the negative.”

Over time, that relentless balance wore on me, making my instinct to argue seem ridiculous. Which, I realized, was precisely the point."

Image Credit: Janice Chang, NYT

Meet the new emotional chatbot Pi! Developed by the start-up Inflection AI, Pi has been released to provide people with a supportive companion. Unlike chatbots that focus on answering queries or making people more productive, Pi provides personality and conversational flair, giving people the feeling of companionship. Critics, however, have raised concerns about privacy and the potential for enabling harmful behaviour, and warn that allowing a chatbot to act as a pseudotherapist to people with serious mental health challenges has obvious risks.

What does the creator say? The CEO of Inflection AI, Mustafa Suleyman, has stressed the technology’s limitations and has emphasized that the tool should be transparent about its boundaries and capabilities. All in all, Pi is an especially fascinating example of the growing desire to code personalities in AI bots and the uses they have beyond providing factual knowledge.

1 Like