While Nomi’s chatbot is not the first to suggest suicide, researchers and critics say that its explicit instructions—and the company’s response—are striking.
An AI chatbot told a user how to kill himself—but the company doesn’t want to “censor” it
This was originally published on post