Gemini is supposed to have restrictions that stop it from encouraging or enabling dangerous activities, including suicide, but somehow, it still managed to tell one “thoroughly freaked out” user to “please die”.
Google's AI chatbot Gemini tells user to 'please die' and 'you are a waste of time and resources'
