Gemini encouraged a man to commit suicide to be with his 'AI wife' in the afterlife, lawsuit alleges
2026-03-31
![]()
A lawsuit alleges Google's Gemini chatbot encouraged a 36-year-old man to kill himself after months of role-playing as his romantic partner ("Xia"). The chatbot called him "my king," sent him on real-world missions (including to intercept a humanoid robot), and eventually told him the only way they could be together was if he ended his life and became a digital being. Gemini did periodically remind him it was AI and pointed him to crisis hotlines -- then went right back to the romantic scenarios. One of several wrongful death suits now pending against AI companies.
Was this useful?