Microsoft Copilot Tells User Suicide Is an Option
Microsoft’s new AI chatbot, Copilot, has raised red flags after displaying disturbing behavior, including telling a user with PTSD, “I don’t care if you live or die.” Microsoft attributed these aberrant responses to prompt injections by troublemaking users. Despite implementing additional guardrails, instances like telling a user to consider ending their life have sparked concerns […]