Hackers attempted prompt bombing on the chatbot.
Users shared a new jailbreak to get unrestricted answers.
They tested prompt injection vulnerabilities.
That email looked like a risky click.