From Copilot to Copirate: How data thieves could hijack Microsoft's chatbot
Prompt injection, ASCII smuggling, and other swashbuckling attacks on the horizon Microsoft has fixed flaws in Copilot that allowed attackers to steal users' emails and other personal data by chaining together a series of LLM-specific attacks, beginning with prompt injection.... [...]