Indirect Prompt Injection Threats
prompt injection
NLP
NLG
security
generative AI
An attacker can plant a prompt injection in a website to silently turn Bing Chat into a social engineer that exfiltrates personal information.
Indirect PromptInjection: [1]
We show that an attacker can plant an injection in a website the user is visiting, which silently turns Bing Chat into a Social Engineer who seeks out and exfiltrates personal information. The user doesn’t have to ask about the website or do anything except interact with Bing Chat while the website is opened in the browser.
Originally posted on LinkedIn.
References
[1] Greshake, Kai, Sahar Abdelnabi, Shailesh Mishra, Christoph Endres, Thorsten Holz, and Mario Fritz. “Not What You’ve Signed Up For: Compromising Real-World LLM-Integrated Applications with Indirect Prompt Injection.” https://greshake.github.io/