prompt injection
/PROMPT in-jek-shun/
hard
• Origin: AI/LLM security term; 2020s.
Manipulating an AI model by embedding hidden instructions inside prompts.
Example Sentence
They tested prompt injection vulnerabilities.
Related Terms
prompt hacking,adversarial prompt
Notes & References
Key area in AI safety research.