Glossary
Prompt Injection
An attack that manipulates a large language model (LLM) by injecting malicious instructions into the data it processes. Prompt injection can force a chatbot to ignore its guidelines, leak confidential data, or perform unauthorized actions. It is the #1 flaw in the OWASP LLM Security Top 10.