Glossary

Prompt Injection

An attack that manipulates a large language model (LLM) by injecting malicious instructions into the data it processes. Prompt injection can force a chatbot to ignore its guidelines, leak confidential data, or perform unauthorized actions. It is the #1 flaw in the OWASP LLM Security Top 10.

Related Pages

Other Terms

Need an external review of your HR SaaS?

Share your product, stack, and client context. We will come back with the right review scope.

Discuss your audit