Prompt Injection - Graph View A security vulnerability where malicious input causes an AI model to ignore its original instructions and follow attacker-supplied directives instead. View concept details Related ConceptsPrompt Adherence Jailbreaking AI AI Guardrails AI Safety System Prompts Prompt Engineering AI Alignment ← Back to full graph