Prompt injection is an insidious technique where attackers introduce malicious commands into the free text input that controls an LLM. By doing so, they can force the model into performing unintended and malicious actions.
Source: Help Net Security
Prompt injection is an insidious technique where attackers introduce malicious commands into the free text input that controls an LLM. By doing so, they can force the model into performing unintended and malicious actions.
Source: Help Net Security