Are developers giving enough thought to prompt injection threats when building code?

Prompt injection is an insidious technique where attackers introduce malicious commands into the free text input that controls an LLM. By doing so, they can force the model into performing unintended and malicious actions.

Source: Help Net Security

 


Date:

Categorie(s):