Promptmap – Tool to Test Prompt Injection Attacks on ChatGPT Instances

Prompt injection refers to a technique where users input specific prompts or instructions to influence the responses generated by a language model like ChatGPT. However, threat actors mainly use this technique to mod the ChatGPT instances for several malicious purposes.

Source: GBHackers

 


Date:

Categorie(s):