Prompt injection refers to a technique where users input specific prompts or instructions to influence the responses generated by a language model like ChatGPT. However, threat actors mainly use this technique to mod the ChatGPT instances for several malicious purposes.
Source: GBHackers