EXECUTIVE SUMMARY: A newly discovered jailbreak – also known as a direct prompt injection attack – called Skeleton Key, affects numerous generative AI models.
Source: Cyber Talk
EXECUTIVE SUMMARY: A newly discovered jailbreak – also known as a direct prompt injection attack – called Skeleton Key, affects numerous generative AI models.
Source: Cyber Talk