With a jailbreaking technique called "Skeleton Key," users can persuade models like Meta's Llama3, Google's Gemini Pro, and OpenAI's GPT 3.5 to give them the recipe for a rudimentary fire bomb ...
Microsoft has tricked several gen-AI models into providing forbidden information using a jailbreak technique named Skeleton Key. Microsoft this week disclosed the details of an artificial intelligence ...