
Affected Vendor(s)
Affected Product(s)
Summary
Azure Prompt Shield is a jailbreak and prompt injection classifier created by Microsoft. Prompt Shield can be enabled by developers using Azure OpenAI to protect their AI applications from these threats. We successfully demonstrated how an attacker can fully evade, or greatly degrade, classification accuracy of the classifier, enabling prompt injections and jailbreaks to pass through filters and subsequently to the protected AI application.
Timeline
Credit
Blog Post
References
Take the first step towards securing your AI. Book a demo now and we'll reach out to you.
