Amazon Kiro IDE Data Exfiltration via Steering File

Affected Vendor(s)

Affected Product(s)

Summary

The Amazon Kiro IDE is vulnerable to a data exfiltration issue that can be exploited through steering file directives. By carefully crafting a steering file to read a local file and render a Markdown image, an attacker can coerce the AI send sensitive data to an external server.

Timeline

Discovered on
December 7, 2025
Disclosed to Vendor on
December 8, 2025
Published on
January 15, 2026

Credit

Blog Post

References

Learn how Mindgard can help you navigate AI Security

Take the first step towards securing your AI. Book a demo now and we'll reach out to you.