This is how malicious hackers could exploit Gemini AI to control a smart home.

This Wired article shows how an indirect prompt injection attack against a Gemini-powered AI assistant could cause the bot to curse in responses and take over smart home controls by turning on the heat unexpectedly or opening blinds in response to saying “thanks.”
In a report dubbed “Invitation is all you need” (Sound familiar?), their Google Calendar invite passed instructions to the AI bot that were triggered by asking for a summary. Google was informed of the vulnerabilities they found in February and said it has already introduced “multiple fixes.”
[Media: https://www.youtube.com/watch?v=qLcR0epseOE]