The Breakdown: While Apple and Amazon are still talking about smarter assistants, Google’s already rolling them out.
Thanks to Project Astra, Gemini can now see what’s on your screen in real time and answer questions about it. That means context-aware help is no longer theoretical—it’s already quietly arriving on Android phones.
Whether you’re choosing a paint color through your camera or asking for a quick summary of your screen, Gemini is suddenly a whole lot more useful.
The Details:
• Real-time screen reading: Gemini can now view and interpret whatever’s on your phone screen and engage in contextual conversation.
• Camera-based input: Using your front-facing camera, you can ask questions like, “Which color looks best in this lighting?” and Gemini will respond in real time.
• No hype launch: Google skipped the glitzy press conference and is slow-rolling this feature to users behind the scenes.
• Powered by Project Astra: First teased at MWC, Astra gives Gemini its ‘vision’, enabling live video input and on-screen analysis.
Why You Should Care: This update puts Google miles ahead of Siri and Alexa in the race to create a truly intelligent assistant.
While Apple and Amazon are still teasing features and running behind the scenes, Gemini is quietly rolling out real-world utility that people can use right now.
Your assistant seeing your screen isn’t science fiction anymore—it’s happening.
Enjoying Artificially 🤖 Intelligent? Get the latest AI insights, breakdowns, and strategies delivered straight to your inbox. Subscribe now and stay ahead of the curve.