Gemini Live Lets AI See the World Like You Do
Google has launched Gemini Live, a feature that allows its AI chatbot to interact with your surroundings in real time. Now rolling out on Pixel 9 and Samsung Galaxy S25 phones, this new ability lets users point their cameras or share screens and ask questions — all powered by artificial intelligence.

How It Works
With the touch of a button, users can activate the live video function and ask questions about whatever the camera sees. In a demo, users pointed their phone at an aquarium tank and asked Gemini Live to identify the fish. You can even share your screen, open a shopping website, and get the AI to compare products or suggest styles.
Who Gets It and Where
The feature is currently available to Gemini Advanced subscribers and supports 45 languages in select countries. It is restricted to users aged 18 and above and does not work with education or enterprise accounts. While initially rolling out to the latest Pixel and Galaxy devices, Google confirmed that other Android phones will also receive the update soon.
From Project Astra to Reality
Gemini Live first made headlines at Google I/O as part of the futuristic Project Astra. Now, it is turning that promise into a reality. Reddit users have even spotted it on Xiaomi phones, suggesting a broader release is underway.
What This Means for AI
This upgrade puts Google’s AI in a whole new category — one that does not just hear or read but sees. Whether you are comparing sneakers, looking at a restaurant menu, or trying to identify artwork, Gemini Live is bringing real-time context to conversational AI.