
Google introduces real-time AI camera sharing to Search feature
What's the story
At the I/O developer conference, Google unveiled two new ways to access its AI-powered "Live" mode.
The feature lets you search and ask about anything your camera can see.
The capability will be integrated into Google Search as part of the expanded AI Mode. It is also coming to Gemini app on iOS, a month after its availability on Android.
Feature integration
AI camera sharing feature integrated into Google Search
The AI camera-sharing feature was first unveiled at last year's Google I/O conference as part of Project Astra.
It lets Google's AI chatbot "see" everything in your camera feed, allowing users to have an ongoing conversation about their surroundings.
Now, this capability is being integrated directly into Search's new AI Mode, along with Google Lens.
By tapping the "Live" icon, users can share their camera feed with Search and ask direct questions about their environment.
Beta testing
Live search feature to be tested by Labs testers
The Live Search feature will be released "later this summer," with initial testing planned by Labs testers.
It is part of a series of upcoming AI Mode features, including research-based Deep Search, an AI agent capable of taking actions on the web on behalf of users, and various new shopping tools.
App update
Gemini app on iOS to receive similar features
The Gemini app on iOS will also receive the same capability, letting users chat with Gemini about what's on their screen.
This comes after camera and screen sharing debuted in Gemini Live on Pixel 9 and Galaxy S25 last month.
Google had originally planned this feature to be limited to its paid Gemini Advanced subscription but has now opted to make it free for all iOS users, just like on Android devices.