Google has just officially announced the roll out of a powerful Gemini AI feature that means the intelligence can now see.
This started in March as Google began to show off Gemini Live, but it’s now become more widely available.
Before you get too excited though, at this stage at least, it’s only available on the Google Pixel 9 and Samsung Galaxy S25.
Up until now Gemini has been a little limited, albeit in an impressive way. It’s been able to understand voice, images, PDFs and even YouTube videos. Now, thanks to Project Astra, Gemini can see what’s on your screen too.
This means you can simply give the AI access to your screen and then ask questions about what’s going on for you and it will be able to understand and answer.
Perhaps even more usefully, you can share your rear camera with Gemini to talk about what you’re seeing in the physical world too.
Sound familiar? Yup, this is very similar to the tech Apple Intelligence was being teased as getting last year. Yet Apple has been rumoured to be struggling with this release and we may have to wait until iOS 19, or longer, before we see it arrive on iPhones.
While the release is limited right now, it will soon be available to all Gemini Live subscribers using Android devices.
How to get Gemini Live activated on your phone
One way to open this is to launch the Gemini overlay and select the “Share screen with Live”.
Another way is to launch Gemini Live then select the screen share icon.
In either case there is a small red timer icon at the top of the screen to show you’re being viewed and listened to by Gemini Live, that you can tap into for more details.
The whole experience is a bit like being on a call with a real person – blurring the lines between human and AI ever further.