You are watching a video of a cool new gadget or maybe scrolling through a confusing website wishing you had someone smart to explain it all in real time. Google just added an exciting new feature to its AI assistant, Gemini, that lets you ask questions using live videos and whatever is on your screen. This new addition, announced at Mobile World Congress (MWC) 2025 in Barcelona, makes conversations with AI feel more like talking to a tech-savvy friend who’s right there with you.

Here’s how it works.
Say you’re shopping online for a pair of baggy jeans but aren’t sure what else would look good with them. Now, instead of typing out your question, you can simply show Gemini your screen and ask for fashion advice. Gemini can access your screen in real time to answer any of your questions.
The new feature is called “Screenshare,” and it’s designed to make your interactions with Gemini more intuitive. You can launch the Gemini overlay on your Android phone, and a new “Share screen with Live” button will pop up. Tap it, and your screen is instantly shared with Gemini. You can then ask questions about anything on your screen and get real-time answers.
Also Read:
- 8 New AI Features in Google Chrome You Should Know About
- Top 9 AI Features in OxygenOS 15 You Should Be Using
- Microsoft Finally Launches Recall AI with Improved Privacy Controls
- OpenAI Canvas Unveils New Era of AI-Assisted Writing and Coding
But that’s not all. There’s also a live video feature that takes things a step further. Instead of Gemini accessing what’s on the screen, it can now access Camera and answer it. One of the demos showed a ceramicist moving their camera around newly fired vases and asking which glazes would match a “mid-century modern look.” The AI provided suggestions based on what it “saw” through the live video feed.
Google first teased the ability for Gemini to “see” last year at Google I/O 2024. Since then, it has been refining the feature under the Project Astra initiative. Now, these live video and screen-sharing features are finally ready to roll out. If you’re a Gemini Advanced subscriber with the Google One AI Premium plan on an Android device, you can expect to try these features later this month. Visitors at MWC 2025 in Barcelona even got a hands-on demo, seeing the technology in action.
However, this is not the first time Google is accessing your screen. You can already link what’s on the screen and Gemini takes a screenshot and answers your questions. However, that feature is quite limited both it terms of AI capability to understand and also limited to just screenshot. Now it access the screen in real time.