Google's AI-enabled mouse pointer understands 'this' and 'that'
Briefly

Google's AI-enabled mouse pointer understands 'this' and 'that'
"Google DeepMind announced a research effort to transform the standard computer mouse cursor into a context-aware, AI-powered tool, marking what the company described as the first major rethinking of the cursor in more than 50 years."
"Researchers said there is a persistent friction in how people currently interact with AI tools. Most AI assistants today live in a separate window, requiring users to copy, paste, or drag content into a chat interface before receiving help. The new approach aims to reverse that dynamic."
""We want the opposite: intuitive AI that meets users across all the tools they use, without interrupting their flow," the researchers stated in the blog post."
"The mouse pointer works alongside the computer's microphone, allowing Gemini to listen as the user points. This lets users refer to features on the screen with object pronouns like "this" and "that." In a demonstration website, a user can hover a cursor over a crab and say "move this here," and the system understands enough context to grab the crab and move it to where the cursor indicates."
Google DeepMind is developing an AI-powered, context-aware mouse cursor that integrates Gemini to interpret where users click, what they click on, and likely intent. The goal is to reduce friction found in current AI assistants, which often require users to switch to separate chat windows and move content via copy, paste, or drag. The cursor is designed to work across the tools people already use while staying out of their way. The system also uses the computer microphone so Gemini can listen as users point and use object pronouns like “this” and “that.” A demo shows hovering over an object and saying “move this here” to trigger an action based on cursor position.
Read at theregister
Unable to calculate read time
[
|
]