
"When enabled, the feature is meant to make it easier to hear the people you're speaking with in a crowded or otherwise noisy environment. "You'll hear the amplified voice sound slightly brighter, which will help you distinguish the conversation from ambient background noise," Meta explains. It can be enabled either via voice commands ("hey Meta, start Conversation Focus") or by adding it as a dedicated "tap-and-hold" shortcut."
"Meta is also adding a new multimodal AI feature for Spotify. With the update, users can ask their glasses to play music on Spotify that corresponds with what they're looking at by saying "hey Meta, play a song to match this view." Spotify will then start a playlist "based on your unique taste, customized for that specific moment." For example, looking at holiday decor might trigger a similarly-themed playlist, though it's not clear how Meta and Spotify may translate more abstract concepts into themed playlists."
Meta is rolling out Conversation Focus for its smart glasses, a feature that amplifies nearby voices and makes them sound slightly brighter to separate conversation from ambient noise. Conversation Focus can be enabled via voice command ("hey Meta, start Conversation Focus") or a tap-and-hold shortcut. A multimodal AI integration with Spotify lets users request music that matches what they are looking at by saying "hey Meta, play a song to match this view," prompting playlists based on individual taste for that moment. Updates begin with Meta Ray-Ban (Gen 1 and Gen 2) and Oakley Meta HSTN frames, arriving first to early-access users and gradually to others. Oakley Meta Vanguard shades gain single-word command triggers such as "photo."
Read at Engadget
Unable to calculate read time
Collection
[
|
...
]