American corporation Meta has announced an updated version of its smart glasses with artificial intelligence, which allows for clearer perception of the interlocutor's speech even in noisy environments. This is reported by UNN with reference to the company's press release.
We're rolling out a software update for our AI glasses that makes it easier for you to hear your conversation partner in noisy environments.
Whether you're eating in a crowded restaurant, commuting on a train, or listening to your favorite DJ perform, the open-ear speakers of AI glasses amplify the voice of the person you're talking to.
You'll hear the speaker's voice a little louder, which will help you distinguish the conversation from the surrounding background noise. You'll also be able to easily adjust this amplification level by swiping your finger across the right temple of the glasses or through the device settings.
It is noted that at the first stage, the new product will appear on the markets of the USA and Canada. The glasses were also equipped with an additional function - integration with Spotify, which allows playing music that matches what the user sees.
"Whether you're looking at album art or something festive... you'll be able to simply say, 'Meta, play a song that matches this image,'" the company explained.
Recall
Google is developing two categories of smart glasses based on AI Gemini, which will appear in 2026. Devices, created jointly with Samsung and others, will work wirelessly with a smartphone.
Alibaba releases Quark smart glasses with built-in AI27.11.25, 16:18 • [views_3400]
