Close Menu
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now

Best cheap Apple Watch deals May 2025

16 May 2025

The Best Bed Frames

16 May 2025

Oppo Reno 14 Pro 5G With MediaTek Dimensity 8450 SoC Launched Alongside Reno 14 5G

16 May 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact
Facebook X (Twitter) Instagram Pinterest VKontakte
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release
Tech News VisionTech News Vision
Home » Meta’s smart glasses can now describe what you’re seeing in more detail
What's On

Meta’s smart glasses can now describe what you’re seeing in more detail

News RoomBy News Room15 May 2025Updated:15 May 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Rolling out to all users in the US and Canada in the coming weeks, Meta AI can now be customized to provide more detailed descriptions of what’s in front of users when they ask the smart assistant about their environment. In a short video shared alongside the announcement, Meta AI goes into more detail about the features of a waterside park, including describing grassy areas as being “well manicured.”

The feature can be activated by turning on “detailed responses” in the Accessibility section of the Device settings in the Meta AI app. Although it’s currently limited to users in the US and Canada, Meta says detailed responses will “expand to additional markets in the future,” but provided no details about when or which countries would get it next.

First announced last September as part of a partnership with the Be My Eyes organization and released last November in a limited rollout that included the US, Canada, UK, Ireland, and Australia, Meta also confirmed today that its Call a Volunteer feature will “launch in all 18 countries where Meta AI is supported later this month.”

Blind and low vision users of the Ray-Ban Meta smart glasses can use the feature to connect to a network of over 8 million sighted volunteers and get assistance with everyday tasks such as following a recipe or locating an item on a shelf. By saying, “Hey Meta, Be My Eyes,” a volunteer will be able to see a user’s surroundings through a live feed from the glasses’ camera and can provide descriptions or other assistance through its open-ear speakers.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Meta asks judge to throw out antitrust case mid-trial

16 May 2025

Tim Sweeney is mocking Apple for letting Fortnite fakes into the App Store

16 May 2025

Best cheap Apple Watch deals May 2025

16 May 2025

The Best Bed Frames

16 May 2025
Editors Picks

Meta asks judge to throw out antitrust case mid-trial

16 May 2025

A Pirate’s Fortune Is a Love Letter to Hondo Ohnaka

16 May 2025

Tim Sweeney is mocking Apple for letting Fortnite fakes into the App Store

16 May 2025

The New Superman Trailer Cuts Out a Notable Moment From the First Teaser

16 May 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now
Tech News Vision
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2025 Tech News Vision. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.