Close Menu
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now

A Baby Received a Custom Crispr Treatment in Record Time

15 May 2025

Vivo V50 Elite Edition Launched in India, Comes Bundled With Vivo TWS 3e

15 May 2025

Blue Protocol Gets a Pseudo-Revival in Star Resonance Later This Year

15 May 2025
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact
Facebook X (Twitter) Instagram Pinterest VKontakte
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release
Tech News VisionTech News Vision
Home » Meta’s smart glasses can now describe what you’re seeing in more detail
What's On

Meta’s smart glasses can now describe what you’re seeing in more detail

News RoomBy News Room15 May 2025Updated:15 May 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email

Rolling out to all users in the US and Canada in the coming weeks, Meta AI can now be customized to provide more detailed descriptions of what’s in front of users when they ask the smart assistant about their environment. In a short video shared alongside the announcement, Meta AI goes into more detail about the features of a waterside park, including describing grassy areas as being “well manicured.”

The feature can be activated by turning on “detailed responses” in the Accessibility section of the Device settings in the Meta AI app. Although it’s currently limited to users in the US and Canada, Meta says detailed responses will “expand to additional markets in the future,” but provided no details about when or which countries would get it next.

First announced last September as part of a partnership with the Be My Eyes organization and released last November in a limited rollout that included the US, Canada, UK, Ireland, and Australia, Meta also confirmed today that its Call a Volunteer feature will “launch in all 18 countries where Meta AI is supported later this month.”

Blind and low vision users of the Ray-Ban Meta smart glasses can use the feature to connect to a network of over 8 million sighted volunteers and get assistance with everyday tasks such as following a recipe or locating an item on a shelf. By saying, “Hey Meta, Be My Eyes,” a volunteer will be able to see a user’s surroundings through a live feed from the glasses’ camera and can provide descriptions or other assistance through its open-ear speakers.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

You can snag a year of Peacock Premium for just $24.99 right now

15 May 2025

US Tech Visa Applications Are Being Put Through the Wringer

15 May 2025

Pinterest says mass account bans were caused by an ‘internal error’

15 May 2025

A Baby Received a Custom Crispr Treatment in Record Time

15 May 2025
Editors Picks

You can snag a year of Peacock Premium for just $24.99 right now

15 May 2025

US Tech Visa Applications Are Being Put Through the Wringer

15 May 2025

Critical Role Will Celebrate Its 10th Anniversary With a Special Panel at IGN Live

15 May 2025

Pinterest says mass account bans were caused by an ‘internal error’

15 May 2025

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now
Tech News Vision
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2025 Tech News Vision. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.