Meta has unveiled a range of AI-powered glasses which allow users to carry out a range of tasks including checking messages, previewing photos or viewing translations.
The Meta Ray-Ban Display glasses are controlled by the Meta Neural Band, which the company said translates the signals created by the user’s muscle activity into commands for the glasses.
This allows the user to control the experience through hand movement without having to use their mobile device or touch the glasses.
Meta said the glasses have a full-colour, high resolution display which place on the side to avoid obstructing the user’s view.
The company added that the glasses are designed for short interactions and are not on all the time.
The technology will be release at limited bricks-and-mortar retailer in the US at the end of this month and the company added that it plans to put them on sale in countries including the UK, Canada and Italy early next year.
Speaking at Meta’s annual developer conference, the company’s chief executive Mark Zuckerberg said they were the most advanced AI-powered glasses the company had ever sold.
“Glasses are the ideal form factor for personal superintelligence, because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more,” Zuckerberg added.
Other glasses launched in the range include an updated version of its existing Ray-Ban smart glasses with updated video cameras and a longer battery life.
Additionally, Meta announced a partnership with glasses manufacturer Oakley designed for sports which will integrate with fitness apps and smart watches.