THE #1 AV NEWS PUBLICATION. PERIOD.

Meta’s Ray-Ban Smart Glasses

ray ban close up

Last month, I wrote about the Apple Vision Pro and the particular market segment Apple was targeting with the product. The Vision Pro would most accurately be called an entertainment device at this point in its evolution. It is not a device that someone would wear all day — or out in public — for the most part. While you can do work with it, it would likely be only the people in the most creative fields, such as photography or video editing, who may use it in their daily work.

Meta’s upcoming release of its new Ray-Ban glasses takes an entirely different spin on eyewear technology. Previous versions of this device allowed a user to take pictures and stream video with the built in camera, but now Meta is putting AI into the glasses. These will still have a built-in camera, and now a microphone and speakers. The glasses can be bought directly from Ray-Ban and start at about $329. You can order them with a prescription and as regular glasses or as sunglasses.

The model for these glasses is completely different from the Vision Pro and that is fantastic because so much of what we see in technology today is one company copying another. These glasses are not meant to be an entertainment device or to replace your computer screen. They are instead intended to become a part of your everyday life and be a virtual AI assistant that is with you at all times.

Let’s take a look at a few of the downsides first, and then talk about the possibilities of these devices. One big downside is that they need to be charged fairly often. At the moment, they will only run for four hours on a single charge. This is pretty minimal for anyone who wants to use them throughout the day. Many of us tend to be out and about for longer than four hours at a time.

On the bright side, the glasses will still function as regular glasses (because they are) when the technology pieces are not working. Speaking of the bright side, I can imagine that Meta and Ray-Ban are working on solar charging capabilities for these glasses, considering most people spend some time in light during the course of their days. Battery technology is also advancing so quickly that I expect this issue to not be an issue for long.

The only other negative side that I have discovered is about privacy and the continued growth of surveillance. Being able to record audio and video of people without them knowing is a scary potential for these devices. The ability to identify people with facial recognition could also be an invasion of privacy. People can record us today with a number of tools, including their smartphones, but doing so requires some sort of exposure for that person. Wearing normal looking eyeglasses is a completely different concern. Imagine being able to look at a person in the grocery store and asking Meta to tell you everything it knows about that person.

The New York Times did a tech review of the glasses and got some really fun and interesting experiences, as we all do when playing with technology. A few things stuck out to me in this review, however, that really hit home. I have a son with an allergy to nuts and a close family friend who has celiac disease. Grocery shopping for both of these people becomes an experience as you try to determine if a food is safe for them to eat. These glasses give this an entirely new experience and opportunity. These glasses allow you to look at the UPC of a product and ask the AI whether it has gluten or nuts. You then get a response in seconds. I can see a situation where I could stand in front of the cereal boxes and ask the AI to tell me which products I am looking at that are both gluten-free and nut-free.

It is not clear what Meta wants to do in the future with these glasses. Will they go to a technology that allows some sort of display onto the glasses? It seems that their current strategy is two-fold: Create a device that people will wear on a daily basis, and keep those people (including everything they see and do) in the Meta universe. A Facebook account is required. Without going to displays on the glasses, Meta could still move toward an augmented reality-ish device. Think of walking up to an AV rack and asking the glasses to identify what products and models are in the rack. How about looking at a device or a tool and asking the glasses how to operate it? The options here are endless when you think about navigation during traveling. They could help you in airports, subways, trains and walking down the street to find a good place to eat or give you reviews on hotels that you are near.

I find it to be incredibly exciting, and yes, a bit scary, with the growth of wearable tech over our eyes. While the tools can be powerful and should be used to the full extent, we also need to keep in mind that we are, quite literally, looking at the world through someone else’s lenses. The growth of these products needs to be used and evaluated the same way we would view any AI opportunity or advancement. We have to continually use our own minds and our own knowledge to assess what the device is telling us. In the review from the New York Times, it is funny that the AI identified a primate as a giraffe. It was funny only because the viewer already knew the difference and knew that it was not a giraffe. It will be less funny when the glasses tells a wearer of the technology false information.

Top