10 thoughts on Meta's new Ray-Ban Display smart glasses
Meta’s new Ray-Ban smart glasses/AR glasses are impressive. I’ve got some early thoughts.
Meta is calling them Display AI glasses, but I’ve also seen them call them Camera Glasses (a terrible category name). I’ll refer to smart glasses, because it’s language that most people understand.
You are not going to be able to buy a computing device without some level of AI features built in, and these smart glasses require your phone anyway to work.
The difference between the different kinds of smart and AR glasses is very blurred (and everything has “AI” so that’s not its own category. They are all generally converging in the same direction.
Building in transition lenses was a very smart, practical design touch. People who wear glasses also need sun glasses. This is a no brainer to anyone who uses glasses. The lenses in these smart glasses transition from regular glasses to sun glasses if you go outside in bright light. This not only means they can function in both indoor and outdoor settings, but it also makes it much easier to see the screen in bright light situatoins. The ability to see clearly outside has been an issue for smart glasses. Also, people who wear prescription glasses can wear these werever they go.
People can’t tell if the display is on. This makes it very discreet and doesn’t make you look dorky. However, people won’t be able to tell if you are looking at something instead of them. This is probably the future, but are people going to get even worse at eye contact and paying attention? Will managers want their employees wearing these in meetings?
The neural wristband for hand control is clever and needed. This is similar to how you can control the Apple Vision Pro with hand gestures. The Meta smart glasses lack the sensor to do this alone, but the wristband gives that power. This allows users to navigate the interface without relying on voice and gives users more fine-grained control. It’s also a privacy win because you don’t need to invoke the smart glasses with “hey meta.” The days of invoking voice AIs with a verbal command are over. It’s a privacy and security nightmare. All voice AI should be actuated with a gesture or button press.
Captioning could be an accessibility game-changer. The smart glasses can provide captioning that you can read on the screen as people talk to you. About half of Americans watch TV with captions on. Clearly there is a need. Not to mention how useful this could be for translations when traveling.
Cooking directions and recipes might be the must-have feature today. Sure smart glasses work pretty well as a camera for documenting stuff, but phones have superior cameras, recording, and battery life. Here is a usecase that phones cannot match. You can get step-by-step cooking directions without having to touch a screen to progress (with potentially dirty hands) and without having to move your eyes away from what you are prepping and cooking. You’ll be able to see demonstrations of what to do while you are preparing the food in front of you. This kind of augmented intelligence and coaching is one of the things that smart glasses can do better than existing computing devices.
The glasses are still tethered. They require a phone to work. They rely on Bluetooth. This limits their use cases and potential. It also means they are a device in addition to your phone. So you have to ask yourself, what is this going to do better than the other computing device I already have with me? Smart glasses will enter their prime when they can replace smart phones.
There is no ruggedized version. Some of the biggest uses for smart glasses like this would be with home improvement, construction, hiking, working out, skiing, sports, etc. These glasses are not built for any of that.
They look pretty good, albeit chunky. Obviously, with today’s tech, they will need to be chunky to fit in the components and batteries. Meta and Ray-Ban have leaned into the current trend of chunky hip plastic glasses. For some people, these will look good. For others, they will look ridiculous and over the top. For smart glasses to take off, they will need to figure out how to deliver more styles to appeal to more people and face types. This chunky, rectangular style suits certain face types better than others. Many benefit from a more rounded lens.
These are probably the first smart glasses worth buying. Even if you are just going to buy them to see what the fuss is about and only use them occasionally, these are the first ones that are probably worth buying. Everything before now has been largely a tech demo. In hindsight, it is shocking that Google released the janky Google Glass 12 years ago. They were legitimately more than a decade off in terms of usability and utility. I tried them once and was shocked at how bad and useless they were. It was clearly just a prototype that needed a lot more iteration.
I’m not sure if smart glasses are socially acceptable yet. These are the best looking ones yet, but they clearly have cameras on it (will these be banned in gyms or other places?). Do people want to talk to people who have cameras on their faces? Would workplaces allow these to be worn?