Three Years with Meta’s Smart Glasses

Meta Ray Bans v1

The next wave of computing, 3 years ago

Three years ago, I bought a pair of smart glasses from Meta/Ray-Ban. Folks like myself were looking for the next big paradigm in tech after social, mobile and location services had matured. I was building chatbots, voice assistants, and dabbling in VR devices – candidates for the next computing Big Thing. We had some inkling that the next wave of devices would get us closer to ambient computing or some version of the metaverse, two philosophies about how humans would interact with technology, but nothing had gotten success in the mainstream. Little did I know that an impulse purchase had a better shot at getting us to the future than anything else then or now.

I was (and still am) in the ambient computing camp. That’s why I’m so bullish about Meta’s first-mover position to make smart glasses ubiquitous in consumers’ lives. Sorry Google Glass and Snap Specs. And likewise, I’m less enthused by their VR efforts.

VR feels like the opposite of ambient computing. If my eyeballs weren’t staring at a screen in the real world, then what – they were tethered to an immersive screen in pixel-land via goofy goggles? While VR may become successful – perhaps necessary if our habitats become scorched and our societies crash – it’s not personally inspiring to me.

Ambient computing feels like a healthier and more human way that technology can augment the real world. And I wanted to interact more in the real world, after decades of working on social tech. We succeeded in connecting everyone, but what are we doing with those connections? Consuming. Always consuming more and more content. And we’re always connected. Everywhere. All the time. Our phones are constantly asking us for attention, asking us to connect to algo-land and disconnect from our surroundings.

I wanted technology to be less whiny, to demand less of me, to fade into the background until I needed it. That’s the promise of ambient computing.

Enter glasses

With smart glasses, we can still keep connected, but the cool thing is that Meta’s glasses allow me to be more present in the physical world, to use technology as I need it in the moment, then move on with my day.

This is also the promise of new gadgets like Humane’s AI pin, but Meta’s smart glasses immediately felt different to me. First, the glasses adapted to my life, vs the other way around. Instead of being a very obvious “This is a new tech device”, Ray Bans looked like normal sunglasses. And normal shades are something I can use everyday. Wearing my smart shades became a habit. The Ray Bans didn’t make me look like an idiot (Google Glass) nor did it make me look like a try-hard jabroni (Snap Spectacles) nor a wanna-be-influencer (Apple VisionPro). It was the Everlane of glasses: just stylish enough that I’m not embarrassed to wear them, but also not so obviously needy for attention. It managed to be individualistically stylish while so normal as to disappear into the background. It was non-threatening.

What’s more, v1 was a James Bond device that I had access to. I use these three features the most:

Photos: I love that I can walk down Parisian streets in the summer afternoons when workers and friends and lovers spill out from cafes and onto the sidewalks — and I can snap their gestures with just a button tap – saving the memory for inspiration. Cassis in one hand, cigarettes on another.

Videos: Or when I’m biking down the Seine and it’s just one of those days I want to remember, with scenes I want to come back to, and I can just start recording the moment with my voice. Or just hanging out with friends after work. The POV rawness is super resonant with nostaglia.

Music: I love that I can walk my dog and listen to music on the same device, and if she does something funny (as she often does), I can – snap – and it’s stored away.

I like getting new message notifications and dictation in theory, but honestly, I’m a visual person when it comes to text, so I use this rarely. I also don’t make or take calls from the glasses very much. I’m old school with social norms and not having conversations in public. This may change if I get a pair of transitions and can wear them indoors.

Biking down the Seine on a Lime bike (RIP)

Not so great things

There are definite areas where things can be improved, and I think the team has already made strides in v2.

Dealing with the battery sucks. The v1 case doesn’t charge consistently, and remembering to get a full charge before going out kills the momentum. I’ve often left the glasses behind because it wasn’t charged. And after a while, a passive habit kicks in and I don’t take them out for months. This device could be on my face for 12-14 hours a day. Getting a good battery life and an easy charging experience is essential.

The bluetooth connection to the View app is horrible for v1. I have to constantly re-install and do the complicated button hold-forget-device-disconnect process to get the glasses to pair with the app to access the photos and videos. Honestly,] getting assets from the glasses into something that can be shared and stored is my biggest pain point. I’ve heard this process is much simpler for v2.

The sound quality is flat, sometimes buzzy and when turned up, people in the vicinity can hear what I’m listening to. Sometimes that’s not a bad thing. I was waiting to pick up my daughter from an after-school program and was listening to my Liked list from Spotify when another parent asked me, “Hey, are you listening to Imogen Heap”? I was – I had one of her songs on repeat. We bonded for the next few minutes over mutual playlists – and it was a little moment of human interaction that I get less nowadays.

The videos and photo quality can be (and will be) much better.

The frames themselves are on the heavy side.

Sidewalk drinks with co-workers.

Ideas I'm excited about

If rumors are true, Meta will come out with v3 with at the end of the year. Here are some ideas rumbling around after using these things for these last years. I think Meta’s glasses could go from a James Bond device to a Jarvis device really quick:

  • Additional camera angles: It would be so cool to have another camera lens with a different zoom/angle so that I can get a different (close up, portrait) view. I think these glasses have the potential to be way better than glorified goPros. Having more control over camera output will go a long way towards consumer adoption. Currently, there’s a sameness to the POV view that you get.
  • AI assistant: I’m already bullish on Meta incorporating a genAI assistant into the glasses. Being able to identify what I’m looking at is such a level up to the way I interact with the real world. Use cases like having a translator in your ear is immediately useful. I’m not sure about identifying other people, however. An AI as a foundation for different modes of interactions is going keep many PMs occupied for years.
  • Augmented displays: Perhaps if the rumors are true, then v3 will get us real (tiny) readable displays on the glasses. I’m guessing that the bottom edge might be a good location for a thin strip of display. It will be such an exciting area to explore for design. Imagine walking around and getting helpful info like texts or location/POI summaries or reviews. Or knowing what types of trees and plants are as you’re hiking. The device might be able to offer warnings and callouts for dangerous situations like crossing a busy street or upcoming storms (if you’re out hiking). Users could get cooking instructions — or even better, feedback if the stuff they’re cooking looks right at each stage. Limitless possibilties.
  • Fitness: I often take my smart glasses along for a run or a bike ride. Playing music without my phone is a killer app. Incorporating GPS on these devices can open up fitness tracking and create a whole new category of products. I’d love if images and videos from my rides can be mapped to the route and synced to my workout stats. Adding social connection – involving my buddies – during and post my rides would also be amazing. Meta already has the raw ingredients to make getting healthy super fun.
  • Directions: Having GPS also enables the glasses to provide turn-by-turn directions. I actually like to get lost when I travel, but there are times when it’s inconvenient. Sometimes breaking out my phone attracts unwanted attention and it feels unsafe. Oftentimes it makes me look like a clumsy tourist.
  • AI curation: The glasses have already helped me to shoot a lot more videos (and photos) and I fully expect Meta to make editing/curating the very best and appropriate ones to share on FB/Insta/etc much easier. I don’t want to go and look at each clip. Just create a highlights video for me. Reels already has all the tools.

A typical walk with the dog.

Why I'm bullish

Beyond product features, there are three big reasons why I’m bullish on Meta Ray Bans becoming the next great consumer device.

One: Meta already has momentum and early market fit.

Usually, when I talk to startups, I always ask if their products are 10x better than whatever they’re trying to replace. If not, there’s no way that users will adopt. The big exception here is the novelty or newness of the tech. You may only need to be 5x better if you’re introducing something super new. Based on my experience with v1 and hearing folks talk about their use of v2, the Meta Ray Bans have enough product-market-fit. Enough for more investment. They have sales numbers to back this up, but more important than novelty sales (ahem VisionPro) are retention numbers. Is there strong retention and repeat usage?

For me, the glasses are already everyday items. I use them in short active sessions when capturing photos and video and longer passive sessions to play music. They have two essential ingredients for habitual products: they’re comfortable and useful. They make me feel good without doing anything; and after using them, they make me feel even better. It’s a slam-dunk combo for a consumer device.

Two: This is a huge market worthy of Meta’s investment.

While Snap targeted influencers and cool teens with Spectacles, Meta went for the jugular by going after the general consumers. The US sunglass market alone is $24B, and 80% of American consumers purchase a pair of shades. By extending beyond sunglasses to transitions and indoor frames, Meta is making a play for people who don’t wear glasses and information workers – a very big market. But I don’t think Meta cares very much about hardware sales. Instead, the early product stickiness with teens, young adults, and working adults promises huge future gains for their core business. Glasses might just be the moat that Meta needs to combat declining usage in classic FB and Insta.

The market may be bigger than eyewear. Glasses can eat a sizable chunk of headphone sales in two years. Add to it an augmented reality assistant subscription service and this might be a new meaningful business line for Meta. At the very least, the addressable market should exceed smartwatches and tablets (~$50B). At the very most, this is Meta’s chance to normalize generative AI for the mass consumer.

Three: Meta’s in the pole position in this new consumer tech race

Meta has everything they need to incorporate native genAI, social experiences and media sharing into the glasses. To put it another way, they have the potential to transform lifestyle consumer tech with this device. They already have two cycles to understand where this product can go. And with v2’s multi-modal tests, the team has shown that they can adapt to market changes. Llama integration is something that may have not been on the early roadmaps.

Looking at potential competition, the partnership with Ray-Bans is such a power move. Zuck essentially bought a big-ass ramp to get into consumer wallets and on top of gift lists. It allowed the what is a new tech interaction to wrap itself in a well-known and well-loved form factor. Look at who they have a headstart over:

  • Amazon: I think Amazon’s a behemoth, but their core value in commerce doesn’t fit with this type of ‘connected life’ mission. They don’t have the social assets to make their device an ecosystem play. Their distribution with Alexa was impressive, so I don’t count them out totally.
  • Apple: Apple probably poses the biggest risk to Meta’s head start. I’m sure Cupertino is developing something similar right now, but they are way behind in making Siri useful. They do have all the components (AI, media, hardware, services) to make a push if they can come out with something for the holidays. Their biggest weakness is a lack of connected social properties that makes the Meta Ray Bans fun. But I wouldn’t count out Apple.
  • Google: I love Google so much, but their strength and attention seem to be in services and perhaps very divided. Their hardware (Home Assistants, Pixel, Nest, etc.) are successful, but not at a scale investors care about. I’m not sure why Google isn’t more successful here. Without any good reasons, I have to agree with folks who cry about their incredible matrixed culture and lack of urgency in product development. They seem spooked by Microsoft/OpenAI so I expect their efforts to be spent towards building up their LLM ecosystem.
  • Startups like Humane/Rabbit: Do I want another device that stands out? With a totally separate investment in a different ecosystem (data plan, etc.). Or do I want to be where my friends already are, using something that I already am familiar with and is comfortable with? Easy answer here for me.


Smart glasses may be the device where Meta can finally vertically integrate and own the entire stack. The Portal never got mass consumer adoption. They were also behind Alexa and Google Home and Apple’s Homepod and never had an original strategy. Glasses early use cases, by being good at a few things instead of falling down on many things, helps user adoption here. But with the massive quality uptake from Llama for the inevitable “guy-in-the-chair” assistant feature, I expect some kind of developer relationship or Glass Store to start (especially if AI and AR happens in the next few cycles) before 2025.

How could it work? Consider t travel services on this thing. Hopper or Tripit can tell me gate and flight information while I’m on my way to the airport. Hilton or Airbnb can show me turn by turn direction and when check-in is available. Yelp can light up ratings or items to order as I’m sitting down at a new eatery.

Many of the current smartphone widgets can be ported to smart glasses. The future will be really cool on these devices.

As an aside, it’s interesting to think about potential partnerships for Meta’s smartglasses. If Apple is the big bad here, it makes sense to have Spotify/Tidal and others on board for music. But what about maps? Or local businesses? Or fitness? How can Meta play nice with Google? Can it drive interest from folks like Peloton or Strava?

Final Personal Takes

I’ll admit that for a few years I was very cynical of Facebook and Meta. It felt like they hit a ceiling and was in full-on copy & paste mode for product development. Moreover, they were the lynchpin in a world where I felt pressured to be connected to the Graph all the time. At first it was friends, but then it became politics and pop culture and sports and all sorts of stuff. I felt helpless to stop. Tech became something that I wanted to get away from vs engage with.

But with glasses, it feels like there’s a chance to be sane again.

Unlike VR headsets, the emphasis isn’t on consuming but on doing. I wear my glasses outside, and it enables me to interact with the world in a way that’s non-invasive and natural. I use it on walks, on runs, at parties, meeting up with friends — it’s a part of my aspirational life, not a substitute for it.

There’s no feed supplying a steady stream of content. Apart from music (different, I think), if I want information (again, about the world around me), the glasses are there when I need it and I can close out when I’m done.

I don’t use glasses to connect with the whole world. I use it to connect with my very-present-now world.

There’s one path where Meta sees this device as another play for the business model of content creation > engagement > ads. And to be fair, it can work. Livestreaming from glasses might be a thing to lean into. It’s certainly a feature that has easy tie-ins to help Instagram grow. (Quick note: While building out P00LS, a web3 influencer platform, I was close to gifting a bunch of smart Ray Bans to our DJs so they can live stream their shows to fans.) But I think content creation is just one piece of this device. And perhaps it’s not even the most interesting part.

The more interesting thing for me is the philosophical change that can happen – the way we make use of and retake control from our devices. Meta is in a position to rethink all of this and enable us to connect first with the real world in better ways. And if we use our phones less and glasses more naturally, we might rediscover that connecting with friends and culture and all of that other stuff in the real world can be a lot of fun and mentally healthy.