See that little circle? That’s a camera. | Photo by Vjeran Pavic / The Verge

All around Meta’s Menlo Park campus, cameras stared at me. I’m not talking about security cameras or my fellow reporters’ DSLRs. I’m not even talking about smartphones. I mean Ray-Ban and Meta’s smart glasses, which Meta hopes we’ll all — one day, in some form — wear. I visited Meta for this year’s Connect conference, where just about every hardware product involved cameras. They’re on the Ray-Ban Meta smart glasses that got a software update, the new Quest 3S virtual reality headset, and Meta’s prototype Orion AR glasses. Orion is what Meta calls a “time machine”: a functioning example of what full-fledged AR could look like, years before it will be consumer-ready. But on Meta’s campus, at least, the Ray-Bans were already everywhere. It…

Continue reading…
  • wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 hours ago

    I’ll believe these will take off when I see it, as this has been tried before, and the questions of “use case” and “how does a user control it” still don’t have good answers in my opinion.

    I’d love just a basic HUD in my glasses for stuff like messages from important contacts, walking directions sometimes, maybe a todo list, a calendar view… but any use case I can imagine would be enough work to control when it displayed or not that it would be just as easy to just pull my phone from my pocket.

    Plus, from using VR headsets, there’s still a lot of room for better image quality when we’re using screens. A projector or transparent screen is going to have similar or worse limitations with resolution and clarity.