Tag: glasses

  • Even Realities G1 Smart Glasses Review: Superb Display, But Slow Info

    Even Realities G1 Smart Glasses Review: Superb Display, But Slow Info

    [ad_1]

    I’ve been wearing the Even Realities G1 glasses for four months, and while many people have commented on my new frames, only two friends asked if my glasses were “smart.” For someone who wore Google Glass in public and lived to tell the tale, this technological anonymity is high praise indeed. They look like glasses you might actually want to wear, and they don’t draw unnecessary attention to your (OK, my) face.

    But as Clark Kent accessed his superpowers after taking off his spectacles, inversely, this mild-mannered reporter benefits from real-time language translation, access to AI, turn-by-turn navigation, and a personal assistant, all by keeping his glasses on.

    Most smart glasses, like the Ray-Ban Meta, rely on Bluetooth audio, but the G1 features a small but brilliantly effective heads-up display called the Holistic Adaptive Optical System, or HAOS. Look carefully at the lenses and you’ll see a faint rectangle in each eye. This is where a micro-LED optical engine projector displays crisp, green digital text (640 x 200 pixels). Glance up (choose the angle via the app) and a seemingly two-foot-wide text homepage appears to float around five feet in front of you. Considering all this, it’s astonishingly clever given how light and, well, normal the frames feel.

    The digitally surfaced lens is actually two bonded lenses but manages to be no thicker or heavier than a standard design. Prescription lenses cost $129 extra and, aside from the occasional glimpse of the projector screen in bright sunshine, works as well as any glasses I’ve ever owned.

    Nestled on the end of each arm you’ll find two rubbery nodules. These contain the battery, buttons, and antennae that exchange real-time data with your phone over Bluetooth. They’re marginally heavier than standard glasses, but because the weight is kept away from the nose, they feel good. The frames are made from solid magnesium and have a cool matte finish, with the temples coated in silicon for added grip. Add in screwless hinges and a classic oval shape, and you’ve got a stylish proposition even before you charge them up.

    Even Realities G1 Smart Glasses Review Superb Display But Slow Info

    Photograph: Christopher Haslam

    The charging case is equally well designed and holds enough power to recharge the glasses 2.5 times. The 60-mAh battery in the glasses has enough power for 1.5 days.

    So, they’re nice glasses—but what do they actually do?

    Virtual Assistance

    The idea of the G1 is not to replace your smartphone but rather to offer a pared-back interface that gives you help and information when you need it, then vanishes when you don’t.

    After installing the app and syncing the glasses, when you glance up you will see a screen with the date, time, battery level, and upcoming diary dates (assuming you’ve given permissions). You can also receive messages and alerts from social and messaging apps. You can’t respond to any messages, though, which seems both odd and a shame given the onboard microphones and the transcription software used.

    The right side of the main display is for QuickNotes. If you pinch the small box on the right arm, a note will flash up saying “Quick Note Recording.” When you speak, your words will be saved and displayed on the screen when you next look up. If you mention a date, time, or place, the AI assistant will add it to your diary. It’s great if you are a fan of voice notes. I’m not, but as someone who meets new people all the time but remains terrible at remembering names, I loved being able to have names, and even job titles, on display, for my eyes only.

    Translation

    Open up the Translate box on the Even Realities app, choose from one of 13 languages (including Mandarin, Japanese, and Korean), decide what language you’d like things translated into (in this case English), and press Engage. If someone then speaks to you in that language, the G1 glasses will listen, translate, and write the words on your HUD.

    Annoyingly, however, it’s no Babelfish. With one-on-one conversations it worked OK, and I enjoyed understanding my wife’s rusty Spanish. Similarly, I had success rewatching Squid Game without subtitles. But without someone wearing their own pair and translating my English, it is one-way traffic.

    [ad_2]

    Source link

  • These Smart Glasses Will Read Your Emotions and Watch What You Eat

    These Smart Glasses Will Read Your Emotions and Watch What You Eat

    [ad_1]

    “We can manage what we measure, but what we mostly measure are things like money or speed,” Nduka says. “What we can’t really measure is quality. And quality is about emotions. And emotions can be sensed most sensitively with expressions.”

    AI Vision

    Humanity has been asking whether AI can truly know how people feel for a long time, and most of the answers come down to, well, probably not. Even without a bunch of advanced cameras and AI smarts, reading emotions can be tricky.

    “Gauging emotion through facial expressions is kind of somewhat debatable,” says Andrew McStay, a professor and director of the Emotional AI Lab at Bangor University in the UK. McStay says that even if the company were using AI to “smooth out” the data collected by the sensors to make it more usable, he’s not convinced it can actually read emotions with accuracy. “I just think there are fundamental flaws and fundamental problems with it.”

    Cultural differences also inform how different people display emotion. One person’s smile might mean congeniality or joy, while others might be a nervous expression of fear. That type of signaling can vary widely from culture to culture. How emotions register on the face can also fluctuate depending on neurodivergence, though Emteq says it wants to help neurodivergent users navigate those kinds of awkward social interactions.

    Strand says Emteq is trying to take all of these factors into account, hence the pursuit for more and more data. Emteq is also adamant that its use cases will be wholly vetted and overseen by health care providers or practitioners. The idea is that the tech would be used by therapists, doctors, or dietary consultants to ensure that all the data they’re collecting straight off your face isn’t used for nefarious purposes.

    “You’ve got to be thoughtful about how you deliver information, which is why we have experts in the loop. At least right now,” Strand says. “The data is valuable regardless because it empowers whoever is making the assessment to give good advice. Then it’s a question of what is that advice, and what’s appropriate for that person in their journey. On the mental health side, that’s especially important.”

    Strand envisions therapy sessions where instead of a patient coming in and being encouraged to share details about stressful situations or anxious moments, the therapist might already have a readout of their emotional state over the past week and be able to point out problem areas and inquire about them.

    Nearsighted

    Regardless of how good Emteq’s smart glasses are, they’re going to have to compete with the bigwigs already out there selling wearable tech that offers far wider use cases. People might not be interested in sporting a bulky-ish pair of glasses if all they can do is scan your face and look at your food. It’s not far-fetched at all to imagine these internal facing sensors being incorporated into something more feature rich, like Meta’s Ray-Ban smart glasses.

    “This has always been kind of the way with these kinds of products,” McStay says. “These things often start with health, and then quickly they kind of get built out into something which is much more marketing oriented.”

    [ad_2]

    Source link

  • Facial recognition Meta Ray-ban glasses knows who you are in real time

    Facial recognition Meta Ray-ban glasses knows who you are in real time

    [ad_1]

    In what might be described as a real-life Black Mirror episode, a Harvard student uses facial recognition with $379 Meta Ray-Ban 2 smart sunglasses – to dig up personal data on every face it sees in real time.

    If you’ve ever cared about your privacy, now might be the time to grab the tin foil hat. I’ve already got mine on.

    AnhPhu Nguyen, a junior at Harvard University, uses the livestreaming feature of his Meta Ray-Ban 2 smart glasses while a connected computer monitors the feed in real-time. He employs publicly available AI-powered facial recognition software to detect faces and scour the internet for more images of those individuals.

    He then uses databases like voter registration and online articles to gather names, addresses, phone numbers, next of kin, and even social security numbers.

    All of this data is scraped together using an LLM (Large Language Model) similar to ChatGPT which aggregates the data into a searchable profile that’s fed straight back to his phone.

    This entire process takes only seconds from being captured discretely on camera to being displayed on his phone, giving off real life Cyberpunk 2077 vibes.

    Nguyen has been very poignant to say that he’s not done any of this for nefarious or malicious purposes. He’s even published a small “how to” remove your information from some of the databases he uses to scrape your personal data. He wants to raise awareness of the implications this type of technology presents.

    While he offers a “solution” to help protect yourself, it’s really a small drop in a very large bucket that very well may never have a solution. Or maybe the solution will be wearing smart glasses of your own with Infrared Lights constantly blinding other facial recognition cameras?

    Unfortunately, bad actors (hackers that act maliciously) have already broken into many websites and databases, including in April of this year, when information on 3 billion people, including every single social security number in existence was stolen from the background check company National Public Data and posted on the Dark Web.

    With the proliferation of AI over just the last few years, one has come to expect to see it used in new and inventive ways … even if that carries a negative connotation, like deep fakes and disinformation to trick the masses into believing whatever narrative the creator wants them to believe.

    For now, Nguyen says he’s not releasing his software dubbed I-Xray.

    But if a smart college kid has already “cracked the code”, imagine what’s already happening behind the curtains. At least I think that was the lesson Edward Snowden was trying to tell us.

    Technical documentation: https://docs.google.com/



    [ad_2]

    Source link

  • Meta won’t say whether it trains AI on smart glasses photos

    Meta won’t say whether it trains AI on smart glasses photos

    [ad_1]

    Meta’s AI-powered Ray-Bans have a discreet camera on the front, for taking photos not just when you ask them to, but also when their AI features trigger it with certain keywords such as “look.” That means the smart glasses collect a ton of photos, both deliberately taken and otherwise. But the company won’t commit to keeping these images private.

    We asked Meta if it plans to train AI models on the images from Ray-Ban Meta’s users, as it does on images from public social media accounts. The company wouldn’t say.

    “We’re not publicly discussing that,” said Anuj Kumar, a senior director working on AI wearables at Meta, in a video interview with TechCrunch on Monday.

    “That’s not something we typically share externally,” said Meta spokesperson Mimi Huggins, who was also on the video call. When TechCrunch asked for clarification on whether Meta is training on these images, Huggins responded, “we’re not saying either way.”

    Part of the reason this is especially concerning is because of the Ray-Ban Meta’s new AI feature, which will take lots of these passive photos. Last week, TechCrunch reported that Meta plans to launch a new real-time video feature for Ray-Ban Meta. When activated by certain keywords, the smart glasses will stream a series of images (essentially, live video) into a multimodal AI model, allowing it to answer questions about your surroundings in a low-latency, natural way.

    That’s a lot of images, and they’re photos a Ray-Ban Meta user might not consciously be aware that they’re taking. Say you asked the smart glasses to scan the contents of your closet to help you pick out an outfit. The glasses are effectively taking dozens of photos of your room and everything in it, and uploading them all to an AI model in the cloud.

    What happens to those photos after that? Meta won’t say.

    Wearing the Ray-Ban Meta glasses also means you’re wearing a camera on your face. As we found out with Google Glass, that’s not something other people are universally comfortable with, to put it lightly. So you’d think it’s a no-brainer for the company that’s doing it to say, “Hey! All your photos and videos from your face cameras will be totally private, and siloed to your face camera.”

    But that’s not what Meta is doing here.

    Meta has already declared that it is training its AI models on every American’s public Instagram and Facebook posts. The company has decided all of that is “publicly available data,” and we might just have to accept that. It and other tech companies have adopted a highly expansive definition of what is publicly available for them to train AI on, and what isn’t.

    However, surely the world you look at through its smart glasses is not “publicly available.” While we can’t say for sure that Meta is training AI models on your Ray-Ban Meta camera footage, the company simply wouldn’t say for sure that it isn’t.

    Other AI model providers have more clear-cut rules about training on user data. Anthropic says it never trains on a customer’s inputs into, or outputs from, one of their AI models. OpenAI also says it never trains on user inputs or outputs through its API.

    We’ve reached out to Meta for further clarification here, and will update the story if they get back to us.

    [ad_2]

    Source link

  • Meta rethinks smart glasses with Orion

    Meta rethinks smart glasses with Orion

    [ad_1]

    Meta Connect 2024 was this week, showcasing new hardware and software to support two of the company’s big ambitions: AI and the metaverse. CEO Mark Zuckerberg announced new Quest headsets, updates to Meta’s Llama AI model, and real-time video capabilities of Ray-Ban Meta smart glasses. The biggest reveal, though, was Orion, a true AR glasses prototype touted as “the most advanced glasses the world has ever seen.”

    OpenAI CTO Mira Murati announced this week that she is leaving the company after more than six years. Hours after the announcement, OpenAI’s chief research officer, Bob McGrew, and a research VP, Barret Zoph, also left the company. The high-level departures come less than a week before the start of OpenAI’s annual developer conference.

    One of CloudKitchens’ earliest employees is suing the company. In the lawsuit, Isabella Vincenza alleges wrongful termination, sex discrimination, and a hostile work environment, including an intense “bro culture” at the company. Vincenza also claims that she was “retaliated against for standing up for herself” following her pregnancy and subsequent maternity leave. 


    This is TechCrunch’s Week in Review, where we recap the week’s biggest news. Want this delivered as a newsletter to your inbox every Saturday? Sign up here.


    News

    Talk to me, ChatGPT: OpenAI rolled out Advanced Voice Mode following some delays and controversy. The feature has an updated blue spherical look, five new voices, and improved accent capabilities for customers in ChatGPT’s Plus and Teams tiers. Read more

    YC Demo Day: Y Combinator kicked off its two-day “Demo Day” event showcasing what the most recent YC batch companies are building. Here are the companies worth paying attention to out of the event. (Spoiler alert: They pretty much all use AI.) Read more

    Amazon employees vs. RTO: Amazon CEO Andy Jassy announced that employees will be expected to work from the office five days a week starting in 2025. But an anonymous survey created by workers reveals that many who have grown accustomed to a hybrid work structure are “strongly dissatisfied.” Read more

    How much can a phone wallpaper cost? Marques Brownlee, known on YouTube as MKBHD, launched the wallpaper app Panels, where he’s curating high-quality digital wallpapers from artists. But in order to access high-resolution wallpapers without ads, users need to cough up about $50 per year. Read more

    WordPress vs. WP Engine: A heated legal battle is brewing between WordPress founder and Automattic CEO Matt Mullenweg and WP Engine — which hosts websites built on WordPress — after Mullenweg wrote a blog post calling WP Engine a “cancer to WordPress.” Read more

    X switches up the block feature: X will soon change how its block feature works so that accounts you have blocked will still be able to see your public posts. Elon Musk clarified that blocked accounts still won’t be able to engage with users who have blocked them. Read more

    RevenueCat turns up the heat: Subscription management platform RevenueCat acquired Dipsea, an app offering subscriptions to “spicy” audiobooks. The idea is to bring a subscription-based app in-house to serve as a testing ground for RevenueCat’s new features. Read more

    RIP, TikTok Music: ByteDance is shuttering its music streaming service, TikTok Music, in November. TikTok Music was rooted in a ByteDance product called Resso, and the service was later available in Indonesia, Brazil, Australia, Singapore, and Mexico. Read more

    Meta gets hit with another privacy penalty: Meta has been reprimanded and fined $101.5 million (at current exchange rates) by Ireland’s Data Protection Commission for a 2019 breach that exposed hundreds of millions of Facebook passwords. Read more

    Hands-on with Plaud’s NotePin: TechCrunch’s Brian Heater has been testing Plaud’s $169 ChatGPT-powered NotePin to transcribe meetings and take notes. Unlike other AI pins, Plaud’s product feels like a solution to real issues, he argues. Read more

    Analysis

    Sam Altman goes “god mode”: OpenAI CEO Sam Altman has historically pitched AI as the solution to the world’s problems, despite its significant impact on the environment. In a new rose-colored-glasses blog post, Altman presents an incredibly positive update on the state of AI, hyping its world-changing potential. But, as TechCrunch’s Sarah Perez notes, much of what he writes is seemingly meant to make skeptics see how much AI matters and could well have the opposite result. Read more

    [ad_2]

    Source link

  • Meta offers a glimpse through its supposed iPhone killer: Orion

    Meta offers a glimpse through its supposed iPhone killer: Orion

    [ad_1]

    For years, Silicon Valley and Wall Street have questioned Mark Zuckerberg’s decision to invest tens of billions of dollars into Reality Labs. This week, Meta’s wearables division unveiled a prototype of its Orion smart glasses, a form factor the company believes one day could replace the iPhone. That idea sounds crazy… but maybe a little less crazy than it did a week ago.

    Orion is a prototype headset that combines augmented reality, eye and hand tracking, generative AI, and a gesture-detecting wristband. Through micro LED projectors and silicon carbide lenses (which are quite expensive), Meta seems to have cracked a longstanding AR display challenge. The idea is that you can look through Orion — you know, like a pair of glasses — but also see application windows projected on the lenses that appear as if they’re embedded in the world around you. Ideally, you can use your hands, eyes, and voice to navigate the environment.

    Meta Orion
    The Orion smart glasses need a wristband and wireless compute puck to work. (Meta)
    Image Credits: Meta

    Though to be clear, Meta’s Orion smart glasses are chunkier than your average readers, reportedly cost $10,000 a pop, and won’t be available for sale anytime soon. We’re talking years from now. All the technology in Orion is relatively young, and all of it needs to get cheaper, better, and smaller to work its way into a pair of smart glasses you can buy at the mall. Zuckerberg says the company has already been working on Orion for 10 years, but there’s still no path to a sellable product.

    However, Meta is hardly the only company trying to put a smartphone replacement on your face.

    This month, Snap unveiled its latest generation of Spectacles smart glasses, which are larger than Orion and have a more limited field of view. One former Snap engineer called the latest Spectacles “obviously bad” — though you can actually order them. Google hinted during its I/O conference in May that it, too, is working on a pair of smart glasses, perhaps a revamp of its failed Google Glass experiment from last decade. Apple is reportedly working on AR glasses that sound a lot like Orion. And we can’t rule out Jony Ive’s new startup, LoveFrom, which he recently confirmed is working on an AI wearable with OpenAI (though we don’t know if they’re glasses, a pin, or something else entirely).

    What’s brewing is a race among Big Tech’s richest companies to create a sleek pair of smart glasses that can do everything your smartphone can — and hopefully something more. Meta’s prototype made two things clear: there is something there, but we’re not “there” yet.

    These devices are a notable departure from the Quest virtual reality headsets Meta has been pushing for years now, and Apple’s Vision Pro. There’s a lot of similar technology involved, like eye-tracking and hand tracking, but they feel completely different to use. VR headsets are bulky, uncomfortable to wear, and make people nauseous from staring at the displays. Sunglasses and eyeglasses, on the other hand, are relatively pleasant to wear and millions of Americans use them everyday.

    To Zuckerberg’s credit, he’s been pushing the eyewear form factor for quite a long time, when it certainly was not popular to do so. It’s long been reported that Meta’s CEO hates that his popular social media apps have to be accessed through Apple’s phones (perhaps leading to the ill-fated Facebook Phone). Now, Meta’s competitors are also dipping their toes into eyewear computing.

    Andrew Bosworth, CTO of Meta and head or Reality Labs, wearing a clear pair of Orion smart glasses. (David Paul Morris/Bloomberg via Getty Images)

    Meta’s early investment here seems to be paying off. Zuckerberg gave a keynote presentation of Orion on Wednesday that we won’t be forgetting anytime soon, filling a room full of skeptical journalists with electricity and excitement. TechCrunch has not demoed Orion yet, but initial reviews have been very positive.

    What Meta offers today is the Ray-Ban Meta: a pair of glasses with cameras, microphones, speakers, sensors, an on-device LLM, and the ability to connect to your phone and the cloud. The Ray-Ban Meta is far simpler than Orion, but relatively affordable at $299 — actually not much more than a regular pair of Ray-Bans. They’re kind of like the Spectacles 3 that Snap released a few years ago, though the Ray-Ban Meta glasses appear more popular.

    Despite the vast differences in price and capabilities, Orion and Ray-Ban Meta are more related than you might think.

    “Orion is really the future, and we ultimately want to go for the full holographic experience. You can think about Ray-Ban Meta as our first step there,” said Li-Chen Miller, a VP of product at Meta who leads its wearables team, in an interview with TechCrunch. “We really need to nail the basic things, like making sure it’s comfortable, people want to wear it, and that people find value in it every day.”

    One of the things Meta is trying to nail with Ray-Ban Meta is AI. Currently, the smart glasses use Meta’s Llama models to answer questions about what you see in front of you, by taking pictures and running them through the AI system alongside a user’s verbal requests. The Ray-Ban Meta’s AI features today are far from perfect: The latency is worse than OpenAI’s natural-feeling Advanced Voice Mode; Meta AI requires very specific prompts to work right; it hallucinates; and it doesn’t have a tight integration with many apps, making it less useful than just picking up my iPhone (perhaps by Apple’s deisgn). But Meta’s updates coming later this year try to address these issues.

    Li-Chen Miller, VP of product during Meta Connect in 2023. (David Paul Morris/Bloomberg via Getty Images)

    Meta announced it will soon release live AI video processing for their Ray-Bans, meaning the smart glasses will stream live video and verbal requests into one of Llama’s multimodal AI models and will produce real-time, verbal answers based on that input. It’s also getting basic features, like reminders, as well as more app integrations. That should make the whole experience a lot smoother, if it works. Miller says these improvements will filter up to Orion, which runs on the same generative AI systems.

    “Some things make more sense for one form factor than the other, but we’re certainly cross-pollinating,” said Miller.

    Likewise, she says some of Orion’s features may filter down as her team focuses on making the AR glasses more affordable. Orion’s various sensors and eye trackers are not cheap technologies. The problem is that Orion has to get both better and more economical.

    Another challenge is typing. Your smartphone has a keyboard, but your smart glasses won’t. Miller worked on keyboards at Microsoft for nearly 20 years before joining Meta, but she says Orion’s lack of keyboard is “freeing.” She argues that using smart glasses will be a more natural experience than using a phone. You can simply talk, gesture with your hands, and look at things to navigate Orion; all things that come naturally to most people.

    Another device that was criticized for lacking a keyboard was, ironically, the iPhone. Former Microsoft CEO Steve Ballmer infamously laughed at the iPhone in 2007, saying it wouldn’t appeal to business customers because it didn’t have a physical keyboard. People adapted though, and his comments sound naive more than 15 years later.

    I think making Orion feel natural is definitely more of a goal than a reality at this point. The Verge notes in its hands-on review that windows occasionally filled the entire glasses lens, completely obstructing the user’s view of the world around them. That’s far from natural. To get there, Meta will have to improve its AI, typing, AR, and a long list of other features.

    “For Ray-Ban Meta, we kept it very scoped to a few things, and then it does them really well,” said Miller. “Whereas, when you want to build a new, futuristic computing platform [with Orion], we have to do a lot of things, and do them all very well.”

    [ad_2]

    Source link

  • Meta Missed Out on Smartphones. Can Smart Glasses Make Up for It?

    Meta Missed Out on Smartphones. Can Smart Glasses Make Up for It?

    [ad_1]

    Meta has dominated online social connections for the past 20 years, but it missed out on making the smartphones that primarily delivered those connections. Now, in a multiyear, multibillion-dollar effort to position itself at the forefront of connected hardware, Meta is going all in on computers for your face.

    At its annual Connect developer event today in Menlo Park, California, Meta showed off its new, more affordable Oculus Quest 3S virtual reality headset and its improved, AI-powered Ray-Ban Meta smart glasses. But the headliner was Orion, a prototype pair of holographic display glasses that chief executive Mark Zuckerberg said have been in the works for 10 years.

    Zuckerberg emphasized that the Orion glasses—which are available only to developers for now—aren’t your typical smart display. And he made the case that these kinds of glasses will be so interactive that they’ll usurp the smartphone for many needs.

    “Building this display is different from every other screen you’ve ever used,” Zuckerberg said on stage at Meta Connect. Meta chief technology officer Andrew Bosworth had previously described this tech as “the most advanced thing that we’ve ever produced as a species.”

    The Orion glasses, like a lot of heads-up displays, look like the fever dream of techno-utopians who have been toiling away in a highly secretive place called “Reality Lab” for the past several years. One WIRED reporter noted that the thick black glasses looked “chunky” on Zuckerberg.

    As part of the on-stage demo, Zuckerberg showed how Orion glasses can be used to project multiple virtual displays in front of someone, respond quickly to messages, video chat with someone, and play games. In the messages example, Zuckerberg noted that users won’t even have to take out their phones. They’ll navigate these interfaces by talking, tapping their fingers together, or by simply looking at virtual objects.

    There will also be a “neural interface” built in that can interpret brain signals, using a wrist-worn device that Meta first teased three years ago. Zuckerberg didn’t elaborate on how any of this will actually work or when a consumer version might materialize. (He also didn’t get into the various privacy complications of connecting this rig and its visual AI to one of the world’s biggest repositories of personal data.)

    He did say that the imagery that appears through the Orion glasses isn’t pass-through technology—where external cameras show wearers the real world—nor is it a display or screen that shows the virtual world. It’s a “new kind of display architecture,” he said, that uses projectors in the arms of the glasses to shoot waveguides into the lenses, which then reflect light into the wearer’s eyes and create volumetric imagery in front of you. Meta has designed this technology itself, he said.

    The idea is that the images don’t appear as flat, 2D graphics in front of your eyes but that the virtual images now have shape and depth. “The big innovation with Orion is the field of view,” says Anshel Sag, principal analyst at Moor Insights & Strategy, who was in attendance at Meta Connect. “The field of view is 72 degrees, which makes it much more engaging and useful for most applications, whether gaming, social media, or just content consumption. Most headsets are in the 30- to 50-degree range.”

    [ad_2]

    Source link

  • Meta teases Orion, brain-powered true AR glasses in a tiny package

    Meta teases Orion, brain-powered true AR glasses in a tiny package

    [ad_1]

    At Wednesday’s Meta Connect event, CEO Mark Zuckerberg announced Orion, what he described as “the most advanced glasses the world has ever seen.”

    The glasses, which are notably significantly smaller than Snap’s recently announced Spectacles 5, are true AR. Orion utilize tiny projects built into the glasses temples to create a heads-up display; think the 2024 version of Google Glass.

    The glasses, which Zuckerberg said were a decade in the making, don’t appear to be too far beyond the concept phase at this point. “These glasses exist, they are awesome, and they are a glimpse of a future that I think will be exciting,” the executive noted during the presentation. He added that the team still has a good bit of “fine-tuning” before Meta is ready to turn them into an official consumer product.

    Image Credits: Meta

    Notably, along with the standard voice prompts, Orion will be controlled through a “neural interface” That arrives by way of Meta’s 2019 acquisition of CTRL-Labs, which makes a wrist band that will be compatible with the devices.

    The company is positioning the upcoming glasses as a kind of successor to its current livestream product, Meta Ray-Bans. It notes,

    Yet while Ray-Ban Meta opened up an entirely new category of display-less glasses super-charged by AI, the XR industry has long dreamt of true AR glasses — a product that combines the benefits of a large holographic display and personalized AI assistance in a comfortable, all-day wearable form factor. Orion rises to the challenge.

    There are a lot of claims at the moment, such “as Orion has the largest field of view in the smallest AR glasses form to date,” but we’re far too early for any specifics at this point. That can, however, be seen as a dig at the new Spectacles, which are extremely large with a very narrow FOV.

    Meta Orion Holographic Glasses
    We see that TechCrunch logo, Meta 👀
    Image Credits: Meta

    “That field of view unlocks truly immersive use cases for Orion, from multitasking windows and big-screen entertainment to life-size holograms of people,” Meta notes, “all digital content that can seamlessly blend with your view of the physical world.”

    One key thing Orion does have in common with the new Spectacles is that it will initially be available for developers only. This is, of course, a common move in this world. Companies treat these announcements as a kind of proof-of-concept to get folks excited to develop for the platform.

    Meta's Orion Holographic Glasses, battery pack and wristband
    Image Credits: Meta

    Meta’s Ray-Bans were a bit of a surprise hit for the company, especially in the wake of steady, if slow, growth around the Quest line. If the product hews closely to the demos, it’s hard to accuse Zuckerberg of hyperbole, when compared to the likes of Snapchat Spectacles on one end and Apple’s Vision Pro on the other.

    The recent addition of Meta AI to the Ray-Bans can also been seen as a stepping stone to more fully realized augmented reality glasses. Features like translation and navigation would be even more powerful with a visual element in play.

    There were unsurprisingly bumps along the road getting to this stage. According to one recent report, building Orion cost in the neighborhood of $10,000 per unit. We know that Meta has gotten in the habit of losing money on Quest headsets, but nothing nearly that astronomical.

    The same report also suggests that Meta will deliver a version of the glasses with a significantly smaller HUD when it ships the wrist band ahead of Orion’s eventual arrival.

    “In the next few years, you can expect to see new devices from us that build on our R&D efforts,” Meta writes. “Orion isn’t just a window into the future — it’s a look at the very real possibilities within reach today. From Ray-Ban Meta glasses to Orion, we’ve seen the good that can come from letting people stay more present and empowered in the physical world, while tapping into all that the digital world has to offer.”

    [ad_2]

    Source link

  • Meta Teaches Its Ray-Ban Smart Glasses Some New AI Tricks

    Meta Teaches Its Ray-Ban Smart Glasses Some New AI Tricks

    [ad_1]

    The Ray-Ban Meta glasses are the first real artificial-intelligence wearable success story. In fact, they are actually quite good. They’ve got that chic Ray-Ban styling, meaning they don’t look as goofy as some of the bulkier, heavier attempts at mixed-reality face computers. The onboard AI agent can answer questions and even identify what you’re looking at using the embedded cameras. People also love using voice commands to capture photos and videos of whatever is right in front of them without whipping out their phone.

    Soon, Meta’s smart glasses are getting some more of these AI-powered voice features. Meta CEO Mark Zuckerberg announced the newest updates to the smart glasses’ software at his company’s Meta Connect event today.

    “The reality is that most of the time you’re not using smart functionality, so people want to have something on their face that they’re proud of and that looks good and that’s, you know, designed in a really nice way,” Zuckerberg said at Connect. “So they’re great glasses. We keep updating the software and building out the ecosystem, and they keep on getting smarter and capable of more things.”

    The company also used Connect to announce its new Meta Quest 3S, a more budget-friendly version of its mixed-reality headsets. It also unveiled a host of other AI capabilities across its various platforms, with new features being added to its Meta AI and Llama large language models.

    An image of a woman wearing the new RayBan Meta Headliner glasses in Caramel.

    Courtesy of Meta

    An image of a man wearing the new RayBan Meta Wayfarer glasses in Shiny Black.

    Courtesy of Meta

    As far as the Ray-Bans go, Meta isn’t doing too much to mess with a good thing. The smart spectacles got an infusion of AI tech earlier this year, and now Meta is adding more capabilities to the pile, though the enhancements here are pretty minimal. You can already ask Meta AI a question and hear its responses directly from the speakers embedded in the frames’ temple pieces. Now there are a few new things you can ask or command it to do.

    Probably the most impressive is the ability to set reminders. You can look at something while wearing the glasses and say, “Hey, remind me to buy this book next week,” and the glasses will understand what the book is, then set a reminder. In a week, Meta AI will tell you it’s time to buy that book.

    Image may contain Accessories and Sunglasses

    Courtesy of Meta

    Image may contain Accessories Sunglasses and Glasses

    Courtesy of Meta

    Meta says live transcription services are coming to the glasses soon, meaning people speaking in different languages could see transcribed speech in the moment—or at least in a somewhat timely fashion. It’s not clear exactly how well that will work, given that the Meta glasses’ previous written translation abilities have proven to be hit-or-miss.

    Zuckerberg says Meta isalso partnering with the Danish-based mobile app Be My Eyes to bring a feature to the Ray-Ban Meta glasses that that connects blind and low-vision people to volunteers who can view live video and talk the wearer through what is in front of them.

    “I think that not only is this going to be a pretty awesome experience today, but it’s a glimpse of the type of thing that might be more possible with always-on AI.”

    Image may contain Accessories and Sunglasses

    Courtesy of Meta

    Image may contain Accessories Sunglasses and Glasses

    Courtesy of Meta

    There are new frame colors and lens colors being added, and customers now have the option to add transition lenses that increase or decrease their shading depending on the current level of sunlight.

    Meta hasn’t said exactly when these additional AI features will be coming to its Ray-Bans, except that they will arrive sometime this year. With only three months of 2024 left, that means very soon.

    [ad_2]

    Source link

  • What to expect at Meta Connect 2024, including Quest 3S and new AR smart glasses

    What to expect at Meta Connect 2024, including Quest 3S and new AR smart glasses

    [ad_1]

    Meta Connect 2024 is so close, you can almost taste it.

    Launching during the week of Sept. 23, the social media giant is expected to rollout hardware and software goodies that will intrigue VR gamer enthusiasts, AI aficionados, and smart glasses devotees. But what, specifically, does Meta have up its sleeves?

    We have a few guesses based on credible reports.

    What to expect at Meta Connect 2024

    Last year, the Meta Quest 3 was announced in early June, but it got its full reveal at Meta Connect 2023.

    Mashable Games

    The headset boasted a sleeker, more comfortable design, as well as new AR capabilities, that made it more appealing than its predecessor. Once again, for Meta Connect 2024, the social media giant is expected to drop a new VR headset, but it’s not necessarily an upgrade over the Quest 3.

    Meta Quest 3S

    Rumor has it that Meta is planning on revealing a cheaper, more budget-friendly version of the Quest 3 called “Quest 3S.”

    Whether it was intentional or accidental, as discovered by a Reddit poster, Meta leaked Quest 3S in its own Meta Quest Link PC app for Windows. For the uninitiated, this software lets users connect their Meta-branded VR headsets to a PC, allowing them to access more demanding PCVR games with just the Quest Link cable (which helps users siphon graphics power from their PC’s GPU).

    The image appears to have the body of the Quest 2 (in that it isn’t as sleek as the Quest 3), but it has different cameras on the front.

    According to a leaker on X, Quest 3S will have the following:

    Mashable Light Speed

    • Qualcomm Snapdragon XR2 Gen 2 chip

    • 1,832 x 1,920-pixel resolution per eye

    • Up to 120Hz refresh rate

    • Quest Touch Plus controllers

    • 4 IR tracking cameras

    • 2 IR illuminators for depth sensing

    • 2 4MP cameras for passthrough

    Regarding price, Meta Quest 3S will reportedly have a starting price of $299. For reference, the starting price of the Quest 3 was $499 when it launched last year, so if the reported price is accurate, you’ll be saving $200 with Quest 3S.

    AR smart glasses

    Last year, Meta unveiled the second-generation Ray-Ban Meta Smart Glasses, which is packed with Meta AI.

    Ray-Ban Meta Smart Glasses on a table

    Ray-Ban Meta Smart Glasses
    Credit: Joe Maldonado / Mashable

    This time around, according to a report from Business Insider, Meta is planning on releasing a new pair of spectacles that are totally unrelated to Ray-Ban Meta Smart Glasses. Called “Orion” internally, these glasses will focus on augmented reality (AR).

    AR incorporates virtual elements into your real-world environment. Meta’s Quest 3 is capable of AR. For example, it has a “passthrough mode” that lets you see your true surroundings, but at the same time, you’ll have the option to see or interact with virtual objects in your space.

    Ray-Ban Meta Smart Glasses, on the other hand, have zero AR capabilities. It can play music, take pictures, capture videos, take calls — and even lets you chat with Meta AI. However, it doesn’t offer another augmented dimension — but Orion, reportedly, will.

    Meta AI

    Meta AI can be found across a myriad of Meta products, including Instagram, WhatsApp, and even the Ray-Ban Meta Smart Glasses.

    Meta Connect 2024


    Credit: Meta

    Last year, Meta introduced Instagram-based “Meta AI Personas,” which were celebrity-look-a-like chatbots that didn’t quite resonate with many people, including Mashable’s own AI reporter Cecily Mauran.

    Based on Meta AI, these chatbots featured the likeness of popular, high-profile people (i.e., Padma Laksmi and Snoop Dogg) while taking on roles like “Creative Writing Partner,” “Travel Expert,” and more.

    However, this year, they got the boot.

    This doesn’t mean that Meta AI won’t continue to be spotlighted during Connect 2024. We’re expecting lots of AI updates during the livestream.

    Meta Connect 2024 will take place on Wednesday, Sept. 25 at 1 p.m. ET.

    Topics
    Virtual Reality
    Meta



    [ad_2]

    Source link