Meta Connect 24: Lots of AI, Ray-Ban Glasses and the Quest 3S

Meta is at least keeping pace when it comes to AI: new functions for Meta AI and Ray-Ban Glasses. A slimmed-down Quest 3S is coming.

Save to Pocket listen Print view
A video screen at Meta Connect 24.

A video screen at Meta Connect 24.

(Image: emw)

8 min. read
Contents

A long street leads to the main entrance of Meta Connect. The company's headquarters in Menlo Park is huge, one two-story building after the next, the entrances bearing numbers, 15, 16, ...22... A bit of color on the façade is supposed to flatter them. It only works to a limited extent from the outside. Inside, however, a large, colorful, hip campus opens up. Mark Zuckerberg's keynote speech takes place in the adjacent building designed by star architect Gehry – with lots of glass and a park on the roof. Zuckerberg presents the company's most important innovations at his in-house trade fair. They set the tone for the year. Here, Zuckerberg announced the renaming to Meta, announced the year of efficiency and here he announced the change of course – everything to artificial intelligence.

Everything? Not quite. Meta is still working on the Metaverse. The Quest family of VR headsets is being expanded to include a slim and affordable version. However, the Ray-Ban smart glasses will benefit primarily from AI, and Meta AI will be significantly more extensive – including the ability to analyze images and speak. Llama comes in version 3.2 and understands text and images. Surprisingly, Zuckerberg also presents his prototype of holographic glasses. A team has been working on this for ten years. The glasses are capable of holograms, eye tracking and speech.

The presentation of a new and significantly cheaper version of the VR headset, the Quest 3S, shows that there is more to it than just AI. Rumors had already been circulating. It is now clear that it will be launched on the market from 329.99 euros with 128 GB. There is also a 256 GB version for 439.99 euros. The headset will be available to pre-order from September 25 and will hit the shelves on October 15. According to Meta, the Quest S3 has the "same high-resolution, color mixed reality capabilities and fast performance as the Quest 3". It is said to be an entry-level device, but also particularly suitable for families or people who already have a Quest 2 and want an affordable update. The large Quest 3 (512 GB) will also be cheaper, costing 549.99 euros in the future.

Eva-Maria Weiß tries her hand at Just Dance on the Meta Quest 3S.

(Image: emw)

The Quest 3S feels very similar to the Quest 3 for the wearer. With the cameras arranged differently on the front, it just looks a little different. Just Dance, Minecraft, operating the browser - everything works intuitively and is fun.

In comparison, an Apple Vision Pro costs around 4000 euros. Zuckerberg was already looking forward to the competition when it was announced; he said it would expand the market, which would also benefit his headsets. Zuckerberg also says that the headset can replace a computer.

The Quest 3 and 3S can of course also use Meta AI, the AI assistant that is not available in the EU.

Horizon Worlds is still being developed, so the Metaverse is still there. Fans are sitting in the audience, clapping particularly loudly when the first images are shown. The avatars have been improved and in future it will be possible to watch YouTube in the virtual world.

In Germany, we still have to be patient. Meta AI, i.e. all AI applications such as chatbots, still have no launch date in the EU. Zuckerberg has already criticized that the EU urgently needs to create uniform regulation. However, this does not concern the AI Act, but data protection. Currently, Meta is not allowed to use data from people in the EU to train AI models. However, the Digital Services Act and Digital Markets Act also repeatedly come up against the company. They want to create better conditions for consumers, and US tech companies in particular are not a fan of regulation. Nevertheless, according to Zuckerberg, Meta AI is the most widely used AI application in the world – "maybe now, maybe by the end of the year at the latest."

For people in the USA, Canada and Australia, for example, Meta AI will have eyes and mouths next month. In the future, the AI will be able to use the latter to speak via WhatsApp, Instagram, Facebook and Messenger. Zuckerberg considers speech to be much more natural than typing. Well-known personalities are available as voices, such as Judi Dench, Kristen Bell and Keegan Michael Key.

Initially only in the USA, Meta AI will also be able to see. This means that the AI can analyze photos in the chat. Similar to how it works with OpenAI's ChatGPT or Google Gemini. Meta AI can then also edit photos within the chat, such as removing an object. However, more gimmicks are also possible, such as placing a cow on a surfboard and putting on a cap. Google can also remove objects, but only with the right Android device and in the Photos app.

Meta AI turns everyone into a superhero at the drop of a hat.

(Image: emw)

Other new features include automatic lip-sync translations for reels that creators can use. They now also have extended options for their own AI assistants, which they can create via the AI Studio.

Meta also shows users AI-generated content that Meta AI thinks matches their interests. This is initially a test.

Llama 3.2 is the next version of Meta's open-source model. It is the first vision model, as Meta emphasizes, and comes in 11B and 90B variants. Two small models (1B and 3B) for mobile devices – and also for glasses, are being released. According to Meta, they are suitable for Qualcomm and MediaTek hardware and optimized for ARM processors. Meta is also presenting the first Llama stack distributions. Llama 3.2 is available via llama.com and Huggingface, as well as via the numerous partner platforms, such as AWS, AMD, Dell, Google Cloud, Microsoft Azure and more.

Companies can support their presence in WhatsApp and Messenger with AI. There are also new AI tools for advertisers.

In the future, Meta's Ray-Ban glasses will be able to create reminders when asked by voice and read QR codes. "Meta, remind me to buy shampoo tomorrow." The glasses say it the following day, but have also created a note in the corresponding app. A practical feature that can also be used for unpleasant tasks. Attention, please don't throw the glasses away just because they say what you wanted to suppress. This year, the integrated AI from Meta AI should also be able to see what the wearer sees in real time and process it. The other major AI providers have also announced such a function, but have not yet delivered it. It remains to be seen who will win the race. Zuckerberg says that the glasses are a whole new category of devices, especially now thanks to AI.

Eva-Maria Weiß wears Ray-Ban glasses that are too big. The glasses are available in two sizes, but only the large version was available for testing at Connect.

(Image: emw)

There will also be real-time translation when talking to someone in another language. Initially, this only applies to English, French, Italian and Spanish. Spotify, Amazon Music, Audible and iHeart will be integrated – so that they can be controlled by voice command. New transitions, i.e. glasses that tint in the sun, are available.

Transparency note: The author was invited to Meta Connect in Menlo Park by Meta. Meta covered the travel costs. There were no specifications regarding the type and scope of our reporting.

(emw)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.