Meta AI can now talk, understand images and dub videos
At Meta’s 11th annual Connect conference, CEO Mark Zuckerberg showcased innovations in mixed reality, AI, and wearables.
Meta AI, the company’s AI assistant, was a major focus, with Zuckerberg revealing that it is used by 400 million people every month, and 185 million on a weekly basis.
To make interactions with Meta AI more natural, the company is introducing voice input for the assistant on Facebook, Messenger, WhatsApp, and Instagram, initially in Australia, Canada, New Zealand, and the USA, and only in English. Well-known voices like Awkwafina, Dame Judi Dench, and John Cena will be used to give the AI assistant more personality.
Ad
Meta is also experimenting with automatic video and lip-syncing for reels on Instagram and Facebook to make content in different languages accessible to a wider audience. This could impact the business models of dedicated startups like HeyGen.
The AI-supported editing tools are also being expanded, allowing users to ask questions about uploaded photos and add, remove, or change elements in real photos using voice commands, thanks to the vision capabilities of the new Llama 3.2 models.
Meta has also released smaller models with 1B and 3B parameters that are optimized for use on devices like smartphones or future AR headsets.
Overall, Meta seems to be catching up with the commercial competition from OpenAI, with the performance gap to models like GPT-4 shrinking. Multimodality and the voice interface are two important functions that end customers can actually use in everyday life.
Recommendation
Ray-Ban Meta Smart Glasses become a city guide
The Ray-Ban Meta Smart Glasses are also receiving updates, including voice command reminders, QR code and phone number scanning, and a personal city tour in real-time thanks to extended Meta-AI support. The headsets will soon be able to translate Spanish, French, and Italian in real-time into English.
Orion: Meta’s first AR headset prototype
One of the highlights of the conference was the unveiling of Orion, Meta’s first AR headset prototype, which the company says it has been working on for ten years.
The wireless headset weighs less than 100 grams and is equipped with ten customized silicon chips and bright displays with a wide field of view (70 degrees).
It features voice control, hand and eye tracking to navigate through the user interface, and a new, discreet, and “socially acceptable” interaction option via an EMG wristband.
Although Orion is only a prototype, Meta emphasizes that it could be an end customer product, showing how far development has already progressed. Even if the headset is never launched on the market, an even better version could be developed more quickly. For Meta CEO Mark Zuckerberg, it was particularly surprising that the AI elements are ready for the market earlier than the actual hardware – something he had predicted a few years ago.