Meta’s latest AI research introduces a prototype technology capable of generating immersive objects through voice commands only.
Recently, Mark Zuckerberg showed off Meta’s AI team’s latest prototype during a live event series to expand its vision of the metaverse on Horizon Worlds.
Related | Creators Of Shiba Inu Are Working On A Metaverse For The Meme
Meta’s CEO claims that the new technology, sampled in the video below, can create immersive, 3D virtual environments, objects, and sounds from users’ voice commands via the assistance of a new voice-powered bot assistant called “Builder Bot.”
As the video demonstrates, users on Meta’s metaverse vision will be able to create and customize their environments without any need for 3D modelings or coding, as their virtual assistant will transcribe their voice commands into immersive spaces, objects, or even sounds.
According to Zuckerberg, Builder Bot “enables you to describe a world and then it will generate aspects of that world for you,” adding that “as we advance this technology further, you’re going to be able to create nuanced worlds to explore and share experiences with others with just your voice.”
Although the prototype might still be a technology at its very early stages, it shows the company’s progress in shaping its vision of the metaverse.
To provide more context, Builder Bot is part of a larger AI project called Project CAIRaoke, which focuses on developing the conversational AI necessary to create these virtual worlds.
Like Meta’s researchers, other companies have also previously worked on developing technology capable of generating images from texts. Last year, for example, OpenAI revealed a neural network capable of generating images from text.
The advancements made by project CAIRaoke will profoundly affect the progress of the company’s pivot from social networking to metaverse technology, as its CEO describes the department’s work as central for Meta’s future.
“In the metaverse, we’re going to need AI that is built around helping people navigate virtual worlds as well as our physical world with augmented reality,” Zuckerberg said. “When we have glasses on our faces, that will be the first time that an AI system will be able to really see the world from our perspective: See what we see, hear what we hear, and more.”
You might also like
More from Facebook
Facebook Probably Owes You Money
Meta has agreed to pay out $725 million to Facebook users to settle a class action lawsuit concerning user privacy. If …
Facebook Is Giving You More Controls Over What You See On Your Feed
Facebook is introducing new "show more" and "show less" controls to let you adjust what you want to see on …
Meta Introduces Facebook Reels API, Offering An Option To ‘Share To Reels’
Meta has introduced the Facebook Reels API, a solution allowing developers to build a 'share to reels' option into their …
Facebook Gets Into Delivery With DoorDash Partnership
DoorDash is partnering with Meta to pilot Facebook Marketplace deliveries across multiple cities in the U.S. Drivers will only transport items …