There are several pieces of evidence to prove that AI that will form the backbone of the metaverse. The role of AI in the metaverse involves combining several related techniques like computer vision, natural language processing, blockchain and digital twins.
In February, Meta’s chief Mark Zuckerberg showcased a demo at the company’s first virtual event – Inside The Lab, of what the metaverse would look like. He said that the company was working on a new range of generative AI models that would allow users to generate a virtual reality of their own simply by describing it. Zuckerberg announced a slew of upcoming launches like Project CAIRaoke – “a fully end-to-end neural model for building on-device assistants” which would help users communicate more naturally with voice assistants. Meanwhile, Meta was also working on building a universal speech translator that could offer direct speech-to-speech translation for all languages. A few months later, Meta has made good on their promise. However, Meta isn’t the only tech company with skin in the game; companies like NVIDIA have also released AI models for a richer metaverse experience.