Meta Shares New Hints at the Next Stage of its Metaverse Development

Team IMTools
Team IMTools
Meta Shares New Hints at the Next Stage of its Metaverse Development
[ad_1]

While Meta’s grand metaverse vision is still a little too far off in distance for everyone else to make out just yet, it is making progress, and advancing its VR tools, which will eventually form the foundation of its next stage.

Meta shared some new VR elements earlier this week, with in-car VR usage, and the expansion of members-only Horizon Worlds. And today, Meta chief Mark Zuckerberg has shared another significant step, with users now able to customize their home environment within Meta Quest, so you can set your own preferred destination for your home setting.

Quest VR home screen samples

Which, much like in-car VR and Horizon groups, may not seem like a big deal in itself. But it’s the expanded potential of these elements that’s of most interest.

As noted by Zuckerberg (via his Instagram Broadcast Channel):

“With AI, soon you’ll be able to generate anything you want [for your VR home base].”

This is the next level of Meta’s generative AI plans – not only is Meta exploring how to integrate generative AI tools into the functions you already use on Facebook and IG, but it’s also looking to simplify VR environment creation, which could be a key step in building more interest in the experience.

Thus far, Meta’s VR environment has left a lot to be desired, while the requirements for building VR experiences are so technically heavy that only high-end developers with both the experience and resources can build such, limiting personalization and interactivity.

But what if you could simply speak things into existence in VR worlds? What if you could say ‘I want a basketball court in space, with flaming basketballs to play with’, and Meta’s generative AI tools could build that experience for you, with no coding knowledge required?

That could open up a range of new possibilities, which could be the spark that Meta needs to generate more interest in its VR offerings.

Again, we’re not at that stage just yet, but with the emergence of generative AI tech, Meta is now working towards this, and building systems that will facilitate more creative, interactive VR visions, which could lead to that next-level experience.

On a related front, Meta also shared some new developer insights into its generative AI work at its AI Infra @ Scale event today, where various Meta engineers and internal experts provided new overviews of projects like its Research SuperCluster for large-scale development, its new data center to support AI tools, and its first-generation, custom silicon chip, which Meta has designed to power its AI recommendation systems.

Mark Zuckerberg holding a Meta AI chip

Those AI systems will facilitate algorithmic improvements, as well as the aforementioned generative tools in its main apps. But they’re also being built with the next stage of VR in mind, which could fast-track creation of a new, personalized, customizable metaverse experience.

Interest in VR has ebbed and flowed over the years, as the technology has advanced, giving us amazing new glimpses of what could be. But those flashes have generally been short-lived, with VR still very much an in-development space.

But with more kids interacting within game worlds, like Fortnite and Roblox, and learning to engage via avatars, in what could already be considered self-contained metaverse experiences, it stands to reason that this could well be the future of digital connection – and if Meta can provide simplified, streamlined VR creation tools, that could be a huge step for the process.

It’s not going to happen overnight, but Meta is on track to invest $16 billion in its metaverse projects this year alone.

That money’s not just vanishing into thin air, and soon, we may get a clearer view of the metaverse concept that Zuck sees in his head.



[ad_2]
Source link

Share this Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *