Omniverse, Nvidia Corp.’s hyper-realistic real-time 3D graphics collaboration and simulation platform, just got better with a number of advances, including increased performance and new tools for enterprise users and creators.
Nvidia Omniverse acts as a “metaverse” platform that allows artists, engineers, creators and others to collaborate remotely on virtual worlds at large scale to recreate and simulate anything they can imagine, such as cars, airplanes, buildings and factories. Using Omniverse Enterprise it’s possible to fully simulate factory floors with “digital twins,” enabling the industrial metaverse that can move and react with actual physics just like in the real world.
Today, in a special address at CES 2023, Nvidia announced major updates to Omniverse Enterprise, including support for the Nvidia Ada Lovelace architecture found in the new GeForce RTX 4090 and 4080 graphics cards and the newest data center-scale OVX graphics computing systems.
The Omniverse ecosystem has also expanded to include new connectors that will allow artists and designers to connect more software such as Adobe Substance 3D Painter, Autodesk Alias, PTC Creo and Kitware’s Paraview.
The company also announced the general availability of Omniverse DeepSearch, an AI-powered service that allows teams to sift quickly through large, untagged databases of 3D models using natural language or 2D reference images in order to find the perfect assets. Using this tool, it will be easier to surface and discover the models that a team needs without having to spend hours poring over a catalog a page at a time using a simple text search.
Omniverse core components also received updates including the Omniverse Kit software development kit, which now makes it easier to begin building advanced tools and applications for the platform quickly. Omniverse Create, a reference application for creating large-scale 3D worlds, now includes DLSS3 and multi-viewport support, making it easier to create large, complex assets. And Omniverse View, another reference application for reviewing 3D scenes, has received additions such as markup, annotation and simple navigation to make presentations more interactive.
Omniverse Nucleus, a database server and collaboration engine launched with Omniverse Enterprise, has been updated with improved information technology tools such as expanded version controls and atomic checkpoints and updated large file transfers for cloud, on-premises and hybrid workflows.
Zaha Hadid Architects, an architectural firm known for a variety of distinctive cultural building designs, has been putting Omniverse to use in accelerating its own design and development workflows. It’s also developing its own custom tools for designing and building out 3D models of buildings using metaverse style applications using the platform.
“We are working with Nvidia to incorporate Omniverse as the connective infrastructure of our tech stack,” said Shajay Bhooshan, associate director at Zaha Hadid Architects. “Our goal is to retain design intent across the various project stages and improve productivity.”
Automaker Mercedes Benz similarly has been using Omniverse Enterprise at its factories worldwide to enable what’s known as the “industrial metaverse” at its manufacturing facilities worldwide. With Omniverse, the company can use full-fidelity digital twins in its production environments to allow globally dispersed teams to collaborate in real time, modify assembly and production lines and see how plans will affect operations all without disrupting the current site.
Since the digital twins match the real-world factory floors as closely as possible and simulate them precisely, designers and engineers can be certain that the changes to the twins will reflect real-world situations. As a result, once they’ve completed the simulations, they can better predict how a factory needs to be changed before committing millions of dollars or hours of work to change the assembly line.
The creator-edition of Omniverse also received updates for animators, creators and developers, delivering AI-powered tools that include generative capabilities added to the Omniverse Launcher. As a result, users will get access to the tools the moment they power up the platform.
The new tools include Omniverse Audio2Face, an artificial intelligence-enabled tool that automatically generates realistic facial expressions from an attached audio file. That means that creators don’t need to animate their own faces frame-by-frame. Instead, they can let an AI automate it for them, although they may need to go in and tweak it a little afterwards.
In addition to Audio2Face, Audio2Gesture and Audio2Emotion, have received performance updates and now are easier to integrate. Audio2Gesture creates realistic upper-body movements and Audio2Emotion causes the character to express realistic emotions based on the sound of the voice, such as happy expressions or sad facial expressions.
“The demand for 3D skills is skyrocketing, but learning 3D can be pretty scary to some, and definitely time consuming,” said Jae Solina, a popular creator from YouTube known as JSFilmz. “But these new platform developments not only let creatives and technical professionals continue to work in their favorite 3D tools but also supercharge their craft and even use AI to assist them in their workflows.”
These AI-assisted tools are in addition to Nvidia Canvas, added in June 2021, which allows artists to generate entire landscapes and expand on them with simple brushstrokes. Soon, all users with RTX will be able to download Canvas and create panoramic environments and create AI-generative images with a few button presses.