Meet Violet, an AI-powered customer service assistant ready to take your order.
Unveiled this week at GTC, Violet is a cloud-based avatar that represents the latest evolution in avatar development through NVIDIA Omniverse Avatar Cloud Engine (ACE), a suite of cloud-native AI microservices that make it easier to build and deploy intelligent virtual assistants and digital humans at scale.
To animate interactive avatars like Violet, developers need to ensure the 3D character can see, hear, understand and communicate with people. But bringing these avatars to life can be incredibly challenging, as traditional methods typically require expensive equipment, specific expertise and time-consuming workflows.
The Violet demo showcases how Omniverse ACE eases avatar development, delivering all the AI building blocks necessary to create, customize and deploy interactive avatars. Whether taking restaurant orders or answering questions about the universe, these AI assistants are easily customizable for virtually any industry, and can help organizations enhance existing workflows and unlock new business opportunities.
Watch the video below to see Violet interact with users, respond to speech prompts and make intelligent recommendations:
How Omniverse ACE Brings Violet to Life
The demo showcases Violet as a fully rigged avatar with basic animation. To create Violet, NVIDIA’s creative team used the company’s Unified Compute Framework, a fully accelerated framework that enables developers to combine optimized and accelerated microservices into real-time AI applications. UCF helped the team build a graph of microservices for Violet that were deployed in the cloud.
Omniverse ACE powers the backend of interactive avatars, essentially acting as Violet’s brain. Additionally, two reference applications are built on ACE: NVIDIA Tokkio and NVIDIA Maxine.
Violet was developed using the Tokkio application workflow, which enables interactive avatars to see, perceive, converse intelligently and provide recommendations to enhance customer service, both online and in places like restaurants and stores.
NVIDIA Maxine delivers a suite of GPU-accelerated AI software development kits and cloud-native microservices for deploying AI features to enhance real-time video communications. Maxine integrates the NVIDIA Riva SDK’s real-time automatic speech recognition and text-to-speech capabilities with real-time “live portrait” photo animation and eye contact features, which enable better communication and understanding.
Latest Microservices Expand Possibilities for Avatars
The demo with Violet highlights how developers of digital humans and virtual assistants can use Omniverse ACE to accelerate their avatar development workflows. Omniverse ACE also delivers microservices that enable developers to access the best NVIDIA AI technology, with no coding required.
Some of the latest microservices include:
- Animation AI: Omniverse Audio2Face simplifies animation of a 3D character to match any voice-over track, helping users animate characters for games, films or real-time digital assistants.
- Conversational AI: Includes the NVIDIA Riva SDK for speech AI and NVIDIA NeMo Megatron framework for natural language processing, allowing developers to quickly build and deploy cutting-edge applications that deliver high-accuracy, expressive voices and respond in real time.
AI Avatars Deliver New Transformations Across Industries
The AI avatars that ACE enables will enhance interactive experiences in industries such as gaming, entertainment, transportation and hospitality.
Leading professional-services company Deloitte has worked with NVIDIA to help enterprises deploy transformative applications. At GTC, Deloitte announced that new hybrid-cloud offerings for NVIDIA AI and NVIDIA Omniverse services and platforms, including Omniverse ACE, will be added to the existing Deloitte Center for AI Computing.
“Cloud-based AI models and services are opening up new ways for digital humans to make people feel more connected, and today’s interaction with Violet in the NVIDIA GTC keynote shows a glimpse into the future of AI-powered avatars,” said Vladimir Mastilović, vice president of digital humans technology at Epic Games. “We are delighted to see NVIDIA Omniverse ACE using MetaHumans in Unreal Engine 5 to make it even easier to deploy engaging high-fidelity 3D avatars.”
NVIDIA Omniverse ACE will be available to early-access partners starting later this year, along with the Tokkio reference application for simplified customer-service avatar implementation.
Learn more about Omniverse ACE by joining this session at GTC, and explore all the technologies that go into the creation and animation of realistic, interactive digital humans.
Customers can request the hands-on, web-based Tokkio demo.
Developers and partners can sign up to be notified when ACE is available.
And catch up on the latest announcements from the GTC keynote by NVIDIA founder and CEO Jensen Huang:
The post NVIDIA Omniverse ACE Enables Easier, Faster Deployment of Interactive Avatars appeared first on NVIDIA Blog.