Nvidia infuses life into virtual characters via generative AI
The NVIDIA Avatar Cloud Engine (ACE) demo provided a preview of the next-generation NPCs(non-playable characters) that would be less labour intensive for game developers.
web3
Highlights
- Nvidia focuses on improving technology in gaming
- The preview of enhanced NPCs in games were shown in the video at Computex 2023
Nvidia will be the next in line to showcase their model on generative AI as these AI models gain popularity and adoption, mentioned by the chipmaker company at Computex 2023 event. As per the information, the company has unveiled its ACE at the event which ensures the creation of NPCs to be less labour intensive for game developers after the AI addition.
To improve the overall process, ACE will improve NPC intelligence by simulating better dialogues in reaction to the player’s actions and thus enhancing the character’s personalities.
“This is the future of video games,” Nvidia CEO, Jensen Huang, told the audience. “Not only will AI contribute to the rendering and the synthesis of the environment, AI will also animate the characters.”
Demo showed by Nvidia on non-player character
The demo showed an NPC behind the bar conversing with the player inside an AI-generated representation of a ramen eatery. The discussion between player Kai and the MetaHuman NPC Jin is shown in the Unreal Engine 5 video. The video's setting, a ramen shop, was created using RTX Direct Illumination (RTXDI) and DLSS 3.
The engine was created by Nvidia in collaboration with Convai, a Melbourne-based company that specialises in automated intelligent conversation systems. It incorporates Nvidia services NeMo, Riva, and Omniverse Audio2Face to offer customisation of NPC personalities and backstories.
Moreover the service can access live speech conversations with NPCs, and produce facial animations from audio inputs.
Nvidia focuses on infusing characters with AI, says CEO
On the gaming development Huang said "Because this character has been infused with artificial intelligence and large language models, it can understand your meaning, and interact with you in a really reasonable way.”. "All of the facial animation is completely done by the AI. We have made it possible for all kinds of characters to be generated,” he added.
Nvidia's generative AI capabilities are already being used by game creators, as shown in the upcoming S.T.A.L.K.E.R. 2: Heart of Chornobyl from GSC Game World. Another independent game developer, Fallen Leaf, used Audio2Face to animate the faces of the characters in its upcoming sci-fi thriller Fort Solis.
COMMENTS 0