hero-image

GeForce NVIDIA RTX ACE and other AI tools hands-on impression: How the future of gaming looks

Ever since the introduction of DLSS back in 2018, Nvidia has slowly expanded its offering in the AI space, which has only grown over the years with offerings such as Reflex and Broadcast. While the initial days of DLSS were filled with issues such as artifacting, over the years and its multiple iterations, it has become a necessary feature to maintain the balance between the ray-traced visual fidelity and the high framerate desired by players.

However, looking at the future of AI in video games, Nvidia seems to have many more ideas in the pipeline beyond DLSS and Reflex. I was recently invited to take a look at and try out a few of these upcoming features at the GeForce Nvidia RTX AI PC Showcase.

Hosted by John Gillooly, Technical Product Marketing Manager, Asia Pacific South at NVIDIA, the event shared the company's vision for the future of not just gaming, but the entire entertainment industry, with a hands-on experience with these tools.


NVIDIA ACE might just be the way to interact with NPCs in the future

Nvidia introduced ACE back in January 2024. Avatar Cloud Engine, or ACE, is a suite of technology that uses generative AI to create interactable digital humans or NPCs. To put it simply, the technology mitigates the traditional dialogue tree in favor of a more natural conversation with the NPC, where you can ask anything of the in-game character, and they will respond accordingly, albeit keeping with the in-game lore.

In the demo Nvidia showcased, there were two different types of ACE, one with the upcoming Mecha BREAK game and the other with Legends Tech demo developed by Perfect World.

The first example in Mecha BREAK utilizes the GPU's tensor core to host the entire AI processing locally, which does provide a faster interaction with minimal latency, but with rather limited dialogue options.

While I could easily ask the in-game character to swap out my equipped mech for a different one or to elaborate on the stats of the mecha, if I asked the character something outside of the game, more often than not the response was "Sorry I couldn't understand that."

Furthermore, the demo crashed multiple times, which, keeping the very early unoptimized build of the demo in mind, makes me concerned about the performance, especially since the demo systems were all running RTX 4080 or 4090s.

The second demo for NVIDIA ACE was the Legends Tech demo by Perfect World. Unlike Mecha BREAK, it utilizes cloud computing and ChatGPT to answer the players, albeit with some intentional guardrails to keep the character lore accurate.

The demo featured a character from a medieval high fantasy setting, who doesn't know about modern technologies, such as cars, but can guide you regarding which weapon to equip and where to use it. Although the responses took a few seconds, it was much more natural and interactive compared to the previous one.

Yun Ni in Perfect World's Legends got annoyed when I asked her about RTX 5090 leaks (Image via Nvidia)
Yun Ni in Perfect World's Legends got annoyed when I asked her about RTX 5090 leaks (Image via Nvidia)

After trying out the demos, I am skeptically optimistic. While it certainly is a novelty feature at this point, with the potential cost on my GPU or internet connection, it is very early in its life cycle to be judged or dismissed completely.

When ray tracing was first introduced on consumer GPUs, it also appeared as a novelty feature with most gamers, including me, preferring higher FPS over better lighting in a handful of games.

However, over the last half-decade, it has become an essential feature, with DLSS covering up the performance hit. I hope the future of Nvidia ACE is equally bright, where we can see widespread adoption of the technology while not affecting performance considerably.


NVIDIA Audio2Face, ComfyUI, and ChatRTX AI tools hands-on impressions

While ACE was the highlight of the event, it certainly wasn't the only one on display. There were a couple of other AI tools on highlight, including Audio2Face in Nvidia Omniverse, alongside ComfyUI and ChatRTX.

The Audio2Face tool in Nvidia Omniverse is a developer-focused tool that uses AI to automatically create facial animations to match dialogue options. Beyond just lipsyncing, the tool let's you set the expression of the face with simulation of all of the facial muscles, creating a very realistic facial animation.

While the demo didn't showcase any use cases of the tool, it can be very beneficial for multilingual games, helping the in-engine expressions to be adaptive to the different audio languages.

John showcased Star Wars Outlaws running with DLSS, which reminded me I need to finish the game (Image via Nvidia || Sportskeeda)
John showcased Star Wars Outlaws running with DLSS, which reminded me I need to finish the game (Image via Nvidia || Sportskeeda)

Nvidia's ComfyUI and ChatRTX offered similar experiences to other AI tools in the market, creating images with Stable diffusion and a local chatbot that can access your files to answer your queries. However, they stand out in their use of Nvidia GPUs to host it entirely locally, thus negating any latency or server issues and providing a much faster solution.

All in all, Nvidia's upcoming suite of AI tools seems promising and would certainly benefit the gaming industry in the future. I, for one, am certainly looking forward to Nvidia ACE coming to the next RPG title so that I can spend hours immersed in the game world.

You may also like