Earlier this year at Computex 2023, Nvidia revealed a new technology during its keynote presentation: Nvidia ACE, a ‘custom AI model foundry’ that promised to inject chatbot-esque intelligence into non-player characters in games.
Now, Nvidia has more to say about ACE: namely, NVIDIA NeMo SteerLM, a new technique that will make it easier than ever before for game developers to make characters that act and sound more realistic and organic.
We’ve heard about NeMo before, back when Nvidia revealed its ‘NeMo Guardrails’ software for making sure that large language model (LLM) chatbots such as the ever-present ChatGPT are more “accurate, appropriate, on topic and secure”. NeMo Steer LM acts in a similar but more creative way, allowing game devs to ‘steer’ AI behavior in certain directions with simple sliders; for example, making a character more humorous, or more aggressive and rude.
I was a bit critical of NeMo Guardrails back when it was originally unveiled, since it raises the question of exactly who programs acceptable behaviors into AI models. In publicly accessible real-world chatbot tools, programmer bias could lead to AI-generated responses that offend some while appearing innocuous to others. But for fictional characters, I’m willing to believe that NeMo has huge potential. Imagine a gameworld where every character can truly react dynamically and organically to the player’s words and actions - the possibilities are endless!
The problems with LLMs in games
Of course, it’s not quite as simple as that. While SteerLM does promise to make the process of implementing AI-powered NPCs a lot more straightforward, there are still issues surrounding the use of LLMs in games in general. Early access title Vaudeville shows that AI-driven narrative games have a long way to go, and that’s not even the whole picture.
LLM chatbots such as ChatGPT and Bing AI have proven in the past that they’re not infallible when it comes to remaining on-topic and appropriate. Indeed, when I embarked on a quest to break ChatGPT, I was able to make it say things my editor sadly informed me were not fit for publication. While tools such as Nvidia’s Guardrails can help, they’re not perfect - and as AI models continue to evolve and advance, it may become harder than ever to keep them playing nice.
Even beyond the potential dangers of introducing actual AI models into games - let alone ones with SteerLM’s ‘toxicity’ slider, which on paper sounds like a lawsuit waiting to happen - a major stumbling block to implementing tools like this could actually be hardware-related.
If a game uses local hardware acceleration to power its SteerLM-enhanced NPCs, the performance could be affected by how powerful your computer is when it comes to running AI-based workloads. This introduces an entirely new headache for both game devs and gamers: inconsistency in game quality dependent not on anything the developers can control, but on the hardware used by the player.
According to the Steam Hardware Survey, the majority of PC gamers are still using RTX 2000 or older GPUs. Hell, the current top spot is occupied by the budget GTX 1650, a graphics card that lacks the Tensor cores used by RTX GPUs to carry out high-end machine-learning processes. The 1650 isn’t incapable of running AI-related tasks, but it’s never going to keep up with the likes of the mighty RTX 4090.
I’m picturing a horrible future for PC gaming, where your graphics card determines not just the visual fidelity of the games you play, but the quality of the game itself. For those lucky enough to own, say, an RTX 5000 GPU, incredibly lifelike NPC dialogue and behavior could be at your fingertips. Smarter enemies, more helpful companions, dynamic and compelling villains. For the rest of us, get used to dumb and dumber character AI as game devs begin to rely more heavily on LLM-managed NPCs.
Perhaps this will never happen. I certainly hope so, anyway. There’s also the possibility of tools like SteerLM being implemented in a way that doesn’t require local hardware acceleration; that would be great! Gamers should never have to shell out for the very best graphics cards just to get the full experience from a game - but I’ll be honest, my trust in the industry has been sufficiently battered over the last few years that I’m braced for the worst.
You might also like
from TechRadar - All the latest technology news https://ift.tt/hPpKZtN
via IFTTT
Comments
Post a Comment