Every few years, we get word that computers have bested us humans at another cherished game. In 1996, the IBM computer Deep Blue defeated Garry Kasparov at chess. In 2016, a Google AI system beat top ranked player Lee Sedol in a five-game match of the Chinese board game Go. And just last year, an AI system known as Pluribus beat five world-class professional poker players in a series of no-limit Texas Hold ’Em games.
This latest victory is particularly significant because poker is a game of imperfect information. The computer doesn’t know what cards its opponents have, and the game involves elements of psychology, intimidation, deception, and raw luck. In multiplayer no-limit poker, humans would seem to have an edge. Nevertheless, Pluribus stomped.
That landmark achievement suggests that AI systems can be dominant in just about any kind of game we puny humans can think up. “It’s safe to say we’re at a superhuman level and that’s not going to change,” Noam Brown, co-creator of Pluribus, told The Verge after the historic matchup.
So is the era of human competitiveness over? Will we carbon-based lifeforms ever again have the bragging rights over machines when it comes to the games that, after all, we invented?
There’s no good answer, says AI researcher Julian Togelius, because we’re asking the wrong questions.
New Game in Town
Instead of looking back with dread at the old, existing games that computers have mastered, says Togelius, a widely acknowledged gaming visionary who co-directs New York University’s Game Innovation Lab, we should be looking forward to the games AI can help us create.
“If there’s one message I have, it’s this: Chess and Go were great, but they’ve had their run and we’re done with that now,” he says. “Stop playing chess and Go.”
Togelius is only half-serious. But he’s right that the tremendous computing power of artificial intelligence is already powering games — video games, largely — that are breaking all the standard molds. The most enduring example is the evolving class of video games that essentially make themselves up as they go.
Using a method known as procedural generation, these games develop new levels algorithmically rather than via manual coding — responding to players’ choices and creating new game elements on the fly. Let’s say you’re in a sword-and-sorcery game and you decide to descend the dark stairway at the end of the catacombs. The game will instantly generate a new level of the dungeon for you to explore, complete with AI-controlled traps and monsters.
“Chess and Go were great, but they’ve had their run and we’re done with that now. Stop playing chess and Go.”Julian Togelius
The critically-acclaimed space exploration game No Man’s Sky uses this method to create literally billions of stars and planets. The ubiquitous kids’ game Minecraft makes worlds on the fly, as well. In other popular games, such as Tom Clancy’s The Division and Alien: Isolation, artificial intelligence controls individual combat enemies that adapt to the player’s tactics.
Most of these present-day video games are still limited in what they can generate, Togelius says. They must be programmed with certain parameters so that the AI doesn’t improvise its way into utter incoherence – turning off gravity, say, or spawning three million soldiers in an underground broom closet. To create the next level of AI-generated games, designers are letting the machines improve themselves.
In this approach, the AI assumes the role of both the player and the various game obstacles, competing against itself at lightning speed until it finds optimal game conditions. The process is similar to the way computers win at chess or Go — changing their gameplay to adapt to an opponent’s skills or style. AI-powered digital entities called “agents” approximate the behavior of human players, then match wits with separate AI agents designed to thwart them.
“We develop these procedural personas that play the game with different styles,” Togelius says. “One agent might be a speed runner and try to finish the game in record time. Another might want to collect maximum treasure. Another might want to talk to everyone.”
The result is a game that evolves in real time — assessing the player, figuring out his or her level of expertise and preferred game style, then creating a custom gaming experience on the fly. That includes a further twist: rather than simply playing better, AI creates entirely new elements for the game — new levels, new rules, new environments. You’re playing the game, yes, but the game is also playing you.
Talking to slime
These higher-order games aren’t on the market yet; Togelius expects we’ll see commercial titles in the next few years. But a few other games available now may hold clues about what’s coming. Matthew Guzdial, assistant professor of computer science at the University of Alberta, points to the indie online game Lab Assistant, based on an AI research project at Cornell University.
In the game, a player trains a little green blob with eyes, meant to be sentient chemical slime, to solve laboratory puzzles — manipulating chemical beakers, say. The trick? The slime and the player need to figure out how to communicate.
That little green slime is “actually [powered by] a deep neural network — one of these pretty hefty machine-learning models,” Guzdial says. “You teach it a language that will tell it what to do.”
By issuing text commands, then rewarding the slime for correct choices, the player and the AI can gradually agree on a common language, which could be English, or German, or Klingon, or nonsense, Guzdial says. The AI learns language through repetition, consistency, and syntax. Essentially, the player and AI collaborate on a machine learning project.
Such collaborative gaming experiences, between humans and AI, represent the future of game development, Togelius says.
“I would like to play a game where you just enter a random world and the system creates a game for you,” Togelius says. “As you interact with the world, it infers what you want. Start shooting at things, and it becomes a shooter. Or start talking and it becomes a diplomacy game, or a dating simulator.”
The Holodeck Grail
Combined with headlong advances in virtual reality, Togelius says, such games could start to approach that holy grail of notional game-nerd conjecture: The Star Trek Holodeck, in which the computer instantly generates any environment the player orders. San Francisco in 1849. A primordial rain forest. An offworld colony.
“Oh, I love Star Trek,” Togelius says. “It’s almost a prerequisite to be a Star Trek fan in this area of work.”
Or course, a true holodeck experience would require massive improvements in adjacent VR technologies — standalone holograms, haptic systems, gesture control. But installing an advanced AI game engine under the hood would add another level of awesome. Holodeck adventures could go beyond passive VR environments and generate interactive challenges for multiple players at the same time.
“Let’s say you and I are playing together,” Togelius says. “I like solving puzzles; you like blowing shit up. The game would create a collaborative scenario where we need to work together. Now we want to cook together, and the game shifts again…”
In Togelius’ vision, each session would become a unique gaming experience. And once the game is over, the system would automatically generate a template for subsequent players.
“We want to get to the point where we’re tearing down the walls between playing a game and creating a game,” Togelius says. Consider the recent success of Speedgate, a new field sport invented by a team of humans and machine learning systems.
Togelius is evangelical on this point. All these stories about humans losing to computers aren’t just mildly depressing, he says; they’re misreading our relationship with machines. We need to stop thinking of AI as a competitor and start reimagining AI as a collaborator.
There’s an old saying that seems relevant here. If you can’t beat ’em…