In September 2010, eight months after the release of their space epic Mass Effect 2, developers BioWare discussed some of the strange statistics they were getting from people who were playing the game. Amidst the talk about one gamer who took 66 hours to complete the game, and the complaints over why more people didn't play as female Shepard, an interesting nugget of information lay more or less unexplained. PC gamers were more likely to finish the optional quest for Miranda, one of the non-player characters, a quest described by BioWare themselves as having "a touchy-feely plot". Another character - the emotionless, bloodthirsty Krogan called Grunt - was favored by XBox 360 players more.
The more game developers look at their own creations, the more they see unusual patterns emerging. Why do some people prefer to play as support classes? Why do some feel the need to complete every last sidequest? What makes a PC gamer more likely to choose the emotional, family-issues quest over a killing spree?
A growing group of academic researchers think they can explain these phenomena - and moreover, they think that understanding these patterns of behavior might lead to better game design in the future. At this year's Computational Intelligence in Games conference in Seoul, South Korea, I met one such researcher who had some startling results to present about an experiment he conducted in another BioWare RPG - 2002's Neverwinter Nights.
Giel Van Lankweld came into computer science via an unusual route - four years studying experimental psychology. His PhD supervisor, Pieter Spronck at the Netherlands' Tilburg University , was interested in pursuing work to help model and understand how gamers behave within games, and hoped that Giel's psychology background would bring a new angle to the research area. Understanding gamers and learning about what kind of people they are is important to game researchers, because doing it in real time allows games to become "adaptive" - redesigned to be personal to you, in ways so subtle you may not even notice.
I foresee problems.
This *could* be used to create games that truly adapt to the user's play style, and allow for users to change their play style halfway through, for instance if they changed from stealth to guns blazing in the second act. They could create truly customized game experiences that adapt to the player's actions, without limiting them from seeing content.
But that's not going to happen.
As soon as this stuff becomes feasible one game will be made implementing it in a half-good experimental way and it will be The Next Big Thing and sell tons of copies. Then publishers will run to their studios and say "ALL YOUR GAMES NEED THIS NOW. IMPLEMENT IT. You can still release this by July, right?" So then you get two dozen games with horrible, poorly implemented versions of this concept that absolutely suck. After two or three years people will point out the flaws in it, but studio leads and publishers won't want to change it because "This is the way it's done, it's what the players expect."
And so we will be permanently stuck with a crappy implementation of adaptive gaming that stereotypes the user and blocks them off from content based on early decisions.
Edited by Phaedrus2129 - 11/4/11 at 4:17pm