I’ve been thinking about empathy, and about the role it plays in game design. There’s a fair amount of discussion of ‘empathy games’ – games created by and from the perspective of marginalized creators, games created to push you into another person’s proverbial shoes to walk a proverbial mile. Different versions of these have emerged, from games that attempt to simulate a life of one random person on the globe, based on statistical modeling, to games about one specific person’s specific experience and their understanding of it. Much digital ink has been spilled, as well, advocating that these games are the way towards a more understanding and empathetic future, with implied better outcomes for the communities represented by those games.

That’s not really what I want to talk about, though it’s an important thing to touch on. Most of the lofty rhetoric around these games has borne little fruit – it turns out that walking a mile in a person’s shoes doesn’t really tell you much about them, because you’re still walking with your legs and perceiving the world through your eyes. You can tell yourself that you’ve come to understand them, but all you’ve done is constructed an effigy of them for your imagination to occupy. It is far from empathy. Often, if these games involve significant choice, they end up being turned into min-max exercises by the player – coming to understand the single optimal strategy for ‘winning’ the game, trivializing a life full of uncertainties and incomplete information into an obstacle course to be solved. And, of course, the end result of these gameable ‘lives’ is the exact opposite of empathy, feeding directly into a sort of just-world fallacy.

However, even before we encounter these high level issues with the ideas underpinning empathy games, let’s question an even more basic assumption: Does empathy lead to kindness?

Empathy is the process of understanding what another creature is thinking and feeling. This is something we do all the time, and is a vital survival tool. All interpersonal interaction is some degree of empathetic, where we are predicting reactions and trying to feed back into those, verbally or otherwise. All communication could be seen, then, as a sort of formalized empathy, codifying and expressing internal processes to make them easier for others to engage with, while they provide the same service in return.

This is lovely, but it reveals a dark truth: Just as there’s nothing inherently kind or morally good about language itself, there’s nothing inherently kind or morally good about empathy itself. Certainly I believe that those most able and inclined to be empathetic are, on average, better moral actors: They understand the potentially painful outcome of their decisions better and they have a conception of shared moral reality that extends beyond their immediate purview. I also believe the same is, on average, true of people who are good at interpersonal communications, for much the same reason. This is not the same thing as these tools being intrinsically or always good or moral. You can use your bone-deep understanding of another person’s mental state for anticipation, for manipulation, for exploitation. We like to describe these sorts of mental domination tactics as being completely separate from what empathy is, but they seem like two sides of the same coin to me.

Of course, even using empathy aggressively is not inherently immoral. We do it all the time when we play games! In something like a fighting game against a single opponent, you’re constantly trying to understand, predict, and counter their every decision. Fighting game players sometimes call this ‘yomi’, Japanese for ‘reading’, meaning to read the mind of their opponent, but it seems like empathy to me: Every form of understanding the mind, decision-making, and emotional state of another creature seems, to me, to be a form of empathy.

This is part of why we love to compete – for the same reason we love conversation, because it allows us to understand and express bits and pieces of our collective minds to each other. The process of competition is much the same as the process of conversation: “What do you want?” “What do you expect?” “How can I accommodate these desires and expectations?” or even “How can I shape these desires and expectations?” These are questions which, in some form or another, go through one’s mind both in cordial social interactions and during an intense competitive game. Both situations have as well a degree of subterfuge – sometimes you have to conceal your feelings, your desires, your plans, either for some sort of competitive advantage or just to spare someone else discomfort.

Even single-player games interface with this desire for connection. In stealth games, for example, you’re constantly trying to understand where people are around you, where they’re going, what they’re trying to achieve, and how much they know about where you are and what you’re doing. The behaviors controlling these opponents are extremely simple because there tend to be quite a few such opponents and the penalties for failure are often high, but the basic flow of understanding “what is it that this entity understands and desires?” is still present. Because these behaviors are so basic, however, they often end up feeling arbitrary – most enemies immediately can tell the difference between your footsteps and their friends’ from two rooms away, but will observe a door that’s meant to be locked swinging open without reacting.

This is, I think, a very subtle thing that players responded to in the Dark Souls and affiliated series: Though enemy behaviors are extremely simple, they do seem to be placed and scripted as though their intent were to defeat the player rather than, as is the case with many enemies in other games, to be dramatically and satisfyingly defeated. Similarly, the enemy behavior is just sophisticated enough that you can watch them decide on an attack based on your relative positions, then seek to execute that attack and respond to it, creating something akin to a very primitive version of the sensation of fighting games’ ‘yomi’. Of course, many players dislike these traps and ambushes and tactics, seeing in them not a malicious opponent but a malicious game designer – which is, I suppose, also the case.

All these single-player examples, though, are of games which encourage you to understand the intent of extremely basic characters and creatures. Even if they’re presented as clever entities like humans in the narrative layer, usually they end up coming off as simple-minded simply because of the limitations of their development. Most games don’t bother creating rich interior lives for their characters – for the simple reason that most players wouldn’t notice if they had. It doesn’t take a lot of effort to make a character behave somewhat convincingly, to make them pace and mutter and run towards loud noises and yell and shoot, but it takes a lot to make them understand their world, formulate goals, and act to achieve them.

Still, it’s worth thinking about what we might do to create a simulation sophisticated enough to be worth empathizing with in a deeper and more elaborate way. First, let’s think about how a creature or person takes action:

This is a chart of what decision-making might look like to an entity. The entity’s self is represented by the red box: Every creature has innate desires and certain information about the world it occupies. For living creature these desires would be subject to change based on that information, but let’s just say this simulation is taking place across little enough time that these desires remain more or less constant. The creature’s information changes based on its observations of the world, and the information combined with the creature’s desires combine to create a plan of action to achieve those desires. The plan results in actions it takes, which change the world around it, and so forth. Left to its devices the creature would steadily achieve its objectives (assuming its plans were any good), but there’s also an unknowable quantity of other creatures whose actions are also affecting the world in ways which change the flow of information, constantly requiring new plans.

Thus, if we wanted to create a compelling set of artificial behaviors, we’d need three tools:

1) Information. How does the creature perceive the world? What can it see and hear, and how does it parse this into usable data?

2) Desire. This is probably the simplest, just create a world state (or set of such states) that the creature wishes to achieve or maintain – the difficulty of this step is in formulating it such a way that it can be used for the next step.

3) Plan. This is the tough one. How do you synthesize the information and desire into a plan of action?

We usually don’t model anything like that in games because it’s overkill. If you create a set of creature behaviors capable of inferring from subtle information, players will often feel like it’s cheating; whereas if you create a set of random behaviors players will often infer a reason for that behavior. Sometimes the cruder and more artificial behavior set ends up feeling more real. However, when we don’t bother to emulate any internal life, we create seams: Creatures ignore information which they aren’t scripted to notice, react bizarrely to edge cases, and while all of these may be nearly impossible problems to effectively solve I think it would be neat if we tried. If we had, as we strained to make ever more photo-realistic worlds, established methods for giving characters perceived knowledge of their environment, a set of desires, a method of formulating plans, would it then be very difficult to create opponents which feel real and substantial? Perhaps even human? If players feel AI that is too smart is cheating, is that just an artifact of the strained pseudo-fidelity of modern games, where everything looks photo-real but nothing meaningfully reacts to the things happening around them?

There’s a natural, easy joy to competition, to testing each other and understanding each other and exceeding each other, which largely doesn’t exist in single-player experiences. What we have instead of competition is naked challenge, a kind of lurid hyper-competition which strips away all the ‘boring’ parts – and, in the end, gives us targets rather than opponents, conquests rather than contests. This is fine. It’s not like these games can’t be fun and interesting and even thought-provoking. And yet I can’t help but wonder what these games would look like today if we’d ever been taught to look at their casts of characters as anything aside from predators or prey.

Of course, even if we gave every creature motivation and observation, there would still be something missing: Empathy. If we wanted to create realistic behavior, we’d also have to give creatures some capacity to observe each other directly, predict actions, and act preemptively based on those observations. Maybe we’d even start to feel bad for massacring them.

If you enjoyed this essay, please consider supporting me on Patreon. Support at any level lets you read new posts one week early and adds your name to the list of supporters on the sidebar.

Leave a Reply

Your email address will not be published. Required fields are marked *