Extra Life
Anti-social tendencies and a nasty case of agoraphobia are just a couple of the symptoms video game detractors throw at hardcore gamers, but they couldn’t be more wrong. In fact, playing video games m
Anti-social tendencies and a nasty case of agoraphobia are just a couple of the symptoms video game detractors throw at hardcore gamers, but they couldn’t be more wrong. In fact, playing video games might actually be good for you.
It’s understandable that video games just don’t seem healthy: the reflected zombie-blue pallor from the screen; the twitching movements of the hand more like a claw; the periodic howls of anger and abuse as you lose your life to yet another 14-year-old sniper taunting you via his headset microphone on Call of Duty. These are certainly not part of a healthy, well-adjusted lifestyle, and every so often a person will die from a non-stop gaming session. The course of action is clear: we should be running around outside, flying kites, climbing trees, grazing knees. Or so it would seem.
However, games are more than first-person shooters, and there’s a great deal of evidence that they’re actually good for you. Most video game players aren’t den-dwelling teenage boys playing with themselves: the average gamer is 30 years old, 45% of gamers are women and 70% play with other people, whether online or in the same room. This is down to the huge boom in social and mobile gaming – FarmVille, Angry Birds and Candy Crush Saga have brought video games to a completely new demographic.
That virtual social experience makes people more social too, according to the Call of (Civic) Duty study published in 2011: people who play video games that encourage cooperation are more likely to help others. That extends into the real world. “Gamers are rapidly learning social skills and prosocial behaviour that might generalise to their peer and family relations outside the gaming environment,” states another review. Essentially there’s a lot more to all that online interaction than discovering new ways to insult someone’s mother.
Older gamers gain benefits beyond the heady rush of crushing candies too. Researchers at the University of Iowa have found that only 10 hours of gaming – 10 hours in total, not per week or per month – is enough to delay a decline in the mental capability of those over 50 years old, by seven years. Immersion in virtual reality video games can even dull chronic pain in adults, according to the American PainSociety, not just because games distract; they also influence how the brain reacts to painful stimuli.
A game like The Sims has been shown to improve cognitive flexibility – the ability of the brain to switch quickly and effectively from one task to another. Video games also generally enhance creativity. A study of 500 12-year-olds found that game playing was strongly associated with greater creativity, in a way that the use of other media, such as the internet or TV, was not.
Gamers are rapidly learning social skills and prosocial behaviour that might generalise to their peer and family relations outside the gaming environment
It’s easy to make the case for more or less anything in moderation, especially when, in casual gaming’s case, there are scores of scientific studies backing up its healthy effects. So let’s think about something else: unhealthy gaming. This is not the occasional thumb-taps on an iPhone screen between tube stops, the wholesome family fun of a Nintendo Wii, or countless brain-training games. These aren’t proper games: they’re cognitive hair shirts or throw-away entertainment. Instead, consider the hard-core gamer – and find that even this Morlock, the butt of so much automatic disdain, has much to recommend him.
One of the most popular, influential games of the last five years – with the most hard-core following – is Minecraft. It now has more than 100 million players, and has generated hundreds of millions of dollars in revenue. The game is ostensibly simple: in a blocky, 3D world, you chop down trees, dig up rocks and build structures out of them – digital Lego. From those simple rules, however, worlds of glorious complexity have emerged. People have made working computers in Minecraft, computers that can themselves play Minecraft, and even a 1:1 version of the entirety of Denmark. Many of these worlds are huge collaborations between players, each chipping away at their allotted task. The game recreates, then embellishes, the best aspects of real-world play.
“Play in the real world is often about exploring the unknown world around you and then returning home where you feel safe,” Chris Goetz, a doctoral student at the University of California, Berkeley, has told the LA Times. “Minecraft is a place where kids can work through those same impulses. It’s like kid utopia.” The game is a natural entry point to the world of coding and teachers are even using it to teach maths, programming and economics in schools. It also has therapeutic benefits too: one model of the game, Autcraft, is specifically tailored to autistic children.

The nature of these games has changed dramatically in the last decade, becoming increasingly complex, diverse, realistic and social >
Maybe that’s a bit too easy. Minecraft is an indie title and a poster child for creative gaming, even if it is also wildly popular commercially. A game that shares its suffix, World of Warcraft, has more than 12 million regular players and is a byword for obsessive, unhealthy gaming.
Sometimes it even kills them. At least two people have died playing the game for consecutive days and one player killed another in real life, for stealing his virtual sword. These tragic stories make compelling headlines but, less emotively, research points to all the benefits of real-time strategy and role-playing games. A 2013 study carried out at the University of London found that real-time strategy games, such as StarCraft, where you control the building and tactics of various sci-fi armies, improve cognitive flexibility.
The successful management of resources, manoeuvring multiple units with different abilities and countering your opponent’s strategy as you fight across hazardous planets, requires lightening-quick decision making as well as big-picture master plans. This means that games like StarCraft offer a real workout for your brain, the kind of workout that has applications outside of pixelated battlecruisers, as another study found: the more teenagers played strategy games, the better they did in school exams and problem-solving tests the following year.
Time now for the most notorious and maligned game genre of them all: the blockbusting first-person shooter. Ultra-immersive, ultra-violent, and surely no good? Maybe not. To start with, they don’t seem to be resulting in much actual violence amongst children. Since 2003, when Call of Duty et al came of age, children have suffered much less harm and bullying, according to a study published in May by theCrimes Against Children Research Center. That can’t be attributed just to video games, but it seems that, despite all sorts of dire media warnings, the games aren’t having much of an impact. It has also been proved that playing first-person shooters improves sight, especially the ability to discern subtle changes in shades of colour against a uniform background (picking out a virtual sniper hiding in the digital scrag).
That study was led by Daphne Bavelier, professor of brain and cognitive sciences at the University of Rochester, who says that first-person shooter games improve attention spans, task management and multitasking. In fact, they do it better than any other type of video game. A research review paper published in January 2014 found that when first-time gamers are assigned a shooter, compared to any other type of game, they “show faster and more accurate attention allocation and higher spatial resolution in visual processing.”
The benefits of playing shooters were comparable to school and university courses designed to enhance exactly the same attributes, according to a meta-analysis published in Psychological Bulletin by researchers from Northwestern University, and those enhancements transfer to realms beyond video games. The information-rich, fast-paced,navigable 3D world of most shooters requires rapid decision-making as well as attention to detail. They’re pretty much a gym designed for your brain.
Some of these arguments are well-known so why make them again? As Isabela Granic and others write in their meta-study The Benefits of Playing Video Games, published in January 2014 in American Psychologist: “the nature of these games has changed dramatically in the last decade, becoming increasingly complex, diverse, realistic and social.” This trend will only deepen. Ultra-immersive, virtual-reality platforms like Oculus Rift, combined with Facebook’s billion-strong social reach, mean we’re only at the very start of a new gaming era. We need to be very clear about the positives and negatives video games offer now, so that we’re ready for what’s to come.