Just like rock music in the 1950s, Dungeons & Dragons in the 1980s, and death metal in the 1990s, videogaming has been demonised by parent groups. For decades, gamers were portrayed as obese social outcasts that spent hours in a dark basement hunched over a flashing screen, slowly becoming more aggressive and distanced from reality.
Today, that stereotype couldn’t be further from the truth. The average gamer is as likely to be a university professor or a corporate banker as they are to be a high school or college student.
Furthermore, a growing body of research is showing a whole host of health benefits associated with gaming. The most recent piece of research shows an improvement in vision in those who suffer from congenital cataract disorders.
But it wasn’t always this way. Early studies were quick to demonstrate the negative consequences of gaming, linking time spent in front of a console or PC with:
In fact, videogame-related literature in the mid-2000s paints a bleak, violent and anti-social picture of gamers. With the help of the media, videogames quickly became the scapegoat for everything that was going wrong with kids.
But the vast majority of early studies only showed associations between gaming and adolescent decline, providing no evidence that videogames were the actual cause.
More recently, studies have shown no difference in aggression or depression levels between gamers and non-gamers. Similarly, researchers have shown that violent games increased frustration in players because of their difficulty rather than aggression because of their violence.
Moreover, when researchers tested young gamers before and after they entered their teens, the strongest predictors of increased aggression were increased exposure to family violence and peer influences.
All of a sudden, the link between videogames and real-world violence isn’t so clear. This isn’t to say that videogames have no negative effects, just that they aren’t the root of the problem.
Researchers have shown that, unlike the moderate use of alcohol, cigarettes and coffee (which have proven detrimental effects) gaming has a raft of positive effects.
As touched on earlier, a recent study shows that patients with a rare cataract disorder improved their vision by playing a first-person shooter for about two hours a day, for a month. This improvement led to an increased ability to recognise faces, see small print and allowed them to read two lines lower on an eye chart.
This finding isn’t all that surprising. In a first-person shooter, players have to deal with multiple, fast-moving targets while keeping track of ammunition, available cover and teammates. Given the brain continually creates new connections throughout our lifetime, gaming is the equivalent to exercise for your eyes and brain.
And the above is not an isolated case. Studies show that gaming can lead to:
- improved reaction time
- improved memory and spatial skills, and even
- decreased blood pressure associated with increased relaxation, regardless of the level of violence in the game being played.
But this next one is my favourite.
University students that played 11-50 hours of games per week were found to have higher grades than non-gamers and those that played a greater variety of games were more imaginative.
Where was this study when I was growing up?
(Although I’d like to think videogames make you smarter, it’s more likely that gaming was a parental reward to get homework done. As economists know, incentives are powerful drivers.)
There’s even evidence of skills transference into real-world situations. Surgeons that played videogames for three hours a week made 37% fewer errors and completed operations 27% faster than non-gamers.
Think about that next time you’re selecting a surgeon.
Gaming is a billion dollar industry and with games moving from the console and PC to the smartphone, the number of gamers is likely approaching the billions. Videogames are here to stay and even non-gamers can benefit from helping to direct the rules of future games, similar to how the Red Cross recently suggested the rules of war games be revised according to international humanitarian law.
As well as the many positives already stated, videogames can also be an invaluable, yet indirect source of learning. Although educational games are often boring, numerous games successfully incorporate educational information.
Such games include the Assassin’s Creed series, which combines actual historical events with gameplay, and the Professor Layton series, which involves solving brain teasers to progress the story. Similarly, a new action game currently under development claims to teach gamers how to code.
Games and gamers can also be viewed as an untapped problem-solving resource. Researchers interested in solving complex protein-folding problems created an online game, Foldit, which led gamers to provide valuable contributions to AIDS research.
What if we take this another step forward and use games and virtual worlds to help individuals with social problems? In many games — such as World of Warcraft or the Call of Duty: Modern Warfare series — players are required to interact with others to finish quests or progress the storyline. Some individuals can find it easier to adopt a more confident personality in the virtual world. Could this be used to treat depression in teens?
I’m a gamer and have no problem in letting my kids play videogames (in moderation, of course). I also look forward to the day when I’ll be able to play co-op first-person shooters with my sons (to my wife’s chagrin).
Just like any other technology, gaming has a place in society. And, if managed appropriately, games could provide even greater physical and psychological benefits than already demonstrated. It’s up to society to shape this technology towards a beneficial end.
Until then, I’ll be happy taking Earth back from the Reapers in Mass Effect 3.