Video game players like to think that their hobby has
benefits beyond entertainment –that even though they appear to be sitting and
staring at a screen, they’re actually fine tuning reflexes, developing
problem-solving abilities, and improving visual acuity.
It’s a compelling idea, and it has some science behind it.
Over the past ten years, a number of studies have shown that video game players
often outperform non-gamers on measures of perception and cognition, and that
video game practice can enhance those abilities.
But a new study suggests the jury is still out on video
games. In a paper recently published as Do
Action Video Games Improve Perception and Cognition? in the journal Frontiers in Psychology,
psychologistsWalter R. Boot,
Daniel P. Blakely and Daniel J. Simons suggest
that methodological problems call earlier studies into question, and leave the
relationship between games and cognition unclear.
The authors write:
"The conclusion that game training produces unusually broad
transfer is weakened by methodological shortcomings common to most (if not all)
of the published studies documenting gaming effects. The flaws we discuss are
not obscure or esoteric—they are well known pitfalls in the design of clinical
trials and experiments on expertise. Most of these shortcomings are surmountable,
but no published gaming study has successfully avoided them all."
One way gaming studies have stumbled is by specifically
seeking out gamers to participate: Essentially, gamers perform better on
cognitive tests because they’ve heard that gamers perform better on cognitive
tests.
"Imagine that you are recruited to participate in a study
because of your gaming expertise, and the study consists of gamelike computer
tasks. If you know you have been recruited because you are an expert, the
demand characteristics of the experimental situation will motivate you to try
to perform well. In contrast, a non-gamer selected without any mention of
gaming will not experience such demand characteristics, so will be less
motivated. Any difference in task performance, then, would be analogous to a
placebo effect."
The solution, say the authors, is to recruit participants
without mention of video games, and to make sure that subjects don’t have any
reason to link their gaming expertise to the tasks in the study.
Another frequent methodological miscue: Assuming that gamers
test better because of improved cognition, not because they’re better at taking
tests.
"Game benefits might reflect shifts in strategy rather than
changes in more basic cognitive or perceptual capacities. Short-term and
long-term game exposure does appear to produce strategy changes. For example,
experienced gamers search more thoroughly than non-gamers, leading to better
change detection performance. Changes in how people approach a task are
interesting and important, but without careful evaluation of strategy shifts,
better expert performance might wrongly be attributed to more fundamental
differences in perception and memory."
None of this means that games don’t improve cognition, or
that further study isn’t warranted; just that researchers need to do a better
job eliminating alternative explanations for their results.
Ultimately, say the authors, games holds tremendous promise
if the evidence bears out: “Such definitive tests could have implications well
beyond the laboratory, potentially helping researchers to develop game
interventions to address disorders of vision and attention, and help remediate
the effects of cognitive aging.”
No comments:
Post a Comment
THANK YOU FOR TAKING THE TIME OUT TO COMMENT AND LETTING YOUR FEELINGS BE HEARD!