Friday, January 7, 2011

Digital Natives Get Brain Boost from Technology


The growing question of what technology is doing to the brain in this age of constant electronic distractions takes on new meaning when applying it to the teen brain. Now well-established research showing the brain's prefrontal cortex - the area responsible for inhibiting impulse, planning, prioritizing and decision making or what's commonly referred to as executive function - develops well into the mid-20s, begs the question of what happens to neurocognitive development in young people who have grown up in the digital age. If you've always had Facebook, text messages, videogames, YouTube and Skype, for example, are you better able to juggle multiple inputs into the brain and do your homework at the same time?

A report on the PBS NewsHour reflects the many ways scientists are exploring how technology may be affecting teen brain development. Dr. Jay Giedd, an NIH researcher is engaged in a 20-year study tracking the development of brains through routine testing and brain imaging from childhood through adolescence and beyond, and continues to find brain changes in subjects who have grown up in the digital age.

While some who study the subject are concerned about this evolution, like author Nicholas Carr, and educators like Michael Austin in his blog Boys Name Tzu, worrying how kids growing up in our electronic world will fare in college and beyond, the news is not all bad. Multitasking taxes the brain - we are best able to sustain attention on one task at a time. Doing two things at once uses up more energy in the brain to activate multiple brain sections. But according to Daphne Bavelier professor of brain and cognitive sciences at the University of Rochester, in her studies of what's happening in the brain when young people play high-action videogames, practice at this activity actually improves vision - that is, attending to multiple details on-screen in an experiment playing Unreal Tournament, a first-person shoot-'em-up action game. "Action video game play changes the way our brains process visual information," says Daphne Bavelier, professor of brain and cognitive sciences at the University of Rochester. "After just 30 hours, players showed a substantial increase in the spatial resolution of their vision, meaning they could see figures like those on an eye chart more clearly, even when other symbols crowded in."

This improvement can be generalized to include focus on what's happening at multiple areas of the screen as players' skills improve with practice. Meaning perhaps, that teens who grow up playing Call of Duty and Facebooking at the same time may be rewiring their brains to attend to more than one task at a time. "They perform better than non-gamers on certain tests of attention, speed, accuracy, vision and multitasking," says Bavelier.

While the news may be good for these digital natives, digital immigrants, whose brains were not wired initially on multiple stimuli, may not get the same brain wiring advantages from practicing electronic multitasking, such as talking on cellphones while driving. The drain in brain energy still imposes too great a cost.

In fact, these social and neurocognitive changes might bring up many more questions: will public policy need to evolve as the digital natives grow into adulthood to reflect their refocused brains? Is there a cost to this rewiring in other areas of the brain, like sustained attention for activities requiring deeper focus like reading or studying? Do schools need to adjust to accommodate these new ways of thinking? Are we even thinking at all - or just making way for stimuli to affect our brains - addicted to the adrenaline or dopamine rush that games and social networks produce - or even the oxytocin connection of a new Facebook friend or Twitter follower - our so-called social media - that might come to approximate social interaction? And do these new brains get passed down genetically to this generation's digital natives' children?

We are still at the frontiers of brain science, with intriguing clues into this sophisticated mechanism and its connections to behavior, social interaction, intelligence, and thought with much to learn. As brains go, we are still evolving. Like the printing press, the telegraph, the telephone and television before it, technology has an impact on human behavior, and what scares one generation becomes the norm for the next. Looking from today's cutting edge of technology, living and thought, what do you think?

2 comments:

  1. Isn't it strange, the more we learn the more questions we have to learn about :-) I was first introduced to this notion of digital natives from the work of Marc Prensky (PDF) http://bit.ly/fE0xd1. I couldn't agree more that technology impacts human behavior .. we create the technology that at some point creates us. Don't know if you've seen this from Professor Wesch at K-State but you might find it fits well with this discussion and perhaps more to the point his follow up video on students today http://bit.ly/fvkndR Can we connect the neuroscience research with what we see and experience today?

    ReplyDelete
  2. Every child is born with a good memory. A lot of importance is laid on the cognitive development of an infant. From the moment a child can speak their first word; cognitive enhancers are given to him to improve concentration and memory. understanding nootropics suplement

    ReplyDelete