A thought provoking blog post by educator Larry Cuban highlights the current fascination with applying brain-based research to unraveling the solution to the puzzle of effective teaching. While many perceive neuroscience as the new Holy Grail - where research about how the brain works can save humanity from everything from PTSD to bad leadership, the fact is, we still know very little. Current science, at best, has unlocked a few suggestions on what happens in the brain under certain (laboratory) conditions, about how the brain influences behavior and vice versa, where environment fits into the picture and clues from the emerging science of epigenetics. More intriguing is what we don't know.
It is still worth applying the rule-of-thumb contributed by cognitive psychologist Daniel Willingham at the University of Virginia. In her ongoing blog, The Answer Sheet, Washington Post journalist Valerie Strauss invites guest blogger Willingham, who is associate editor of the journal Mind, Brain, and Education and the book Why Don't Students Like School, to share his caveats about tying neuroscience findings to learning:
- the brain is always changing
- the connection between the brain and behavior is not obvious
- deriving useful information for teachers from neuroscience is slow, painstaking work
Interesting idea - that we don't know what the brain is up to. So whether it is understanding our own behavior or having a clue about what techniques best motivate children to learn - or not - the brain has a mind of its own.
Tuesday, March 15, 2011
Friday, January 7, 2011
The growing question of what technology is doing to the brain in this age of constant electronic distractions takes on new meaning when applying it to the teen brain. Now well-established research showing the brain's prefrontal cortex - the area responsible for inhibiting impulse, planning, prioritizing and decision making or what's commonly referred to as executive function - develops well into the mid-20s, begs the question of what happens to neurocognitive development in young people who have grown up in the digital age. If you've always had Facebook, text messages, videogames, YouTube and Skype, for example, are you better able to juggle multiple inputs into the brain and do your homework at the same time?
A report on the PBS NewsHour reflects the many ways scientists are exploring how technology may be affecting teen brain development. Dr. Jay Giedd, an NIH researcher is engaged in a 20-year study tracking the development of brains through routine testing and brain imaging from childhood through adolescence and beyond, and continues to find brain changes in subjects who have grown up in the digital age.
While some who study the subject are concerned about this evolution, like author Nicholas Carr, and educators like Michael Austin in his blog Boys Name Tzu, worrying how kids growing up in our electronic world will fare in college and beyond, the news is not all bad. Multitasking taxes the brain - we are best able to sustain attention on one task at a time. Doing two things at once uses up more energy in the brain to activate multiple brain sections. But according to Daphne Bavelier professor of brain and cognitive sciences at the University of Rochester, in her studies of what's happening in the brain when young people play high-action videogames, practice at this activity actually improves vision - that is, attending to multiple details on-screen in an experiment playing Unreal Tournament, a first-person shoot-'em-up action game. "Action video game play changes the way our brains process visual information," says Daphne Bavelier, professor of brain and cognitive sciences at the University of Rochester. "After just 30 hours, players showed a substantial increase in the spatial resolution of their vision, meaning they could see figures like those on an eye chart more clearly, even when other symbols crowded in."
This improvement can be generalized to include focus on what's happening at multiple areas of the screen as players' skills improve with practice. Meaning perhaps, that teens who grow up playing Call of Duty and Facebooking at the same time may be rewiring their brains to attend to more than one task at a time. "They perform better than non-gamers on certain tests of attention, speed, accuracy, vision and multitasking," says Bavelier.
While the news may be good for these digital natives, digital immigrants, whose brains were not wired initially on multiple stimuli, may not get the same brain wiring advantages from practicing electronic multitasking, such as talking on cellphones while driving. The drain in brain energy still imposes too great a cost.
In fact, these social and neurocognitive changes might bring up many more questions: will public policy need to evolve as the digital natives grow into adulthood to reflect their refocused brains? Is there a cost to this rewiring in other areas of the brain, like sustained attention for activities requiring deeper focus like reading or studying? Do schools need to adjust to accommodate these new ways of thinking? Are we even thinking at all - or just making way for stimuli to affect our brains - addicted to the adrenaline or dopamine rush that games and social networks produce - or even the oxytocin connection of a new Facebook friend or Twitter follower - our so-called social media - that might come to approximate social interaction? And do these new brains get passed down genetically to this generation's digital natives' children?
We are still at the frontiers of brain science, with intriguing clues into this sophisticated mechanism and its connections to behavior, social interaction, intelligence, and thought with much to learn. As brains go, we are still evolving. Like the printing press, the telegraph, the telephone and television before it, technology has an impact on human behavior, and what scares one generation becomes the norm for the next. Looking from today's cutting edge of technology, living and thought, what do you think?