New Research about the Interaction between Technology, Society, and Cognition

May 19th, 2011

One of the best parts of #KellerKerfuffle (decent #hashtag, no?) is the debate about the role of technology in society has come to the fore. I’ve come across a number of really interesting items I haven’t seen before. I’ll just summarize them here, analyze them at some point later (really, this just makes my footnoting for an article easier, but I make it available for anyone interested).

Maryanne Wolfe, author of “Proust and the Squid: The Story and Science of the Reading Brain,” from the Center for Reading and Language Research at Tufts notes that human brains were never designed to read.

After many years of research on how the human brain learns to read, I came to an unsettlingly simple conclusion: We humans were never born to read. We learn to do so by an extraordinarily ingenuous ability to rearrange our “original parts” — like language and vision, both of which have genetic programs that unfold in fairly orderly fashion within any nurturant environment. Reading isn’t like that.

Books, a form of (primeval) technology, trained our brains to do this!

Gary Smart at ULCA notes that the human brain is constantly changing because of technology.

“We know that technology is changing our lives. It’s also changing our brains,” Small said during a recent Open Mind lecture for the Friends of the Semel Institute, a group that supports the institute’s work in researching and developing treatment for illnesses of the mind and brain. Small’s talk centered around his recently published book, “iBrain: Surviving the Technological Alteration of the Modern Mind.”

The human brain is malleable, always changing in response to the environment, Small said. “A young person’s brain, which is still developing, is particularly sensitive. … It’s also the kind of brain that is most exposed to the new technology.”

Wolfe notes the “brain gap” between “digital natives” and “digital immigrants.”

Digital natives — young people born into a world of laptops and cell phones, text messaging and twittering — spend an average of 8 1/2 hours each day exposed to digital technology. This exposure is rewiring their brain’s neural circuitry, heightening skills like multi-tasking, complex reasoning and decision-making, Small said. But there’s a down side: All that tech time diminishes “people” skills, including important emotional aptitudes like empathy.

On the opposite end of the spectrum, digital immigrants, born into a world of pocket calendars you penciled dates into and letters that got sent in the mail, have to work hard to embrace technology without the already-developed brain form and function. The good news, Small said, is that the flexible brain is eminently trainable.

For many “digital immigrants,” they were simply unable to keep up with the inundation of information present on the web.

He cited a recent UCLA study that assessed the effect of Internet searching on brain activity among volunteers between the ages of 55 and 76 — half of them well-practiced in searching the Internet, the other half not so. Semel Institute researchers used functional magnetic resonance imaging (fMRI) to scan the subjects’ brains while they surfed the ‘Net. The result: Researchers found that the brains of the Web-savvy group reflected about twice as much activity compared to the brains of those who were not Web-savvy.

“A simple, everyday task like searching the Web appears to enhance brain circuitry in older adults,” Small said, “demonstrating that our brains are sensitive and can continue to learn as we grow older.”

This post provides a good summary of the interaction between computers, the Internet, and our brains.

Research shows that each medium offers its own positive attributes: Neuroscience has shown that playing video games stimulates areas of our brains that control working memory, hand and eye coordination and attention and can stimulate and vastly improve our cognitive skills. Reading on the other hand promotes deep thought and exercises areas of the brain responsible for reflection, reasoning and critical analysis. And auditory storytelling stimulates areas of the brain involved with creativity, contextual thinking and executive function.

It could be argued that the Web, which is the ultimate library of words, video, images, interactivity, sharing and conversation, is the quintessential place to learn.

 

More on how our Brains react to technology:

Jonah Lehrer, author of “How We Decide,” also argues that our brains are likely just fine on the Internet. Mr. Lehrer, a former neuroscientist, writes on his blog, The Frontal Cortex, that “given this paucity of evidence, I think it’s far too soon to be drawing firm conclusions about the negative effects of the Web.”

In a recent blog post Mr. Lehrer notes that the evidence critics use to attack the Web could be used to argue that we shouldn’t even walk down a city street as the cognitive load is far too great for our brains to handle. He notes that a in 2008, a group of scientists from the University of Michigan engaged in a study that showed walking led the brain to see a “dramatic decreases in working memory, self-control, visual attention and positive affect.” Mr Lehrer writes:

When people walk down the street, they are forced to exert cognitive control and top-down attention, and all that mental effort takes a temporary toll on their brain.

Based on this data, it would be easy to conclude that we should avoid the metropolis, that the city street is a hazardous place.

 

One historical point I have followed is the repeated generational opposition to new forms of technology. In this Op-Ed, Steven Pinker writes that similar objections were made after the invention of the invention of the printing press, newspapers, paperbacks and television.

So too with electronic technologies. PowerPoint, we’re told, is reducing discourse to bullet points. Search engines lower our intelligence, encouraging us to skim on the surface of knowledge rather than dive to its depths. Twitter is shrinking our attention spans.

But such panics often fail basic reality checks. When comic books were accused of turning juveniles into delinquents in the 1950s, crime was falling to record lows, just as the denunciations of video games in the 1990s coincided with the great American crime decline. The decades of television, transistor radios and rock videos were also decades in which I.Q. scores rose continuously.

For a reality check today, take the state of science, which demands high levels of brainwork and is measured by clear benchmarks of discovery. These days scientists are never far from their e-mail, rarely touch paper and cannot lecture without PowerPoint. If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying. Other activities in the life of the mind, like philosophy, history and cultural criticism, are likewise flourishing, as anyone who has lost a morning of work to the Web site Arts & Letters Dailycan attest.

The new media have caught on for a reason. Knowledge is increasing exponentially; human brainpower and waking hours are not. Fortunately, the Internet and information technologies are helping us manage, search and retrieve our collective intellectual output at different scales, from Twitter and previews to e-books and online encyclopedias. Far from making us stupid, these technologies are the only things that will keep us smart.

This Room for Debate provides a nice balance of views on the value of technology.

From Nicholas Carr, author of “The Shallows: What the Internet Is Doing to Our Brains” and “The Big Switch: Rewiring the World, From Edison to Google.”:

If your boss and your colleagues expect you to be connected all the time, your career may suffer if you go silent. And if your friends are texting, tweeting, and Facebooking around the clock, going offline can leave you feeling socially isolated.

Then again, the only way to change social norms is for individuals to change their behavior. If you lead, maybe others will follow. At the very least, you’ll probably feel calmer, sharper and more in control of your thoughts.

Funny, I want to change social norms in the other direction.

Gary Small, who I mentioned above, had this to say:

Many of us escalate from multitasking to partial continuous attention: we’re constantly scanning the environment for the next exciting bit of information — the next text message, IM, email, or even land-line phone call. That next ping or buzz or ring interrupts our focus and charges up the dopamine reward system as we anticipate something new and more exciting than the task at hand.

William Powers is the author of “Hamlet’s BlackBerry: A Practical Philosophy for Building a Good Life in the Digital Age”.

First, it’s essential to recognize that this is a completely normal human response to a powerful new technology. People have been addicted to connectedness since the dawn of time. We need it to get ahead in life, learn about the world beyond ourselves, find happiness and meaning. Some of the most accomplished figures in history have struggled with the challenge captured in Matt Richtel’s story, that restless inability to stop connecting.

Socrates was so hooked on the dominant connectedness of his time — oral conversation — he couldn’t bear to spend time outside the walls of Athens. Why take a quiet walk in the country when he could be where the action was, chatting up his friends? A friend showed Socrates that putting some distance between yourself and your busy, connected life does wonders for the mind. Today we just need to learn that same lesson.

From Timothy Lee:

When a new technology enters the social scene, hand-wringing about its social effects is never far behind. So I was not surprised to see Matt Richtel offer the latest contribution to this shopworn genre. The trends he describes are not nearly as novel — or as alarming — as he and the experts he interviews seem to think.

The article quotes Stanford’s Clifford Nass, who warns that excessive use of digital technologies will “diminish empathy by limiting how much people engage with one another.” That may be true for some people, but for most people the reality is just the opposite: the Internet broadens and strengthens our social ties and greatly enhances our ability to engage with one another.

The Internet has allowed me to nurture a number of long-distance friendships that would have withered in a pre-Internet era. I have about a dozen close friends who have moved far away from me. Tools like Twitter, Facebook, instant messaging, and email, have been essential to staying close.

The Internet also enables the creation of entirely new friendships. People with common interests — even quite obscure ones — can find one another and build virtual communities. If they happen to be in the same city, sites such as MeetUp help virtual friendships evolve into “real” ones.

Plus this Room for Debate about whether the brain likes eboooks. This passage from Maryanne Wolf is really interesting:

After many years of research on how the human brain learns to read, I came to an unsettlingly simple conclusion: We humans were never born to read. We learn to do so by an extraordinarily ingenuous ability to rearrange our “original parts” — like language and vision, both of which have genetic programs that unfold in fairly orderly fashion within any nurturant environment. Reading isn’t like that.

Each young reader has to fashion an entirely new “reading circuit” afresh every time. There is no one neat circuit just waiting to unfold. This means that the circuit can become more or less developed depending on the particulars of the learner: e.g., instruction, culture, motivation, educational opportunity.

In brief, this brain learns to access and integrate within 300 milliseconds a vast array of visual, semantic, sound (or phonological), and conceptual processes, which allows us to decode and begin to comprehend a word. At that point, for most of us our circuit is automatic enough to allocate an additional precious 100 to 200 milliseconds to an even more sophisticated set of comprehension processes that allow us to connect the decoded words to inference, analogical reasoning, critical analysis, contextual knowledge, and finally, the apex of reading: our own thoughts that go beyond the text.

This is what Proust called the heart of reading — when we go beyond the author’s wisdom and enter the beginning of our own.

This interesting article talks about a family that is wired.