Bill Keller, the Executive Editor of the New York Times, has an astounding column that portrays his shortsightedness, and portends a future state where newspapers like the Times will decrease indefinitely into obscurity. Keller previously attacked Arianna Huffington and the Huffington Post, contending that it is simply aggregation, and not content generation.
Now Keller opens fire on Twitter and Facebook, contending that using these cool new digital tools comes at a price. I see this as the start of a sustained campaign against those who detract from the Times’ prominence.
I don’t mean to be a spoilsport, and I don’t think I’m a Luddite. I edit a newspaper that has embraced new media with creative, prizewinning gusto. I get that the Web reaches and engages a vast, global audience, that it invites participation and facilitates — up to a point — newsgathering. But before we succumb to digital idolatry, we should consider that innovation often comes at a price. And sometimes I wonder if the price is a piece of ourselves.
Keller argues that we are “outsourcing our brains to the cloud.”
Basically, we are outsourcing our brains to the cloud. The upside is that this frees a lot of gray matter for important pursuits like FarmVille and “Real Housewives.” But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.
And what are the downsides to these new forms of technology?
The most obvious drawback of social media is that they are aggressive distractions. Unlike the virtual fireplace or that nesting pair of red-tailed hawks we have been live-streaming on nytimes.com, Twitter is not just an ambient presence. It demands attention and response. It is the enemy of contemplation. Every time my TweetDeck shoots a new tweet to my desktop, I experience a little dopamine spritz that takes me away from . . . from . . . wait, what was I saying?
My mistrust of social media is intensified by the ephemeral nature of these communications. They are the epitome of in-one-ear-and-out-the-other, which was my mother’s trope for a failure to connect.
Keller doesn’t even view Facebook as social, disputing that the modern form of communication is not social interaction, at all:
I’m not even sure these new instruments are genuinely “social.” There is something decidedly faux about the camaraderie of Facebook, something illusory about the connectedness of Twitter. Eavesdrop on a conversation as it surges through the digital crowd, and more often than not it is reductive and redundant. Following an argument among the Twits is like listening to preschoolers quarreling: You did! Did not! Did too! Did not!
As a kind of masochistic experiment, the other day I tweeted “#TwitterMakesYouStupid. Discuss.” It produced a few flashes of wit (“Give a little credit to our public schools!”); a couple of earnestly obvious points (“Depends who you follow”); some understandable speculation that my account had been hacked by a troll; a message from my wife (“I don’t know if Twitter makes you stupid, but it’s making you late for dinner. Come home!”); and an awful lot of nyah-nyah-nyah (“Um, wrong.” “Nuh-uh!!”). Almost everyone who had anything profound to say in response to my little provocation chose to say it outside Twitter. In an actual discussion, the marshaling of information is cumulative, complication is acknowledged, sometimes persuasion occurs. In a Twitter discussion, opinions and our tolerance for others’ opinions are stunted. Whether or not Twitter makes you stupid, it certainly makes some smart people sound stupid.
After some obligatory praise for social media, Keller writes that is supplanting things he values more, or as he puts it, “things that matter” (to him they are one in the same):
The shortcomings of social media would not bother me awfully if I did not suspect that Facebook friendship and Twitter chatter are displacing real rapport and real conversation, just as Gutenberg’s device displaced remembering. The things we may be unlearning, tweet by tweet — complexity, acuity, patience, wisdom, intimacy — are things that matter . . . My own anxiety is less about the cerebrum than about the soul, and is best summed up not by a neuroscientist but by a novelist.
Mat Honan at Gizmodo does a masterful job at deconstructing the piece. I wanted to flag a few of his points, then add my own comments.
First, Keller confuses the medium with the message.
Keller makes the same mistake in dismissing Twitter and Facebook and, well, modernity, that critics ten to twelve years ago made in dismissing blogging: he confuses medium with message. Twitter, and any technology, is what you make of it. If you choose to do superficial things there, you will have superficial experiences. If you use it to communicate with others on a deeper level, you can have more meaningful experiences that make you smarter, build lasting relationships, and generally enhance your life.
Keller provides a sense of history, and notes that before the printing press, people memorized stuff. Now, with the Internet, facebook is not necessary.
Joshua Foer’s engrossing best seller “Moonwalking With Einstein” recalls one colossal example of what we trade for progress. Until the 15th century, people were taught to remember vast quantities of information. Feats of memory that would today qualify you as a freak — the ability to recite entire books — were not unheard of.
Then along came the Mark Zuckerberg of his day, Johannes Gutenberg. As we became accustomed to relying on the printed page, the work of remembering gradually fell into disuse. The capacity to remember prodigiously still exists (as Foer proved by training himself to become a national memory champion), but for most of us it stays parked in the garage.
Honan notes that “previous generations” had similar objections to newspaper and pamphlets. He cites to an argument Socrates, who never wrote anything down, made against writing.
[F]or [the use of letters] will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.
Honan is not so concerned, and notes that the way we process information changes; but this does not result in a loss to society.
Technology will change the way we think and interact. Our brains will process information differently and we will interact with each other differently thanks to the tools we use, be they databases, communications mediums, or language itself. But Keller seems to mistake the changing nature of the way our brains work to process information and communicate with us having lost something as a society. That’s just not true.
If we lose the art of penmanship, but gain a greater ability to clearly communicate what is ultimately lost? If we become unable to recognize simple patterns in data with our eyes because we have built machines that can see complex ones our brains could not process in many lifetimes, are we truly intellectually bereft for it?
I’ve made this argument elsewhere. The skills we need for future interactions differ from those Keller learned.
Honan adds:
The thing is not that we’re dumber, or that our cognitive advance has slowed or reversed. It’s that we need different mental abilities to process information and the modern world.
We don’t simply use new technologies, we become immersed in them. We live in an era of information assault. Data is everywhere. Ads come at us from all sides. Email pours into our boxes. The Web, and television, and radio and, yes, fucking newspapers spew information at us like, well, like newspapers once spewed from printing presses before they began drifting into irrelevance.
Memorization was once a tool for preserving information. But today the more important skill is the ability to process and filter it. To quickly decide what needs to be analyzed and responded to, and what ought to be ignored. That’s not a cognitive loss, it’s an evolutionary advancement.
The fact that the skills are different does not make them bad. Memorizing is not nearly as important as the ability to process and analyze vast amounts of information. Short, twitter messages. Supertasking. These are helpful skills for the future.
Similarly, just as we encounter much more data each day, we also encounter many more people. Think back 20 years ago. How many people did you interact with in a 24 hour period? Almost certainly, all of your interactions were in person or via the telephone. The majority required speech. A small subset likely took place via the written word. In technologically advanced societies, that trend has reversed itself.
If you are like me, most of your daily interactions with other people take place electronically. You probably interact with a greater number of distinct individuals via emails, tweets, Facebook updates, chats, and text message than you do verbally or in person. (Unless you have a job that requires a great deal of public interaction like, say, a sales clerk at a busy department store.)
Again, you need to be able to process those relationships quickly and efficiently. It’s a basic tool for modern life. Yet that does not mean that your interactions in those mediums are any less genuine, or less soulful, even if they take place more rapidly.
We interact with more people than ever before through technology. Sure we are not seeing as many people in person, but this need not vitiate all human aspects of interaction.
I see a huge clash in our culture coming soon between those in Keller’s generation, and those in our generation. Take a look at my liveblogging law review article experiment on the future of the legal education. I have started addressing the arguments of those in Keller’s camp.