“The Shallows” by Nicholas Carr


Try reading a book while doing a crossword puzzle; that’s the intellectual environment of the Internet.

The central text in The Shallows is Marshall McLuhan’s 1964 book Understanding Media: The Extensions of Man, which is about how media shapes the thoughts of its consumer. His dictum “The media is the message,” is a catchy way of saying that the type of media dictates the content so thoroughly that the two are inseparable. For example, read a book and watch a documentary about the same subject. In probably every instance, the book will contain more nuance and the arguments will be more balanced. The message is shaped by the constraints of the medium you’re experiencing them.

Nicholas Carr is worried that ubiquitous connection to the internet is shaping the way we think in harmful ways that we don’t immediately realize. The most convincing example to me was the studies he cites that show how intimately connected our spatial awareness is connected to our general memory, meaning that the more GPS we use to navigate the less we exercise our spatial abilities and thus our general memory atrophies. Also, relying on Google to find facts we can’t instantly remember instead of sifting through our memory banks makes us lazier thinkers in general. It’s too early to see how this new technology plays out—we are all guinea pigs here—but a thoughtful meditation on what this new way of thinking leaves behind is important even if there is no going back.

Buy on Amazon

McLuhan understood that whenever a new medium comes along, people naturally get caught up in the information—the “content”—it carries. They care about the news in the newspaper, the music on the radio, the shows on the TV, the words spoken by the person on the far end of the phone line. The technology of the medium, however astonishing it may be, disappears behind whatever flows through it—facts, entertainment, instruction, conversation. When people start debating (as they always do) whether the medium’s effects are good or bad, it’s the content they wrestle over. Enthusiasts celebrate it; skeptics decry it. The terms of the argument have been pretty much the same for every new informational medium, going back at least to the books that came off Gutenberg’s press. Enthusiasts, with good reason, praise the torrents of new content that the technology uncorks, seeing it as signaling a “democratization” of culture. Skeptics, with equally good reason, condemn the crassness of the content, viewing it as a “dumbing down” of culture. One side’s abundant Eden is the other’s vast wasteland.

What both enthusiast and skeptic miss is what McLuhan saw: that in the long run a medium’s content matters less than the medium itself in influencing how we think and act. As our window onto the world, and onto ourselves, a popular medium molds what we see and how we see it—and eventually, if we use it enough, it changes who we are, as individuals and as a society. “The effects of technology do not occur at the level of opinions or concepts,” wrote McLuhan. Rather, they alter “patterns of perception steadily and without any resistance.” The showman exaggerates to make his point, but the point stands. Media work their magic, or their mischief, on the nervous system itself.

The boons are real. But they come at a price. As McLuhan suggested, media aren’t just channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I’m online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

We become, neurologically, what we think.

It comes as no surprise that neuroplasticity has been linked to mental afflictions ranging from a depression to obsessive-compulsive disorder to tinnitus. The more a sufferer concentrates on his symptoms, the deeper those symptoms are etched into his neural circuits. In the worst cases, the mind essentially trains itself to be sick.

The potential for unwelcome neuroplastic adaptations also exists in the everyday, normal functioning of our minds. Experiments show that just as the brain can build new or stronger circuits through physical or mental practice, those circuits can weaken or dissolve with neglect. “If we stop exercising our mental skills,” writes Doidge, “we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead.” Jeffery Schwartz, a professor of psychiatry at UCLA’s medical school, terms this process “survival of the busiest.” The mental skills we sacrifice may be as valuable, or even more valuable, than the ones we gain. When it comes to the quality of our thought, our neurons and synapses  are entirely indifferent. The possibility of intellectual decay is inherent in the malleability of our brains.

The historical advances in cartography didn’t simply mirror the development of the human mind. They helped propel and guide the very intellectual advances that they documented. The map is a medium that not only stores and transmits information but also embodies a particular mode of seeing and thinking. As mapmaking progressed, the spread of maps also disseminated the mapmaker’s distinctive way of perceiving and making sense of the world. The more frequently and intensely people used maps, the more their minds came to understand reality in the maps’ terms. The influence of maps went far beyond their practical employment in establishing property boundaries and charting routes. “The use of a reduced, substitute space for that of reality,” explains the cartographic historian Arthur Robinson, “is an impressive act in itself.” But what’s even more impressive is how the map “advanced the evolution of abstract thinking” throughout society. “The combination of the reduction of reality and the construct of an analogical space is an attainment in abstract thinking of a very high order indeed,” writes Robinson, “for it enables one to discover structures that would remain unknown if not mapped.” The technology of the map gave to man a new and more comprehending mind, better able to understand the unseen forces that shape his surroundings and his existence.

As the stories of the map and the mechanical clock illustrate, intellectual technologies, when they come into popular use, often promote new ways of thinking or extend to the general population established ways of thinking that had been limited to a small, elite group. Every intellectual technology, to put it another way, embodies an intellectual ethic, a set of assumptions about how the human mind works or should work. The map and the clock shared a similar ethic. Both placed a new stress on measurement and abstraction, on perceiving and defining forms and process beyond those apparent to the senses.

Language itself is not a technology. It’s native to our species. Our brains and bodies have evolved to speak and to hear words. A child learns to talk without instruction, as a fledgling bird learns to fly. Because reading and writing have become so central to our identity and culture, it’s easy to assume that they, too, are innate talents. But they’re not. Reading and writing are unnatural acts, made possible by the purposeful development of the alphabet and many other technologies. Our minds have to be taught how to translate the symbolic characters we see into the language we understand. Reading and writing require schooling and practice, the deliberate shaping of the brain.

The written word is “a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance.” Those who rely on reading for their knowledge will “seem to know much, while for the most part they know nothing.” They will be “filled, not with wisdom, but with the conceit of wisdom.”

It’s hard for us to imagine today, but no spaces separated the words in early writing. In the books inked by scribes, words ran together without any break across every line on every page, in what’s now referred to as scriptura continua. The lack of word separation reflected language’s origins in speech. When we talk, we don’t insert pauses between each word—long stretches of syllables flow unbroken from our lips. It would never have crossed the minds of the first writers to put blank spaces between words. They were simply transcribing speech, writing what their ears told them to write. (Today, when young children begin to write, they also run their words together. Like the early scribes, they write what they hear.)

Even the earliest silent readers recognized the striking change in their consciousness that took place as they immersed themselves in the pages of a book. The medieval bishop Isaac of Syria described how, whenever he read to himself, “as in a dream, I enter a state when my sense and thoughts are concentrated. Then, when with prolonging of this silence the turmoil of memories is stilled in my heart, ceaseless waves of joy are sent me by inner thoughts, beyond expectation suddenly arising to delight my heart.” Reading a book was a meditative act, but it didn’t involve a clearing of the mind. Readers disengaged their attention from the outward flow of words, ideas, and emotions. That was—and is—the essence of the unique mental process of deep reading.

What Turing could not have anticipated was the way his universal machine would, just a few decades after his death, become our universal medium. Because the different sorts of information distributed by traditional media—words, numbers, sounds, images, moving pictures—can all be translated into digital code, they can all be “computed.” Everything from Beethoven’s Ninth to a porn flick can be reduced to a string of ones and zeros and processed, transmitted, and displayed or played by a computer. Today, with the Internet, we’re seeing firsthand the extraordinary implications of Turing’s discovery. Constructed of millions of interconnected computers and data banks, the Net is a Turing machine of immeasurable power, and it is, true to form, subsuming most of our other intellectual technologies. It’s becoming our typewriter and our printing press, our map and our clock, our calculator and our telephone, our post office and our library, our radio and our TV. It’s even taking over the functions of other computers; more and more of our software programs run through the Internet—or “in the cloud,” as the Silicon Valley types say—rather than inside our home computers.

Of the four major categories of personal media, print is now the least used, lagging well behind television, computers, and radio.

When the Net absorbs a medium, it re-creates that medium in its own image. It not only dissolves the medium’s physical form; it injects the medium’s content with hyperlinks, breaks up the content into searchable chunks, and surrounds the content with the content of all the other media it has absorbed. All these changes in the form of the content also change the way we use, experience, and even understand the content.

Dozens of studies by psychologists, neurobiologists, educators, and Web designers point to the same conclusion: when we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. It’s possible to think deeply while surfing the Net, just as it’s possible to think shallowly while reading a book, but that’s not the type of thinking the technology encourages or rewards.

It’s not just that we tend to use the Net regularly, even obsessively. It’s that the Net delivers precisely the kind of sensory and cognitive stimuli—repetitive, intensive, interactive, addictive—that have been shown to result in strong and rapid alterations in brain circuits and functions. With the exception of alphabets and number systems, the Net may well be the single most powerful mind-altering technology that has ever come into general use. At the very least, it’s the most powerful that has come along since the book.

The Net engages all of our senses—except, so far, those of smell and taste—and it engages them simultaneously.

The Net also provides a high-speed system for delivering responses and rewards—“positive reinforcements,” in psychological terms—which encourage the repetition of both physical and mental actions. When we click a link, we get something new to look at and evaluate. When we Google a keyword, we receive, in the blink of an eye, a list of interesting information to appraise. When we send a text or an instant message or an e-mail, we often get a reply in a matter of seconds or minutes. When we use Facebook, we attract new friends or form closer bonds with old ones. When we send a tweet through Twitter, we gain new followers. When we write a blog post, we get comments from readers or links from other bloggers. The Net’s interactivity gives us powerful new tools for finding information, expressing ourselves, and conversing with others. It also turns us into lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.

The Net’s cacophony of stimuli short-circuits both conscious and unconscious thought, preventing our minds from thinking either deeply or creatively.

The depth of our intelligence hinges on our ability to transfer information from working memory to long-term memory and weave it into conceptual schemas. But the passage from working memory to long-term memory also forms the major bottleneck in our brain. Unlike long-term memory, which has a vast capacity, working memory is able to hold only a very small amount of information. In a reknowned 1956 paper, “The Magical Number Seven, Plus or Minus Two,” Princeton psychologist George Miller observed that working memory could typically hold just seven pieces, or “elements,” of information. Even that is now considered an overstatement. According to Sweller, current evidence suggests that “we can process no more than about two to four elements at any given time with the actual number probably being at the lower [rather] than the higher end of the scale.” Those elements that we are able to hold in working memory will, moreover, quickly vanish “unless we are able to refresh them by rehearsal.”

Try reading a book while doing a crossword puzzle; that’s the intellectual environment of the Internet.

Research was painting a fuller, very different, picture of the cognitive effects of hypertext. Evaluating links and navigating a path through them, it turned out, involved mentally demanding problem-solving tasks that are extraneous to the act of reading itself. Deciphering hypertext substantially increases readers’ cognitive load and hence weakens their ability to comprehend and retain what they’re reading. A 1989 study showed that readers of hypertext often ended up clicking distractedly “through pages instead of reading them carefully.”

Even though the World Wide Web has made hypertext commonplace, indeed ubiquitous, research continues to show that people who read linear text comprehend more, remember more, and learn more than those who read text peppered with links.

“Auditory and visual working memory are separate, at least to some extent, and because they are separate, effective working memory may be increased by using both processors rather than one.”

The Internet, however, wasn’t built by educators to optimize learning.

Every time we shift our attention, our brain has to reorient itself, further taxing our mental resources.The near-continuous stream of new information pumped out by the Web also plays to our natural tendency to “vastly overvalue what happens to us right now,” as Union College psychologist Christopher Chabris explains. We crave the new even when we know that “the new is more often trivial than essential.” 

The ability to skim text is every bit as important as the ability to read deeply. What is different, and troubling, is that skimming is becoming our dominant mode of reading. Once a means to an end, a way to identify information for deeper study, scanning is becoming an end to itself—our preferred way of gathering and making sense of information of all sorts. We’ve reached the point where a Rhodes Scholar like Florida State’s Joe O’Shea—a philosophy major, no less—is comfortable admitting not only that he doesn’t read books but that he doesn’t see any particular need to read them. Why bother, when you can Google the bits and pieces you need in a fraction of a second? What we’re experiencing is, in a metaphorical sense, a reversal of the early trajectory of civilization: we are evolving from being cultivators of personal knowledge to being hunters and gatherers in the electronic data forest.

“The more you multitask, the less deliberative you become; the less able to think and reason out a problem.” You become, he argues, more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought. David Meyer, a University of Michigan neuroscientist and one of the leading experts on multitasking, makes a similar point. As we gain more experience in rapidly shifting our attention, we may “overcome some of the inefficiencies” inherent in multitasking, he says, “but except in rare circumstances, you can train until you’re blue in the face and you’d never be as good as if you just focused on one thing at a time.” What we’re doing when we multitask “is learning to be skillful at a superficial level.” The Roman philosopher Seneca may have put it best two thousand years ago: “To be everywhere is to be nowhere.”

Erasmus, who as a schoolboy had memorized great swaths of classical literature, including the complete works of the poet Horace and the playwright Terrance, was not recommending memorization for memorization’s sake or as a rote exercise for retaining facts. To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading. He believed, as the classical historian Erika Rummel explains, that a person should “digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author.” Far from being a mechanical, mindless process, Erasmus’s brand of memorization engaged the mind fully. It required, Rummel writes, “creativeness and judgment.”

Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.

What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed, the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals. Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections—a new context.

We don’t constrain our mental powers when we store new long-term memories. We strengthen them. With each expansion of our memory comes an enlargement of our intelligence.

He knew from experience with time-sharing networks that the role of computers would expand beyond the automation of governmental and industrial processes. Computers would come to mediate the activities that define people’s everyday lives—how they learn, how they think, how they socialize. What the history of intellectual technologies shows us, he warned, is that “the introduction of computers into some complex human activities may constitute an irreversible commitment.” Our intellectual and social lives may, like our industrial routines, come to reflect the form that the computer imposes on them.

What makes us most human, Weizenbaum had come to believe, is what is least computable about us—the connections between our mind and our body, the experiences that shape our memory and our thinking, our capacity for emotion and empathy. The great danger we face as we become more intimately involved with our computers—as we come to experience more of our lives through the disembodied symbols flickering across our screens—is that we’ll begin to lose our humanness, to sacrifice the very qualities that separate us from machines.

In explaining how technologies numb the very faculties they amplify, to the point even of “autoamputation,” McLuhan was not trying to romanticize society as it existed before the invention of maps or clocks or power looms. Alienation, he understood, is an inevitable by-product of the use of technology. Whenever we use a tool to exert greater control over the outside world, we change our relationship with the world. Control can be wielded only from a psychological distance. In some cases, alienation is precisely what gives a tool its value. We build houses and sew Gore-Tex jackets because we want to be alienated from the wind and the rain and the cold. We build public sewers because we want to maintain a healthy distance from our own filth. Nature isn’t our enemy, but neither is it our friend. McLuhan’s point was that an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained.

The more that people depended on explicit guidance from software programs, the less engaged they were in the task and the less they ended up learning.

A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper. The reason, according to attention restoration therapy, or ART, is that when people aren’t being bombarded by external stimuli, their brains can, in effect, relax. They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their ability to control their mind.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s