The myth of
Have We Reached ``Peak Attention’’?
Why Lifehacking is ``morphine for a bed of nails''
Talk Given November, 2008, Fordham University
``Peak oil,'' ``peak water'' and ``peak everything'' -- these are predictions, based on mathematical analysis of supply and demand, that purportedly identify the moment when humanity has reached maximum level of supply for a given commodity. After the peak, we're bound to drill, mine, bottle or harvest less of the stuff with every passing year.
Part of today's conventional wisdom about digital media is the belief that we're at the point of ``peak attention.'' We have a limited supply of attention, we can't get more out of our brains, and demand for more -- from all that stuff we are supposed to know, track, remember, analyze -- is growing very fast.
As with oil or wheat, this supply-and-demand problem prompts people to look for substitutes that can take on some of the functions of the scarce commodity. Hence the market for what I call ``attention prosthetics'' -- machines that do attending for us (software, for example) and drugs like Ritalin and Adderall that enhance ability to focus. The market for these things is growing because Americans today fear of ADD in their kids, dementia in their elders, and, in themselves, inadequate capacity to attend, gather, and act on information. We go in for ``lifehacking,'' imagining that there are subtle secrets out there that will help us do better at reading our email and scheduling gym workouts. As for the future, many people welcome talk of implants in the brain and genetic alterations to make attention more abundant. It's likely that these ``attention prosthetics,'' as I call them -- be they drugs, software, gadgets or implants -- will proliferate.
The culture of attention prosthetics feeds the anxiety that it is supposed to assuage: As more of daily life is suffused with the assumption that attention is scarce, we have a harder time imagining any other way to see ourselves. Our stuff confirms the theory, which makes us get more stuff, which further confirms the theory. It seems common sense to believe, as Nobel laureate Herbert Simon put it: ``Information used to be the scarce resource. Now attention is the scarce resource.''
The spirit of this conventional wisdom is embodied, for example, in this much repeated factoid: ``A typical issue of the New York Times contains more information than the average person in the seventeenth century was likely to encounter during his entire lifetime.''
We should be skeptical about this claim that ours is a special age, if only because of the Copernican principle: Astronomers know it is safest to presume, when they study the cosmos, that our circumstances, whether they be in space or in time, are likely to be just average. In other words, the Copernican principle tells them the planet from which they observe, and its galaxy, are probably typical. Similarly, the place where you live is much more likely to be unremarkable than than to be an outlier -- much more probably Brooklyn than Antarctica. And the same is true of your era in history. It may be that we live in an era unlike any other in its demands on the human mind. But it's not probable.
And in fact there have been other eras in which people thought demands on attention were outstripping human capacities. Here is Walter Lippmann in 1922:
How cluttered, how capricious, how superfluous and clamorous is the
ordinary urban life of our time. We learn to understand why our addled
minds seize so little with precision, why they are caught up and
tossed about in a kind of tarantella by headlines and catch-words.
And here's Jerome K. Jerome in 1889, describing ``that fretful haste, that vehement striving, that is every day becoming more and more the bane of nineteenth-century life,'' which provides us with ``luxuries that only cloy, with pleasures that bore, with empty show [...] you never know a moment's freedom from anxiety and care, never gain a moment's rest for dreamy laziness.''
I want to suggest a counter-narrative about the effects of media and communications. That view says we are looking at the wrong end of the question, worrying about increasing supply of attention. We should be looking at what we think of as rising demand for attention. What is this demand, really? Is it actually taxing our capacities in new ways?
I believe the ``peak attention'' metaphor, revealing though it be of our culture's anxiety about running out of stuff, is entirely wrong. I believe we have all the attention we need, and that if we are feeling overwhelmed, overstretched, and distracted, it is not due to inevitable facts about the brain, nor to inescapable developments in media. To address our anxieties and calm ourselves down, we must look elsewhere.
What do we mean by attention? Usually one or more of these four concepts:
One is turning the spotlight of consciousness on something -- when we focus, and know we are focussing.
Two is absorption or ``flow'' -- that not entirely conscious ability to be engaged without consciously knowing what we are doing.
Three is what cognitive scientists call executive function, a set of capacities that seem to be associated most with activity in the prefrontal cortex, that are needed for planning, self-monitoring, and comparing options.
Four is unconscious awareness -- all those aspects of the world that we notice without knowing. Much of this awareness is filtered through a set of biases and predispositions that appear to be built in to the mind.
For example, recent studies have shown
- People are more attentive to movements by animals than by cars and trucks. Even people who grew up around vehicles, in urban environments, were more likely to notice an animal's roadside movement than a car's.
- People's decisions about another person were affected by what others said -- even when the ``gossip'' added nothing to what they already knew.
People write fast when hearing a race car driver's name , slow down when presented with stereotypes of old geezers , are more lax in their moral judgments if they have washed their hands before discussing an ethical issue.
Let me go back up this list. The fourth aspect of attention, unconscious attending, is important because it's the root of recent findings, in behavioral economics especially, that have overturned our view that people are rational maximizers of utility -- that in our decisions, we behave as `` rational economic man.'' As we are not rational or conscious when we make our choices, it follows that we cannot easily say if we are attention-poor. Much of attention takes place outside awareness.
The third meaning of attention on my list -- conscious planning and anticipating -- is the main concern of attention-extending tools and drugs. They're sold by scaring people into thinking that their planning/evaluating/comparing capacities have much more work to do than they did in our ancestors' time. The rhetoric of scarce attention is about helping this executive capacity protect the remaining two aspects on my list-- ability to lose yourself in some activity, and to turn the spotlight where you want it.
Now, there is no doubt that we're trying to do different things with the spotlight than our ancestors did. We can easily touch far more written and video-ed and recorded information, cram our heads with it. But it does not follow that we're doing more?
Here is an example of what past people attended to. It's from the anthropologist Louis Liebenberg, describing a day in the Kalahari Desert:
While tracking down a solitary wildebeest spoor [tracks] of the previous evening !Xo trackers pointed out evidence of trampling which indicated that the animal had slept at that spot. They explained consequently that the spoor leaving the sleeping place had been made early that morning and was therefore relatively fresh. The spoor then followed a straight course, indicating that the animal was on its way to a specific destination. After a while, one tracker started to investigate several sets of footprints in a particular area. He pointed out that these footprints all belonged to the same animal, but were made during the previous days. He explained that the particular area was the feeding ground of that particular wildebeest. Since it was, by that time, about mid-day, it could be expected that the wildebeest may be resting in the shade in the near vicinity.
Now, when someone argues that the New York Times contains more information in a day than a 17th century person sees in a lifetime, they're invoking a today's parochial definition of information. They're counting every new demand created by the medium of print or web if that's how you get your Times. But they're assuming that the capacities engaged now were lying fallow before.
Yet people who live, or lived, in a non-digital environment, as hunters or farmers, pay a lot of attention in circumstances in which many of us here would be blind. They did other things. Where is the evidence that they do less ?
I can use Wikipedia to find a picture of a wildebeest, but I couldn't name many trees in Central Park, or guess about tomorrow's weather from the look and smell of today, or name all my great-grandparents or my second cousins. The cliche about the Times is silly because it claims that only our kind of information -- printed, mediated, based in reading -- counts as demand.
If we look at all the information processing that the human brain can do, though, we see that the demand on our attention isn't really an exception from the general rule of human history: People attend to what matters to them, and filter out a lot of other stuff.
Filtering is, in fact, extremely important here. The fact is that attention in any of the four senses I mean -- focus, flow, planning and unconscious noticing -- is not just a matter of perceiving a stimulus. At many levels of explanation, it's about NOT noticing. About ignoring, averaging, setting aside, pruning.
To attend this lecture you're paying attention to sounds and sights but you're also ignoring sounds and sights, filtering them out. Your eyes are moving about in their sockets, making very small saccades, or jumps. You don't see me jiggling, though, because those irrelevant movements are filtered out in the process of translating light into neural signals. (When you look up at a star, without a point of reference in a black sky, your visual centers can't correct, and it seems to move) I'm not pronouncing every ``the'' in exactly the same acoustic form, but you're correcting for the variation. On up the scale, to more complex information processing -- when you recall this meeting, you will have a general impression of the day as good or bad, stimulating or boring, that will be a kind of sum of all the minutes. If you're bored by me but love everybody else speaking, you'll be left with an impression that it was a pretty good morning. Because I will have been filtered out.
This is how memory works. Eric Kandel, who won the Nobel prize in 2000 for his work on the physiology of memory, points out that the creation of long-term memory is not a process of recording, but also of forgetting. Average lunches, workdays and errands aren't remarkable, informative or important. as he says of life, "A lot of it is pretty depressing. It's good to be able to forget." People can't recall if they took their daily pill or where they parked because we prune away what is unremarkable, unimportant.
So: The mind has good filters. We're great at tuning out a landscape or newspaper which offers no motive to scan it. Today we attend much more to what we can take in from digital media. We attend much less to other kinds of information. I see no evidence that we've hit a new limit.
So why are we all so stressed out?
I think this is the reason: We aren't being bombarded by an increased amount of important information. We're being bombarded by cunning techniques of persuasion. We have a mind well-evolved to attend to important information and ignore the rest. What has changed is not how we cope with information but how we sense what is important.
So, that's the issue: How do we decide what is important? What we have to pay attention to and what we have to ignore? We're obsessed with the conscious aspects of attention -- we want to master our email, take charge of our social networking, be efficient multi-taskers -- but we're forgetting about the mounting evidence that we're paying attention all the time to stimuli that our outside conscious awareness -- that have an effect on us and our actions.
New media, as Tom de Zengotita says, expands the realm of choice. Reality is what we have no choice about. The rest is mediated.
Effect of digital everything is that we pick and choose what songs are on the iPod, what news is on the news reader, what emails go into the trash directly. It makes the world much more about the self. And people seem to like that self-importance. Everyone remembers where he was when Obama was elected, or when JFK was shot. Nobody talked about where ``I'' was on Pearl Harbor day.
There is a big problem, as de Zengotita has noted, with this expanded self -- of so much more information chosen because of its relationship to the receiver: it's much less obvious to us what's important. With fewer reality checks that say ``if you can't track a wildebeest you will starve'' we're susceptible to a sense that, hey, that unread email, that undownloaded blog post, could be important. We don't know. To the extent that information is about the self, then the self is the measure of that information's importance. The self that is inconstant in its minute-by-minute changes, that is highly influenced by circumstances; the self that is, in its perceptions and responses, largely unknown to the conscious mind. As an instrument to judge information, this self is unreliable. We sense this and it makes us nervous.
Instead of a reality test that tells us what to ignore and what to attend, we have inconsistent impulses, shifting with our moods and experiences. The more people assert themselves to control information, in other words, the less certain they are about their filters. In an information-rich world, asserting yourself is the opposite of protecting yourself.
This condition -- self-assertion leading to uncertainty about filters, leading to anxiety that we might miss something, leading to more self-assertion -- makes us vulnerable to rhetorical devices that reach that unconscious part of attention, that speak to the emotions directly. And by ``rhetorical devices'' I follow Richard Lanham in referring to visual appeals as well as those made of words.
Why does this matter?
Because if we think our problem is inherent in modern life, inevitable, then we cannot change our situation.
But if we see that the problem is located not in the world but in our own emotions, then we can do something about this attention problem of ours. We can do something at the personal level, involving self control, and, at the societal level, involving a collective decision to protect ourselves from attention rhetoric. We really would be better off if we could see that we aren't overwhelmed by information. We're overwhelmed by information anxiety.
People who have been raised in other traditions, far from those of our info-glut culture, are used to attending to matters outside themselves, and so they seem to know that ``peak attention'' is a myth. Some of those argue for the value of the natural world, in contrast to the human-made one. Others are religious. These skeptics have something to tell us about the narrowness of what we think of as information. I'll leave you with an example: The essayist and novelist Marilynne Robinson, reflecting on just how much attention we have to spare, if only we'd look differently at our mediated world.
We find comfort in anxiety because it engrosses our attention, which we have in surplus, and are usually at a loss to employ. And anxiety is a stimulant, like love, like hatred, though generally not so prone to extravagant expression as they are, indeed, even secretive, and therefore liable suddenly to produce great effects from what are apparently very small causes.'' [...] ``It is as if we took morphine to help us sleep on a bed of nails.''