conclusion

Conclusion


The Foraging Brain

I began this book with the allegation that most of our instincts about learning are misplaced, incomplete, or flat wrong. That we invent learning theories out of whole cloth, that our thinking is rooted more in superstition than in science, and that we misidentify the sources of our frustration: that we get in our own way, unnecessarily, all the time. In the chapters that followed, I demonstrated as much, describing landmark experiments and some of the latest thinking about how remembering, forgetting, and learning are all closely related in ways that are neither obvious nor intuitive. I also showed how those unexpected relationships can be exploited by using specific learning techniques.
What I have not done is try to explain why we don’t know all this already.
If learning is so critical to survival, why do we remain so ignorant about when, where, and how it happens? We do it naturally, after all. We think about how best to practice, try new approaches, ask others we think are smarter for advice. The drive to improve never really ends, either. By all rights, we should have developed pretty keen instincts about how best to approach learning. But we haven’t, and the reasons why aren’t at all apparent. No one that I know of has come forward with a convincing explanation, and the truth is, there may not be one.
I do have one of my own, however, and it’s this: School was born yesterday. English class, Intro to Trig, study hall, soccer practice, piano lessons, social studies, art history, the Russian novel, organic chemistry, Zeno’s paradoxes, jazz trumpet, Sophocles and sophomore year, Josephus and gym class, Modern Poetry and Ancient Civilizations: All of it, every last component of what we call education, is a recent invention in the larger scheme of things. Those “ancient” civilizations we studied in middle school? They’re not so ancient, after all. They date from a few thousand years ago, no more. Humans have been around for at least a million, and for the vast majority of that time we’ve been preoccupied with food, shelter, and safety. We’ve been avoiding predators, ducking heavy weather, surviving by our wits, foraging. And life for foragers, as the Harvard psychologist Steven Pinker so succinctly puts it, “is a camping trip that never ends.”
Our foraging past had some not so obvious consequences for learning. Think for a moment about what it meant, that lifelong camping trip. Hunting and tracking were your reading and writing. Mapping the local environment—its every gully, clearing, and secret garden—was your geometry. The science curriculum included botany, knowing which plant had edible berries and which medicinal properties; and animal behavior, knowing the hunting routines of predators, the feeding habits of prey.
Over the years you’d get an education, all right. Some of it would come from elders and peers, but most of it would be accumulated through experience. Listening. Watching. Exploring the world in ever-widening circles. That is how the brain grew up learning, piecemeal and on the fly, at all hours of the day, in every kind of weather. As we foraged for food, the brain adapted to absorb—at maximum efficiency—the most valuable cues and survival lessons along the way.
It became a forager, too—for information, for strategies, for clever ways to foil other species’ defenses and live off the land. That’s the academy where our brains learned to learn, and it defines who we are and how we came to be human.
Humans fill what the anthropologists John Tooby and Irven DeVore called the “cognitive niche” in evolutionary history. Species thrive at the expense of others, each developing defenses and weapons to try to dominate the niche it’s in. The woodpecker evolved an extraordinary bone structure to pound holes in tough bark and feed on the insects hidden in trees. The brown bat evolved an internal sonar, called echolocation, allowing it to hunt insects at dusk. We evolved to outwit our competitors, by observing, by testing our intuitions, by devising tools, traps, fishhooks, theories, and more.
The modern institution of education, which grew out of those vestigial ways of learning, has produced generations of people with dazzling skills, skills that would look nothing less than magical to our foraging ancestors. Yet its language, customs, and schedules—dividing the day into chunks (classes, practices) and off-hours into “study time” (homework)—has come to define how we think the brain works, or should work. That definition is so well known that it’s taken for granted, never questioned. We all “know” we need to be organized, to develop good, consistent study routines, to find a quiet place and avoid distractions, to focus on one skill at a time, and above all, to concentrate on our work. What’s to question about that?
A lot, it turns out. Take “concentration,” for example, that most basic educational necessity, that mental flow we’re told is so precious to learning. What is concentration, exactly? We all have an idea of what it means. We know it when we see it, and we’d like more of it. Yet it’s an ideal, a mirage, a word that blurs the reality of what the brain actually does while learning.
I remember bringing my younger daughter to my newspaper office one weekend a few years ago when she was twelve. I was consumed with a story I had to finish, so I parked her at an empty desk near mine and logged her into the computer. And then I strapped in at my desk and focused on finishing—focused hard. Occasionally, I looked up and was relieved to see that she was typing and seemed engrossed, too. After a couple hours of intense work, I finished the story and sent it off to my editor. At which point, I asked my daughter what she’d been up to. She showed me. She’d been keeping a moment-to-moment log of my behavior as I worked. She’d been taking field notes, like Jane Goodall observing one of her chimpanzees:
10:46—types
10:46—scratches head
10:47—gets papers from printer
10:47—turns chair around
10:48—turns chair back around
10:49—sighs
10:49—sips tea
10:50—stares at computer
10:51—puts on headset
10:51—calls person, first word is “dude”
10:52—hangs up
10:52—puts finger to face, midway between mouth and chin, thinking pose?
10:53—friend comes to desk, he laughs
10:53—scratches ear while talking
And so on, for three pages. I objected. She was razzing me, naturally, but the phone call wasn’t true, was it? Did I make a call? Hadn’t I been focused the whole time, locked in, hardly looking away from my screen? Hadn’t I come in and cranked out my story without coming up for air? Apparently not, not even close. The truth was, she could never have invented all those entries, all that detail. I did the work, all right, and I’d had to focus on it. Except that, to an outside observer, I looked fidgety, distracted—unfocused.
The point is not that concentration doesn’t exist, or isn’t important. It’s that it doesn’t necessarily look or feel like we’ve been told it does. Concentration may, in fact, include any number of breaks, diversions, and random thoughts. That’s why many of the techniques described in this book might seem unusual at first, or out of step with what we’re told to expect. We’re still in foraging mode to a larger extent than we know. The brain has not yet adapted to “fit” the vocabulary of modern education, and the assumptions built into that vocabulary mask its true nature as a learning organ.
The fact that we can and do master modern inventions like Euclidean proofs, the intricacies of bond derivatives, and the fret board hardly means those ancient instincts are irrelevant or outmoded. On the contrary, many scientists suspect that the same neural networks that helped us find our way back to the campsite have been “repurposed” to help us find our way through the catacombs of academic and motor domains. Once central to tracking our location in physical space, those networks adjusted to the demands of education and training. We don’t need them to get home anymore. We know our address. The brain’s internal GPS—it long ago evolved internal communities of so-called grid cells and place cells, to spare us the death sentence of getting lost—has retuned itself. It has adapted, if not yet perfectly.
Scientists are still trying to work out how those cells help us find our way in modern-day learning. One encompassing theory is called the Meaning Maintenance Model, and the idea is this: Being lost, confused, or disoriented creates a feeling of distress. To relieve that distress, the brain kicks into high gear, trying to find or make meaning, looking for patterns, some way out of its bind—some path back to the campsite. “We have a need for structure, for things to make sense, and when they don’t, we’re so motivated to get rid of that feeling that our response can be generative,” Travis Proulx, a psychologist at Tilburg University in the Netherlands, told me. “We begin to hunger for meaningful patterns, and that can help with certain kinds of learning.”
Which kinds? We don’t know for sure, not yet. In one experiment, Proulx and Steven J. Heine, a psychologist at the University of British Columbia, found that deliberately confusing college students—by having them read a nonsensical short story based on one by Franz Kafka—improved their performance by almost 30 percent on a test of hidden pattern recognition, similar to the colored egg test we discussed in Chapter 10. The improvements were subconscious; the students had no awareness they were picking up more. “Kafka starts out normally, the first couple pages make you think it’s going to be a standard narrative and then it gets stranger and stranger,” Proulx told me. “Psychologists don’t really have a word for the feeling that he creates, but to me it goes back to the older existentialists, to a nostalgia for unity, a feeling of uncanniness. It’s unnerving. You want to find your way back to meaning, and that’s what we think helps you to extract these very complex patterns in this artificial grammar, and perhaps essential patterns in much more that we’re asked to study.”
When we describe ourselves as being “lost” in some class or subject, that sentiment can be self-fulfilling, a prelude to failure or permission to disengage entirely, to stop trying. For the living brain, however, being lost—literally, in some wasteland, or figuratively, in The Waste Land—is not the same as being helpless. On the contrary, disorientation flips the GPS settings to “hypersensitive,” warming the mental circuits behind incubation, percolation, even the nocturnal insights of sleep. If the learner is motivated at all, he or she is now mentally poised to find the way home. Being lost is not necessarily the end of the line, then. Just as often, it’s a beginning.
• • •

I have been a science reporter for twenty-eight years, my entire working life, and for most of that time I had little interest in writing a nonfiction book for adults. It was too close to my day job. When you spend eight or nine hours a day sorting through studies, interviewing scientists, chasing down contrary evidence and arguments, you want to shut down the factory at the end of the day. You don’t want to do more of the same; you don’t want to do more at all. So I wrote fiction instead—two science-based mysteries for kids—adventures in made-up places starring made-up characters. As far from newspapering as I could get.
The science itself is what turned me around. Learning science, cognitive psychology, the study of memory—call it what you like. The more I discovered about it, the stronger the urge to do something bigger than a news story. It dawned on me that all these scientists, toiling in obscurity, were producing a body of work that was more than interesting or illuminating or groundbreaking. It was practical, and not only that, it played right into the way I had blossomed as a student all those years ago, when I let go of the reins a bit and widened the margins. I was all over the place in college. I lived in casual defiance of any good study habits and also lived—more so than I ever would have following “good” study habits—with the material I was trying to master. My grades were slightly better than in high school, in much harder courses. In a way, I have been experimenting with that approach ever since.
The findings from learning science have allowed me to turn my scattered nonstrategy into tactics, a game plan. These findings aren’t merely surprising. They’re specific and useful. Right now. Today. And the beauty is, they can be implemented without spending a whole lot more time and effort and without investing in special classes, tutors, or prep schools.
In that sense, I see this body of work as a great equalizer. After all, there’s so much about learning that we can’t control. Our genes. Our teachers. Where we live or go to school. We can’t choose our family environment, whether Dad is a helicopter parent or helicopter pilot, whether Mom is nurturing or absent. We get what we get. If we’re lucky, that means a “sensuous education” of the James family variety, complete with tutors, travel, and decades of in-depth, full-immersion learning. If we’re not, then … not.
About the only thing we can control is how we learn. The science tells us that doing a little here, a little there, fitting our work into the pockets of the day is not some symptom of eroding “concentration,” the cultural anxiety du jour. It’s spaced study, when done as described in this book, and it results in more efficient, deeper learning, not less. The science gives us a breath of open air, the freeing sensation that we’re not crazy just because we can’t devote every hour to laser-focused practice. Learning is a restless exercise and that restlessness applies not only to the timing of study sessions but also to their content, i.e., the value of mixing up old and new material in a single sitting.
I’ve begun to incorporate learning science into a broad-based theory about how I think about life. It goes like this: Just as modern assumptions about good study habits are misleading, so, too, are our assumptions about bad habits.
Think about it for a second. Distraction, diversion, catnaps, interruptions—these aren’t mere footnotes, mundane details in an otherwise purposeful life. That’s your ten-year-old interrupting, or your dog, or your mom. That restless urge to jump up is hunger or thirst, the diversion a TV show that’s integral to your social group. You took that catnap because you were tired, and that break because you were stuck. These are the stitches that hold together our daily existence; they represent life itself, not random deviations from it. Our study and practice time needs to orient itself around them—not the other way around.
That’s not an easy idea to accept, given all we’ve been told. I didn’t trust any of these techniques much at first, even after patting my college self on the back for doing everything (mostly) right. Self-congratulation is too easy and no basis for making life changes. It was only later, when I first began to look closely at the many dimensions of forgetting that my suspicious ebbed. I’d always assumed that forgetting was bad, a form of mental corrosion; who doesn’t?
As I dug into the science, however, I had to reverse the definition entirely. Forgetting is as critical to learning as oxygen, I saw. The other adjustments followed, with trial and error. For example, I like to finish. Interrupting myself a little early on purpose, to take advantage of the Zeigarnik effect, does not come naturally to me. Unfortunately (or, fortunately) I have no choice. Being a reporter—not to mention a husband, dad, brother, son, and drinking partner—means having to drop larger projects, repeatedly, before having a chance to sit down and complete them. Percolation, then, is a real thing. It happens for me, all the time, and without it I could never have written this book.
Applying these and other techniques has not made me a genius. Brilliance is an idol, a meaningless projection, not a real goal. I’m continually caught short in topics I’m supposed to know well, and embarrassed by what I don’t know. Yet even that experience smells less of defeat than it once did. Given the dangers of fluency, or misplaced confidence, exposed ignorance seems to me like a cushioned fall. I go down, all right, but it doesn’t hurt as much as it once did. Most important, the experience acts as a reminder to check and recheck what I assume I know (to self-test).
The science of learning is not even “science” to me anymore. It’s how I live. It’s how I get the most out of what modest skills I’ve got. No more than that, and no less.
I will continue to follow the field. It’s hard not to, once you see how powerful the tools can be—and how easily deployed. The techniques I’ve laid out here are mostly small alterations that can have large benefits, and I suspect that future research will focus on applications. Yes, scientists will surely do more basic work, perhaps discovering other, better techniques and more complete theories. The clear value of what’s already there, however, begs for an investigation into how specific techniques, or combinations, suit specific topics. “Spaced interleaving” may be the best way to drive home math concepts, for instance. Teachers might begin to schedule their “final” exam for the first day of class, as well as the last. Late night, mixed-drill practice sessions could be the wave of the future to train musicians and athletes. Here’s one prediction I’d be willing to bet money on: Perceptual learning tools will have an increasingly central role in advanced training—of surgeons and scientists, as well as pilots, radiologists, crime scene investigators, and more—and perhaps in elementary education as well.
Ultimately, though, this book is not about some golden future. The persistent, annoying, amusing, ear-scratching present is the space we want to occupy. The tools in this book are solid, they work in real time, and using them will bring you more in tune with the beautiful, if eccentric, learning machine that is your brain. Let go of what you feel you should be doing, all that repetitive, overscheduled, driven, focused ritual. Let go, and watch how the presumed enemies of learning—ignorance, distraction, interruption, restlessness, even quitting—can work in your favor.
Learning is, after all, what you do.

Comments

Popular posts from this blog

ft

karpatkey