After millions of years of remembering what matters, is the way memory works changing? Drew Turney investigates.
As we offload more of the storage of information onto technology, might we be losing the art of remembering for ourselves?
As long ago as 2002 writer Cory Doctorrow, speaking about whether his blog were to disappear, said; ‘huge swaths of acquired knowledge would simply vanish …my blog frees me up from having to remember the minutiae of my life’.
We all know the feeling to some degree as we recruit machinery to stand in as our memories more and more the busier life gets.
But is doing so changing us? When we can reach into the networks and airwaves to pluck information with impunity, is the technosphere our new collective memory (especially when – for the first time in history – virtually no fact is beyond our reach)?
We’ve always relied on external sources (including technology) for information storage, whether it’s putting a calendar note in your phone or asking the oldest tribe member where the waterhole is.
But as a recent study about Google’s effects on memory from New York’s Columbia University found, we’re better able to remember where to find the information we need than the information itself. Subjects were also less likely to remember something when they knew they could look up online later.
The tacit conclusion is that if we know we have access to knowledge through what’s called distributed cognition or transactive memory (the web, other members of the tribe, etc), we don’t bother remembering it.
Georgetown University neurologist and bioethicist James Giordano calls those mental prompts ‘identicants’. “Rather than relatively complete ideas, we’ll initially recall iconic labels as placeholders to engage technologies to retrieve them,” he says.
Instead of internalising the torrent of information that characterises the modern age, it’s tempting to think if we could just clear all those messy little factoids out and have machines remember them for us our mental capacity will be free for deeper, abstract or creative thought.
But Ian Robertson, psychologist and author of The Winner Effect, warns that even though you might be less stressed in doing so, we can’t think of the brain like a computer with finite disk space. “Your brain doesn’t get full – the permutations of connectivity are almost infinite,” he says. “The more you learn, the more you can learn. More things connect to other aspects of your memory and that makes you more skilled at storing and pulling them out.”
A better way to look at how technology is affecting our memory might be the reason behind how and why we remember things in the first place. Many memories – even simple ones – are tinged with emotion. Your bank’s phone number is going to mean something very different from the mobile of a beloved in a new relationship, for example.
As Flinders University psychologist Jason McCarley points out, the Columbia, New York study was conducted with random facts that didn’t necessarily mean anything to the subjects. “It seems less likely we’d offload memory for information that’s meaningful or important,” he says. “So the idea that technology will compromise our general quality of thought or creativity is probably overwrought.”
As Macquarie University psychologist Amanda Barnier adds, we’re not only meaning-making machines, we add the dimension of context which makes raw information workable. “If the task of cognition is to make sense of things and make them relevant in everyday life, a computer can’t do that for you.”
We can also choose what information deserves deeper consideration through the simple act of paying closer attention when we know it will do something for us, whereas a computer gives every input equal weight – from a forgettable joke on Facebook to your online banking password. Repeated focus on something files it away beyond the hippocampus (the brain’s ‘memory acquisition’ apparatus) and it becomes another of the millions of mental units available for instant recall.
The emotion and focus of holding information internally also comes with an appreciation of its potential meaning. In fact, having to go beyond our borders for information might even tax the mental resources we should be putting to better use. After all, our brains have evolved to synthesise facts, not signposts. “Knowing is critical as a foundation for new and creative thinking that extends out from that base,” says University of Otago, NZ, psychologist Cliff Abraham.
“If you’re going to be successful in a profession you need to collect a lot of information,” says University of NSW professor of neuropsychiatry Perminder Sachdev. “If you don’t have readily accessible information in your head but just try to get it from other sources, it’s going to be difficult for it to lead to creative thought.”
But when we need to augment what we know and remember with the wisdom of the crowd, technology enables it like never before. “Is technology affecting our memory and how we learn?” asks computational neuroscience Paul King. “Certainly. For those with curiosity, learning has become more self-directed and dynamic.”
So the question might not be whether technology is affecting the way we remember things, but how. Sure, we’ve never had such repositories of information at our disposal before. But after millions of years of remembering what matters, the way we remember isn’t going to fundamentally change any time soon. Because so much of Cory Doctorrow’s minutiae can be stored off-brain efficiently, we may be facing the best of both worlds…