The interface between the self and the Web has been a topic that I think about a lot. I’ve written previously about Sherry Turkle’s work and her new book, Evocative Objects, and some of the ways the porous membrane between a cyber persona and a physical self can almost disappear.
The generational implications of the last ten years of technological development are also provocative. Who knows what will shift and change for my children and their deviced and gadgeted cohorts?
The following excerpt is from an article by Clive Thompson in Wired and suggests some interesting twists on these themes.
This summer, neuroscientist Ian Robertson polled 3,000 people and found that the younger ones were less able than their elders to recall standard personal info. When Robertson asked his subjects to tell them a relative’s birth date, 87 percent of respondents over age 50 could recite it, while less than 40 percent of those under 30 could do so. And when he asked them their own phone number, fully one-third of the youngsters drew a blank. They had to whip out their handsets to look it up.
That reflexive gesture — reaching into your pocket for the answer — tells the story in a nutshell. Mobile phones can store 500 numbers in their memory, so why would you bother trying to cram the same info into your own memory? Younger Americans today are the first generation to grow up with go-everywhere gadgets and services that exist specifically to remember things so that we don’t have to: BlackBerrys, phones, thumb drives, Gmail.
I’ve long noticed this phenomenon in my own life. I can’t remember a single friend’s email address. Hell, sometimes I have to search my inbox to remember an associate’s last name. Friends of mine space out on lunch dates unless Outlook pings them. And when it comes to cultural trivia — celebrity names, song lyrics — I’ve almost given up making an effort to remember anything, because I can instantly retrieve the information online.
In fact, the line between where my memory leaves off and Google picks up is getting blurrier by the second. Often when I’m talking on the phone, I hit Wikipedia and search engines to explore the subject at hand, harnessing the results to buttress my arguments.
My point is that the cyborg future is here. Almost without noticing it, we’ve outsourced important peripheral brain functions to the silicon around us.
And frankly, I kind of like it. I feel much smarter when I’m using the Internet as a mental plug-in during my daily chitchat. Say you mention the movie Once: I’ve never seen it, but in 10 seconds I’ll have reviewed a summary of the plot, the actors, and its cultural impact. Machine memory even changes the way I communicate, because I continually stud my IMs with links, essentially impregnating my very words with extra intelligence.
You could argue that by offloading data onto silicon, we free our own gray matter for more germanely “human” tasks like brainstorming and daydreaming. What’s more, the perfect recall of silicon memory can be an enormous boon to thinking. For example, I’ve been blogging for four years, which means I’ve poured out about a million words’ worth of my thoughts online. This regularly produces the surreal and delightful experience of Googling a topic only to unearth an old post that I don’t even remember writing. The machine helps me rediscover things I’d forgotten I knew — it’s what author Cory Doctorow refers to as an “outboard brain.”
Still, I have nagging worries. Sure, I’m a veritable genius when I’m on the grid, but am I mentally crippled when I’m not? Does an overreliance on machine memory shut down other important ways of understanding the world?
There’s another type of intelligence that comes not from rapid-fire pattern recognition but from slowly ingesting and retaining a lifetime’s worth of facts. You read about the discoveries of Madame Curie and the history of the countries bordering Iraq. You read War and Peace. Then you let it all ferment in the back of your mind for decades, until, bang, it suddenly coalesces into a brilliant insight. (If Afghanistan had stores of uranium, the Russians would’ve discovered nuclear energy before 1917!)
We’ve come to think of human intelligence as being like an Intel processor, able to quickly analyze data and spot patterns. Maybe there’s just as much value in the ability to marinate in the seemingly trivial.
Of course, it’s probably not an either/or proposition. I want both: I want my organic brain to contain vast stores of knowledge and my silicon overmind to contain a stupidly huge amount more.
At the very least, I’d like to be able to remember my own phone number.
Deborah – This disturbs me because of what I perceive is a disconnect between facts, as recitated in factual information style – mainly as a functioning of mind, and information which is an amalgam of what is directly experienced fact as influencing an almost visceral aspect of memory which engaged not only the mind but the sensory apparatus that so influences how we perceive and remember “Things”. i think i would prefer to think myself less “intelligent” or less in control of the flow of information, because it is not only what I know as fact that might be so important, but how I process facts through my own idiosyncratic perceptual and sensory sense of self, which might cause me to be less informed, but informed at a deeper level of unity and coherence. Its a toss up between quantity and quality, for me. G
[…] her new book, Evocative Objects, and some of the ways the porous membrane between a cyber … http://slowmuse.wordpress.com/2007/11/29/leaky-margins/ Slow Muse […]
G, Thank you for responding and offering up a valid (and important) set of concerns. The excerpt is not a statement of my personal point of view (sounds like the disclaimer at the beginning of a DVD) but included because it is provocative. To me anyway. Those of us who regularly tune into other channels not available on iphones or the Web sometimes take a more detached position on these matters. I hope that isn’t viewed as a tendency to the cavalier since that was not my intention. I always like hearing your well considered point of view.