In making these claims, Pinker reminds us that he joined Leda Cosmides and John Tooby enthusiastically as one of the founders of the most extreme version of cognitive evolutionary psychology (CEP), the "modular brain" theory. We know that the brain has all sorts of very specific speciallizations, but the notion of opaque independent functional modules is far from universally accepted. Books that have included intelligent, scholarly critiques of CEP or stress the importance of non-modular aspects of brain function include Kenan Malik, David Buller, Merlin Donald, Terrence Deacon, Terrence Sejnowski, and Jeffrey Schwartz. The contrast between Deacon and Pinker on the role of language in the evolution of the mind is particularly interesting.
My point is not at all to "debunk" CEP by presenting people who offer other kinds of theory, since I think CEP is a viable concept that probably gets some things right regarding the evolution of the mind. My point is that it seems too radical to claim that the mind and brain are simply and entirely modular in the way they would have to be for Pinker's statements above to be completely true. Pinker wants us to believe that the brain is modular and that experience cannot affect general abilities, yet he can't help using the term "deep reflection." It is difficult to imagine how such a thing as "deep reflection" even makes sense in the modular independent architeture Pinker is insisting protects our intellectual functions from the potentially deleterious effects of experience.
Pinker even explicitly acknowledges the work it takes to develop intellectual depth:
It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate.
If it takes so much work to develop intellectual depth, how is it reasonable to then also argue that our thinking can't be affected by experience, or by activities that detract from this developmental process?
Yes, he is right the brain has limits to how much it is reshaped by experience, but I think he significantly overstates the case. The issue is regarding specifics, not the general principle of neuroplasticity. What specific effects do specific activities have on our mind and brain over specific kinds of time period?
Pinker may very well be right that the web is not itself deteriorating our reflective thinking ability the way Nicholas Carr argues it is. Carr perhaps goes over the top when he says that his failing ability to concentrate is specifically due to his use of the web. However Pinker goes too far when he implies that the idea is simply silly in principle. It remains an empirical question, not just a conceptual one, unless we're replacing cognitive neuroscience with Pinkerist modularism.
Pinker also misses a much more important point, that our attitude toward technology affects the way it shapes our daily life. He assumes that those university activities he takes for granted will continue to be valued just because he himself takes their value for granted. The university was not always there, and there is no reason to assume it will always be there if we stop arguing for its value.
Pinker's ironic conclusion:
And to encourage intellectual depth, don’t rail at PowerPoint or Google. It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate. They are not granted by propping a heavy encyclopedia on your lap, nor are they taken away by efficient access to information on the Internet.
The new media have caught on for a reason. Knowledge is increasing exponentially; human brainpower and waking hours are not. Fortunately, the Internet and information technologies are helping us manage, search and retrieve our collective intellectual output at different scales, from Twitter and previews to e-books and online encyclopedias. Far from making us stupid, these technologies are the only things that will keep us smart.
Is it really our knowledge that is increasing exponentially, or is it available information?
That fact that Pinker seems to conflate the two is exactly the question begging that most fundamentally ignores the most central aspect of modern culture critique, are we gaining more knowledge because we are exposed to more information?
Against the cultural critics, the smart among us have always managed to take responsibility for their own minds and organize the available information and think deeply enough to create meaningful individual knowledge from it.
Against Pinker and the others who think cultural critics are just "panicking," the fact that some smart people always manage to cultivate knowledge in spite of the challenges offered by new tools doesn't mean that everyone else will automatically inherit that ability.
If we assume that simply having access to a lot of information will make us smart (this is not an exaggeration, it is literally how many Internet optimists think), we will end up missing the real issue.
The real issue is not whether the Internet fries your brain, there is no really good evidence so far that it does. The real issue is whether we recognize and continue to appreciate the work it takes to cultivate knowledge and expertise or whether we take these things for granted.
Update: Nicholas Carr responds to Pinker on his own RoughType blog.
Update: Commentary by Nick Bilton with some useful references, argues sensibly that each form of media has its potential unique value - http://bits.blogs.nytimes.com/2010/06/11/in-defense-of-computers-the-internet-and-our-brains/?ref=technology
Update 6/15: Some of the reviewers do more thougthfully reflect on bigger picture issues and ask somewhat deeper questions than just the alarmist one of whether the Internet is "frying our brains." See this review in New Republic by Todd Gitlin: "The Uses of Half-Truths."
Also: Nicholas Carr and Douglas Rushkoff respond to Pinker on EDGE.
This is a very important essay, both Carr and Pinker and Mangen and Dr Wolf. But Todd, I hope you can blog or interview me on my ideas here below:
ReplyDeleteIt is my hunch that reading
on screens is not "reading" per se, but a new form of human reading,
and I call it "screening" for now until a better word comes down the
line, and it will, someday. Soon. I have been trying to alert the
media and newspapers to this but not one reporter will interview me. I
have contacted Newsweeka and Time and the NYTimes and Atlantic and the
Boston Globe and not one outlet will publish my eccentric views on
this. But watch: future MRI scan studies at Tufts and UCLa will prove
that reading on paper surfaces lights up different parts of our brains
vs when we read on screens and that reading on paper is vastly
superiod for processing of info, retention of info, analysis of info
and critical thinking about the info read. I have no PHD so nobody
listens to me, but let some Times reporter interview Dr Wold and Dr
Tenner and Anne Mangen in Norway, and Paul Saffo and Kevin Kelly and
Marvin Minsky, they all agree with me. The Times will listent to
them. Sharon Begley at Newsweek is writing a big cover story about
this now. As in the New York Times Sunday magazine and Time has a
summer cover on this too. See more at my blogs. - Danny Bloom, Tufts
1971
To sum up: reading on screens is not reading per se. it is a new form
of human reading, vastly inferior to paper reading. but what does this
mean for the future of civilization and does anybody care? I do.
danbloom AT gmail DOT com
Thanks very much for your comments, Dan.
ReplyDeleteI really appreciate your frustration, I found that a lot of smart people seemed to be dismissing Carr out of hand even without reading his book. Now I don't agree with his most pessimistic message, that the different ways of using attention tend to destroy each other because the brain is permanently reshaped by particular media use. But I do think he makes a reasonable argument for the possibility, and says a lot of other things that are not particularly implausible.
Sure he has written some seemingly alarmist articles about Google making us stupid and son, but the negative over-reaction to his ideas really caught my attention as well. I would guess you are experiencing some of the same. And the harder it is to get people to listen, the more you want to get your ideas out there.
From my perspective, I suspect that there are real functionally relevant differences between holding a physical print medium in your hands and reading it, especially the way I taught myself to study, by marking, annotating, diagramming, bookmarking, reviewing, highlighting, etc.. Even with the best reading tech, and hte highest resolution available, these things are still easier and more efficient with print media. The physical medium is an extension of our mind, an analog, in a way that electronic books are not, at least yet. I remember the position of something I read on the page and the position of the page in the book. My brain maps the knowledge I obtained from the book onto a visual picture of the book in my memory, and this gives me an index for reflective thinking.
Perhaps this will become possible with future reading technologies, and perhaps not. I know I am reluctant to let go of physical books to try to find out!
I do care, very much, but I find that I need to be optimistic about future tech because I don't see being able to oppose the cultural and economic forces toward e-books and web media even with my best arguments. I am not excited about tilting at windmills, as much as I probably seem like a curmudgeon in some of my articles. I am more interested in knowing how I can better make use of the things I can't stop, and keep using the things that I find benefit me.