• October 20, 2014

Linked In With: a Writer Who Questions the Wisdom of Teaching With Technology

Is Technology Making Your Students Stupid? 1

Joanie Simon

More educational technology doesn't result in more learning, says Nicholas Carr.

Enlarge Image
close Is Technology Making Your Students Stupid? 1

Joanie Simon

More educational technology doesn't result in more learning, says Nicholas Carr.

Multimedia—dangerous!

Online research—depthless!

Classroom screens—dubious!

If you're looking for a contrarian take on technology, Nicholas Carr is your man. In 2003 the author touched off a debate about the role of computers in business with his article "IT Doesn't Matter." He caused another kerfuffle five years later with an Atlantic piece, "Is Google Making Us Stupid?"

Now the 51-year-old, Colorado-based writer has published a new book, The Shallows, which warns that the Internet is rewiring our brains and short-circuiting our ability to think. The Chronicle called Mr. Carr to get his opinion about what this means for teaching and research.

Q. The idea of neuroplasticity is central to your argument. What does this mean, and what does it have to do with how the Internet is changing our brains?

A. For a long time, even when I was going to school, we were taught that the structure of the human brain was basically fixed by the time we got to our early 20s. But it's become clear in the last few decades that in fact, even the adult human brain is quite malleable. And our neural circuitry is kind of always in the process of adapting to circumstances and to environment and to the tools we use, particularly those for finding information and making sense of information. When you look across all of the evidence, there's very strong suggestions that the way we take in information online or through digital media impedes understanding, comprehension, and learning. Mainly because all of those things combine to create a very distracted, very interrupted environment.

Q. You write that educators assumed multimedia would aid learning, but that has been "contradicted by research." Explain.

A. Whenever we have a new information technology, there tends to be a lot of enthusiasm throughout society, but also in the educational community. That was true with hypertext in the 80s and 90s, and I think it continues to be true with multimedia. But what the evidence suggests is that, unless it's very carefully planned with an eye to how the brain processes information, multimedia actually impedes learning rather than enhances it, simply because it divides our attention. Studies pretty clearly show that when our attention is divided, it becomes much more difficult to transfer information from our short-term memory, which is just the very temporary store, to our long-term memory, which is the seat of understanding.

Q. What studies?

A. There's a study called "The Laptop and the Lecture" that divided a class into two sets. One-half of the students could use their laptops in a classroom while listening to a lecture. They were free to surf the Web. And the other half had to keep their laptops closed. And then there was a test of comprehension. And the students who used their laptops scored significantly lower on the comprehension test for how well they could remember the content of the lecture. An interesting twist was that students who visited sites relevant to the content of the lecture actually did even worse on the test than students who browsed unrelated sites. It indicates that, even if you think that allowing students to look at other information relevant to what they're being taught might enhance their learning, it actually appears to have the opposite effect.

Q. Some professors are interested in integrating social technology—blogs, wikis, Twitter—into their teaching. Are you suggesting that is a misguided approach?

A. I'm suggesting that it would be wrong to assume that that path is always the best path. I'm certainly not suggesting that we take a Luddite view of technology and think it's all bad. But I do think that the assumption that the more media, the more messaging, the more social networking you can bring in will lead to better educational outcomes is not only dubious but in many cases is probably just wrong. It has to be a very balanced approach. Educators need to familiarize themselves with the research and see that in fact one of the most debilitating things you can do to students is distract them.

Q. Most people would praise the Internet's power to open access to information for researchers, but you suggest a downside.

A. I should start off by saying—I don't want to come off as a hypocrite—I use the Net for research all the time. It's an extremely valuable way to uncover information. I think that there are at least two possible downsides. We know that as human beings we love new information. That's pretty well proven by psychological research. If given the opportunity to find something new, we'll usually go in that direction, whether that information is trivial or important. And what the Net does is give us huge opportunities to get new information. We can see that in our habits when we're on Facebook or social networking or just on the Web in general. But that same instinct can bleed over even when we're doing formal academic research. Because there's so much information at our fingertips, we can get stuck just constantly uncovering new relevant information and never stopping and actually reading and thinking deeply about any one piece of information.

The other is the study by James Evans that was in Science magazine a couple of years ago. He looked at what happens when academic journals publish their archives online. The assumption is this will be a great boon to research because suddenly all these articles that used to be difficult to find, suddenly we can just search them. And what he discovered is, in fact, the effect was kind of the opposite of what we expected, in that actually the number of articles cited goes down when these journals go online. And also people tend to cite more-recent articles and not go back in time to older ones. His hypothesis there is that we become so dependent on search, and the results from searches are determined by popularity of one sort or another. And the risk of using search for online research is that everybody gets led in the same directions to a smaller number of citations which, as they become ever more popular, become the destination for more and more searches. And ... he suggested that simply the act of flipping through paper copies of journals actually may expose researchers to a wider array of evidence.

Q. If the Internet is making us so distracted, how did you manage to write a 224-page book and read all the dense academic studies that much of it is based on?

A. It was hard. The reason I started writing it was because I noticed in myself this increasing inability to pay attention to stuff, whether it was reading or anything else. When I started to write the book, I found it very difficult to sit and write for a couple of hours on end or to sit down with a dense academic paper. One thing that happened at that time is I moved from outside of Boston, a really highly connected place, to renting a house in the mountains of Colorado. And I didn't have any cellphone service. I had a very slow Internet connection. I dropped off of Facebook. I dropped out of Twitter. I basically stopped blogging for a while. And I fairly dramatically cut back on checking e-mail. After I got over the initial period of panic that I was missing out on information, my abilities to concentrate did seem to strengthen again. I felt in a weird way intellectually or mentally calmer. And I could sit down and write or read with a great deal of attentiveness for quite a long time.

Q. Did you ever wish you could stick a hyperlink in your book?

A. No. There's already enough footnotes. I'm not the only one now who's starting to question all these hyperlinks. Particularly if you're writing something long and thoughtful, do you want to keep encouraging readers to jump out and jump back in? Or is there something to be said for reducing the temptation to hop around and actually encouraging attentiveness?

Q. What do you make of Steven Pinker's critique of you? The Harvard cognitive psychologist is dismissive of one of your key points, that experience can change the brain. He says cognitive neuroscientists "roll their eyes at such talk."

A. I think he's overly dismissive. I say that with great respect. It's important to read his thoughts in the context of a broader battle going on in the world of cognitive neuroscience, between those who, like Pinker, are strongly behind evolutionary psychology, which basically says that our behavior is very much determined by our genetic heritage, versus those who believe that the brain is adaptable—and we're not locked into that kind of behavior, and in fact our brain changes as the environment changes. Pinker's background and Pinker's views are very much antithetical to the "highly adaptive argument." There's a whole lot of neuroscientists who are uncovering evidence that in fact our use of digital media and media multitasking is having a substantial effect on the way we think.

Q. Colleges refer to a screen-equipped space as a "smart classroom." What would you call it?

A. I would call it a classroom that in certain circumstances would be beneficial and in others would actually undermine the mission of the class itself. I would maybe call it a questionable classroom.

Comments

1. davidcomposer - July 06, 2010 at 10:56 am

Regarding "smart classrooms," I have found them to be next to useless in one area where I teach; music theory. Often the best way to illustrate the lecture, and respond to students' questions is to write an example on the chalk/white board, then play it on the piano. It is also necessary to give students an example to work on, then walk around and observe their work. This is very difficult when the classroom is set up like an auditorium, and there is no board to write on. In some subjects, interaction with the students is very important, and smart classrooms are not suitable for this.

I have had better results using media in teaching music history and humanities courses, but the time it takes to prepare the media presentation is considerable, especially when you're an adjunct with four other jobs.

Finally, one of the biggest problems with "smart classrooms" is that often the equipment and software just doesn't work. Administrators are quick to purchase equipment & software, but are slow to hire the support staff necessary to maintain and update the media (computers don't need health care benefits). I have often come to class with a material that is not compatible with the outdated software in a given classroom (again, when you're an adjunct you don't get your own smart classroom). Another problem is that a lot of media equipment is designed for "entertainment" and not for classroom use, e.g. it's hard to navigate track lists, and when you bring up the film clip or the cd track, it starts playing before you've had a chance to introduce it.

2. emmadw - July 07, 2010 at 06:58 am

I'm not convinced about the research quoted to support the fact that Multimedia doesn't help with learning.

I see 'Multimedia' as providing information in more than just text/images; so, for example, some video footage of historical events, animation of blood flow etc.

To me, saying that students who use laptops in class don't retain as much information as those who don't isn't looking at multimedia; it's looking at attention (and possibly teaching styles. Someone that encouraged students to use their laptops e.g. to research two sides of the debate & present a case in class is clearly giving a very different session from someone who just lectures for an hour.

Just as a matter of interest, did the laptop using students use them for note-taking, as well as surfing? Were the non-laptop students note-taking?

3. paievoli - July 07, 2010 at 07:09 am

This is exactly what education needs at this point in the global economic structure. We need to put the brakes on all this learning and focus on getting disconnected from the world's research. We need isolationism desperately. We need to focus on a very limited range of concepts from only a handfull of people who have been taught back in the 1970s. This is exactly what we need now in the history of humankind. To stop and focus on the past....
Great thinking.

4. dlafky - July 07, 2010 at 07:17 am

In defense of the dominance of recent citations it should be noted that peer reviewers seem to be heavily biased in that direction. This prompts academic authors to ruthlessly prune any cites >5 years old. It isn't that we don't read or appreciate earlier work. It's that our peers only care that we are on top of the latest research, particularly their own.

5. robabel - July 07, 2010 at 07:46 am

Books like these and the pundits on either side of the argument, well, it's sort of like having a long debate about using a hammer. One side says, "The hammer is just wonderful - it really helps what I'm trying to do." The other side says, "Hammers are bad because they can be used to knock in screws and screws shouldn't be used like that!"

In other words, I think we all know by now that technology/IT/ICT is just a tool in teaching and learning. You need the right tool to fit the job. Really, that is pretty much the whole deal. Can we stop debating it now?

The real challenge in front of us is understanding what models for teaching and learning fit the varying educational needs around the world - and then applying technology to help make those models more accessible and affordable. Let's see more books and debates on that.

Rob Abel
IMS Global Learning Consortium

6. rjensen65 - July 07, 2010 at 07:53 am

Brain Alterations Caused by the Worldwide Web --- http://www.trinity.edu/rjensen/000aaa/theworry.htm#BrainAlteration
The Dark Side of Education Technology --- http://www.trinity.edu/rjensen/000aaa/theworry.htm
Jensen Comment
I emphatically disagree that online research is depthless. When I almost lived in the Stanford University Library as a doctoral student before the days of the Internet, my research was far more "depthless" than what is now a searchable World Wide Web of knowledge. Nicholas Carr is depthless and prone to making brainless generalizations about technology that he's not studied in depth.

He's not even studied the real dark side of education technology ---
http://www.trinity.edu/rjensen/000aaa/theworry.htm
Bob Jensen July 7

7. joekling - July 07, 2010 at 08:07 am

I teach an introductory graduate course in educational research online. The "it depends" response to the questions raised by Marc Parry is, of course, the only correct (give that student 1 pt!) answer. If the questions we, as instructors, are asking of our students require sustained, possibly even profound? thought, then they will be forced (if only to get a good grade) to use the internet as a wonderfully accessible library. Browsing the index of a journal, or flipping through the pages of the 1971 bound edition of RER, as those of us old enough to recall, is indeed a valuable way to find interesting and relevant research, and to contextualize our own questions. The electronic databases available through the university libraries (ebscohost, proquest,
etc.)provide a similar "luxury" electronically if you teach the students not to just bring up the article found listed in Google Schoolar (with 135 citations, dozens of related articles and its own reference list) but to go to the "journal" and browse the issues. Many of the technical applications such as blogging can be re-visioned in f2f terms and a decision about its use becomes fairly clear - Q. is casual conversation (blogging) what we want in our classrooms? A. It depends. :-)

8. joekling - July 07, 2010 at 08:09 am

Note: the "joekling" comment was submitted by Prue Posner, adjunct instructor at Sage Graduate School of Education, Troy NY.

9. richardtaborgreene - July 07, 2010 at 08:53 am

The ENTIRE discussion is juvenile.

When any rather new thing is new, new, not old, new, that means NOT A LOT of practice, not a lot of comparison, not a lot of competitive performances on it, etc.---lack of practice, comparison, competition, emerging standards, emerging norms of excellence, makes........

ALL NEW THINGS sloppily badly and haltingly done. THAT is a trait of all things novel.

So, surprise surprise interfaces on devices and webs that are four years old lead to new behaviors that are sloppy, bad, halting, and filled with other flaws from the viewpoint of intefaces that are thousands of years old they replace----wowie wowie wowie

What sort of moron would ever be surprised by this??????
Yes you can pander pander---that is write scare stories for the naive public to make yourself rich, famous, and a center of attention---we are afterall all small minded monkeys trying to tell lies to everyone about how central and important we are in the universe.

Tripe tripe tripe---all these baby-ish scare stories about how a technology less than 6 years old does not measure up to a technology 100 years older if not centuries older---what sort of moron would ever be surprised by that????

GEEEEEEEEZZZZZ

10. cb_10 - July 07, 2010 at 09:40 am

It't going to be difficult to have an honest conversation about the value of technologically mediated learning if some of the participants can't distinguish between pedagogy and medium. Carr's chief complaints seem center around certain specific uses of technology rather than the use of technology in general. There are a few flimsy straw men built into his argument as well.

For example, Carr's description of "the laptop and the lecture" study presents what to my mind is a fairly weak bit of analysis. Students were distracted by their laptops from learning what the professor at the front of the class was presenting? I am shocked, shocked I tell you! (You have to imagine me saying this in Claude Rains' voice from Casablanca.) Of course they were distracted, and of course too many competing flows of information diminish comprehension.

But Carr's answer to that painfully obvious (and frequently made) observation doesn't seem very original. Carefully planned learning with technology has been regularly advocated. There are some debates about what works best, but I don't know of anyone who is suggesting that we just throw up things nilly-willy.

The observations Evans makes about scientific articles may be very accurate, but Carr misunderstands (or misrepresents) the very problem he's presenting. The problem is the way searches are conducted, not the overreliance on them. Build a better search engine, rather than restrict the availability of the information.

That confusion of medium and practice seems to be Carr's big problem. He fairly offers that technology is important and that he himself uses it regularly, but then suggests that it's too distracting. Yet, the methods of research he advocates have their own distractions. Flipping through journals offers the nice feel of the page under the finger, but one can easily be distracted by all those other articles. Carr doesn't offer what is different about this type of distraction, he just assumes it isn't the same.

The final quote in the article seems characteristic of Carr's confusion. "A questionable classroom?" That could be any classroom, with any instructor who hasn't given careful thought to the instruction and instructional activities. Carr may get attention raising questions about technology in the classroom, but his questions are muddled and have less to do with technology and the valuable resources being developed online than they do with learning in general.

Carr is focused on the wrong targets.

11. walkerst - July 07, 2010 at 10:08 am

Yes, of course there are good and poor uses of technology. And it isn't the technology that is to blame for poor student learning, when that occurs. But what people aren't saying here is that technology is clearly a very easy and tempting distraction. That's the problem with web access and laptops in classrooms - not that they are inherently bad or good, only that it becomes too easy to not listen to what is being taught, while fooling oneself that one is, in fact, listening.

12. chewy18 - July 07, 2010 at 10:20 am

Technology adds much to the learning experience if students are offered guidance as to what they are reading. Not all web sites offer credible, i.e., valid information. I assign extra credit assignments requiring students to search the Supreme Court website frequently. Other times they are required to cite to their sourses and check for validity. Then I do the same when the assignments are turned in. Budget cuts that force some faculty to use WebCt for exams are causing some problems as well. I have yet to give an exam where students have not asked for clarification on some question. Oh, but wait those are not recognition, recall tests, they are analytical tests which seem to be going the way of the Edsel in favor of one line questions and one word answers. Life is not a multiple choice test. P.S., re: post #9, why would an educated person refer to others as morons?

13. sfbrower - July 07, 2010 at 11:57 am

I agree with the everything Mr. Carr states, until the very last comment about "smart classrooms". That term refers to a set of tools, and, like all tools, they are only useful if the user knows how to apply them well. An instructor can use an image, video, or a weblink to illustrate a point in a wonderful way, or, he/she may distract the students. As we all know, one must use the right tool for the right job!

14. snackmeister - July 07, 2010 at 01:28 pm

What about the reality of the differences among each individual's preferred and most effective mode of learning? There will always be people whose strength is in scanning and browsing as opposed to deep analysis. But does the uniquitousness of these types of media and information sources encourage too many people--even those that are good at analysis--to replace comprehension and deep processing with sloppiness? I think it can and it does. I am grateful to the many educators (like those that weighed in on this comment thread) that deploy educational technology in ways that are thoughtfully designed to enhance learning.

15. mbsss - July 07, 2010 at 04:28 pm

My own observations of Internet use, including my own use, suggests that we may not be reading as carefully as we might have prior to the technology. Many of the posts in this thread are illustrative of my point. Several authors demonstrate their rather incomplete reading and/or comprehension of the article. Nicholas Carr, it seems to me, worked hard in this interview to demonstrate a relatively balanced and respectful view of technology--admitting his own use of all of the tools about which he shows concern. His provocation is worth consideration, especially by anyone born after 1980.

16. saraid - July 07, 2010 at 06:45 pm

Carr is the kind of person who is worth disagreeing with: his arguments are useful to use as a devil's advocate, not to play to an either-or decision, but instead to uncover, recognize, and account for the huge pitfalls that come with anything that hasn't been done to death.

In this case, the point is that we're treating technology with a religious veneration. Because it's technology, because it's new, and because it's trending popular, we're making a huge assumption that it can be used to do exactly what we need it to. That's an untested assumption, and it has no nuance or experience.

Personally, I want to go even further with technology. I think Carr is wrong, but I don't mind, because I realize that when designing "smart classrooms" and the like, I need to devote some effort in architecting things to manage attention better. When developing a lecture or deciding on a policy in participation, I need to keep in mind how the individual students think. I need to experiment and figure out what works and what doesn't.

17. tachuris - July 09, 2010 at 08:21 am

Don't throw the baby out with the bathwater in dismissing Carr. Access to so much information is fantastic, and yet it does create a huge distraction as well. Furthermore, information sticks better if you have to be more involved in the process of obtaining it. I took an intensive French class last fall and our professor told us that we needed to hand write all our workbook exercises because we would remember the information better if we did that rather than typing it on the computer. Similarly, if we need to go to the library and actually handle our references, it's a very different process from sitting at a keyboard flying around the libraries of the world. That's not to say we should miss out on what is so easily accessible, but I do think we need to address the problem. I did not grow up with computers and whereas I can't imagine doing research without the internet, my attention span seems to get shorter as the speed of internet access increases.

Regarding distraction in the classroom - yes, for gosh sakes, do we need so much PowerPoint? If the setting is a lecture, but there are also colorful slides to look at, one's eyes are drawn inexorably from the lecturer to the slides, reducing the amount of attention paid to the spoken word. I think I'd rather wait while the words are written on the blackboard, giving me time to absorb the information. And I loved when the equipment all stopped working when I had to give a presentation in Spanish class - I felt much more engaged with my listeners when they had nothing to look at but me :)

So, yeah, we're all flawed and we're not using the technology correctly. Will we ever? And how will we learn to use it correctly? Internet patches on our arms to wean us off of it are beside the point - we will never go back to not having the internet nor would we want to. Maybe this is just part of our cultural evolution and we're at the early stages. Curious to see what it will look like in a few decades (by which time we'll have new addictions).

18. emmadw - July 09, 2010 at 08:29 am

I agree with most of what tachuris has to say ... but, less sure about the comment he made:
"I took an intensive French class last fall and our professor told us that we needed to hand write all our workbook exercises because we would remember the information better if we did that rather than typing it on the computer."
I'd feel it more important to get students themselves to identify what helps them remember better. For some, it could be handwriting, for some typing, for some it could be the reading it (aloud) afterwards that's the really important bit.
Given how often we use keyboards today, I know I & many of my students find it physically more demanding to handwrite - so you start to concentrate on the aching hand, rather than the content.
I always worry about telling students a single way to study - I encourage them to experiment (so, for example, I always do sessions on mindmapping, though personally it does nothing for me).
Just the same as some would argue that you learn best if you take notes by hand, others would say that it's best to take no notes at all & just listen ... & make notes afterwards. For others, creating a mindmap as they go might be best. We're all different & what's right for me isn't always right for you.

19. plummerj - July 09, 2010 at 08:39 am

Couldn't agree more--64 year old college instructor (29 years in hospitals + 11 in public schools). Young people have real difficulty thinking and expressing themselves today.

20. gent258 - July 09, 2010 at 10:10 am

I often teach in classrooms in which every student sits in front of a PC. On the positive side, I can have students compose in class and I can look over their shoulders and make suggestions;however, on the negative side, too many students waste class time by using Facebook or by chatting with their friends when they should be listening to class discussions and my explanations of the material. In short, the computer are more of a distraction and I would prefer to teach in a classroom without them.

21. lsmaterna - July 09, 2010 at 10:28 am

I have to agree with Carr's observation about his own distractedness. I need to take time outs from technology, particularly the constant pressures of email, in order to seemingly "calm" my mind and restore some of what a decade ago was an easy to achieve state of attentiveness and concentration.

I'll skip commenting on the quality of our research today, but I would like to weigh in on another topic: the use of computers for formal writing. I now write articles, academic papers etc. almost exclusively on the computer. Typically I do several drafts if not more. I came across some months ago a paper I had written by hand in graduate school by hand, and then had typed on a typewriter (OK, I'm old) to turn in. I didn't have the typed copy because the grad professor never returned it. What shocked me was that the handwritten copy, which had a few but not many crossed out sections and strips of paper attached where sentences or phases were changed and added, was a much better first version than a typical computer one I come up with today (which I'm also rewriting multiple times before it becomes a draft version). Then too, I find more typos and a tendency to repeat myself because I've rewritten so many times , including moving pieces of text around. If I don't print often and reread and inevitably edit (and then rewrite again!) I can end up with unfortunate redundancies and those terrible typos that even find their way into published copy.

So, has the computer really helped us write better? I have evidence to the contrary, and if one takes into account that I was a budding scholar rather than a seasoned one, then this equally or even inferiority of my first computer draft to my handwritten one raises the question of the computer's superiority as a writing tool.
I'd love to read some comments regarding this observation.

22. 11147066 - July 09, 2010 at 10:42 am

This is a complex issue. In response to post 19, I am a foreign language teacher and I would maintain that writing, as well as speaking, listening and reading, are important means through which to memorize and retain foreign language structures and vocabulary. Not every aspect of learning is a question of personal choice or "learning styles." There is actually evidence, as well as empirical experience, to support that certain methods help most students learn in particular disciplines.
We have become so reliant on giving individual students freedom to customize learning that we may forget there is a such a thing
as professional expertise.

23. jack_cade - July 09, 2010 at 11:54 am

The essentially complication within all of this is that we're talking about individuals who manage their discrete encounters with technology in radically unique ways (at least potentially). Thus, broad statements about the nature of "human" interactions with x technology, while occasionally being valid, are also always incorrect. (There is the kernel of what is wrong with theories by folks like Pinker too, but that is another show.)
So, perhaps we all should take a break from the global information system, but perhaps also we shouldn't. Perhaps it depends on the day.
There is no right answer to anything.
There is only the answer we choose to believe.
@22: To whit, I write better on a computer than I do when I do the handwritten thing. I know many people that would say the same, but I also know folks who would agree with you.
As Curly (played by Jack Palance) said in _City Slickers_ "you've got to choose for yourself."

24. lfblocker - July 09, 2010 at 12:54 pm

I think the sensationalist tone of the introduction to the article would lead anyone who hasn't read the book "The Shallows" to draw false conclusions about it. Mr. Carr never says the Internet short-circuits our ability to 'think.' His main points are: 1) the Internet, cell phones, email, etc. being constantly 'on' distract you when you're trying to be contemplative. I think this is obvious to us, but we're surrounded by young people, and some educators, who think young people are 'different'-- they can multitask and not be distracted because they're young and grew up with the distractions. Those people are wrong. What Mr. Carr is saying here, and cites many studies to back him up, is that the parts of your brain that 'surf' information get better at it, and the parts that think deeply about information get worse at it.
2) The act of moving information from your working memory to your long-term memory can be disrupted by distractions. So, while you're attending a webinar and everyone's making comments in the chat and you think it's such a great, collaborative learning experience, you sit down later and find that you don't remember as much of what the presenter said than you would if you'd ignored the chat and asked your questions/had the chat conversation at the end of the presentation. This happened to me just this week.
3) Reading hyperlinked text serves as a distraction even when you don't click on the links, because you have to decide whether or not to click on them, and that takes up some working memory.
4) One important part of the book not mentioned in this article is 'outsourcing' memory and the discounting of memory skills in some modern educational settings. This is the part that scared me the most when I read the book.

To the long-hand v. typing scholar, I type very quickly so I find using the computer is best for me IF I use it the way you used paper--to jot down this and that, to strikethrough text instead of deleting it because I may end up using it again, etc. But I still like a little notebook to jot down ideas. I don't like taking notes on my iPhone, and I still use a paper calendar. I find the 'devices' too clunky for this type of activity.

25. amy_l - July 09, 2010 at 12:54 pm

lsmaterna: I always write my papers by hand and then type them into the computer. I can't explain it, but I think more clearly when I'm writing by hand. I also print out any articles I need to read closely, and notate them by hand. Words on a computer screen just don't seem as "real" to me -- I don't absorb them as well. Probably just a symptom of my advanced age and decrepitude.

26. crankycat - July 09, 2010 at 01:08 pm

"Smart classrooms"... right. My experience with that technology is that what we are offered is not flexible enough to be unobtrusive, and it can end up turning every class into a 'distance learning' experience, as the instructor has to spend more time interacting with the technology than with the students. Not why I went into education.

I do have students who use laptops in class - sometimes it adds to the discussion, if they can quickly reference an article to answer a technical question. Other times they are messing about in email or social networks and that usually shows up as distractedness and poor participation.

I can't say my colleagues are any less challenged by the distractions - I can't tell you how many I've seen look at their phones or email devices during seminars. Or how few people take handwritten notes during talks any more. Take the time to be focused - it makes a difference in just the simple enjoyment of a well-delivered talk, let alone in the intellectual benefit you can derive.

27. beedhamm - July 09, 2010 at 01:36 pm

I started to read this interview, but then I lost focus.

28. rexheer - July 09, 2010 at 05:03 pm

Had i been sitting in a classroom with, for example, 100 other students--all with laptops open while listening (live) to Marc Parry's interview of Nicholas Carr, and I heard Carr say "there's a study called 'The Laptop and the Lecture'," I might have Googled that source to discover that: a) the article was published in 2003; b) it contained an obvious error (see Fig. 2, p. 56); and c) wouldn't seem to be generalizable to classrooms, students, and technologies in 2010. Thus, while I was "on-task" (much like the lowest-performing students in the 2003 study), I would have likely begun to question Carr's credibility. And that, perhaps, might have led me to score lower on tests of recall and recognition of whatever Carr thought was appropriate and easy to assess.

On the other hand, had I been able to bring that (my thoughts about the 2003 study) up for discussion, the whole class (not just the students) might have been able to have a more substantive, dynamic, and engaging learning experience. (For example, Parry could have used clickers to gauge student's thoughts regarding the distracting effects of technology on learning pre- and post- discussing the 2003 study.) ;-)

I wonder if there have been any recent studies to see if students, who were actively participating in an engaging, interactive, collaborative online learning activity might be distracted by someone standing nearby trying to lecture [to] them.

29. profweigand - July 09, 2010 at 06:35 pm

The use of technology in the learning environment is like any other tool. Faculty who make a sustained commitment to blending technology with their teaching styles will probably have more satisfied students, while faculty who fail to put in the time and try to use technology merely as showy bells and whistles will be less successful. In my experience, the hybrid or web-supported instructional model can increase student engagement and achievement, but faculty must commit to mastering the technology, which often requires a considerable investment of time and energy. Bottom line is, if it's not fun and engaging for the faculty member, they are probably better of sticking to whatever feels most natural to them. Everyone should find their comfort zone with technology and work within their strengths and personal styles. More (or less) technology is never automatically better (or worse).

30. linpol811 - July 10, 2010 at 01:18 am

We should never dismiss technology. I have used it often in research. I find that there is so much assessible information that I have to take additional time to ensure if it is credible before using it. And having access to university libraries is priceless. When it is time to begin writing, I have a habit of using my laptop. It is very difficult for me to actually write with a pen and paper. The old ways should not be totally dismissed either. It has value.

31. amnirov - July 10, 2010 at 05:26 am

That article was so dumb it just lowered the ambient room temperature by at least 5 degrees.

32. raymond_j_ritchie - July 10, 2010 at 05:38 am

An effect of on-line technology that disturbs me is that I find first year biology students show no interest in looking down a microscope or looking at actual specimens of larger plants and animals. When I was a student in the 1970s it was hard to keep most of us away from a microscope. Now students only bother to look at a slide for a few seconds and then only because you are watching them and refuse to learn how to use the microscope properly. Some will articulately tell you labs are a complete waste of time and we should give them all the visual material on-line and scrap lab classes.
Attitudes to experimental biochemistry, physiology, marine biology, plant physiology and even ecology (I am not joking!) in later years are no better. Actually doing an experiment seems to be meaningless to them and they will tell you it is a waste of time. After all no experiments they did in high-school ever worked.
I think it is detachment from physical reality rather similar to how bemused country people used to be by urban children who thought milk and eggs came in cartons. Science has to be actually done.

33. kedves - July 10, 2010 at 12:28 pm

I agree that we are all challenged to focus and to manage quantity of information in ways we never had to deal with before. But with regard specifically to the use of technology in teaching, I would like to how much Carr understands the experience. He has an MA in language and literature; has he ever taught?

Obviously, students who surf the web, or text (or do a puzzle, or fall asleep), are going to miss a lot of lecture and discussion content. But that's a learning issue. What about the teaching issues? Other people here must remember what it used to be like; does he? Was the method I used in the old days to assemble news articles about course topics better--physically cutting them from a paper newspaper, gluing them on a page, assembling them in a bulkpack, writing down every article title/author/date/paper to get copyright permission, photocopying all that paper, and students' paying for it? Now I can put the link--from any source--to the article on Blackboard, tell students to read it and take notes, and display it on the screen in the classroom for reference if we argue about its details. It's a better use of my time, students' money, and natural resources (trees). Is there any loss of reading comprehension? I haven't noticed any.

People used media technology in classrooms before "smart" classrooms. Was it better to have to call the AV people to haul the equipment to your room and FF to the right spot on a videotape, or have a VCR ready to go at all times when the ad you wanted to tape came on, than to get the clip off YouTube or similar, turn on the equipment already in the room, and bring up your saved file? Does he not realize the huge time sink that went into the old ways? Or should we not be showing media clips to illustrate a point at all?

New technologies improve my ability to do what I want in the course and in the classroom. But whether that's important or not, the way we get information has changed for all of us, including our students. Isn't it our responsibility to show them how to focus, how to use time wisely, how to use good search terms, how to distinguish credible from questionable online sources (and scholarship from journalism), how much is enough? I could go back to the strictly book/lecture way, but should I?

In one area, new communication is unquestionably better for my teaching: email. Most of my email conversations are with students who would never come to office hours. Email is less intimidating for them, and when we do talk after class or in office hours, they feel comfortable because they know I'm going to help, or at least listen. That's part of teaching, too.

34. agpbloom - July 10, 2010 at 02:06 pm

robabel writes:

"Books like these and the pundits on either side of the argument, well, it's sort of like having a long debate about using a hammer. One side says, "The hammer is just wonderful - it really helps what I'm trying to do." The other side says, "Hammers are bad because they can be used to knock in screws and screws shouldn't be used like that!"

Well stated! Your instrumentalist analogy is effective because it exposes the way in which a prior commitment to technology or opposition to it becomes divorced from the actual uses of it. Often we argue over "schools" of thought like IT or "traditional" education, developing apologetcs as if the debates were religious.

What really happens in classrooms and other learning environments? This is what is at issue, not who is "committed" to A particular FORM of education.

Taking robabel's analogy another step...Imagine how absurd it would be to ask, "Who's FOR hammers and who is AGAINST hammers? Hurry up and weigh in on the issue. It is time to be heard, time to take a stand!"

What an absurdity, yet folks often do this when technological innovations are examined in policy forums.

35. eelalien - July 11, 2010 at 06:34 pm

Regarding the "laptop and the Lecture" study cited to support Nick Carr's premise: Lecturing is simply NOT the most effective method of conveying ideas, information, etc. So, once again, I am reminded that most of the students' ability to absorb/internalize/digest/recall information and concepts is highly dependent on the quality of the instruction - regardless of technologies used, or not used.

36. jgrantprcc - July 12, 2010 at 10:03 am

The comment about distracting information is well taken. Years ago as an undergraduate I found that I just could not study well in the university library. There was too much information available that was way more interesting than what I needed to study at the time. I now have the same problem with the Internet!

37. lee77 - July 12, 2010 at 11:08 am

Thank you lfblocker (#25) - it didn't appear that too many of the posters had actually read The Shallows, basing their comments on the relatively shallow interview, thus validating observations that Carr made in his book. Which is the point - Carr isn't making an argument for or against anything - he shared research and observations which the reader can ponder. Some readers may choose to do some things somewhat differently based on what Carr points out (as did this reader).

38. cucmike - July 12, 2010 at 12:18 pm

Well designed courses, where the SME (subject matter expert) and ID (nstructional designer) take time to produce the best possible course for delivery whether f2f, online, or hybrid, will always be better than the sage on the stage and students learn!

39. ohreally - July 12, 2010 at 01:01 pm

Someone dares to challenge the juggernaut of technology and the technophiles invoke ad hominen attacks. There are many studies of various kinds that show that introducing information technologies have counter-productive effects.

What I am frustrated by is that technology advocates want technology to be judged on what it *could* do, not what it actually does. The fact is the internet is a giant entertainment device; its primary use for most users is to buy and sell stuff; and, it is not a democratizing or leveling agent, notwithstanding possibilities. Could it be otherwise? Sure. But it's not. Could students learn "more" with information technology? Sure. Could they learn more in traditional courses? Sure. So, why aren't we advocating for more traditional types of instruction?

Technology is not the answer, not because it can't work, or that it "re-wires" brains. It's not the answer because, in practice, it has NOT shown itself to be the answer.

40. ellenhunt - July 12, 2010 at 04:50 pm

Great article. It totally parallels my own experience. Of course we humans adapt to our environment, and when the environment is easy-to-find "stuff" it teaches us to "find stuff". We are hunter-gatherers in the depths of our evolved psyches.

I was quite amused, following the Jensen links to find fractured hypertext thoughts there rather than an argument of prose. He makes the point of the article by his very writing style, a style that few would want to read.

41. janyregina - July 12, 2010 at 05:48 pm

As a cognitive psychologist, I find it amusing that some don't realize that the brain is constantly making new neuronal pathways interacting with the environment and being affected by the environment. Basic Neuropsychobiology 111 & Automaticity.

As an aside, I don't think it is in my student's best interests highlightling in a text as I lecture. I want them to write it down in their own words and then maybe they will recognize the material no matter how it is presented. Old fashioned of me?

42. tekton - July 14, 2010 at 03:19 pm

Ahh, but the new highlighters (#42) feature "Knowledge Nanoencapsulation Technology" (KNT) by which the highlighter incorporates words on a page into nanoparticles, which then are transmitted by osmosis through the highlighter, into the hand of the student, and then directly to the brain, where it is automatically assimilated as in-depth knowledge. At least this, I think, is what the furiously-highlighting student hopes.

Highlighting is one of those study aids that is reminiscent of the Peanuts cartoon in which Charlie Brown hasn't studied much for a test the next day, and he explains as he goes to bed that he will put his textbook under his pillow and that the knowledge in it will seep through his pillow and into his brain as he sleeps. The last panel shows him laying his head on his pillow, atop the book, and saying, "I hope."

Technology creates many hopes, some false and some realizable. At the center of the teaching enterprise, though, is a teacher who through a variety of means communicates not only information but how to think about the material. A problem with technology in teaching is that the medium can become the message, and the teacher's role in properly communicating a larger way of thinking about the material can be diminished if not lost entirely.

43. albcamp - July 16, 2010 at 01:08 am

Having re-read the interview after reading all the responses, I have a suggestion for those who posted in the negative. Print a hard copy of the interview, find a place away from distractions such as computers and mobile phones, read the interview, think about it, and establish Carr's basic premise. Having done that, feel free to disagree but please, don't build arguments against a misinterpreted premise as it's a distracting waste of time.

44. raghuvansh1 - July 16, 2010 at 03:03 am

My experiences quite different.Brain is not change by circumstance but brain developed enlarge with new experiences.For references Internet is useful.but search is trouble some you to delete many junk references.Big question is how you can prevent the progress of Internet?Only solution you must be aware how to use internet wisely.Iam living India where condition of libraries are very poor Internet is boon to me.From last ten year I learned more from Internet than my 40 previous years in libraries.

45. shalomfreedman - July 16, 2010 at 09:02 am

It is common sense and we all can confirm it if we think about a bit- When we truly concentrate on one thing or activity we can do it in a far better and deeper way than when we try to do it while doing other things.

46. runbei - July 17, 2010 at 10:54 am

To understand the effect of divided attention on comprehension, close your eyes during a PowerPoint presentation, opening them only to briefly capture essential information from the screen. You'll retain more, and if the presenter tells a dynamic story, you'll be much more entertained.

47. technocrat - July 17, 2010 at 05:51 pm

Channel surfing once was a only a mildly pernicious form of meta-boredom when there were only a relative handful of TV channels, radio stations, daily newspapers and a magazine or two from which to choose. You could blow off 20-30 minutes looking for something interesting and then call your buddy to see if he wanted to go hang out.

Today, it's possible to flip through the choices without ever coming to an end, always the possibility of something to staunch l'ennui just a click or two away. Telling yourself "I'll do it just as soon as I finish looking this up on the Internet" has become "never."

48. bast2894 - July 17, 2010 at 08:49 pm

The real danger, or one of them, is that the "bells and whistles" are often mistaken for instruction and even competence, and increasingly seen as essential. For every argument about the multi-tasking and -modal mentality and expectations of the neo-millenial generation, there is a horror story like the one about the space shuttle Columbia (which blew up in part because essential information about the flammability of some components was tucked away in a corner of some PowerPoint slide, so the powers-that-be simply overlooked it). So what to do ?
As has been suggested already, and not only here, use technology sensibly and constructively: we can't do away with it, pretend it doesn't exist or matter, or anything of the sort. Doing so would even be a disservice to our students, whom we need (as has also been pointed out) to prepare for the "real world," which will also require them to use technology competently and independently. I'd like to see some "happy medium" where educators use technology in a way appropriate to their subject matter and conducive to learning, their own and their students' -- but without, for example, feeling compelled by the administration or, heaven forbid, student boredom to resort to the aforementioned "bells and whistles." Part of such technological integration should also be teaching students to use these resources in academically/ professionally approporiate and responsible ways -- for example, looking up relevant information during a lecture, but not personal web-browsing.
To dream the impossible dream ...

49. abdufaith - July 19, 2010 at 08:37 am

I think there is something missing in this issue. Multimedia and social learning is not exclusively intended to enhance attention, increasing knowledge or retaining the information for longer time. It could be argued that these technologies are not as beneficial as we thought and they lead to distraction. I may agree.

However, there is another side that doesn't receive enough attention. Using multimedia and social in education reshapes the way students view and interacts with the world. Multimedia and social tools are used extensively in many parts of our life by billions of people and thousands of companies, institutions, government agencies..etc and one job of any education center is to prepare students to the world. Another thing, these tools must be used so students don't feel that their education center or system is isolated from technologies used by everyone.
Using any innovation idea in education may have its rewarding impacts, but it's not necessarily to improve the passing of knowledge from lecturers to students or vice versa. Education centers is not only for teaching anymore.

50. optimysticynic - July 19, 2010 at 09:35 am

No one has addressed another issue with the push for using technology: the time it takes to come up to speed. Instead of reworking lectures and updating material, we are encouraged to and rewarded for simply transforming our old lectures into new, more entertaining formats. Contacts with students via technology are more highly valued than f2f (office hours? forget them and offer podcasts). This is in part a function of being able to add something new-looking to your annual report, which enables the dean or other counter to add something new to his/her report and so on. Then the entire university can publish new stats in their recruiting lit and--presto changeo--educational value has been added (we call that: marketing.)

51. billguinee - July 19, 2010 at 07:36 pm

I am pretty sure that any kind of significant new technology is going to change our brains and our thinking. For example, the printing press and widespread literacy seem to have virtually destroyed much of our memory such as the ability to memorize epic poems. From the point of view of a pre-literacy person, this could have seemed a disastrous consequence of the new technology. Right now, I feel, we are still largely in the pre-digital age and have no way to really evaluate what the digital brain will be like.

Add Your Comment

Commenting is closed.

subscribe today

Get the insight you need for success in academe.