At almost every conference or workshop I attend on teaching, and even in hallway conversations with my colleagues, the discussion veers toward the use of technology in the classroom -- how to do it, why to do it, who does it well, and who does not.
My thoughts have turned in that direction as well as I prepare to return to the classroom after a year on sabbatical. I plan to experiment with some new technologies for the first time so I've been doing some research about them and thinking about the debate over how thoroughly they should penetrate our classrooms.
One of the most provocative pieces I have encountered on the subject comes from Marc Prensky, whose essay "Digital Natives, Digital Immigrants" has generated much discussion. Prensky styles himself, on his Web page, as a "visionary," "inventor," "game designer," and "futurist," among other titles, and the argument of his widely read essay certainly falls in line with those grand appellations.
The title pretty much says it all: Our students are digital natives who have grown up in the land of technology and know no other way of operating in the world. Those of us who are a generation or two ahead of them are digital immigrants, who grew up in a different kind of world and now have to bumble our way around with our guidebooks. However comfortable we may eventually become with technology, we will remain immigrants, never as connected to the land as the natives.
That seems sensible, but Prensky draws some radical conclusions from it. "Digital Immigrant teachers," he explains, "assume that learners are the same as they have always been, and that the same methods that worked for the teachers when they were students will work for their students now. But that assumption is no longer valid" (Prensky's emphasis).
That means, he argues, we have to jettison our old assumptions about everything in education, including the very basic content and skills we teach. Prensky distinguishes between "legacy content," from the predigital era, and "future content," from the digital one.
"'Legacy' content," he writes, "includes reading, writing, arithmetic, logical thinking, understanding the writings and ideas of the past, etc. -- all of our 'traditional' curriculum. It is of course still important, but it is from a different era. Some of it (such as logical thinking) will continue to be important, but some (perhaps like Euclidean geometry) will become less so, as did Latin and Greek."
By contrast, according to Prensky, "'future content' is to a large extent, not surprisingly, digital and technological. But while it includes software, hardware, robotics, nanotechnology, genomics, etc. it also includes the ethics, politics, sociology, languages and other things that go with them" (Prensky's emphasis).
The precise meaning of his italicized phrases is not crystal clear to me, but his larger point about content shifting in higher education rings out pretty loudly. And, of course, if the content changes radically, so too, for Prensky, will the methods. Not surprisingly, he favors the use of computer games and simulations as the proper pedagogical method for digital natives.
To all of which, I confess, I can only say: Huh.
Not a sneering, dismissive huh, or an incredulous huh, or an "is-he-nuts?" huh, but the sort of huh that follows when I realize I need to think about something a little further before I decide.
A few decades ago, Walter J. Ong argued, in Orality and Literacy, that writing was a technology that changed thought; the shift from oral to literate cultures wrought a deep and permanent change in the way we reasoned as a species. Prensky seems to be making a similar argument about the role of digital technologies today; they have already changed the thinking of our students, it seems, so now we have to refine our teaching techniques in order to accommodate that transformation.
On the one hand, I think we would be foolish to dismiss Prensky's argument out of hand, because it seems impossible that the avalanche of new technologies in our lives and the lives of our students won't change just about everything we do. And I am certainly open to the possibility that we should do more to work with the tools and thinking strategies that our students are accustomed to.
On the other hand, I am very skeptical that the value of "legacy content" like logical thinking, careful reading, and clear writing will diminish in our democratic society any time soon. Maybe someday but not, I'm guessing, in my lifetime or the lifetime of my students. I'm not even convinced that technology can offer better ways to teach some of those fundamental skills to our students.
Certainly technology has improved our ability to teach many subjects -- students studying anatomy can now work on virtual human bodies instead of dead cats, and that seems to me like an improvement. And the ability of computer programs to simulate and model chemical processes or economic theorems certainly surpasses what instructors used to be able to do with chalk, blackboards, and overhead projectors.
But can a computer program teach careful reading skills more effectively than a great teacher working with books, pencils, and a blackboard? Maybe a properly designed program could do it more effectively for some students, but probably not for all of them.
I find myself more in line with the thoughts of George Justice, assistant dean of the graduate school at the University of Missouri, who said to me recently that he still believes in the value of sitting around a table with a dozen students, all holding books, and talking about the meaning of the words on the page.
Justice's comment helped me see more clearly a problem with Prensky's argument. He assumes a uniformity among digital natives and their preferred learning methods that seems questionable to me. It reminds me of the false assumptions about uniformity in learning that faculty members have been making for a long time: I liked learning from lectures, and, therefore, all of my students must like learning from lectures.
Anyone who takes the briefest look at theories of how human beings learn will discover that our students learn in all kinds of ways, and that we should never assume that any one method is universal or even a majority preference. Likewise, to blanket our current students with the label "digital natives" doesn't take into account the wide range of experiences and preferences they have with technology.
Can a wealthy student from an elite prep school, outfitted with every available technology, really be considered a fellow digital native with a rural student whose exposure to technology might be limited to an hour or two a week on a library computer? What about a student living in low-income housing? Or an immigrant recently arrived from a rural part of India or Africa?
All of which is to say, as I begin to refine my "Huh," that the right attitude here might be: Live and let live.
Let's welcome the pedagogical innovations of Prensky and his collaborators, but let's give equal respect to George Justice and his class of students holding books and pens. Our students can learn equally well from both kinds of classrooms, and which one is used should depend upon the subject, the teacher, and the students.
And part of live and let live means that no teacher should feel compelled -- by colleagues, administrators, or current theories -- to digitize his or her classroom. If you want to let new technologies inform and transform your classrooms, by all means do so. But if you believe your classroom is better off without those trappings, then keep them out.
I would even argue that, in a digital age, the value of sitting with books might become more important than the value of learning experiences designed to mimic the technological habits of our students.
I am a not-very-hopeful environmentalist, and I believe one of the main reasons we find ourselves in our current dire predicament is that the digital generations are not slowing down enough to look at the world around them, or to think about how the choices they make affect the physical world in which we live.
Walk to the most beautiful place on your campus -- ringed in flowers, shaded by tall trees, adjacent to a pond or fountain -- and sit down and watch the students walk by. The number of students who aren't on their cellphones or plugged into iPods dwindles more precipitously each year. That seems to me like an unfortunate transformation. The student who understands that she walks through beauty seems less likely to deface that beauty -- dropping a soda can along the way -- than one who has all of her attention on a cellphone conversation.
That may seem to take my argument far afield from teaching if you believe that our responsibility lies in training students narrowly in the skills and content of our disciplines. I don't believe that. We teach the whole person, and we all are responsible for teaching students to be good citizens in our democracy, and inhabitants of the planet.
So let's make use of the technologies that seem appropriate and effective, but let's not neglect to remind students that, for their own good and that of the planet, sometimes they need to find a pocket of nature or an unplugged classroom somewhere, and sit there with nothing but a brain and a book.
Later this summer I will offer my second annual reading list of books and other resources on teaching in higher education. If you have a suggestion -- whether it's something you have written on the topic or read, something new or a classic -- send me an e-mail message, and I will consider it for the column.