• November 22, 2014

On Stupidity, Part 2

Last month I reviewed a collection of recent books arguing that Americans, particularly those now entering college, have been rendered "stupid" by a convergence of factors including traditional anti-intellectualism, consumer culture, the entertainment industry, political correctness, religious fundamentalism, and postmodern relativism, just to name some of the usual suspects.

Of course the anticipated consequences of the "stupidity crisis" seem dire enough — the end of democracy, the economic decline of the United States, the extinction of humanity as we know it — that one feels compelled to register opposition to the "Age of Unreason" by buying a few books.

I bought seven of them. And I am convinced — as if I ever doubted it — that, over the past several decades, we have become less knowledgeable, more apathetic, more reliant on others to think for us, more susceptible to simple answers, and more easily exploited.

Nevertheless, I am still suspicious of studies that proclaim the inferiority of the rising generation. We've all been the young whippersnappers at some point, frightening our elders, and many of us are, no doubt, destined to become grumpy old nostalgics in turn. As a teacher, I would prefer to think my students are the ones with the most promise; they are attuned to what is happening in the culture, even if they still have much to learn.

I noted in last month's column that several of the books on stupidity blame the rise of digital technologies — video games, iPods, the Internet — for the intellectual degradation of the young. The culture of multitasking and Internet surfing has apparently damaged their ability to concentrate; nurtured superficiality, self-absorption, and social isolation; and created a generation of young people who are always plugged in, constantly busy, yet seem remarkably uninformed and unproductive.

I went on to affirm some of those claims with my own observations. My interests, as an English professor who grades at least 1,000 essays every year, tend to focus on the skills involved in that kind of work (I know relatively little about what's happening in math and science).

Essentially I see students having difficulty following or making extended analytical arguments. In particular, they tend to use easily obtained, superficial, and unreliable online sources as a way of satisfying minimal requirements for citations rather than seeking more authoritative sources in the library and online. Without much evidence at their disposal, they tend to fall back on their feelings, which are personal and, they think, beyond questioning.

In that context, professors are seen as peevish bureaucrats from whom students need to extract high grades on the road to a career in which problems with writing and critical analysis will somehow not matter.

Some observers, such as Marc Prensky, who wrote Don't Bother Me Mom, I'm Learning!, argue that students' brains have been physically "rewired" by digital technologies, and that our task is to teach students to work with that wiring rather than to continue traditional teaching methods that are no longer relevant (http://www.marcprensky.com/writing).

In some respects, I agree with Prensky: Teachers do need to be mindful of generational changes. But today's students must also learn — just as we all did — how to adapt to generations that came before them, since, except in school, there are usually more people outside of one's generation than in it. Age differences may be the most underrated form of diversity in education.

One of the consequences of K-12 schooling (and of college, to a lesser extent) is the creation of a narrow peer group. That segregation by age impairs the ability of young people to relate to anyone outside their cohort, as anyone with teenage children or first-year college students knows all too well.

One of the purposes of teaching, as I see it, is to negotiate the differences, real and imagined, between generations. At the moment, that means meeting the "digital natives" where they are, but it also means expecting them to meet the "digital immigrants" — the people who were not raised in front of personal computers — where we are.

As teachers, we need to build upon students' strengths, but we should also train them against the grain of their experiences.

Moreover, since the brains of our students are hardly identical (the notion of a unified generational culture is always oversimplified), it seems more effective to use a variety of teaching methods all at once — the same way it is better to eat a balanced diet than to subsist entirely on Grape-Nuts and bananas.

So what does that mean in practical terms? What does one do in the classroom?

For me, it still means embracing the traditional essay:

  • Expecting evidence and examples with correct citations.

  • Teaching academic honesty and enforcing the rules fairly and rigorously.

  • Getting students into the library and getting real books into their hands.

  • Teaching them how to evaluate the credibility of sources: why Wikipedia, though useful, is less reliable than, say, the Dictionary of American Biography.

  • Grading essays carefully, giving attention to the details of grammar and punctuation, as well as showing students when some rules can be artfully broken.

  • Emphasizing writing as a painstaking process that involves revision and re-evaluation in conversation with other people.

  • Making a case for why reading, writing, and the liberal arts are vital to success in every field.

Beyond writing exercises, effective teaching requires embodying the joy of learning — particularly through lectures and spirited discussions — that made us become professors in the first place. It's extremely hard, but teachers have been doing it for generations.

For example, I have continued to lecture in many of my courses, but I have gradually learned to make lectures more stimulating and interactive by weaving together multiple threads of analysis using images, video, audio, artifacts, and readings — and asking the students to perform those readings. The lectures are designed to make a sustained argument, but they also have multiple points of entry, so that students are not lost after a momentary lapse of attention. Added to that are intervals of rest — in which concentration can slacken for a few minutes, as concepts are considered and discussed — before the harder analysis is resumed.

Such lectures have to be carefully prepared, but they are also spontaneous, and always open to interaction, because that's what enables students to make connections on their own.

Sometimes a student will rush up to the computer terminal and make those connections using a video clip from YouTube or by instantly locating a quote from a relevant text. That's the kind of engagement and excitement that leads to good essays, in which they make their own discoveries rather than simply repeat what they think I want to hear.

It's a kind of magic act that works best when a student pulls the rabbit out of the hat. I know there are educators who strongly oppose the use of the lecture. "Chalk and talk," they call it, and insist upon group work. And I respect that view, particularly since it has nearly banished the dry, droning professor, reading from yellowed notes. But the taboo against lecturing sometimes impairs the freedom of teachers to experiment with a traditional method in a way that can both respond to the skills of the "digital natives" — such as interconnectivity and intuition — while training them in the use of evidence and rational argumentation.

One of the most effective things I've done is use course-management software, such as Moodle, to create courses that are no longer confined to four hours a week in a classroom. We use class time for the most intense, "live" interactions, but conversations and new materials appear continually between classes, keeping all of us engaged as much as possible for the duration of the semester.

Such hybridized courses, live and online, create the habit of thinking and making connections all the time rather than simply showing up and fulfilling requirements.

If digital technologies are a contributing factor to "stupidity," then they are also part of the solution. As Al Gore observes in his book The Assault on Reason, the future of communication and an informed citizenry will depend increasingly on the Internet rather than on television or the print media.

That doesn't mean we should stop teaching the traditional essay and research paper, but it does mean we need to teach students to work in other genres, such as writing for blogs and wikis, creating podcasts and PowerPoint presentations, and participating in social-networking sites. They need to be comfortable in a variety of online environments, understand Web etiquette, know how to protect their privacy and respect the privacy of others, and learn how to evaluate various sources of information.

But such teaching requires a lot of time. It means being constantly available, developing intricate presentations, coming to class early to set things up, and staying afterward for conversations. It requires giving students careful feedback on writing assignments and rarely using multiple-choice and short-answer exams. And it requires looking for new ways to enhance learning rather than relying exclusively on what we already know.

As a tenured professor — though a somewhat harried one, at times — I have the institutional support to respond to the changing needs of our students. I have the freedom to learn from my mistakes and adapt to the changing qualities of our students. I also have colleagues with whom I can discuss teaching strategies and the place of my courses in the overall curriculum. I can challenge students and enforce academic honesty because I will not be fired (or "not renewed") for displeasing a "customer" and using administrative time to enforce standards.

How much harder is it for the legions of dedicated but disposable part-timers with large classes on multiple campuses — the majority of college instructors today — who are charged with teaching the foundational skills that the "digital natives" are thought to be most conspicuously lacking?

If digital technologies are a cause of "stupidity," it is because we have spent freely on computers — among other things — without also giving comparable support to college teachers. The students have been left to negotiate a cultural paradigm shift, comparable to the print and industrial revolutions, with inadequate support from the institutions created to help them.

And that strikes me as unambiguously stupid.

 

Thomas H. Benton is the pen name of William Pannapacker, an associate professor of English at Hope College. He writes about academic culture and welcomes reader mail directed to his attention at careers@chronicle.com.

subscribe today

Get the insight you need for success in academe.