• September 19, 2014

Divided Attention

In an age of classroom multitasking, scholars probe the nature of learning and memory

Scholars Turn Their Attention to Attention 1

Alex Williamson for The Chronicle Review

Enlarge Image
close Scholars Turn Their Attention to Attention 1

Alex Williamson for The Chronicle Review

Imagine that driving across town, you've fallen into a reverie, meditating on lost loves or calculating your next tax payments. You're so distracted that you rear-end the car in front of you at 10 miles an hour. You probably think: Damn. My fault. My mind just wasn't there.

By contrast, imagine that you drive across town in a state of mild exhilaration, multitasking on your way to a sales meeting. You're drinking coffee and talking to your boss on a cellphone, practicing your pitch. You cause an identical accident. You've heard all the warnings about cellphones and driving—but on a gut level, this wreck might bewilder you in a way that the first scenario didn't. Wasn't I operating at peak alertness just then? Your brain had been aroused to perform several tasks, and you had an illusory sense that you must be performing them well.

That illusion of competence is one of the things that worry scholars who study attention, cognition, and the classroom. Students' minds have been wandering since the dawn of education. But until recently—so the worry goes—students at least knew when they had checked out. A student today who moves his attention rapid-fire from text-messaging to the lecture to Facebook to note-taking and back again may walk away from the class feeling buzzed and alert, with a sense that he has absorbed much more of the lesson than he actually has.

"Heavy multitaskers are often extremely confident in their abilities," says Clifford I. Nass, a professor of psychology at Stanford University. "But there's evidence that those people are actually worse at multitasking than most people."

Indeed, last summer Nass and two colleagues published a study that found that self-described multitaskers performed much worse on cognitive and memory tasks that involved distraction than did people who said they preferred to focus on single tasks. Nass says he was surprised at the result: He had expected the multitaskers to perform better on at least some elements of the test. But no. The study was yet another piece of evidence for the unwisdom of multitasking.

Experiments like that one have added fuel to the perpetual debate about whether laptops should be allowed in classrooms. But that is just one small, prosaic part of this terrain. Nass and other scholars of attention and alertness say their work has the potential to illuminate unsettled questions about the nature of learning, memory, and intelligence.

As far back as the 1890s, experimental psychologists were testing people's ability to direct their attention to multiple tasks. One early researcher asked her subjects to read aloud from a novel while simultaneously writing the letter A as many times as possible. Another had people sort cards of various shapes while counting aloud by threes.

Those early scholars were largely interested in whether attention is generated by conscious effort or is an unwilled effect of outside forces. The consensus today is that there are overlapping but neurologically distinct systems: one of controlled attention, which you use to push yourself to read another page of Faulkner, and one of stimulus-driven attention, which kicks in when someone shatters a glass behind you.

But those scholars also became intrigued by the range of individual variation they found. Some people seemed to be consistently better than others at concentrating amid distraction. At the same time, there were no superstars: Beyond a fairly low level of multitasking, everyone's performance breaks down. People can walk and chew gum at the same time, but not walk, chew gum, play Frisbee, and solve calculus problems.

In a famous paper in 1956, George A. Miller (then at Harvard, now at Princeton) suggested that humans' working-memory capacity—that is, their ability to juggle facts and perform mental operations—is limited to roughly seven units. When people are shown an image of circles for a quarter of a second and then asked to say how many circles they saw, they do fine if there were seven or fewer. (Sometimes people do well with as many as nine.) Beyond that point, they estimate. Likewise, when people are asked to repeat an unfamiliar sequence of numbers or musical tones, their limit on a first try is roughly seven.

And that is under optimal conditions. If a person is anxious or fatigued or in the presence of an attractive stranger, his working-memory capacity will probably degrade.

What Miller called the informational bottleneck has been recognized as a profound constraint on human cognition. Crudely speaking, there are two ways to manage its effects. One is to "chunk" information so that you can, in effect, pack more material into one of those seven units. As Miller put it, "A man just beginning to learn radiotelegraphic code hears each dit and dash as a separate chunk. Soon he is able to organize these sounds into letters, and then he can deal with the letters as chunks. Then the letters organize themselves as words, which are still larger chunks, and he begins to hear whole phrases." That sort of process is obviously central to many kinds of learning.

The second method for managing the bottleneck—and the one that concerns us here—is to manage attention so that unwanted stimuli do not crowd the working memory. That might sound simple. But as the Swedish neuroscientist Torkel Klingberg explains in his recent book The Overflowing Brain: Information Overload and the Limits of Working Memory (Oxford University Press), scholars are far from agreement about how to describe the relationship between attention and working memory. Does a poor attention system cause poor working-memory performance, or does the causation sometimes work in the other direction?

One common metaphor is that controlled attention acts as a "nightclub bouncer," preventing irrelevant stuff from getting into working memory. A few years ago, Klingberg and a colleague conducted brain-imaging experiments that suggested that a region known as the globus pallidus seems to be highly active when people successfully fend off distraction.

"Why is it that some people seem to reason well and others don't?" asks Michael J. Kane, an associate professor of psychology at the University of North Carolina at Greensboro. "Variability in working-memory capacity accounts for about half the variability in novel reasoning and reading comprehension. There's disagreement about what to make of that relationship. But there are a number of mechanisms that seem to be candidates for part of the story."

One of those seems to be attentional, Kane says. "The view that my colleagues and I are putting forward is that part of the reason that people who differ in working-memory capacity differ in other things is that higher-working-memory-capacity people are simply better able to control their attention."

In other words—to borrow a metaphor from other scholars—people with strong working-memory capacities don't have a larger nightclub in their brains. They just have better bouncers working the velvet rope outside. Strong attentional abilities produce stronger fluid intelligence, Kane and others believe.

Attention and distraction are entangled not only in reasoning and working memory, but also in the encoding of information into long-term memory.

In 2006 a team of scholars led by Karin Foerde, who is now a postdoctoral fellow in psychology at Columbia University, reported on an experiment suggesting that distraction during learning can be harmful, even if the distraction doesn't seem to injure students' immediate performance on their tasks.

Foerde and her colleagues asked their subjects to "predict the weather" based on cues that they slowly learned over many computer trials. For example, seeing an octagon on the screen might mean that there was a 75-percent chance of rain on the next screen. The subjects would never be told the exact percentage, but gradually they would learn to infer that most of the time, an octagon meant rain.

During one of their four training runs, the subjects were distracted by a task that asked them to count musical tones while they did the forecasting. At first glance, the distraction did not seem to harm the subjects' performance: Their "weather forecasts" under distraction were roughly as accurate as they were during the other three trials.

But when they were asked afterward to describe the general probabilistic rules for that trial (for example, a triangle meant sunshine 80 percent of the time), they did much worse then they did after the undistracted trials.

Foerde and her colleagues argue that when the subjects were distracted, they learned the weather rules through a half-conscious system of "habit memory," and that when they were undistracted, they encoded the weather rules through what is known as the declarative-memory system. (Indeed, brain imaging suggested that different areas of the subjects' brains were activated during the two conditions.)

That distinction is an important one for educators, Foerde says, because information that is encoded in declarative memory is more flexible—that is, people are more likely to be able to draw analogies and extrapolate from it.

"If you just look at performance on the main task, you might not see these differences," Foerde says. "But when you're teaching, you would like to see more than simple retention of the information that you're providing people. You'd like to see some evidence that they can use their information in new ways."

If single-minded attention is vital to learning, how far should college instructors go to protect their students from distraction? Should laptops be barred at the classroom door?

One prominent scholar of attention is prepared to go even further than that.

"I'm teaching a class of first-year students," says David E. Meyer, a professor of psychology at the University of Michigan at Ann Arbor. "This might well have been the very first class they walked into in their college careers. I handed out a sheet that said, 'Thou shalt have no electronic devices in the classroom.' ... I don't want to see students with their computers out, because you know they're surfing the Web. I don't want to see them taking notes. I want to see them paying attention to me."

Wait a minute. No notes? Does that include pen-and-paper note-taking?

"Yes, I don't want that going on either," Meyer says. "I think with the media that are now available, it makes more sense for the professor to distribute the material that seems absolutely crucial either after the fact or before the fact. Or you can record the lecture and make that available for the students to review. If you want to create the best environment for learning, I think it's best to have students listening to you and to each other in a rapt fashion. If they start taking notes, they're going to miss something you say."

Give Meyer his due. He has done as much as any scholar to explain how and why multitasking degrades performance. In a series of papers a decade ago, he and his colleagues determined that even under optimal conditions, it takes a significant amount of time for the brain to switch from one goal to another, and from one set of rules to another.

"I've done demonstrations in class," Meyer says, "whereby they can see the costs of multitasking as opposed to paying attention diligently to just one stream of input."

He might, for example, ask students to recite the letters A through J as fast as possible, and then the numbers 1 through 10. Each of those tasks typically takes around two seconds. Then he asks them to interweave the two recitations as fast as they can: "A, 1, B, 2," and so on. Does that take four seconds? No, it typically requires 15 to 20 seconds, and even then many students make mistakes.

"This is because there is a switching time cost whenever the subject shifts from the letter-recitation task to the number-recitation task, or vice versa," Meyer says. "And those extra time costs quickly add up."

Several other scholars of attention, however, concede that they haven't tried to set firm rules about laptops in class.

"I've thought about having a special laptop section in my lecture hall," says Kane, the psychologist at Greensboro. "That way students wouldn't have to be distracted by their neighbors' screens if they don't want to be." Beyond that, however, Kane is reluctant to move. Many students do legitimately take notes on laptops, and he doesn't want to prevent that.

Stanford's Nass, likewise, allows laptops in his classes, though he feels sheepish about that choice, given his research. "It would just seem too strange to ban laptops in a class on computers and society," he says.

Many other scholars say instructors should make peace with the new world of skimming and multitasking. N. Katherine Hayles, a professor emerita of English at the University of California at Los Angeles, has argued in a series of essays that the new, multimedia world generates "hyper attention"—which is different from, but not necessarily worse than, attention as traditionally understood. In a media-rich environment, she believes, young people's brains are getting better at making conceptual connections across a wide variety of domains.

"One of the basic tenets of good teaching is that you have to start where the students are," Hayles says. "And once you find out where they are, a good teacher can lead them almost anywhere. Students today don't start in deep attention. They start in hyper attention. And our pedagogical challenge will be to combine hyper attention with deep attention and to cultivate both. And we can't do that if we start by stigmatizing hyper attention as inferior thinking."

Nass is skeptical. In a recent unpublished study, he and his colleagues found that chronic media multitaskers—people who spent several hours a day juggling multiple screen tasks—performed worse than otherwise similar peers on analytic questions drawn from the LSAT. He isn't sure which way the causation runs here: It might be that media multitaskers are hyperdistractible people who always would have done poorly on LSAT questions, even in the pre-Internet era. But he worries that media multitasking might actually be destroying students' capacity for reasoning.

"One of the deepest questions in this field," Nass says, "is whether media multitasking is driven by a desire for new information or by an avoidance of existing information. Are people in these settings multitasking because the other media are alluring—that is, they're really dying to play Freecell or read Facebook or shop on eBay—or is it just an aversion to the task at hand?"

When Nass was a high-school student, decades ago, his parents were fond of an old quotation from Sir Joshua Reynolds: "There is no expedient to which man will not resort to avoid the real labor of thinking." That is the conundrum that has animated much of his career.

"I don't think that law students in classrooms are sitting there thinking, Boy, I'd rather play Freecell than learn the law," Nass says. "I don't think that's the case. What happens is that there's a moment that comes when you say, Boy, I can do something really easy, or I can do something really hard."

David Glenn is a senior reporter at The Chronicle.

Comments

1. abbiatti - February 01, 2010 at 09:47 am

I wonder what the research says about the attention span of the e-learner versus the traditional classroom learner. Are students more attentive when they have no recourse but to take the majority of responsibility for learning? E-courses, when created appropriately are very challenging, and require a great deal of on-task time to complete. Does lack of attention span and task orientation account for any portion of the course completion problem we see in e-courses? is this the same as maturity, etc.

2. jsener - February 01, 2010 at 10:40 am

Can't help but wonder the extent to which the potential downsides of multitasking, and the accompanying "illusion of competence", also apply to instructors - in particular, how so many of them seem to believe that they can tell whether or not each of their students is "getting it". It's clear that there's some extent of "illusion" (or more accurately, delusion) there as well...

3. fluffysingler - February 01, 2010 at 12:16 pm

The studies seem to compare relatively similar tasks. What about a kinesthetic task (like typing or taking notes) with a conceptual task like the recitation or lsat questions?

4. hnturnbow - February 01, 2010 at 10:23 pm

Regarding the instructor who forbids note-taking during class: While some students may benefit from this policy, I am almost certain that others will suffer.

Speaking personally as a person with ADD, I have extreme difficulty maintaining my focus on a lecture unless I am actively taking notes. I can also focus on reading or writing for longer periods of time when there is instrumental music playing in the background. Without these so-called "distractions" my brain will simply drift off into the sunset, and I will become anxious trying to force my attention back where it belongs.

As a teacher, I try to recognize that different people's minds work in different ways - I don't allow blatant alternate activities like cell phone or internet use, but note-taking or doodling during lectures is harmless and often helpful to some people. Just a thought:)

5. provostialdramaturg - February 02, 2010 at 09:47 am

I am eager to see more studies-- and hear more anecdotes-- around the notion that "media multitasking might actually be destroying students' capacity for reasoning." I share this view as a half-formed perception of a looming problem (see also Nicholas Carr's 2008 article in The Atlantic, "Is Google Making Us Stupid?") affecting not only the classroom but the professions. Meanwhile, I believe it is important to provide students with opportunities to "think longer thoughts" by posing problems and setting tasks that don't easily yield to the quick hit.

And by the way, I am an inveterate doodler, both in my student days and now in faculty and staff meetings.

6. dollardogs - February 02, 2010 at 04:53 pm

Somebody should pass this story along to the professor who, in an article here a couple months back, actually endorsed letting his students "twit" in his class.

7. timebandit - February 02, 2010 at 04:59 pm

I am also concerned about the use of laptops in class. From my own experience, I do get better focus with taking notes by hand, even if these are less searchable than typing. I think there are several reasons, one being that the act of writing makes me attend to key concepts, and then even if I cannot write down all of the items, my recall of these key concepts will be better. (Plus, diagrams by hand are almost always better unless one works with a tablet style laptop.)

We can also put forward a pragmatic reason or two. Just as not keeping cookies in the house keeps one from snacking, taking notes by hand allows me to avoid the temptation of checking email or looking up some reference on the web, and then becoming distracted from the flow of the lecture. (Interesting developments on willpower research also.)

I imagine both of these, among other factors at work, and the developing work on multitasking promises to be very interesting to see. On the other hand, aside from the websurfers distracting others with flash graphics and animated kitties, is the role of a professor really to dictate that teenagers pay attention in class?

8. ianative - February 03, 2010 at 10:36 am

As an instructional designer, I've often warned faculty that I work with of the "attention assumption" -- that if students are paying attention (or appear to be paying attention) that they are learning. Cognitive processing research tells us that unless we're actively doing something with content we're unlikely to move it into long-term storage. Taking notes well is one way to "do something" with content, while listening to the teacher may involve little actual thinking, so I'd discourage those bans on note-taking.

9. mawickline - February 03, 2010 at 10:37 am

I can easily see that computers offer a distraction (happens to me all the time), but I was surprised that Dr. Meyer also bans pencil & paper notetaking. I like to use a computer to take notes because I can type faster than I can write, but I often use pencil & paper in meetings to keep my attention focused on what is being discussed. If I don't do that, my mind is subject to wander. His approach seems to assume our minds won't wander without external "devices" and that we're all auditory learners. I think that first kind of unconscious drifting is much more likely to happen when I don't direct my attention through writing. So. I join the consensus that note-taking may be as useful for some as it is potentially distracting for others. Thank you for a very interesting article.

10. eytanfichman - February 03, 2010 at 03:21 pm

I went back to school for a 2nd masters degree at the age of 50, after teaching for about 20 years. During those mid-career studies I took extensive notes on my laptop and found myself writing notes to myself on my PDA at all sorts of times, including while walking outside from one place to another or while browsing in bookstores.

Taking notes in class did sometimes disturb my attention. For that reason I sometimes stopped taking notes and, at those times, only listened (or spoke, if I was part of class dialog). All this note-taking gave me resources that I value very highly; resources I still refer to and reflect on. I am glad none of my instructors discouraged my note-taking.

I do think telling students about the research on multi-tasking is a good idea - then students can take responsibility for their learning processes in a more informed way.

11. fergbutt - February 03, 2010 at 04:24 pm

An interview with Nass was shown last night on PBS Frontline. You can show the clip to your students by going to pbs.org and finding the segment. But I think the 1-10, A-J demonstration would work as well.

12. tekton - February 05, 2010 at 01:30 pm

Just the other day, I taught a lab section (no TAs in my program) in which I had handed out a map followed by an explanation of the steps the students needed to follow to complete the exercise. Predictably, students started to work on the map before I finished giving the instructions, so I told them flatly to stop working on the map so they could pay attention to what I was saying. Two girls continued to work on the map as I talked; the rest at least appeared to pay attention. After I was finished, the students all started to work. As I walked around the room, one the two girls who had continued working on the map looked up and innocently asked me, "What are we supposed to do to finish the map?" I told her that I had instructed all of them to stop working and pay attention so they wouldn't miss out on the complete instructions. The girl confidently replied, "Oh, I'm good at multitasking. I can listen and work at the same time." I told her that research shows that multitasking does not work nearly as well for students as they think it does (I also told the two to read the instructions and figure it out for themselves).

As others have said, we deal continually with the age-old problem of people not paying attention. But now students are being told - in our case by our Administration among others - that they're "wired" differently because they've grown up in the age of interactive electronic media, and they believe it. This is both dismaying, because the research shows that students don't actually benefit from multitasking, and intriguing. As mentioned in the article, young people ARE learning a different way of processing information. As I understand it (I'm not a brain scientist), brains develop 'wiring' according to what is habitual. What I wonder is, will the "new" way of processing information suit young people better in the world of tomorrow, in which computers will in fact do more of our information processing on increasingly deeper levels? Or does the "new" way just make young people stupider, leading to the demise of our society and all that is good in the world? Is there an upside to multitasking that outweighs the downside?

Clearly the two girls in my lab experienced a downside; I tried to make a point of it to them. But, like Prof. Hayes suggests, multitasking seems unavoidable; it's part of a culture that young people feel entitled to. The more we oppose it, the more young people may see themselves as exceptions to the rule, the way young people always see their generation as exceptional. Multitasking is here, like it or not. Can we use it to our advantage as teachers?

13. vernrogers - February 05, 2010 at 03:27 pm

The fact that a person's attention is increasingly pulled in so many directions will surely be a point of interest for the foreseeable future. This article reminded me of a story on NPR entitled "Passions of the Brain." Here is the link: http://www.npr.org/templates/story/story.php?storyId=101334645

14. arrive2__net - February 06, 2010 at 12:50 am

This research is valuable because these are questions that can be answered by research, and the answers if applied can be powerful guides to what works. Use of the term multi-tasking implies there could be a problem with it ... because multi-tasking implies there is more than one complete task going on. What is the boundary between a single complex task, and multi-tasking (where further synthesis is not attainable)? Sometimes what starts off as multi-tasking becomes tasking, for example a soccer player dribbles the ball, look for opportunities to pass, tracks the player guarding him or her, looks for opportunities to kick on goal, tracks foot position (so as to have the appropriate foot free to kick at the right time), etc. When the player is first learning these are separate tasks to be learned, but then with proficiency the tasks seem to merge together. But if you add talking on a cell phone to the players tasks, the merger is likely to be less complete and the player may be permanently "multi-tasking". Also, I think of note-taking in a lecture as something that starts multi-tasking and merges into a single task ... in taking notes you bring to the learning your own prior associations, you cognitively create and record your way of conceptualizing the ideas, and perform other details that would seen to me to facilitate the learning.
Bernard Schuster
Arrive2.net

15. davsch65 - February 08, 2010 at 09:40 am

The degree to which two or more tasks can be performed simultaneously depends largely on how similar the two tasks are. The more similar, either in terms of the material attended to, the mental computations carried out, or the actions executed, the more the tasks interfere with each other. It is very difficult to simultaneously carry on two different conversations but rather easy to carry on one conversation while preparing a meal or painting a room. Most of us can walk and chew gum at the same time. And even highly similar tasks can be performed in parallel with minimal cost if the person performing them is sufficiently well practiced, such as a chess grandmaster playing multiple games at the same time. We should be careful not to let the limitations of the college sophomores who provide the data for most cognitive psychology experiments or the artificiality of the tasks they are asked to perform distort our view of what people are capable of under favorable conditions.

16. slenjules - February 12, 2010 at 02:19 pm

As a current graduate student, I find the idea of not taking notes in class intriguing. I know I am not able to write quality notes and listen effectively at the same time and therefore miss some of the statements made by the lecturing professor. That brings to mind the much-debated question about the effectiveness of the lecture for learners. (See Steve Jones' article Reflections on the lecture: outmoded medium or instrument of inspiration? in the Journal of Further and Higher Education (2007) for a cogent discussion.) I think lectures can be effective but most professors do not give effective lectures. Incorporating discussions, cooperative learning and active learning strategies, for example, would greatly enhance the effectiveness of a lecture. But those seem to be rare occurences. So maybe the first step is to be conscious of the students' need to process and write notes on the ideas presented before moving on to the next point in the lecture. Wouldn't most people agree that student learning is more important than lecturer expediency? Small measures may help facilitate that learning.

17. kubina - February 13, 2010 at 03:00 pm

Having a video recording of a lecture and the instructors slides is something I would have loved to have had when going to college; I would have learned a lot more being able to review a lecture and provide annotations for reviewing. I also think it would provide useful feedback to the instructor to see how well the lecture was learned and where improvements should be made. But I also know I retain more of a lecture if I write some notes during the lecture.

I suspect my optimal retention/learning from a lecture would be achieved by being able to jot short notes that are tagged with the time into the lecture that I could jump to when reviewing its video. My optimal learning is when I can interact with the instructor to clarify the points I don't understand. Usually this will help the other students but sometimes it just slows down the rate of learning for everyone else if they understood the point that I didn't.

18. matt_matson - February 24, 2010 at 10:32 pm

Eliminating laptops seems reasonable because computers are capabile of receiving and displaying non-relevant, distracting information.

Pen-and-paper note-taking is quite different. The notepad is not in competition with the lesson, but an aid in digesting the information. A student must take a professor's lengthy communication and transform it into a shorter note. The process requires understanding the content.

Taking notes by any means requires constant attention. Without any note-taking, students are likely to simply stop listening for significant periods of time.

And there is science supporting having a note-pad available--even for doodling:

http://bps-research-digest.blogspot.com/2009/12/what-does-doodle-do-it-boosts-your.html


19. dubliner1 - March 03, 2010 at 04:00 pm

While the title of this article captured my attention, after scrolling down to see how long it was I decided that it probably would not hold my attention.

20. praymont - March 04, 2010 at 01:31 pm

The philosopher Wittgenstein used to ban note-taking in his classes. I could never understand why. For me, note-taking is part of genuine 'active learning' -- thinking about the lecturer's points, coming up with my own ways of phrasing/clarifying them, my own examples to illustrate them, questions that I'd like to pose, etc.

Laptops should be banned. To permit them is like inviting each student to bring a home-entertainment system to class.

21. annam78 - March 04, 2010 at 05:09 pm

All of this is interesting to debate, if only I had more time... And that may be the advantage of the multi-tasker. Many jobs require multi-tasking, even if it means that each task is completed with lower quality/accuracy than it would be if the other tasks were neglected. It all depends on priorities and goals. I'm amused that the LSAT is used for a study on multi-tasking. It is one of the few standardized tests not administered electronically. All it tells us is that people who will perform well in law school (since that is what the LSAT indicates) are not multi-taskers or that multi-taskers are less successful in law school. No surprise there - law school requires long hours of singular focus. Conversely, successful entrepreneurs are often amazing multi-taskers, but would never have succeeded in law school. Different work (and learning) requires different skills. Multi-tasking is useful in some contexts and not others. Is this post as perfectly thought through and eloquent as I would like it to be? No. But I could not post anything and still meet my other obligations, in work and life, if I made everything perfect.

22. 11890636 - March 05, 2010 at 10:55 am

Thanks for the timely article and thoughtful comments. Not discussed explicitly is the impact of multitasking/distracted students on the instructor (or conference presenter, for that matter) who is not only competing for attention against multimedia, Internet-connected devices but also deprived of eye contact and other forms of visual feedback from students/audiences. Consider four canonical classroom formats:

(1) Seminar: small group of students prepared for discussion, with instructor as facilitator
(2) Socratic: medium size group of students prepared to discuss in more structured fashion, when called upon
(3) Lecture/medium size: most of the class devoted to presentation, yet instructor prepared to respond to questions -- and, potentially, backtrack or deviate depending on questions and other forms of visual feedback
(4) Lecture/large: essentially all of the class devoted to presentation

I suspect that (3) is an important format to understand in this context, since medium-size lectures are both widespread and amenable to enhancement (i.e. made livelier, more interactive, and more responsive to students' questions) if two-way communication can be established. students focused on laptops (whether note taking or web surfing) are not providing visual or auditory feedback that could lead the instructor to engage more fully and effectively, resulting in the "canned lecture" that students feel justified to "tune out" (or to wonder why the audio and visuals can't be put on the web since the "classroom experience" added nothing).

Add Your Comment

Commenting is closed.

subscribe today

Get the insight you need for success in academe.