I’m excited and happy to be teaching linear algebra again next semester. Linear algebra has it all — there’s computation that you can do by hand if you like that sort of thing, but also a strong incentive to use computers regularly and prominently. (How big is an incidence matrix that represents, say, Facebook?) There’s theory that motivates the computation. There’s computation that uncovers the theory. There’s something for everybody, and in the words of one of my colleagues, if you don’t like linear algebra then you probably shouldn’t study math at all.
Linear algebra is also an excellent place to use Peer Instruction, possibly moreso than any other sophomore-level mathematics course. Linear algebra is loaded with big ideas that all connect around a central question (whether or not a matrix is invertible). The computation is not the hard part of linear algebra — it’s forging a real understanding of the ideas and concepts in the subject and coming to terms with how they relate. And this is where PI steps in, as a teaching method that focuses specifically on students constructing their own conceptual knowledge through engagement with important questions. The last time I taught linear algebra, I folded in PI as part of the spectrum of pedagogies used in the class, and it went really well. When I was assigned linear algebra to teach next semester, I knew that I wanted to use PI as much as possible.
However, one of the big roadblocks in using PI in linear algebra is the traditional, text-based multiple-choice formatting of the conceptual questions or “ConcepTests” that PI uses. Some questions just don’t work very well in that format, for example if you want students to draw something rather than pick from a menu, or if you want students to be able to select as many answers as they think apply from a list rather than just one. So when I first heard about a new web-based software venture called Learning Catalytics, created by some folks closely connected to PI godfather Eric Mazur and designed specifically with PI in mind, I took note. I test-drove the software last summer in my Calculus II course, and it was brilliant.
Learning Catalytics (LC) uses a “bring-your-own-device” classroom response system model where the questions given to students are created on the web, served out to students on the web, and then students access and respond to them using any device that has a web browser — laptop, tablet, smartphone, e-reader, you name it. Student responses are sent back to the instructor through the web and stored in the instructor’s account for viewing, archiving, and so forth. What is really great about LC, though, was that it lets me create more than just the usual multiple-choice, select-the-best-answer questions. I can create multiple selection questions (“choose all that apply”), questions requiring freeform text answers, questions where students click or tap on an image and the software creates a heat map, questions where students draw on an image, and so on. The playbook is wide open.
Better still, LC lets instructors tag their questions, and the questions are stored in a global repository. It creates essentially a huge user library of clicker questions that I can dip into if needed, and I get to contribute something to the user community by making up my own questions. Also, the questions allow LaTeX typesetting and they are working on letting users export question modules to LaTeX as well.
My students in Calculus II liked LC so much that when the 30-day free trial ran out and we went back to ordinary Turning Point clickers, they vocally complained about it. There’s a lot to like from their perspective. A single semester license of LC costs only $12 versus $35 and up for a clicker device, which these days is not likely to get used outside of a single course. It uses the technology that students already have (almost all of them; more on that later) rather than forcing them to buy a device they may not use in the future.
I knew that I wanted to use LC to implement peer instruction in linear algebra almost immediately after the Calculus II class was over. The biggest potential issue with doing this is if one or more students doesn’t have a device they can bring. This happened in the Calculus II course. There was one student who didn’t have a computer or smartphone, couldn’t afford one, and didn’t know anybody with one well enough to borrow it. Fortunately my department has an iPad 1 that we use as a test device for instructors, so I just let her use it. But what if we’d had 5 students in her situation?
So part of my planning for the course involved writing a grant proposal that would pay for 60 student licenses of Learning Catalytics (enough to cover both sections I am teaching) and to pay for the purchase of inexpensive handheld computers that could be loaned out to students who needed devices during class (for whatever reason). That grant was funded by our Faculty Teaching and Learning Center, so I’m currently getting all the technology set up.
In looking around for the handheld devices, I wanted to find the absolute cheapest thing money could buy that had a real web browser and didn’t suck. Originally these were going to be the iPod Touch ($199). This was back in October and November. But then, as the holiday season approached, there was a mass markdown in prices among several good alternatives like the Google Nexus 7 tablet, the Barnes & Noble Nook, and the Kindle Fire. All of these are now in the same price range as the iPod Touch and feature a nice roomy 7” screen. After doing research and asking around, I eventually went with the Nexus 7. I now have a fleet of nine of these (I’ll order the tenth one after we settle some tax-exemption business with Google) and they are set up and ready for students to get their hands on them.
One of the things I’m especially interested in seeing is whether questions that admit more constructive input on a question correlate with improved student learning compared to more static multiple choice questions on the same concept. For example, given a couple of vectors, you could ask “Which of the following are linear combinations of these vectors?” A static multiple choice question would give several possibilities, including ones like “Just (a) and (b)” or “None of the above”. A more constructive question would give a list of possibilities and ask “Check all that apply”, thereby putting students into a larger space of answers to consider before clicking. Does the kind of question matter in terms of student learning? Nobody really knows right now. There’s not much in the literature about BYOD clicker setups as opposed to traditional clicker devices in the literature right now that I can see in terms of effects on student learning.
I’m really looking forward to all this. Any questions?