Marshall Thompson writes in this blog post from a couple of weeks ago that he’s concerned over the tone of the recent and ongoing Khan Academy/#mtt2k debate and is worried about the cost it incurs. It’s a good post, and in the process of commenting on it I realized a few things. Marshall writes:

I get the impression that KA has a goal of pedagogical soundness. Is this the best way to help them achieve that goal?

Sal Khan is not a dummy. He is clearly working through some of the same pedagogical misconceptions we all worked through (and continue to work through). How can we best help him through his personal journey without alienating him or causing him to be defensive?

I have tremendous respect for Sal Khan, but I have to admit that I’m not really concerned about his personal journey or his working through pedagogical misconceptions. It would be fantastic if he began…

This week I am adding to the playlist of screencasts for the inverted intro-to-proofs class I first mentioned here. There are seven chapters in the textbook we are using and my goal is to complete the screencasts for the first three of those chapters prior to the start of the semester (August 27). Yesterday I added four more videos and I am hoping to make four more tomorrow, which will get us through Chapter 1.

The four new ones focus on conditional (“if-then”) statements. I made this video as the second video in the series as a prelude to proofs, which are coming in Section 1.2 and which will remain the focus of the course throughout. Generally speaking, students coming into this course have had absolutely no exposure to proof in their background with the exception of geometry and maybe trigonometry, in which they hated proofs. Watch a part of this and see if you can figure out my …

Here’s the first (and so far, only) screencast that students will use in the inverted transition-to-proof class:

This one is a bit more lecture-oriented than I intend most of the rest of them to be, so it’s a little longer than I expect most others will be. But I do break up the lecture a little bit with a “Concept Check”, which is the same thing as a ConcepTest except I’ve never warmed to that particular term (the word “test” puts students on edge, IMO).

If you have tried out any of Udacity’s courses or read my posts about taking Udacity courses, you will see some obvious inheritances here. I tried to keep the video short, provide simple but interesting examples, and give some measure of formative assessment in the video. I am exploring ways to make the Concept Check actually doable within YouTube — Camtasia 2 has an “interactive hotspot” feature I am trying to figure out — …

Let’s go back to the original Slate piece, which said:

Though no well-implemented study has ever found technology to be effective, many poorly designed studies have—and that questionable body of research is influencing decision-makers.

The Slate piece suggests that researcher bias, brought on by having a financial stake in…

At some point around the beginning of February 2012, David Coffey — a co-worker of mine in the math department at Grand Valley State University and my faculty mentor during my first year — mentioned something to me in our weekly mentoring meetings. We were talking about screencasting and the flipped classroom concept, and the conversation got around to Khan Academy. Being a screencaster and flipped classroom person myself, we’d talked about making screencasts more pedagogically sound many times in the past.

That particular day, Dave mentioned this idea about projecting a Khan Academy video onto the screen in a classroom and having three of us sit in front of it, offering snarky critiques — but with a serious mathematical and pedagogical focus — in the style of Mystery Science Theater 3000. I told him to sign me up to help, but I got too busy to stay in the loop with it.

So, the six-week Calculus 2 class is over with — that didn’t take long — and there’s beginning to be enough distance between me and the course that I can begin to evaluate how it all went. Summer classes for me are a time when I like to experiment with things, and I wanted to comment on the outcomes of one experiment I tried this time, which is using a bring-your-own-device setup for clicker questions.

I’ve been using TurningPoint clickers ever since I started doing peer instruction, and I recommend these devices highly. They have a lot going for them in terms of classroom technology: They are small and unobtrusive, relatively cheap ($35), exceedingly simple to use, rely on no pre-existing infrastructure (for example, whether or not you have decent wifi in the room), and are nearly indestructible. They are about as simple, dependable, and inexpensive as a radio-operated garage door…

Slate magazine has been running several articles on education this week, including two today that are of interest. This one by Konstantin Kakaes is worth looking at more closely, if only because it somehow manages to gather almost every wrong idea about technology in education in existence into a single, compact article.

The piece proposes that the effort to increase the use of technology in education “is beginning to do to our educational system what the transformation to industrial agriculture has done to our food system over the past half century: efficiently produce a deluge of cheap, empty calories.” I’m not sure which “effort” Kakaes is referring to, since there is no single push being coordinated from a secret underground bunker that I know of, and some efforts are better-conceived than others. But nevermind.

There are two overriding conceptual errors that drive this article…

The first speaker in the Model-Eliciting Activities (MEA’s) session Monday morning said something that I’m still chewing on:

Misunderstanding is easier to correct than misconception.

She was referring to the results of her project, which took the usual framework for MEA’s and added a confidence level response item to student work. So students would work on their project, build their model, and when they were done, give a self-ranking of the confidence they had in their solution. When you found high confidence levels on wrong answers, the speaker noted, you’ve uncovered a deep-seated misconception.

I didn’t have time, but I wanted to ask what she felt the difference was between a misunderstanding and a misconception. My own answer to that question, which seemed to fit what she was saying in the talk, is that a misunderstanding is something like an incorrect interpretation of an idea …

Online educational startup Udacity, with whom I had a very positive experience while taking their CS 101 course, is taking things a bit further by partnering with Pearson. They’ll be using Pearson VUE testing centers worldwide to provide proctored final exams for some of their courses (presumably all of their courses will be included eventually), leading to an official credential and participation in a job placement service.

Before, students watched the videos and did homework assignments online and then took a final exam at the end of the semester. In the first offering of CS 101, the “grade” for the course (the kind of certificate you got from Udacity) depended on either an average of homework scores and the final exam or on the final exam alone. Most Udacity courses these days just use the final exam. But the exam is untimed and unproctored, and there’s absolutely nothing…

I blog a lot about peer instruction, but I think this screenshot from this morning’s Calculus 2 class is worth 1000 of my blog posts about just how effective a teaching technique PI can be. It’s from a question about average value of a function. Just before this question was a short lecture about average value in which I derived the formula and did an example with a graph of data (not as geometrically regular as the one you see below). I used Learning Catalytics to set up the question as Numerical, which means that student see the text and the picture on their devices along with a text box in which to enter what they think is the right answer. (I.e. it’s not multiple choice.) Here are the results of two rounds of voting (click to enlarge):

After the first round of voting, there were 12 different numerical answers for 23 students! (Some of these would be the same answer if students …

I am a mathematician and educator with interests in cryptology, computer science, and STEM education. I am affiliated with the Mathematics Department at Grand Valley State University in Allendale, Michigan. The views here are my own and are not necessarily shared by GVSU.

The Chronicle Blog Network, a digital salon sponsored by The Chronicle of Higher Education, features leading bloggers from all corners of academe. Content is not edited, solicited, or necessarily endorsed by The Chronicle. More on the Network...