Category Archives: Educational technology

July 18, 2012, 9:44 am

For proofs, just click “play”

This week I am adding to the playlist of screencasts for the inverted intro-to-proofs class I first mentioned here. There are seven chapters in the textbook we are using and my goal is to complete the screencasts for the first three of those chapters prior to the start of the semester (August 27). Yesterday I added four more videos and I am hoping to make four more tomorrow, which will get us through Chapter 1.

The four new ones focus on conditional (“if-then”) statements.  I made this video as the second video in the series as a prelude to proofs, which are coming in Section 1.2 and which will remain the focus of the course throughout. Generally speaking, students coming into this course have had absolutely no exposure to proof in their background with the exception of geometry and maybe trigonometry, in which they hated proofs. Watch a part of this and see if you can figure out my …

Read More

July 13, 2012, 8:17 am

Screencasting for the inverted proofs class

Here’s the first (and so far, only) screencast that students will use in the inverted transition-to-proof class:

This one is a bit more lecture-oriented than I intend most of the rest of them to be, so it’s a little longer than I expect most others will be. But I do break up the lecture a little bit with a “Concept Check”, which is the same thing as a ConcepTest except I’ve never warmed to that particular term (the word “test” puts students on edge, IMO).

If you have tried out any of Udacity’s courses or read my posts about taking Udacity courses, you will see some obvious inheritances here. I tried to keep the video short, provide simple but interesting examples, and give some measure of formative assessment in the video. I am exploring ways to make the Concept Check actually doable within YouTube — Camtasia 2 has an “interactive hotspot” feature I am trying to figure out — …

Read More

July 5, 2012, 4:13 pm

Looking for evidence?

Amid all the shuffle of the #mtt2k phenomenon and my piece on Khan Academy this week — which is well on its way to being the most-read and -retweeted article I’ve ever done — Konstantin Kakaes put up a response to critiques of his Slate piece on educational technology. In it, he addresses both my critique and that of Paul Karafiol. I wanted to give just a few counter-critiques here. I haven’t had a chance to read Paul’s piece, so I’m just going to focus on the part of the response that referenced my post. (Here’s the full post I wrote about the Slate article.)

Let’s go back to the original Slate piece, which said:

Though no well-implemented study has ever found technology to be effective, many poorly designed studies have—and that questionable body of research is influencing decision-makers.

The Slate piece suggests that researcher bias, brought on by having a financial stake in…

Read More

July 3, 2012, 9:08 am

The trouble with Khan Academy

At some point around the beginning of February 2012, David Coffey — a co-worker of mine in the math department at Grand Valley State University and my faculty mentor during my first year — mentioned something to me in our weekly mentoring meetings. We were talking about screencasting and the flipped classroom concept, and the conversation got around to Khan Academy. Being a screencaster and flipped classroom person myself, we’d talked about making screencasts more pedagogically sound many times in the past.

That particular day, Dave mentioned this idea about projecting a Khan Academy video onto the screen in a classroom and having three of us sit in front of it, offering snarky critiques — but with a serious mathematical and pedagogical focus — in the style of Mystery Science Theater 3000. I told him to sign me up to help, but I got too busy to stay in the loop with it.

It…

Read More

June 29, 2012, 2:23 pm

The summer of BYOD

So, the six-week Calculus 2 class is over with — that didn’t take long — and there’s beginning to be enough distance between me and the course that I can begin to evaluate how it all went. Summer classes for me are a time when I like to experiment with things, and I wanted to comment on the outcomes of one experiment I tried this time, which is using a bring-your-own-device setup for clicker questions.

I’ve been using TurningPoint clickers ever since I started doing peer instruction, and I recommend these devices highly. They have a lot going for them in terms of classroom technology: They are small and unobtrusive, relatively cheap ($35), exceedingly simple to use, rely on no pre-existing infrastructure (for example, whether or not you have decent wifi in the room), and are nearly indestructible. They are about as simple, dependable, and inexpensive as a radio-operated garage door…

Read More

June 27, 2012, 11:52 am

Two big mistakes in thinking about technology in education

Slate magazine has been running several articles on education this week, including two today that are of interest. This one by Konstantin Kakaes is worth looking at more closely, if only because it somehow manages to gather almost every wrong idea about technology in education in existence into a single, compact article.

The piece proposes that the effort to increase the use of technology in education “is beginning to do to our educational system what the transformation to industrial agriculture has done to our food system over the past half century: efficiently produce a deluge of cheap, empty calories.” I’m not sure which “effort” Kakaes is referring to, since there is no single push being coordinated from a secret underground bunker that I know of, and some efforts are better-conceived than others. But nevermind.

There are two overriding conceptual errors that drive this article…

Read More

June 12, 2012, 7:00 am

Misunderstandings vs. misconceptions

The first speaker in the Model-Eliciting Activities (MEA’s) session Monday morning said something that I’m still chewing on:

Misunderstanding is easier to correct than misconception.

She was referring to the results of her project, which took the usual framework for MEA’s and added a confidence level response item to student work. So students would work on their project, build their model, and when they were done, give a self-ranking of the confidence they had in their solution. When you found high confidence levels on wrong answers, the speaker noted, you’ve uncovered a deep-seated misconception.

I didn’t have time, but I wanted to ask what she felt the difference was between a misunderstanding and a misconception. My own answer to that question, which seemed to fit what she was saying in the talk, is that a misunderstanding is something like an incorrect interpretation of an idea …

Read More

June 2, 2012, 10:27 am

Udacity to partner with Pearson for testing: What does this mean?

Online educational startup Udacity, with whom I had a very positive experience while taking their CS 101 course, is taking things a bit further by partnering with Pearson. They’ll be using Pearson VUE testing centers worldwide to provide proctored final exams for some of their courses (presumably all of their courses will be included eventually), leading to an official credential and participation in a job placement service.

Before, students watched the videos and did homework assignments online and then took a final exam at the end of the semester. In the first offering of CS 101, the “grade” for the course (the kind of certificate you got from Udacity) depended on either an average of homework scores and the final exam or on the final exam alone. Most Udacity courses these days just use the final exam. But the exam is untimed and unproctored, and there’s absolutely nothing…

Read More

May 8, 2012, 12:52 pm

A screenshot that illustrates what peer instruction can do

I blog a lot about peer instruction, but I think this screenshot from this morning’s Calculus 2 class is worth 1000 of my blog posts about just how effective a teaching technique PI can be. It’s from a question about average value of a function. Just before this question was a short lecture about average value in which I derived the formula and did an example with a graph of data (not as geometrically regular as the one you see below). I used Learning Catalytics to set up the question as Numerical, which means that student see the text and the picture on their devices along with a text box in which to enter what they think is the right answer. (I.e. it’s not multiple choice.) Here are the results of two rounds of voting (click to enlarge):

After the first round of voting, there were 12 different numerical answers for 23 students!   (Some of these would be the same answer if students …

Read More

May 7, 2012, 8:58 pm

How the technology works in Calculus 2

Today we started the spring term, 6-week Calculus 2 class that I’ve been writing about for the last few days. We had a good time today, getting comfortable with each other and doing some review of the basics of the definite integral. Before we get too far into the term, I wanted to outline the technology infrastructure of the course.

For a long time, I’d used the learning management system (LMS) of my institution as the basic technology for the course, and everything else kind of fit around the LMS. At GVSU the default LMS is Blackboard. But I decided after used Blackboard this past year that we have irreconcilable differences. I don’t ask much from my LMS; I mainly use it to archive files, provide a link to a central calendar, post grades, and to make announcements. I don’t need all the dozens of other features Blackboard offers, and the profusion of features in Blackboard tends to…

Read More