About a year ago, I started partitioning up my Calculus tests into three sections: Concepts, Mechanics, and Problem Solving. The point values for each are 25, 25, and 50 respectively. The Concepts items are intended to be ones where no calculations are to be performed; instead students answer questions, interpret meanings of results, and draw conclusions based only on graphs, tables, or verbal descriptions. The Mechanics items are just straight-up calculations with no context, like “take the derivative of \(y = \sqrt{x^2 + 1}\)”. The Problem-Solving items are a mix of conceptual and mechanical tasks and can be either instances of things the students have seen before (e.g. optimzation or related rates problems) or some novel situation that is related to, but not identical to, the things they’ve done on homework and so on.

I did this to stress to students that the main goal of taking …

The calculus class took their third (and last) hour-long assessment yesterday. In the spirit of data analytics ala the previous post here, I made boxplots for the different sections of the test (Conceptual Knowledge (CK), Computation (C), and Problem Solving (PS)) as well as the overall scores. Here are the boxplots for this assessment — put side-by-side with the boxplots for the same sections on the previous assessments. “A2″ and “A3″ mean Assessments 2 and 3.

Obviously there is still a great deal of improvement to be had here — the fact that the class average is still below passing remains unacceptable to me — but there have been some definite gains, particularly in the conceptual knowledge department.

What changed between Assessment 2 and Assessment 3? At least three things:

The content changed. Assessment 2 was over derivative rules and applications; Assessment 3 covered…

I just graded my second hour-long assessment for the Calculus class (yes, I do teach other courses besides MATLAB). I break these assessments up into three sections: Concept Knowledge, where students have to reason from verbal, graphical, or numerical information (24/100 points); Computations, where students do basic context-free symbol-crunching (26/100 points); and Problem Solving, consisting of problems that combine conceptual knowledge and computation (50/100 points). Here’s the Assessment itself. (There was a problem with the very last item — the function doesn’t have an inflection point — but we fixed it and students got extra time because of it.)

Unfortunately the students as a whole did quite poorly. The class average was around a 51%. As has been my practice this semester, I turn to data analysis whenever things go really badly to try and find out what might have happened. I …

Last week, I wrote about structuring class time to get students to self-verify their work. This means using tools, experiences, other people, and their own intelligence to gauge the validity of a solution or answer without uncritical reference an external authority — and being deliberate about it while teaching, resisting the urge to answer the many “Is this right?” questions that students will ask.

Among the many tools available to students for this purpose is Wolfram|Alpha, which has been blogged about extensively. (See also my YouTube video, “Wolfram|Alpha for Calculus Students”.) W|A’s ability to accept natural-language queries for calculations and other information and produce multiple representations of all information it has that is related to the query — and the fact that it’s free and readily accessible on the web — makes it perhaps the most powerful self-verification tool…

Dave Richeson at Division By Zero wrote recently about a “proof technique” for proving equalities or inequalities that is far too common: Starting with the equality to be proven and working backwards to end at a true statement. This is a technique that is almost a valid way to prove things, but it contains — and engenders — serious flaws in logic and the concept of proof that can really get students into trouble later on.

I left a comment there that spells out my feelings about why this technique is bad. What I wanted to focus on here is something I also mentioned in the comments, which was that it’s so easy to take a “backwards” proof and turn it into a “forwards” one that there’s no reason not to do it.

Take the following problem: Prove that, for all natural numbers \(n\),

When I am having students work on something, whether it’s homework or something done in class, I’ll get a stream of questions that are variations on:

Is this right?

Am I on the right track?

Can you tell me if I am doing this correctly?

And so on. They want verification. This is perfectly natural and, to some extent, conducive to learning. But I think that we math teachers acquiesce to these kinds of requests far too often, and we continue to verify when we ought to be teaching students how to self-verify.

In the early stages of learning a concept, students need what the machine learning people call training data. They need inputs paired with correct outputs. When asked to calculate the derivative of \(5x^4\), students need to know, having done what they feel is correct work, that the answer is \(20x^3\). This heads off any major misconception in the formation of the concept…

The video post from the other day about handling ungraded homework assignments went so well that I thought I’d let you all have another crack and designing my courses for me! This time, I have a question about really bad mistakes that can be made in a proof.

One correction to the video — the rubric I am developing for proof grading gives scores of 0, 2, 4, 6, 8, or 10. A “0″ is a proof that simply isn’t handed in at all. And any proof that shows serious effort and a modicum of correctness will get at least a 4. I am reserving the grade of “2″ for proofs that commit any of the “fatal errors” I describe (and solicit) in the video.

I got an email this afternoon from a reader who is interested in learning mathematics — as an adult, post-college. The reader has an advanced degree in a humanities discipline and never studied mathematics, but recently he’s become interested in learning and is looking for a place to start.

I recommended The Mathematical Experience by Davis and Hirsch, How to Solve It by Polya, and any good college-level textbook in geometry (like Greenberg, or for a humanities person perhaps Henderson). I felt like these three books give an ample and accessible start at — respectively — the big picture and history of the discipline, the methodology of mathematicians, and a first step into actual mathematical content.

But what I thought this was an interesting question, and I wonder if the other readers out there would have similar suggestions for books, articles, movies or documentaries… anything…

Sorry for the slowdown in posting. It’s been tremendously busy here lately with hosting our annual high school math competition this past weekend and then digging out from midterms.

Today in Modern Algebra, we continued working on proving a theorem that says that if \(a\) is a group element and the order of \(a\) is \(n\), then \(a^i = a^j\) if and only if \(i \equiv j \ \mathrm{mod} \ n\). In fact, this was the third day we’d spent on this theorem. So far, we had written down the hypothesis and several equivalent forms of the conclusion and I had asked the students what they should do next. Silence. More silence. Finally, I told them to pair off, and please exit the room. Find a quiet spot somewhere else in the building and tell me where you’ll be. Work on the proof for ten minutes and then come back.

As I wandered around from pair to pair I was very surprised to…

I am a mathematician and educator with interests in cryptology, computer science, and STEM education. I am affiliated with the Mathematics Department at Grand Valley State University in Allendale, Michigan. The views here are my own and are not necessarily shared by GVSU.

The Chronicle Blog Network, a digital salon sponsored by The Chronicle of Higher Education, features leading bloggers from all corners of academe. Content is not edited, solicited, or necessarily endorsed by The Chronicle. More on the Network...