May 15, 2013, 6:53 am
Sebastian Thrun of Udacity today announced that Udacity, Georgia Tech, and AT&T are teaming up to offer an online Master’s degree in Computer Science. Here is Thrun’s official announcement. The details are slim at this point but Thrun states that the course materials will be entirely free, that there will be a tuition charge if you want to have the actual credit-bearing Master’s degree certification, and non-credit certificates will be offered at “a much reduced price point”.
Without details, there’s not much to say at this point about all this, other than this is clearly a major advance in the reach of massively open online courses. Udacity was the first to partner with brick-and-mortar universities to offer academic credit for MOOCs, and just as others are beginning to follow suit, they have made the leap into graduate education.
What this means for traditional…
May 8, 2013, 8:00 am
I was really fortunate this past weekend to host Dana Ernst and T.J. Hitchman, two colleagues (from Northern Arizona University and University of Northern Iowa, respectively) at the Michigan MAA section meeting. They did a discussion panel on Teaching to Improve Student Learning that I organized, and we ended talking a lot about inquiry-based learning, which both of these guys practice. After Dana blogged about the session, he got this tweet:
Dana, Brandon, and I exchanged some tweets after that, and I think generally we’re on the same page, but here’s my reasoning about this question and, more generally, what does or does not fall under the heading of “flipped classroom”.
The main thing to keep in mind is the distinction between an instructional practice and a course design principle. This was the gist of my post a…
May 7, 2013, 7:33 pm
I lost a bet with my friend and colleague Dave Coffey (remember him?) over the NCAA mens’ basketball tournament, and as a result, I owed Dave a guest post on his blog DeltaScape. Risky move on his part, in my opinion, since his blog is a consistently excellent source of wisdom about math education and teacher training. I hope I didn’t mess it up too much — my guest post, entitled “Does this make sense?” is about so-called sense-making activities and what they mean for math instruction. It can be found here. Enjoy!
May 6, 2013, 8:00 am
Let’s go back to the research paper on screencasting that I first blogged about here. In that post, we saw that students on the study generally watched the screencasts, even without explicit rewards like grades, and the tended to do so strategically. But what about student learning? Did it help?
To answer that question, we have to go back to a previous paper by the authors [PDF]. (That one is in the queue this week to read and blog about.) In that paper, the authors did find a positive correlation between screencast use (which they tracked using stats for the class’ course management system) and overall performance. But – this correlation does not imply causation, and indeed when the data are sliced along various demographic lines, sometimes the students’ performance was better explained by GPA than by screencast use.
I haven’t gotten into that second paper yet, but what …
April 25, 2013, 2:39 pm
The semester just ended, and I’m now in full retrospect mode. This semester I was fortunate to have only one prep — two sections of Linear Algebra. Linear algebra, for me, is the cornerstone of a modern mathematics education precisely because its concepts and its mechanics lie at the heart of so much real-world stuff — from web search algorithms to scheduling problems to computer graphics and many other areas. And yet, in a typical one-semester course on linear algebra you only get to touch on a handful of applications, and those tend to be sort of domesticated. A few years ago, I decided I wanted students to explore more than just the stock examples in the textbook, and I wanted them to do so in an authentic way that reflects real-world mathematical practice.
About that time, Derek Bruff published this blog post about his use of Application Projects, and I gleefully…
April 22, 2013, 9:29 am
The Washington Post reports this morning (apologies if this is behind a paywall) about how some universities are (finally?) moving from in-class lecture as the basis for their “large lecture” courses to the flipped or inverted classroom. Says the article:
Colleges are absorbing lessons from the online education boom, including the growth of massive open online courses, or MOOCs. And some professors are “flipping” their classrooms to provide more content to students online and less through standard lectures.
William E. “Brit” Kirwan, chancellor of the University System of Maryland, said the system hopes the redesigned courses save money and boost performance.
“The passive, large lecture method of instruction is dead,” Kirwan said. “It’s just that some institutions don’t know it yet. We do.”
This is nice to hear, but watch out for that phrase, “saves money…
April 4, 2013, 4:48 pm
Screencasting is an integral part of the inverted classroom movement, and you can find screencasting even among courses that aren’t truly flipped. Using cheap, accessible tools for making and sharing video to clear out time for more student-active work during class make screencasting very appealing. But does it work? Do screencasts actually help students learn?
We have lots of anecdotal evidence that suggests it does, but it turns out there are actually data as well that point in this direction. I’ve been reading an article by Katie Green, Tershia Pinder-Grover, and Joanna Mirecki Millunchick (of Michigan State University and the University of Michigan) from the October 2012 issue of the Journal of Engineering Education in which they studied 262 students enrolled in an engineering survey course that was augmented with screencasts. Here’s the PDF. This paper is full of interesting…
March 27, 2013, 8:00 am
In my series of posts on the flipped intro-to-proofs course, I’ve described the ins and outs of the design challenges of the course and how the course was run to address those challenges and the learning objectives. There’s really only one thing left to describe: How the course actually played out through the semester, and especially how the students responded.
I wasn’t sure how students in the course would respond to the inverted classroom structure. On the one hand, by setting the course up so that students were getting time and support on the hardest tasks in the course and optimizing the cognitive load outside of class, this was going to make a problematic course very doable for students. On the other hand, students might be so wed to the traditional classroom setup that no amount of logic was going to prevail, and it would end up like my inverted MATLAB class did where a
March 25, 2013, 8:00 am
I have a confession to make: At this point in the semester (week 11), there’s a question I get that nearly drives me to despair. That question is:
Can we see more examples in class?
Why does this question bug me so much? It’s not because examples are bad. On the contrary, the research shows (and this is surely backed up by experience) that studying worked examples can be a highly effective strategy for learning a concept. So I ought to be happy to hear it, right?
When people ask this question because they want to study an example, I’m happy. But studying an example and seeing an example are two radically different things. Studying an example means making conscious efforts to examine the example in depth: isolating the main idea or strategy, actively trying out modifications to the objects involved, making connections to previous examples and mathematical results, and – very …
March 20, 2013, 8:00 am
So, what about grading in that inverted transition-to-proofs course? Other than the midterm and final exams, which were graded pretty much as you might expect, we had four recurring assignments that required grading: Guided Practice, Quizzes, Classwork, and the Proof Portfolio. Let’s discuss the workflow and how it was all managed.
Let’s start with the easy stuff: Quizzes and Guided Practice. Quizzes were done using clickers, so the grading was trivial. Guided Practice was graded on the basis of completeness and effort only, on a scale of 0–2. So it was almost instantaneous to grade. Students would submit their work using a Google form that dumped their responses into a spreadsheet. I would just sort the spreadsheet in alphabetical order, look through for any glaring omissions or places where effort was lacking, and then put the grades right into Blackboard. A grade of “0”…