Learning Catalytics and Data Mining Student Performance

December 13th, 2011

I previously blogged about the Khan Academy’s efforts to use data-mining to find the most effective way to teach students. This article from the Houston Chronicle discusses a cool new tool, Learning Catalytics, that takes a “Moneyball” approach to education.

This is absolutely amazing:

Classroom data-mining isn’t just taking off at rich universities like Harvard. In a community-college sector racked by budget cuts, one Arizona institution sifts through data on student behavior in online courses to figure out who is at risk of underperforming or dropping out—and how to help.

By the eighth day of class, Rio Salado College predicts with 70-percent accuracy whether a student will score a C or better in a course.

That’s possible because a Web course can be like a classroom with a camera rigged over every desk. The learning software logs students’ moves, leaving a rich “clickstream” for data sleuths to manipulate.

Running the algorithms, officials found clusters of behaviors that helped predict success. Did a student log in to the course homepage? View the syllabus? Open an assessment? When did she turn in an assignment? How does his behavior compare with that of previous students?

The college translated that analysis into a warning system that places color-coded icons beside students’ names in the course-management system. Shannon F. Corona, a seven-year online teaching veteran who is faculty chair of the physical-science department at Rio Salado, says the alerts improved her outreach. Before, she knew which students were doing great. She also knew which had tuned out. But she had a harder time pinpointing those in between, struggling yet still trying.

Now, when Ms. Corona logs in to her Chemistry 130 course, she takes students’ temperature with a glance. The software flags them as green (likely to complete the course with a C or better), yellow (at risk of not earning a C), and red (highly unlikely to earn a C). If she hovers her mouse over the color, she gets more details. For one student flagged as yellow, for example, the system reports that he is doing an excellent job logging in to the class and a good job engaging with lessons, but falling behind when it comes to the pace of assignment submissions.

That might be the online equivalent of a student who shows up to class but struggles with the content, she says. Ms. Corona e-mails yellow-tagged students asking if they’d like her help or a tutor’s.

“Especially for online students, they sometimes feel isolated,” she says. “And a lot of instructors, just because of how the system is set up, you might miss it. You don’t really know where they are, how they’re doing, because they haven’t asked you any questions.”

But can you change a student’s trajectory? The college has experimented with various intervention strategies, so far with mixed results. For example, early data showed students in general-education courses who log in on Day 1 of class succeed 21 percent more often than those who don’t. So Rio Salado blasted welcome e-mails to students the night before courses began, encouraging them to log in.

The next step is a widespread rollout of the color-coded alerts, one that will put the technology in the hands of many more professors and students. The hope, Ms. Corona says, is that a yellow signal might prompt students to say to themselves: “Gosh, I’m only spending five hours a week in this course. Obviously students who have taken this course before me and were successful were spending more time. So maybe I need to adjust my schedule.”

I would love to incorporate some of these technologies in class.

Another set of data-driven experiments involves how to teach those students once they start taking college classes, such as the one here at Harvard where the computer picks study partners.

That Learning Catalytics system grew out of technology developed in Eric Mazur’s physics class. It marks the latest effort in the Harvard professor’s long campaign to perfect the art of interactive teaching. Science instructors around the world have adopted his Peer Instruction method, and the technique helped popularize the classroom-response devices known as “clickers.”

Mr. Mazur argues that his new software solves at least three problems. One, it selects student discussion groups. Two, it helps instructors manage the pace of classes by automatically figuring out how long to leave questions open so the vast majority of students will have enough time. And three, it pushes beyond the multiple-choice problems typically used with clickers, inviting students to submit open-ended responses, like sketching a function with a mouse or with their finger on the screen of an iPad.

“This is grounded on pedagogy; it’s not just the technology,” says Mr. Mazur, a gadget skeptic who feels technology has done “incredibly little to improve education.”

Students may not even need to pick their own colleges!

Similar ideas are flourishing in the world of admissions. One company getting buzz is ConnectEDU, sometimes described as an eHarmony for college matchmaking. Its founder, Craig Powell, dreams that students won’t even have to apply to college “because an algorithm will have already told them and the schools where they would fit best,” as The Atlantic reported recently.

Mr. Powell hopes to make that happen by plugging high schools and colleges in to an online platform that feels a lot like Facebook. And like Facebook, its news feed and customized recommendations hinge on vast amounts of information: over 250 data points for each student, including high-school academic records, standardized test scores, financial circumstances, career ambitions, and geographic locations. So far, 2.5 million high-school students have ConnectEDU profiles.

H/T Gary Rosin