Colvin K, Champaign J, Liu A, Zhou Q, Fredericks C and Pritchard D (2014). “Learning in an introductory physics MOOC: All cohorts learn equally, including an on-campus class.” The International Review of Research in Open and Distance Learning, 15(4). ISSN 1492-3831, <URL: http://www.irrodl.org/index.php/irrodl/article/view/1902>.
We studied student learning in the MOOC 8.MReV Mechanics ReView, run on the edX.org open source platform. We studied learning in two ways. We administered 13 conceptual questions both before and after instruction, analyzing the results using standard techniques for pre and posttesting. We also analyzed each week’s homework and test questions in the MOOC, including the pre and posttests, using item response theory (IRT). This determined both an average ability and a relative improvement in ability over the course. The pre and posttesting showed substantial learning: The students had a normalized gain slightly higher than typical values for a traditional course, but significantly lower than typical values for courses using interactive engagement pedagogy. Importantly, both the normalized gain and the IRT analysis of pre and posttests showed that learning was the same for different cohorts selected on various criteria: level of education, preparation in math and physics, and overall ability in the course. We found a small positive correlation between relative improvement and prior educational attainment. We also compared homework performance of MIT freshmen taking a reformed oncampus course with the 8.MReV students, finding them to be considerably less skillful than the 8.MReV students. (p. 1)
Summary and Conclusions (p. 9)
We have studied conceptual learning in a MOOC by analyzing the results of pre and posttesting in two ways: normalized gain and item response theory (IRT). Both methods show unequivocal evidence of learning. The amount of learning, normalized gain, 0.31 +/ 0.02, was higher than in any of the 14 traditional (i.e., lecturebased) courses studied by Hake 1998, but was in the lowest decile of courses whose classes included “interactive engagement” activities. (p. 9)
Indeed we have made a preliminary investigation (Champaign et al., 2014) finding significant positive correlations with time spent on several different resources, but with little differentiation between them. A second factor that might affect learning is study patterns; for example, we found dramatically different patterns of resource use when students did homework versus exams (Seaton, Bergner, Chuang, Mitros, & Pritchard, 2014). This raises the question of whether students following these (or other) patterns will show more or less learning. (p. 9)
Given the different demographics of those registering, the different objectives of each course, and the significantly smaller percentage of certificate earners in these other MOOCs, direct comparisons will be challenging. (p. 10)