read

References

Citekey: @ebben13unpacking

Ebben, M. and Murphy, J. S. (2014). Unpacking mooc scholarly discourse: a review of nascent mooc scholarship. Learning, Media and Technology, 39(3):328–345.

Notes

A review of MOOC research is useful for someone who is new to the topic (and its history). Based on a review of about two dozen journal articles, the authors identified two phases of MOOC scholarly discourse, characterizing each phase by their key ingredients.

Frankly speaking, I have expected more from this article. The identification of these two phases didn’t stretch too far from the distinction already made between cMOOC and xMOOC. Also, excluding conference proceedings could be a serious mistake for this review because a lot of good stuff has not made to journal articles yet, and they are not necessarily of less quality..

Highlights

Two key phases of scholarship about MOOCs are identified, each with associated research imperatives and themes. Phase One: Connectivist MOOCs, Engagement and Creativity 2009 – 2011/ 2012. (p. 2)

Phase Two: xMOOCs, Learning Analytics, Assessment, and Critical Discourses about MOOCs 2012–2013. (p. 2)

This paper identifies a solid body of empirically based MOOC research and analyses this area of inquiry, its authors, content, and direction. (p. 3)

Cite
Just a side note: The paper should cite the original distinction between cMoocs and xMoocs. (p. 3)

Data collection (p. 3)

Articles for analysis were drawn from a comprehensive search of nine leading academic databases starting in the year 2002 to 1 July 2013. (p. 3)

The databases searched were: Academic Search Complete, Communication and Mass Media Complete, Directory of Open Access Journals, Education Full Text, ERIC, Google Scholar, MathSciNet, Science Direct, and Web of Science. (p. 4)

Data collection efforts yielded 25 scholarly articles with a research focus on MOOCs published between 2009 and 2013. One article was published in 2009, no articles were published in 2010, 10 articles were published in 2011 and 2012, and 14 articles (59% of the total) were published in the first half of 2013 (Table 1). (p. 4)

Research methodologies used were diverse, employing both qualitative and quantitative approaches such as content analysis, surveys, interviews, self-reflection, data-mining techniques applied to MOOC courses, and social network analyses of MOOC participants’ interactions and patterns of communication. (p. 6)

Scholars are beginning to formulate conceptual frameworks to organize the nascent research literature about MOOCs. (p. 6)

Liyanagunawardena, Adams, and Williams (2013) identify eight themes apparent in MOOC scholarship. These include: (1) Introductory (explaining aspects of MOOCs), (2) Concept (discussion of threats and opportunities of MOOCs in higher education), (3) Case Studies, (4) Educational theory (pedagogical approaches used), (5) Technology (discussion of hardware and software used), (6) Participant focused ((discussion of participants’ experiences), (7) Provider focused (discussion of course creators and leaders), and (8) Other. (p. 6)

Two key phases of MOOC scholarship are identified, each with associated themes. Phase One: Connectivist cMOOCs, Engagement and Creativity 2009–2011/2012. Themes of Phase One include: development of Connectivism as a learning theory, and technological experimentation and innovation with Connectivism in early cMOOCs. Phase Two: xMOOCs, Learning Analytics and Assessment 2012–2013. Themes of Phase Two include: the rise of xMOOCs, further development of MOOC pedagogy, growth of learning analytics and assessment, and the emergence of a critical discourse about MOOCs. (p. 6)

Phase one: Connectivist cMOOCs, engagement and creativity 2009–2012 (p. 7)

Early MOOC researchers sought to operationalize the learning theory of Connectivism by building and running cMOOCs with a few researchers (Kop 2011; DeWaard et al. 2011) doubling as both designers and facilitators of the MOOCs under study. (p. 7)

Nearly all of the cMOOCs described in the research of this period were affiliated with Canadian distance education university programs. These include: PLENK 2010 (Personal Learning Environments, Networks and Knowledge,) CCK 08, 09, 11 (Connectivism and Connective Knowledge), CRITLIT 2010 (Critical Literacies), and MobiMOOC 2011. (p. 7)

Participant interaction and modular roles for educators and participants within an open network were emphasized in these early cMOOCs. None of the four cMOOCs analyzed in the research, however, had the goal of delivering standard university course content (e.g., Math 101). (p. 8)

Although the majority of the 1580 participants in this cMOOC were disengaged, a core group of participants, mostly educators, were highly engaged and reported that the reasons for their engagement as producers of digital content were due to motivation and social support. (p. 9)

There is little critique of the limits of Connectivism in this first phase of MOOC research; however, Bell (2011) is the exception. He does not believe Connectivism can always be applied successfully to online teaching, and suggests that Connectivism fits some learning environments better than others. Bell rejects Connectivism as a theory, and argues that it is best understood as a phenomenon rather than as a full-fledged theory of learning. (p. 9)

Phase two: xMOOCs, learning analytics, assessment, and critical discourse 2012–2013 (p. 9)

Based on this analysis of differences, as well as drawing on the classifications proposed by Anderson and Dron (2011), Rodriguez argues that the Stanford MOOC is a different sort of massive online course, an xMOOC. (p. 10)

From an epistemological point of view, Rodriguez argues that cMOOCs are characterized by generative knowledge, whereas xMOOCs are characterized by declarative knowledge. (p. 10)

The concept and practice of ‘openness’ in MOOCS also gets differentiated between the two MOOC formats by Rodriguez (2013). (p. 10)

For xMOOCs, openness means open access to anyone. For cMOOCs, openness in a connected environment constitutes the locus and practice of knowledge acquisition and production. (p. 11)

Learning analytics (p. 11)

New note
I would say ‘learning analytics’ discussed in this article represents a narrow conception of this topic. (p. 11)

Predictive algorithms and adaptive feedback mechanisms hardwired into MOOCs track, record, and analyze every click a student makes to generate data aimed at understanding the ways in which students learn in MOOCs (McKay 2013). (p. 11)

Another aim of learning analytics is to correlate student characteristics (age, gender, nationality, etc.) with achievement in MOOC courses. (p. 12)

Learning analytics research also focuses on discerning micro-patterns of behavior about the ways in which students use course materials available in MOOCs. For example, analyses track the time students spend watching lecture videos, doing homework, completing labs, engaging in discussion boards, consulting the textbook, and other course resources. The concept is to identify high-performing students and map their activities and behaviors in the MOOC to determine the best presentation sequence of course materials. Researchers seek to identify when certain resources might be most useful and in what ways. For example, what resources do the best MOOC students use for different tasks? What behaviors do they perform when preparing for an exam? What behaviors do they engage in when working on homework? Learning analytics compares discussion board behavior of high achieving students with others. Who asked questions? Who answered questions or made a comment? Does discussion board communication correlate with course success? (p. 12)

‘best’ is operationalized as learning, growth, and persistence. (p. 12)

According to Breslow et al. (2013), the next phase of data-mining research will focus on further development of models and analytics to offer understanding about the ways in which student background and other patterns of student engagement with course content either assist or hinder students’ ability to persist and complete the course, including attempting to understand the high dropout rates of MOOCs.3 (p. 12)

Assessment (p. 13)

To enrich the pedagogical repertoire, student writing is attempted in some MOOCs, but assessment remains an obstacle. (p. 14)

Two forms of essay assessment are currently described in research on MOOCs: Automated Essay Scoring (AES) software used by edX, and University of California – Los Angeles Calibrated Peer ReviewTM (CPR) used by Coursera. (p. 14)

Critical discourse about MOOCs (p. 14)

Critiques of MOOCs are being articulated that identify problematics related to MOOC epistemology, pedagogy, and cultural hegemony (Rhoads et al. 2013). (p. 14)

the problem of epistemology concerns the narrow view of knowledge employed in some MOOCs that view knowledge as a product to be transmitted to anyone with an internet connection and a computer (p. 14)

The problem of pedagogy refers to a limited understanding of what constitutes empowering teaching. (p. 15)

Blog Logo

Bodong Chen


Published

Image

Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home