read

References

Citekey: @siemens2013learning

Siemens G (2013). “Learning analytics: The emergence of a discipline.” American Behavioral Scientist, pp. 0002764213498851.

Notes

Highlights

The challenges facing LA as a field are also reviewed, particularly regarding the need to increase the scope of data capture so that the complexity of the learning process can be more accurately reflected in analysis. Privacy and data ownership will become increasingly important for all participants in analytics projects. (p. 1380)

When P. W. Anderson stated in 1972 that “more is different,” he argued that the quantity of an entity influences how researchers engage with it. (p. 1381)

The emphasis on large quantities of data for discovery has important implications for education. (p. 1381)

for researchers, learning sciences, and education in general, data trails offer an opportunity to explore learning from new and multiple angles. (p. 1381)

The view that data and analytics offer a new mode of thinking and a new model of discovery is at least partially rooted in the artificial intelligence and machine learning fields. (p. 1381)

This article reviews the historical developments of learning analytics as a field, tools and techniques used by practitioners and researchers, and challenges with broadening the scope of data capture, modeling knowledge domains, and building organizational capacity to use analytics. (p. 1382)

Defining Learning Analytics and Tracing Historical Roots (p. 1382)

the following definition offered in the 1st International Conference on Learning Analytics: Learning analytics is the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning and the environments in which it occurs.1 (p. 1382)

Other definitions are less involved and draw language from business intelligence: Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. (Cooper, 2012b) (p. 1382)

Where LA is more concerned with sensemaking and action, educational data mining (EDM) is more focused toward developing methods for “exploring the unique types of data that come from educational settings”.2 Although the techniques used are similar in both fields, EDM has a more specific focus on reductionist analysis (Siemens & Baker, 2012). As LA draws from and extends EDM methodologies (Bienkowski, Feng, & Means, 2012, p. 14), it is a reasonable expectation that the future development of analytic techniques and tools from both communities will overlap. (p. 1382)

Analytics in education can also be viewed as existing in various levels, ranging from individual classroom, department, university, region, state/province, and international. (p. 1382)

Historical Contributions to LA (p. 1383)

The following is a brief summary of the diversity of fields and research activities within education that have contributed to the development of learning analytics: (p. 1383)

LS
Surprised that the field of learning sciences did not get mentioned by Siemens. (p. 1384)

academic analytics involved the adoption of business intelligence (BI) to the academic sector (Goldstein, 2005). While sometimes referred to as LA, the BI roots of academic analytics are more concerned with improving organizational processes, such as personnel management or resource allocation, and improving efficiency within the university. (p. 1384)

LA Tools, Techniques, and Applications (p. 1384)

Tools (p. 1385)

Rough classification
Dividing LA tools by commercial and free/research sounds rough to me. Because there are so many tools that could be potentially applied to LA, the boundary is not always clearcut. And having this dichotomy does not contribute much to our thinking over LA tools. (p. 1385)

Learning analytics tools can be broadly grouped into two categories: commercial and research. (p. 1385)

Several prominent analytics tools already rely on data captured in an LMS. For example, Purdue University’s Signals (Arnold, 2010) and University of Maryland–Baltimore County’s “Check My Activity” (Fritz, 2010) both rely on data generated in Blackboard. Recommender systems, such as Degree Compass (Denley, 2012), similarly draw on data captured in existing information technology systems in universities. (p. 1385)

Techniques and Applications (p. 1386)

LA has two overlapping components: techniques and applications. Techniques involve the specific algorithms and models for conducting analytics. Applications involve the ways in which techniques are used to impact and improve teaching and learning. (p. 1386)

The distinction between technique and an application is not absolute but instead reflects the focus of researchers. (p. 1386)

Both, however, are important in advancing LA as a field. (p. 1386)

Baker and Yacef (2009) address the technique dimension of LA/EDM in listing five primary areas of analysis: •• Prediction •• Clustering •• Relationship mining •• Distillation of data for human judgment •• Discovery with models Bienkowski, Feng, and Means (2012) offer five areas of LA/EDM application: •• Modeling user knowledge, behavior, and experience •• Creating profiles of users •• Modeling knowledge domains •• Trend analysis •• Personalization and adaptation (p. 1386)

Baker and Yacef’s model details various types of data mining activity that the researcher conducts, whereas Bienkowski et al.’s model is focused on application. (p. 1386)

Figure 1. Historical influences in development of learning analytics. (p. 1387)

Scope of Data Capture (p. 1387)

To date, LA has relied heavily on two sources: student information systems (SIS; in generating learner profiles) and learning management systems (in tracking learner behavior and using it for prediction). (p. 1387)

“sensor-based modeling of human communication networks” (Choudhury & Pentland, 2003) (p. 1387)

Table 1. Learning Analytics (LA) Techniques and Applications. (p. 1388)

Other approaches include “passive acquisition” of “physical activity data” through “pedometers, heart rate monitors, accelerometers, and distance trackers” (Lee & Thomas, 2011, p. 867) (p. 1388)

Lecture hall data are limited to a few variables: who attended, seating patterns, student response system data, and observational data recorded by faculty or teaching assistants. By contrast, when learners watch a video lecture, data sources are richer, including frequency of access, playback, pauses, and so on. (p. 1389)

Knowledge Domain Modeling (p. 1389)

the authors argue for the need for “data structures and computational techniques” (Hendler & Berners Lee, 2010, p. 158) to enable human-computer interactions that provide a new level of intelligence and problem solving. (p. 1389)

Once knowledge domains have been articulated or mapped, learner data, profile information, and curricular data can be brought together and analyzed to determine learner knowledge in relation to the knowledge structure of a discipline. (p. 1389)

Organizational Capacity (p. 1390)

Additionally, effective analytics practices require organizational support. If analytics is to have an impact on how a university supports its learners, great interand intrainstitutional collaborations are required. (p. 1390)

faculty support and navigating “the realities of university culture” (p. 160). The insights gained through analytics require broad support organizationally. Prior to launching a project, organizations will benefit from taking stock of their capacity for analytics and willingness to have analytics have an impact on existing processes. In this context, Greller and Drachsler (2012, p. 43) outline six dimensions that must be considered “to ensure appropriate exploitation of LA in an educationally beneficial way”: •• Stakeholders: Those who are interested in or impacted by analytics •• Objectives: Goal or intent of analytics •• Data: Data sets and sources •• Instruments: Tools and technologies •• External limitations: Ethical, legal, managerial/organizational •• Internal limitations: Acceptance of analytics and skill level or competencies to perform analytics within an organization (p. 1391)

The effective process and operation of learning analytics require institutional change that does not just address the technical challenges linked to data mining, data models, server load, and computation but also addresses the social complexities of application, sensemaking, privacy, and ethics alongside the development of a shared organizational culture framed in analytics. (p. 1391)

Figure 2. Learning analytics model. (p. 1392)

LAM includes seven components: collection, storage, data cleaning, integration, analysis, representation and visualization, and action. (p. 1392)

Challenges (p. 1392)

The most significant challenges facing analytics in education are not technical. Concerns about data quality, sufficient scope of the data captured to reflect accurately the learning experience, privacy, and ethics of analytics are among the most significant concerns (see Slade & Prinsloo, 2013). (p. 1392)

Data Quality and Scope (p. 1392)

Data interoperability “imposes a challenge to data mining and analytics that rely on diverse and distributed data” (Bienkowski et al., 2012, p. 38) (p. 1393)

Suthers and Rosen (2011) capture the challenge when stating “since interaction is distributed across space, time, and media, and the data comes in a variety of formats, there is no single transcript to inspect and share, and the available data representations may not make interaction and its consequences apparent” (p. 65). (p. 1393)

Privacy (p. 1394)

With interactions online reflecting a borderless and global world for information flow, any approach to data exchange and data privacy requires a global view (World Economic Forum, 2011, p. 33). (p. 1394)

ownership of, and access to, data is only one aspect that educators need to consider. (p. 1394)

The analysis of data presents a secondary concern. Who has access to analytics? Should a student be able to see what an institution sees? Given variations in privacy laws, should educators be able to see the analytics performed on students in different courses? On graduation, should analytics be made available to prospective employees? When a learner transfers to a different program or a different university, what happens to his or her data? How long does a university keep those data, and can they be shared with other universities? These and numerous equally intractable problems will need to be addressed. (p. 1394)

The Dark Side (p. 1395)

The potential of LA to provide educators with actionable insight into teaching and learning is clear. The implications of heavy reliance on analytics are less clear. Ellul (1964) stated that technique and technical processes strive for the “mechanization of everything it encounters” (p. 12). Ellul’s comments remind us of the need to keep human and social processes central in LA activities. (p. 1395)

The learning process is essentially social and cannot be completely reduced to algorithms. (p. 1395)

The learning process is creative, requiring the generation of new ideas, approaches, and concepts. Analytics, in contrast, is about identifying and revealing what already exists. (p. 1395)

Blog Logo

Bodong Chen


Published

Image

Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home