read

References

Citekey: @Clow2013

Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6), 683–695. doi:10.1080/13562517.2013.827653

Notes

Highlights

It is rooted in the dramatic increase in the quantity of data about learners and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical to an educational sense of teaching. (p. 1)

It argues that teachers can and should engage with learning analytics as a way of influencing the metrics agenda towards richer conceptions of learning and to improve their teaching. (p. 1)

There is a tension between the framing of education as an economic activity and conceptions of education and learning that are concerned with the development of meaning and the transformation of understanding. (p. 1)

The framing of education as an economic activity supports the view of educational institutions as businesses. Business Intelligence is increasingly applied not only in higher education, in areas such as outreach and advertising, enrolment, management and fund-raising, but also in more academic areas. ‘Dashboards’ showing performance metrics against targets are increasingly popular with senior managers and political pressures, such as the current focus on college completion in the USA, reinforce this direction. (p. 2)

These developments are not always welcomed by teachers. (p. 2)

Learning analytics is the application of these Big Data techniques to improve learning. Learning analytics is currently a fixture in educational horizon-scanning reports (see e.g. Johnson et al. 2011; Johnson, Adams, and Cummins 2012; Sharples et al. 2012) and in a raft of other publications aimed at practitioners and aspiring practitioners from organisations concerned with technology in education, such as Educause (http://www.educause.edu/library/analytics), JISC (http://jisc.cetis.ac.uk/ topic/analytics) and SURF (http://www.surf.nl/en/themas/InnovationinEducation/ learninganalytics/Pages/default.aspx). (p. 2)

big data?
Debatable. So is whether LA is about quantitative data. (p. 2)

This increasing activity has a range of drivers and facilitators. First, there is a pressure towards performance management, metrics and quantification. Second, there is an increasing volume of data available about the learners and learning, particularly as more learning takes place online in Learning Management Systems or Virtual Learning Environments (LMS/VLEs). (p. 3)

Third, statistical and computational tools to manage large data-sets and to facilitate interpretation have become available as a result of the Big Data activity. (p. 3)

A key concern in learning analytics is the need to use the insights gathered from the data to make interventions to improve learning, to generate ‘actionable intelligence’ (Campbell, DeBlois, and Oblinger 2007) which informs appropriate interventions. (p. 3)

Learning analytics is not so much a solid academic discipline with established methodological approaches as it is a ‘jackdaw’ field of enquiry, picking up ‘shiny’ techniques, tools and methodologies (p. 3)

This eclectic approach is both a strength and a weakness: it facilitates rapid development and the ability to build on established practice and findings, but it to date lacks a coherent, articulated epistemology of its own. (p. 4)

need?
This is no need to have a unified epistemology for learning analytics. (p. 4)

Having set out learning analytics and its context in broad terms, this paper presents a set of more concrete examples of learning analytics practice to provide a more grounded view of the field. (p. 4)

Predictive modelling (p. 4)

The basic concept of predictive modelling is fairly straightforward: a mathematical model is developed, which produces estimates of likely outcomes, which are then used to inform interventions designed to improve those outcomes. (p. 4)

First, the output of predictive modelling is a set of estimated probabilities, and it is widely established that many people struggle to correctly understand probabilities and to make consistent decisions based on the probabilistic information. Second, the output is not (typically) restricted to a student’s teacher: the information is readily made available to others beyond the immediate learning context. Third, the output can be used directly to trigger actions and interventions without involving a teacher at all. (p. 5)

The Course Signals project at Purdue University (http://www.itap.purdue.edu/studio/ signals/) is the most prominent and arguably the most successful application of predictive modelling to student completion in higher education. (p. 5)

with an average of 10 percentage-point increase in grades A and B and a 6 percentage-point decrease in grades D, F and withdrawals. There is an increased overall retention: of the 2007 cohort, 69% of students with no exposure to Signals are retained, compared to 87% of students with exposure to at least one course using Signals. (p. 6)

Social network analysis (p. 6)

These diagrams can be interpreted simply by eye (for example, you can see whether a network has lots of links, or whether there are lots of nodes with few links). Alternatively, they can be interpreted with the aid of mathematical analysis of the network. (p. 6)

SNAPP (http://www.snappvis.org/; Bakharia and Dawson 2011) is a SNA tool specifically developed for online learning contexts (Dawson 2010). (p. 7)

Suthers and Chu (2012) used SNA to explore the Tapped-In community for educational professionals (http://tappedin.org). Their approach, inspired by Actor-Network Theory, was much more detailed and rich, based on an ‘associogram’, rather than a simple social network diagram. An associogram is a complex multidirectional mapping of the participants, the artefacts they created (e.g. messages in chat rooms, postings in discussions and shared files) and the actions taken by the participants on those artefacts (e.g. writing/posting and reading). (p. 7)

Usage tracking (p. 7)

Many tools exist to capture what a user does on a computer over time, and these can be used as a source of data about student activity when the learning task requires them to use something beyond the LMS/VLE. (p. 7)

Santos et al. (2012) developed a dashboard for students on a software development course at the Katholieke Universiteit Leuven. (p. 7)

Content analysis and semantic analysis (p. 8)

La ́russon and White (2012) have developed the Point of Originality tool, which enables teachers to track how students develop originality in their use of key concepts over the course of a series of writing assignments. (p. 8)

A more speculative example is an automated feedback to students about the nature of their online writing, with the aim of improving the quality of educational dialogue. (p. 8)

Recommendation engines (p. 9)

Discussion (p. 9)

The first and perhaps most obvious area is the ethics of personal data. Foucault (1991) uses Bentham’s Panopticon as a symbol of how institutions and power structures enforce self-surveillance and control through the belief that scrutiny may occur at any time. The nightmare vision of Big Data for individuals is that the system does not rely on self-surveillance to enforce a disciplinary regime. (p. 9)

Being open about learning analytics with students can improve their perceptions of the activity (as with Signals), but openness need not and arguably should not be complete in learning contexts. (p. 9)

Students typically know and care more about their own learning situation than even the most dedicated teacher. (p. 10)

Educators have a professional responsibility to use tools and methods that can improve students’ learning, and learning analytics offers potentially powerful ways of doing this. (p. 10)

As a field, learning analytics is data-driven and is often atheoretical, or more precisely, is not explicit about its theoretical basis. Several authors have sought to ground learning analytics in theory (e.g. Dawson 2008; Suthers et al. 2008; Atkisson and Wiley 2011; Clow 2012), but this is not universal, running the risk of treating the data that have been gathered as the data that matter. The choice of what is measured in learning analytics terms, the selection of metrics is critical. If an educational system is designed to optimise metrics that do not encompass learning, it is likely that learning will be optimised away. For those who care about learning, the choice is to attempt total resistance to the regime of metrics or to take a more pragmatic course and insist on the inclusion of appropriate metrics that do reflect learning. (p. 10)

well-said

Blog Logo

Bodong Chen


Published

Image

Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home