Citekey: @Shum2012b

Shum, S. B., & Ferguson, R. (2012). Social learning analytics. Educational Technology and Society, 15(3), 3–26.



Learning analytics has its roots in two computing endeavours not specifically concerned with learning, but rather with strong business imperatives to understand internal organisational data, and external consumer behaviour. (p. 1)

Educational Data Mining (EDM) “an emerging discipline, concerned with developing methods for exploring the unique types of data that come from educational settings, and using those methods to better understand students, and the settings which they learn in” (Baker & Yacef, 2009). (p. 2)

Educational institutions have become increasingly interested in analysing the available datasets in order to support retention of students and to improve student results. This use of academic analytics stretches back for at least 50 years, but has become more significant in the last five years as datasets have grown larger and more easily available for analysis. (p. 2)

the first significant academic gathering of the learning analytics community was in 2011 at the 1st International Conference on Learning Analytics & Knowledge, doubling in size to 200 in 2012. The 2011 conference defined the term as follows: Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. (p. 2)

in contrast to more theoretical research or artificial experimentation which might be published in some of the above fields, there is an emphasis on impacting authentic learning from real-world contexts, through the use of practical tools. There is also a shift away from an institutional perspective towards a focus on the concerns of learners and teachers. The main beneficiaries are no longer considered to be administrators, funders, marketing departments and education authorities, but instead are learners, teachers and faculty members (Long & Siemens, 2011). (p. 3)

The challenge of social learning analytics (p. 3)

In a literature analysis of the field, we found that in the discourse of academic analytics there is little mention of pedagogy, theory, learning or teaching (Ferguson, 2012). This reflects the roots of these analytics in management information systems and business intelligence (p. 3)

Performance indicators in educational settings typically involve outcomes-centric analytics based on learners’ performance on predefined tasks. (p. 3)

Social Learning Analytics (SLA) are strongly grounded in learning theory and focus attention on elements of learning that are relevant when learning in a participatory online culture. They shift attention away from summative assessment of individuals’ past performance in order to render visible, and in some cases potentially actionable, behaviours and patterns in the learning environment that signify effective process. (p. 3)

Social Learning Analytics is, we propose, a distinctive subset of learning analytics that draws on the substantial body of work demonstrating that new skills and ideas are not solely individual achievements, but are developed, carried forward, and passed on through interaction and collaboration (p. 3)

A socio-cultural strand of educational research demonstrates that language is one of the primary tools through which learners construct meaning. Its use is influenced by their aims, feelings and relationships, all of which shift according to context (Wells & Claxton, 2002). Another socio-cultural strand of research emphasises that learning cannot be understood by focusing solely on the cognition, development or behaviour of individual learners; neither can it be understood without reference to its situated nature (Gee, 1997; Wertsch, 1991). (p. 3)

Social Learning Analytics should render learning processes visible and actionable at different scales: from national and international networks to small groups and individual learners. (p. 3)

The emergence of open, social learning (p. 4)

analytics focused on summative assessment of performance remain important but do not go far enough: we need to develop new sets of analytics that can be used to support learning and teaching in these new conditions. We summarise these phenomena as: technological drivers the shift to ‘free’ and ‘open’ demand for knowledge-age skills innovation requires social learning challenges to educational institutions. (p. 4)

Demand for knowledge-age skills Technology is always appropriated to serve what people believe to be their needs and values. Since 1991, we have lived in the “knowledge age”—a period in which knowledge, rather than labour, land or capital, has been the key wealth-generating resource (Savage, 1996). (p. 5)

These changes have prompted an interest in “knowledge-age skills” that will allow learners to become both confident and competent designers of their own learning goals (Claxton, 2002). (p. 5)

Characterising online social learning (p. 6)

Social learning has been conceptualised as societal learning in general, as processes of interaction that lead to concerted action for change, as group learning, and as the learning of individuals within a social context (Blackmore, 2010). (p. 6)

While OERs greatly increase the amount of good quality material available online to learners, another consequence can be that individual learners find themselves adrift in an ocean of information, struggling to solve ill-structured problems, with little clear idea of how to solve them, or how to recognise when they have solved them. At the same time, distributed networks of learners are grappling with ‘wicked problems’ such as climate change, which offer the same challenges on a grander scale. Social learning infrastructure could have a key role to play in these situations, helping learners connect with others who can provide emotional and conceptual support for locating and engaging with resources, just as in our tree story at the start of this section. This forces us to ask whether our current educational and training regimes are fit for purpose in equipping our children, students and workforce with the dispositions and skills needed under conditions of growing uncertainty—a challenge explored in detail by many others, for example in the collection edited by Deakin Crick (2009). (p. 7)

Online social learning can take place when people are able to: clarify their intention—learning rather than browsing ground their learning—by defining their question/problem, and experimenting engage in learning conversations—increasing their understanding. (p. 7)

Figure 1. Dimensions of the social learning design space (p. 8)

Inherently social learning analytics (p. 8)

Inherently social analytics—only make sense in a collective context: Social Network Analytics—interpersonal relationships define social platforms and link learners to contacts, resources and ideas. Discourse Analytics—language is a primary tool for knowledge negotiation and construction. (p. 8)

Socialised analytics—although these are relevant as personal analytics, they have important new attributes in a collective context: Content Analytics—user-generated content is one of the defining characteristics of Web 2.0 (p. 8)

Disposition Analytics—intrinsic motivation to learn lies at the heart of engaged learning and innovation Context Analytics—mobile computing is transforming access to people, content and both formal and informal learning. (p. 9)

Zimmerman and his colleagues (2007) provide a definition of context that allows the definition of the context of an entity (for example, a learner) depending on five distinct categories: Individuality context includes information about the entity within the context. In the case of learners, this might include their language, their behaviour, their preferences and their goals Time context includes points in times, ranges and histories so can take into account work flow, long-term courses and interaction histories Location context can include absolute location, location in relation to people or resources, or virtual location (IP address) Activity context is concerned with goals, tasks and actions Relations context captures the relations of an entity with other entities, of example with learners, teachers and resources. (p. 15)

Early work in context-aware computing treated the environment as a shell encasing the user and focused on scalar properties such as current time and location, together with a list of available objects and services (see, for example, Abowd, Atkeson, Hong, Long, & Pinkerton, 1997; Want, Hopper, Falcao, & Gibbons, 1992). The focus was on the individual user receiving data from an environment rather than interacting with it. This model did not acknowledge the dynamics of interaction between people and the environment. When considered in the context of learning, it did not provide information that could help people to modify their environment in order to create supportive workspaces or form social networks with those around them or accessible online (Brown et al., 2010). (p. 15)

The challenge of powerful analytics (p. 16)

Bowker and Star (2000) demonstrate how these become the mechanisms by which we choose not only how to remember, but also systematically forget, what is known. If a phenomenon is not visible within a classification scheme, it is systematically erased. The issue of power is, therefore, a central one to confront. (p. 17)

The challenge for learning analytics is more complex still. As described above, at least some forms of learning analytics research have an interest in using data generated by users as a by-product of online activity (for example, asking/answering questions, or recommending resources), rather than as an intentional form of evidence of learning (such as taking a test or submitting an essay). (p. 17)

Important concerns (boyd & Crawford, 2011) are beginning to be expressed about learning analytics, such as the following variants on longstanding debates at the intersection of education, technology and artificial intelligence: Analytics are dependent on computational platforms that use, re-use and merge learner data, both public and private: institutions should steer clear of open data and minimise the merging of datasets of any sort until there are much clearer ethical and legal guidelines. Analytics could disempower learners, making them increasingly reliant on institutions providing them with continuous feedback, rather than developing meta-cognitive and learning-to-learn skills and dispositions. Analytics are a crude way to operationalise proxy measures of teacher effectiveness, and will be used to compare and contrast student outcomes, leading to the gaming of the system: “learning and teaching to the analytic” to maintain performance indicators that do not genuinely promote meaningful learning. (p. 17)

If SLA tools and data are placed in the hands of learners, the balance of power shifts significantly. (p. 17)

If analytics are drawing learners’ attention to their development as self-aware, intrinsically motivated learners, they are being moved in the opposite direction to becoming passively dependent on the institution or platform to tell them how they are doing and what to do next. (p. 17)

If analytics are focused on providing formative feedback to improve learning process, rather than making automated judgments about mastery levels in a given subject, there might be fewer concerns around the removal of human mentors from the feedback loop. (p. 17)

Blog Logo

Bodong Chen



Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home