Bodong Chen

Crisscross Landscapes

Notes: Lockyer. (2013). Informing Pedagogical Action



Citekey: @Lockyer2013

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439–1459. doi:10.11770002764213479367


Learning design is a new area for me. But one critical point made in this article is that interpretation of learning analytics should be made in the context of pedagogical designs. This is a point that should not be neglected so often in learning analytics research.


The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action. (p. 1439)

This article examines two relatively new concepts within education, learning analytics, that is, the collection, analysis, and reporting of data associated with student learning behavior, and learning design, that is, the documented design and sequencing of teaching practice, and how together these may serve to improve understanding and evaluation of teaching intent and learner activity. (p. 1439)

Learning designs, which document pedagogical intent and plans, potentially provide the context to make sense of learning analytics data. Essentially, learning design establishes the objectives and pedagogical plans, which can then be evaluated against the outcomes captured through learning analytics. (p. 1440)

The high adoption of education technologies, such as learning management systems (LMS), has resulted in a vast set of alternate and accessible learning data (Greller & (p. 1440)

Drachsler, in press; Pardo & Kloos, 2012). Student interactions with the course activities via the LMS are captured and stored. The resulting digital footprints can be collected and analyzed to establish indicators of teaching quality and provide more proactive assessment of student learning and engagement. (p. 1441)

The emergent field is multidisciplinary and draws on methodologies related to educational data mining, social network analysis, artificial intelligence, psychology, and educational theory and practice. (p. 1441)

Here, we use the term learning design, but work in the same vein has been carried out under such names as pedagogical patterns, learning patterns, and pattern language. Learning design describes the sequence of learning tasks, resources, and supports that a teacher constructs for students over part of, or the (p. 1441)

entire, academic semester. A learning design captures the pedagogical intent of a unit of study. Learning designs provide a board picture of a series of planned pedagogical actions rather than detailed accounts of a particular instructional event (as might be described in a traditional lesson plan). As such, learning designs provide a model for intentions in a particular learning context that can be used as a framework for design of analytics to support faculty in their learning and teaching decisions. (p. 1442)

Thus, the broad field of learning design was underpinned by two main aims: to promote teaching quality and to facilitate the integration of technology into teaching and learning. (p. 1442)

Research and development work in this area have included the creation of online repositories of learning designs that teachers could read, interpret, and adapt to their own practice (e.g., Agostinho, Harper, Oliver, Hedberg, & Wills, 2008; Conole & Culver, 2010) and the development of technical languages and tools designed to make learning designs machine readable and adaptable (Koper, 2006; Masterman, 2009). (p. 1442)

The learning designs come in many forms and level of detail. Some draw on an architectural model to describe textually solutions to common educational problems (McAndrew & Goodyear, 2007). Some use common representations such as process diagrams, flowcharts, and tables (Falconer, Beetham, Oliver, Lockyer, & Littlejohn, 2007), and others combine text descriptions with graphical representations (Agostinho et al., 2008). (p. 1442)

Regardless of the format in which learning designs are documented, essential elements include identifying the key actors involved (teachers and students), what they are expected to do (teaching and learning tasks), what educational resources (p. 1442)

are used to support the activities, and the sequence in which the activities unfold. These essential elements may be presented with great detail and provide a highly contextualized description of a particular unit, covering specific topics. Or they may be presented more generically, free of the detail of any particular implementation of the design. Learning designs also range in granularity from presenting a teaching and learning process that might occur for an entire semester-long course to that which might occur in only one class. (p. 1443)

The most easily understood and adapted common elements within all learning designs include the following: • A set of resources for the student to access, which could be considered to be prerequisites to the learning itself (these may be files, diagrams, questions, web links, prereadings, etc.) • Tasks the learners are expected to carry out with the resources (prepare and present findings, negotiate understanding, etc.) • Support mechanisms to assist in the provision of resources and the completion of the tasks; these supports indicate how the teacher, other experts, and peers might contribute to the learning process (e.g., such as moderation of a discussion or feedback on an assessment piece; Bennett et al., 2004) (p. 1443)

Although learning designs can provide a description of pedagogical intention, they do not identify how students are engaged in that design during or postimplementation. This is where learning analytics can provide information for a more holistic perspective of the impact of learning activities. (p. 1443)

However, at present, the predominance of learning analytics research centers on the types of data available in institutional LMS. (p. 1444)

Table 1. Examples of Learning Analytics Tools and Visualizations. (p. 1445)

Interpretation of the analysis thus requires alignment with the original teaching context if it is to be useful as feedback on whether the learning design has achieved its intent. Interpretation requires an understanding of the relationship among technology functionality, observed interactions behaviors, and educational theory (Heathcote, 2006). (p. 1446)

Learning Analytics to Evaluate Learning Design (p. 1447)

Although learning designs provide theoretical, practice-based, and/or evidence-based examples of sound educational design, learning analytics may allow us to test those assumptions with actual student interaction data in lieu of self-report measures such as post hoc surveys. In particular, learning analytics provides us with the necessary data, methodologies, and tools to support the quality and accountability that have been called for in higher education. (p. 1448)

Aligning Learning Analytics With Learning Design (p. 1448)

The first relates to what we term checkpoint analytics, that is, the snapshot data that indicate a student has met the prerequisites for learning by accessing the relevant resources of the learning design. (p. 1448)

The second type of learning analytics we term process analytics. These data and analyses provide direct insight into learner information processing and knowledge application (Elias, 2011) within the tasks that the student completes as part of a learning design. (p. 1448)

The articulation of the nature of support available within learning designs helps to interpret process learning analytics. (p. 1449)

How Analytics Can Support Implementation of a Learning Design (p. 1451)

Stage 1: Case Analysis Task—Checkpoint Analytics. Learning analytics can generate reports of student log-in behaviors and access to individual cases; these provide the teacher with indicators of when students have commenced the learning sequence. (p. 1451)

Stage 2: Case Analysis Discussion Task—Process Analytics. Once students analyze their case individually, they then share their ideas with their project group members. They identify issues that arose in these cases and consider how they may be applicable to the project they are about to undertake. A network diagram (p. 1451)

Stage 3:Whole-Class Discussion Task—Process Analytics. After the project groups discuss their case analyses, the learning design calls for the teacher to facilitate a whole-class discussion. If successful, the social network analysis of discussion forum posts should illustrate the teacher as central in the network. (p. 1452)

Stage 4: Project Proposal Task—Process Analytics. At this stage, students begin their project task. In the first part of the project task, students work in a small group to collaborate on their project proposal. If the task is completed within a discussion forum, a social network diagram could be used to indicate established density and connections of participation as well as outliers or disconnected students disengaged from the task. (p. 1452)

Stage 6:ReflectionTask—Checkpoint or Process. The final reflection task can be assessed using both checkpoint and process analytics. The checkpoint is to verify whether the self-reflection template has been accessed or uploaded with student changes. In addition, further content analysis can be undertaken to map student reflections and changes over an extended period. Self-reflection requires strong metacognitive capacities that have been demonstrated to be essential for developing the skills necessary for lifelong learning (Butler & Winne, 1995). (p. 1454)

How Analytics Support Implementation and Redesign (p. 1454)

Conclusion (p. 1455)

This article argued that the evaluative potential of learning analytics would be significantly enhanced by reference to the learning design that documents pedagogical intent. (p. 1455)