read

References

Citekey: @Shum2012a

Shum SB (2012). “UNESCO Policy Brief: Learning Analytics.” Technical Report November, UNESCO Institute for Information Technologies in Education. <URL: http://www.iite.unesco.org/publications/3214711/>.

Notes

Highlights

The emerging conversaon goes far beyond technologists (academic and commercial), to include researchers in educaon, leaders and policymakers, educaonal praconers, organisaonal administrators, instruconal designers, product vendors, and crically, the learners themselves (who are oen the first adopters of new cloud applicaons, many of which make data available, and who are the primary consumers of certain kinds of learning analyc). (p. 2)

The Convergence of Macro, Meso and Micro-level Analytics (p. 3)

Macro-level analycs seek to enable crossinstuonal analycs, for instance, through ‘maturity’ surveys of current instuonal pracces11 or improving state-wide data access to standardized assessment data over students’ lifemes.12 (p. 3)

Meso-level analycs operate at instuonal level. To the extent that educaonal instuons share common business processes to sectors already benefing from BI, they can be seen as a new BI market sector, who can usefully appropriate tools to integrate data silos in enterprise warehouses, opmize workflows, generate dashboards, mine unstructured data, beer predict ‘customer churn’ and future markets, and so forth. (p. 3)

Micro-level analycs support the tracking and interpretaon of process-level data for individual learners (and by extension, groups). (p. 3)

As the figure shows, what we now see taking place is the integraon of, and mutual enrichment between, these layers. (p. 4)

EXAMPLES OF LEARNING ANALYTICS (p. 4)

The Internaonal Conference on Learning Analycs & Knowledge (LAK) has archived proceedings and replayable presentaons which are the best snapshot of the emerging state of the art15 while EDUCAUSE is building a valuable resource bank and training events for educators/leaders/IT-administrators closer to immediate deployment opons.16 (p. 4)

educaonal startup companies are accelerang the pace at which learners will encounter micro-level analycs (e.g. the Educaonal Innovaon Summit17). (p. 4)

An EDUCAUSE synthesis of emerging trends in 20129 idenfies three kinds of predictors and indicators (Disposional, Acvity & Performance, and Student Arfacts), the key role of Visualizaon to support educaonal sensemaking (e.g. debate over what the analycs appear to be evidencing), and two kinds of intervenons (fully and semi-automated). (p. 4)

Analytics Dashboards (p. 4)

Predictive Analytics (p. 5)

Adaptive Learning Analytics (p. 6)

Concept. Adapve learning plaorms build a model of a learner’s understanding of a specific topic (e.g. algebra; photosynthesis; dental surgical procedures), somemes in the context of standardised tests which dictate the curriculum and modes of tesng. This enables fine-grained feedback (e.g. which concepts you have grasped and at what level), and adapve presentaon of content (e.g. not showing material that depends on having mastered concepts the learner has failed on). (p. 6)

Social Network Analytics (p. 6)

Social network analysis (somemes called Organisaonal Network Analysis in corporate sengs) makes visible the structures and dynamics of interpersonal networks, to understand how people develop and maintain these relaons. (p. 6)

Discourse Analytics (p. 6)

IMPACT AT DIFFERENT LEVELS (p. 7)

Learning: Towards a Data-Driven Science? (p. 7)

LEARNING ANALYTICS DEBATES (p. 8)

Data Is Not Neutral (p. 8)

A recent crique of the rhetoric around Big Data reminds us to enter this field with cauon:48 • Automang Research Changes the Definion of Knowledge • Claims to Objecvity and Accuracy are Misleading • Bigger Data are Not Always Beer Data • Not All Data Are Equivalent • Just Because it is Accessible Doesn’t Make it Ethical • Limited Access to Big Data Creates New Digital Divides (p. 8)

In the context of learning analycs, every step of the lifecycle — from data to analycs to insight to intervenon — is infused with human judgment. In short, it is as naïve to believe that ‘data speaks for itself’ as it is to believe that a text has a single, objecvely discernible meaning for all contexts. (p. 8)

Learning Analytics Perpetuate Assessment Regimes (p. 8)

Since assessment regimes are a hotly contested issue within educaonal research and policy, by extension, an intelligent approach to learning analycs must engage with this debate, making clear what assessment regimes and pedagogical commitments a given learning analyc promotes. (p. 8)

Due to the complexity of implemenng good assessment for learning,49 designing tools of this sort remains a primary challenge for learning analycs researchers.50, 51 The promise is that done well, analycs could be the key enabler for delivering formave assessment for learning at scale, placing new kinds of tools in the hands of learners.52 (p. 8)

The risk is that research and development focuses on the data which is simplest to log computaonally, perpetuang the dominant pedagogies and learning outcomes from an industrial era, when most educaonal thought-leaders point to the addional disposions and skills needed for lifelong, lifewide learning, and the capacity to thrive in a very turbulent world. (p. 8)

Ethics (p. 9)

the sharing and interpretaon of personal data (p. 9)

Who decides which data are important to log, how it is ‘cleaned’ for aggregaon with other datasets, and whether those datasets are compable? Who decide how the data are rendered visually, and are those seeing them literate enough to interpret them? Should learners see analycs about themselves, or their peers? Are teachers skilled enough to devise appropriate intervenons based on them? Can data be anonymised adequately, and can access be controlled appropriately? Are aempts to formalise educaonal theories to embed them in computaonal algorithms valid? (p. 9)

RECOMMENDATIONS (p. 9)

Governments and instuons can use the possible introducon of analycs to catalyse debate on their vision for teaching and learning for the 21st Century. (p. 9)

Instuons should train staff and researchers in the design and evaluaon of learning analytics (p. 9)

They should invest in analycs infrastructures for two reasons: (1) to opmise student success, and (2) to enable their own researchers to ask foundaonal quesons about learning and teaching in the 21st century. To research learning without an analycs infrastructure may soon become like a theorecal physicist with no access to CERN, or a genecist without genome databases. (p. 9)

Blog Logo

Bodong Chen


Published

Image

Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home