Bodong Chen

Crisscross Landscapes

Notes: Barnett2004-vo: Inferring Learning from Big Data: The Importance of a Transdisciplinary and Multidimensional Approach

2018-04-17


References

Citekey: @Barnett2004-vo

Lodge, J. M., Alhadad, S. S. J., Lewis, M. J., & Gašević, D. (2017). Inferring Learning from Big Data: The Importance of a Transdisciplinary and Multidimensional Approach. Technology, Knowledge and Learning, 22(3), 385–400. https://doi.org/10.1007/s10758-017-9330-3

Notes

Summarize: Nice paper discussion the importance of engaging in transdisciplinary dialogues in learning analytics to expose philosophical assumptions we make. Many great points are made by making analogies with other transdisciplinary areas.

Assess: Two things improvable for this paper: 1) articulate transdisciplinarity in relation to multidisciplinarity and interdisciplinarity (since they are mentioned in the intro); and 2) be more critical of ‘inferences’ as the main goal of learning analytics.

Highlights

Drawing on developments in cognate dis- ciplines, we analyse the inherent difficulties in inferring the complex phenomenon that is learning from big datasets. This forms the basis of a discussion about the possibilities for systematic collaboration across different paradigms and disciplinary backgrounds in interpreting big data for enhancing learning. (p. 1)

Difficulties that have arisen in similar multidisciplinary fields dealing with big and complex datasets such as educational neuroscience (Palghat et al. 2017) highlight the critical necessity of monitoring and updating assumptions and conceptualisations that underpin the analysis and interpretation of big data. (p. 2)

We define transdisciplinarity, as per Nicolescu (2002), as a strategy for recognising the importance of what exists between and beyond disciplinary boundaries and for embracing the multiple levels of reality for a more holistic approach to understanding learning and teaching. (p. 2)

2 Inferring Learning from Big Data (p. 2)

While the availability of big data sets have created some opportunities for researchers and instructors that were not previously viable, that is not to say that there has not already been a concerted effort to use, particularly, behavioural, demographic and other data to understand the learning process. (p. 2)

Concerns related to sampling, generalisability, measurement error and power all speak to the underlying methodological and contextual factors that impact on learning and other psychological phenomena. (p. 3)

The power inherent in these enormous sample sizes and potentially substantial numbers of variables across time, location and settings carry an additional conceptual problem that cannot only be solved by statistical methods alone. (p. 3)

Data-driven inference, therefore, does not appear to be sufficient to provide a clear picture of psychological and social phenomena such as learning. (p. 3)

2.1 Learning: Process or Outcome, or Both? (p. 4)

De Houwer et al. ( 2013) are critical of research examining learning, making the argument that it is often not even clear in research whether learning is being treated as a process or outcome. (p. 4)

While educationally, learning has been traditionally seen as a developmental process, in recent times, this idea has also been conflated with performance. (p. 4)

The implication of these trends is that there is an emphasis on a snapshot of performance as a means of determining student learning rather than seeing learning for what it is, a developmental process leading to an (ongoing) outcome. (p. 4)

This misguided focus on performance over learning therefore masks a deeper issue about what learning is. Jacobson et al. (2016) discuss a shift in the focus of educational research on learning towards more complex, systems-based conceptualisations. This change is an attempt to move the debate forward from the clear divide between cognitive and social/situated conceptions of learning. Their premise is that complex, systems-based models can incorporate both. (p. 4)

Attempting to model and make decisions based on binary outcome data like this has been most famously criticised as an example of the ‘McNamara fallacy’ (Basler 2009). This fallacy is named after a Vietnam War era US Secretary of Defence who over-relied on crude quantifications such as total numbers of casualties to determine progress in the war. (p. 5)

Similarly, learning is multidimensional, complex and requires the preparation of graduates for an unknowable future (Barnett 2004). (p. 5)

implifying complex outcomes into relatively straightforward measures also risks invoking Campbell’s law (Nichols and Berliner 2007). This law is captured most concisely in the saying ‘teaching to the test’ that was applied to standardised national testing in primary and secondary education. In other words, simplified output measures become the goal of education rather than the earlier focus on teaching for quality learning. (p. 5)

As Rowntree (1987) argues, inherent in both the McNamara fallacy and Campbell’s law are bias towards quantification that implies that anything that cannot be measured is not worthwhile. (p. 5)

The measures we currently have do not capture the complexities and multidimensional nature of learning, despite the nascent potential of large datasets. (p. 5)

It is perhaps a truism to argue that learning, as a construct, ranges from repetition to higher order, complex phenomena, and must always be considered within the interplays of all historical, cultural, political, economic and social contexts that it occurs in. (p. 5)

2.2 Complex Phenomenon: Complex Model? (p. 6)

Jacobson et al. ( 2016) argue that to continue to make progress in understanding and enhancing learning, it is indeed in complexity science that answers are to be found. A complex phenomenon requires a paradigm that has complexity at its core. (p. 6)

Baker ( 2016) pointed out that ITS remain ‘stupid’ in comparison to the people they are designed to assist. Instead, Baker argues that systems should be built to augment, rather than supplant human capabilities. (p. 6)

Although modelling approaches used to infer learning from big datasets use a combi- nation of inductive and deductive logic, there is always, intrinsically, an inferential gap between human and machine. (p. 6)

Whether these sophisticated modelling approaches can seamlessly bridge the gap between data-driven inference and complex, messy reality is unclear. (p. 6)

Conversations and ongoing collaboration between the technical and theoretical communities will help to bridge the research and practice gap and lead to better inferences. (p. 6)

3 Inference and Translation of Big Data in Practice (p. 6)

Like the replication crisis, these will inevitably link to fundamental issues of rigour and validity due to the diversity of approaches different researchers bring to the field based on their academic and contextual backgrounds. Learning analytics, as a developing field, may not be well prepared to address these changes due to the emphasis to date on the practical aspects of learning analytics implementation, as has often been the case in similar interdisciplinary projects. (p. 7)

it’s not about ‘not be well prepared’. instead, LA is not facing the same issue, or crisis, as psychology studies. we should focus on practical value and integrity instead of validity as such. (p. 7)

learning analytics as an educational science field (p. 7)

not sure about this. (p. 7)

Learning analytics is a transdisciplinary area of research by default, as it is grounded in transdisciplinary educational practices (p. 7)

distinguish trans-, inter-, and multi-. (p. 7)

learning analytics is not impervious to considerations of design rigour and relevance to facilitate causal inferences. (p. 7)

Rather, critical reflection in using big data in higher education practice contexts becomes even more necessary. (p. 7)

Learning analytics is conceptualised as a sense- making and actionable science in practice (p. 7)

While critical discussions of translational issues in inferences made from big data have emerged (Reimann 2016; Wise and Shaffer 2015), learning analytics as a practice- focussed field could further benefit from the lessons learnt in the health and clinical sciences, which have been able to achieve a significant level of transdisciplinarity. (p. 7)

As a genuinely transdisciplinary research and practice field such as learning analytics, two-way communication between researchers and practitioners is crucial to advance empirical and theoretical development in complex, collaborative research and practice contexts. (p. 8)

Another implication of inferential challenges for the use of big data in higher education is towards actionable science. (p. 8)

one thing is to say LA is transdisciplinary field, another thing is to constantly draw back to ‘inferences’ and ‘science’ as the central concerns of this field. seems contradictory. just a footnote. (p. 8)

That is, in driving action, data is sometimes viewed as evidence in and of itself and thus may be perceived as sufficient in justifying directions for actions. This illusion of objectivity of data or analytics is a cognitive fallacy (Berger and Berry 1988), and can result in a false sense of rigour (Dover and Schultz 2016) and potential misap- plication of sense-making and actions (Cohen 1994). This perception of data as unques- tionably objective has been shown to be problematic in other domains. For example, the claimed objectivity of big data mining in drug safety has been demonstrated to be a barrier to effective pharmacological clinical judgement. Outcomes from different software sys- tems and vendor tools were deemed to be discrepant simply through differential data transformation and computation methods (Hauben et al. 2007). This result, therefore, prompted the critical role of collaborative communication between the technical and functional subdomains, facilitating more realistic perceptions of the data. (p. 8)

Acknowledging the role of relative subjectivity and complexity in making inferences could not only open the way for more accurate and appropriate judgement and decision making but also recognises the realities of the complexities of measurement and meaning making. Thus, understandably, the focus on issues about ethics in using learning analytics is prevalent in affecting educational practice (e.g., Ferguson et al. 2016; Prinsloo and Slade 2016). (p. 8)

Practitioners have the added responsibility of critically considering the consequences of their actions beyond intended improvements in learning, teaching, or retention, to that of impact on individuals and groups, as well as the emergent ecological consequences. (p. 9)

4 A Transdisciplinary Approach to Interpreting Big Data (p. 9)

Careful consideration of disciplinary differences in ontology and epistemology has also been problematic in other cognate fields such as educational neuroscience. However, a concerted effort to bridge the laboratory and the classroom (see Horvath et al. 2017) is beginning to move educational neuroscience towards a transdisciplinary research agenda. (p. 9)

One difficulty in working across disciplines that is evident for example in educational neuroscience is that there are multiple layers of interpretation and analysis of learning from small parts of the brain over minute timeframes to the whole student over a lifetime (see Horvath and Lodge 2017). (p. 9)

By framing levels of analysis and action into abridged con- ceptual models, a possible way forward towards bridging these layers can be proposed. (p. 9)

to make explicit the assumptions underpinning the inferences being made about learning on the basis of the data. (p. 9)

So, a finding from a neuroimaging study can have impact on teaching practice in the classroom if it is adequately interpreted at the level of the brain and mind, through the behavioural level and again reinterpreted for teaching practice. Such a research agenda involves neuroscientists, cognitive psychologists, educational psychologists and teachers all working together across levels of data collection and interpretation. (p. 10)

it is possible to extend on these previous models and come up with a version for situating big data analysis in the context of higher education research. Such a hierarchy is presented in Fig. 3. Again, this depiction highlights the need to make meaning of data collected through research conducted at each level for the other levels. The aim is pre- sented here as having an impact on students as individuals as they exist in the world. (p. 10)

The aim here is simply to suggest a possible linear translation and analysis approach for research on learning and how it can help interpretation of big datasets. (p. 10)

Translation of this sort means that the inferences being made about the patterns evident in large real-life datasets are grounded in empirical evidence from laboratory studies. Therefore, any inferential gaps emerging from the interpretation of big data alone can be explicitly tested under controlled conditions. (p. 11)

the words translation and inferences worry me. (p. 11)

This approach would involve a meta-model of learning where many inferences are made and triangulated. This model has been advocated in the emerging field of psychoinfor- matics (Yarkoni 2012). A feature of this approach is that, rather than treating different forms of information as hierarchical in time and space, all indicators and conceptualisa- tions are treated as broadly equal and indicative of the central and holistic meta-construct. (p. 12)

As we have alluded to in thus far, the key here is that all members of a transdis- ciplinary project discuss and make explicit their conceptual resources: that is, their definition of learning, where they see learning occurring and the methods by which they infer learning from the outset. A hierarchical model belies an assumption that some forms of evidence are more rigorous or relevant than others. While this may be true to an extent, it does not help resolve fundamental differences in the approach that researchers and practitioners from different disciplines to make sense of data. Our ecological model is not designed to be an exact representation of reality, however, it does provide a way of (p. 12)

thinking about evidence that moves beyond a notion that some forms of evidence are superior to others. (p. 13)

Having a systematic way of fostering these conversations both builds on and generates the affordances of multiple pathways for understanding learning. We are therefore in agreement with Nathan and Alibali (2010) that any endeavour broadly char- acterised as the learning sciences should involve complementary systematic and elemental aspects (p. 13)

5 Conclusion (p. 14)

In the learning sciences, there is continuing debate about what learning is and how best to infer it. (p. 14)

The complexity of our task means that more efficient ways of coming together and facilitation that moves the conversation forward are needed. This initiative includes, and indeed is premised on, being mindful of the episte- mological and ontological realities of collaborators, researchers and practitioners in the field. While each may not all see learning in the same way, and try to infer learning using different methodologies, we have attempted in this paper to provide necessary ground to move the transdisciplinary conversation forward and provide a basis for a multidimen- sional research agenda including deliberate sense-making and translation across levels of inference. (p. 14)

Barnett, R. (2004). Learning for an unknown future. Higher Education Research & Development, 23(3), 247–260. doi:10.10800729436042000235382. (p. 14)