Bodong Chen

Crisscross Landscapes

Notes: Martínez. (2016). Promoting student metacognition through the analysis of their own debates



Citekey: @Martinez2016-sz

Martínez, M. L., & Valdivia, I. M. Á. (2016). Promoting student metacognition through the analysis of their own debates. Is it better with text or with graphics? Educational Technology & Society, 19(4), 167–177. Retrieved from






ABSTRACT This paper presents a higher education experience aimed at explicitly promoting metacognitive processes in a social and collaborative context. Students carried out a debate on an e-forum, and were later asked to collaboratively analyse their own debates. The control group conducted this analysis using text-based tools; the experimental group analysed it with a graphical tool (“DebateGraph”). We examine the consequences of such experiences in promoting students’ metacognitive processes for argumentative competence, as well as its impact on content knowledge learning. The analysis yields different results depending on the perspective adopted: students’ self-assessment or instructor’s assessment. (p. 1)

Argumentation, developed either in monological or dialogical forms, is often related to the building of more functional and meaningful knowledge (Venville & Dawson, 2010). (p. 1)

With the emergence of the socio-cultural approach (Nussbaum, 2008; Vygotsky, 1978), social and dialogical forms of argumentation have gained importance. (p. 1)

Although debates are commonly developed using oral and written language, with the advent of Information and Communication Technologies (ICT) new forms of representing arguments are available. (p. 1)

In this paper, we present a study developed for higher education aimed at explicitly promoting metacognitive processes in a collaborative context. Teams of students collaboratively analysed their own previously held debates. They did these analyses using two different strategies: the control group had to perform the analysis using textual tools; the experimental group had to analyse it with a graphical tool (“DebateGraph,” (p. 1)

Argumentation and metacognition (p. 2)

Argumentation, conceived as a dialogic form of discussion, can be seen as a social activity that has two goals: first, to support one’s own position by providing evidence and favouring arguments; second, to undermine the opponent’s position by identifying and challenging weaknesses in their argument (Walton, 1998) (p. 2)

Metacognition can be defined as the cognition of cognition; it includes at least two main processes: knowledge of cognition and control over cognition (Flavell, 1979). (p. 2)

argumentation competence requiring three different “metaknowing” components (Rapanta et al., 2013):  Metacognitive knowing: being aware of the appropriate knowledge to support and construct arguments (know-what); metacognitive knowing mainly refers to declarative knowledge (e.g., what concepts can I use to support my stance? What evidence is appropriate to support my argument?).  Metastrategic knowing: knowing suitable strategies, in accordance with the task requirements, to construct arguments (know-how); metastrategic knowing refers to procedural knowledge, and involves understanding and awareness of the task requirements in order to select appropriate strategies (e.g., what procedures can I use to better fulfil the argument task? On what basis can I decide whether an idea is right or wrong?).  Epistemic knowing: being aware of the consequences derived from the cognitive performance in an argumentative task (know-be); this involves knowing about knowledge, both in general and in relation to individuals (e.g., has the argument provoked any knowledge advance in participants? Is argumentation a good setting for solving mathematical problems in teams?). (p. 2)

Metacognition has mostly been examined as an individual process, rather than a social or distributed phenomenon among members of a group (Goos, Galbraith, & Renshaw, 2002). (p. 2)

Siegel (2012) broke group metacognition down into three components, with the group using these components to reduce the distance between their member’s positions:  Metasocial awareness: the group identifies “who” has “what” expertise and resources;  Monitoring understanding: the group manages to identify what they know on a public level; rendering “holes” in the group’s understanding visible is especially important;  Monitoring process: the group manages to set goals and revise them according to the process they have followed. (p. 2)

Group metacognition is under-researched and its effects on teaching and learning are highly underappreciated (Anderson, Nashon, & Thomas, 2009). (p. 3)

useful for metadiscourse research and SLA research as well (p. 3)

Argumentation and knowledge representation (p. 3)

Most of these studies compared two conditions (i.e., using graphical representation of arguments versus textual representation), and generally yielded positive outcomes for the graphical condition. (p. 3)

Along similar lines, other studies have supported the use of visual technological features. Munneke, Andriessen, Kanselaar and Kirschner (2007) confirmed that students argue more thoroughly, both broadening the debate space and examining in greater depth, when they use a tool to represent arguments in a diagrammatic fashion. (p. 3)

Methods (p. 4)

This research used a case study approach. The case study is an appropriate method for researchers who want to attain a perceptive understanding of an instructional context, by seeking answers to descriptive and explicative questions (Yin, 2003). (p. 4)

Specifical ly, this study used a quasi-experimental design, with a multi-method approach to the data analysis. (p. 4)

The study took place on a Developmental Psychology course in a postgraduate programme (a Master’s degree for Second Language Education Teachers) in a public university in Barcelona, Spain. There were 56 students (43 female, 13 male, M = 25.6 years, SD = 5.9; range = 21-43). (p. 4)

Procedure, tasks, and instruments (p. 4)

In this study, we examined three different activities that occurred as part of the same unit: “Development of thinking in adolescence.” The first two activities were developed in collaborative teams of five or six members (n = 10); these teams were organised by the students themselves at the beginning of the course. The third activity was an individual exam. (p. 4)

In order to assess the virtual debate’s quality, the instructor took into account the quality of the ideas elaborated by the students, the argumentation and support of the ideas, and the accomplishment of the debate instructions. In accordance with those criteria, every debate was evaluated and scored using a three-point scale, where A was excellent, B was good, and C was acceptable/passing. (p. 4)

The second activity took place two months after the end of the virtual debate, in a two hour face-to-face session. The aim of the second activity was to analyse the previous debate. Half of the groups (n = 5) were assigned to the experimental condition, where they used the tool DebateGraph. (p. 4)

In order to present their analysis, all the control groups used the text editor Microsoft Word. (p. 4)

The third activity was an individual exam, where the students were presented with an open question that assessed the conceptual knowledge discussed in the debate: “Explain the main features of adolescent thinking.” The answers were graded by the instructor on a 0 to 10 scale poin (p. 5)

Results (p. 7)

We then consider the cognitive consequences of such an experience, analysing whether there are differences between the experimental and control groups in terms of declarative knowledge content learning. To inform this, we consider the instructor’s assessment of the students’ exam answers. (p. 7)

We observe significant differences in the appraisal of the learning experience by the students. All the item scores show a statistically significant difference, with the exception of the item “awareness of development of new analytical skills”. T (p. 7)

Analysis of the debate’s analysis products, as evaluated by the instructor, did not yield any significant differences. (p. 7)

Finally, ANOVA results do not show statistically significant difference between the groups in terms of their exam outcomes. (p. 7)

Discussion (p. 8)

Our study yielded some interesting results on how graphical and text-based tools promote the metacognitive skills involved in argumentation. With regard to the accounts of students themselves, gauged by our questionnaire, the students using the graphical tool reported a significantly worse appraisal of metacognitive abilities fostered in their debate analysis. (p. 8)

Very interesting! And meta – good to see negative results published by the journal. This increases my respect for this journal and wish to send my work there. (p. 8)

Therefore, we may state that the students felt that their debate content is better appraised and acknowledged when using text-based tools to analyse it. (p. 8)

Know -how metacognition encompasses showing awareness of the debate task and assessing whether the procedures used for constructing arguments are valid or not. (p. 8)

wondering whether the tool introduced unnecessary extranous cognitive load that led to unproductive argumentation. (p. 8)

Conclusions In our study, students did not take advantage of using a graphical tool to enhance their learning while analysing their own previous debates. Students in the experimental group neither improved their metacognitive nor their cognitive processes in comparison with the text-based tool group. (p. 9)

a different way to think about the results? Students not taking advantage or not enough support to help them taking advantage? Very interesting questions to raise. (p. 9)

The present study reminds us that while the tool (either digital or analogue) may be an important element of the educational activity, many other variables in the educational setting may interact with each other, and eventually play a role in learning processes. (p. 10)

an important point (p. 10)

Anderson, D., Nashon, S. M., & Thomas, G. P. (2009). Evolution of research methods for probing and understanding metacognition. Research in Science Education, 39(2), 181-195. (p. 10)

Chen, Y., Chen, N. S., & Tsai, C. C. (2009). The Use of online synchronous discussion for web-based professional development for teachers. Computers & Education, 53(4), 1155-1166. (p. 10)

Nussbaum, E. M. (2008). Collaborative discourse, argumentation, and learning: Preface and literature review. Contemporary Educational Psychology, 33(3), 345-359. (p. 11)

Siegel, M. A. (2012). Filling in the distance between us: Group metacognition during problem solving in a secondary education course. Journal of Science Education and Technology, 21(3), 325-341. (p. 11)