read

References

Citekey: @Zhang2011c

Zhang, J. and Sun, Y. (2011). Quantified measures of online discourse as knowledge building indicators. In Spada, H., G., S., Miyake, N., and Law, N., editors, Connecting computer-supported collaborative learning to policy and practice: CSCL2011 conference proceedings, volume 1, pages 72–79. ISLS, Hong Kong SAR, China.

Notes

Highlights

This secondary data analysis examined a set of social interaction (e.g., social network patterns), content (e.g., questions, ideas), and lexical measures (e.g., academic words, domain terms) (p. 1)

Choices of research measures for a given study are often made based on theoretical considerations of what such measures mean and imply. There is a need to examine and justify the importance of these measures to collaborative knowledge building through systematic empirical testing, which will further provide a stronger research base for initiatives to create automated analysis tools (e.g., Rosé et al., 2008). (p. 1)

Three types of quantified measures have been widely used: (a) Content analysis (Chi, 1997), using coding schemes to categorize the nature of responses, types of questions, depth of ideas, evidence use, argumentation patterns, and so forth (e.g., Baker et al., 2007; Hakkarainen, 2003; Hmelo-Silver, 2003; van Aalst & Chan, 2007; Weinberger & Fischer, 2006; Zhang et al., 2007); (b) Socio interaction measures, focusing on contribution rate, reading rate, conversation threads (build-on trees), social networks of who reads or responds to whose postings (Aviv, Erlich, Ravid, & Geva, 2003; de Laat, Lally, Lipponen, & Simons, 2007; Guzdial & Turns, 2000; Hewitt & Teplovs, 1999; Hewitt, Brett, & Peters, 2007; Howell-Richardson & Mellar, 1996; Zhang, Scardamalia, Reeve, & Richard, 2009); and (c) Linguistic markers of discourse, such as occurrences of epistemic words and domainspecific key terms in discourse (Hong & Scardamalia, 2008; Sun, Zhang, & Scardamalia, 2010). (p. 1)

Table 1: Measures of online knowledge building discourse. (p. 3)

Social Note contribution interaction measures Note reading percentage Note reading network: in-degree and out-degree Note linking network: in-degree and out-degree Note linking network: cliques (p. 3)

Content Problems measures Personal ideas Information sources Evidence Inquiry threads (p. 3)

Lexical Total words (p. 3)

1st 1,000 words Academic words Domain-specific words (p. 3)

We used social network analysis to examine two types of social relations recorded by Knowledge Forum: (a) who read whose notes, with reading peers’ notes as a primary mean to understanding knowledge advances and challenges of the community; and (b) who linked to whose notes (i.e., created build-ons, rise-aboves, or references), as a indicator of complementary and connected contributions. Three measures were included in this analysis: (a) indegree showing how many relational ties (e.g., reading, linking) a member received from peers, suggesting the level of his/her influence; (b) out-degree measuring the number of relational ties one sent out to other peers as an indicator of his/her effort to understand and build on peer contributions; and (c) clique analysis, which identified sub-networks each member belonged to in the note linking network, as an indicator of communitywide dynamic collaboration. A clique is “a sub-set of a network in which the actors are more closely and intensively tied to one another than they are to other members of the network.” (Hanneman, 2001, p. 79) (p. 3)

whose notes, with reading peers’ notes as a primary mean to understanding knowledge advances and challenges of the community; and (b) who linked to whose notes (i.e., created build-ons, rise-aboves, or references), as a indicator of complementary and connected contributions. Three measures were included in this analysis: (a) indegree showing how many relational ties (e.g., reading, linking) a member received from peers, suggesting the level of his/her influence; (b) out-degree measuring the number of relational ties one sent out to other peers as an indicator of his/her effort to understand and build on peer contributions; and (c) clique analysis, which identified sub-networks each member belonged to in the note linking network, as an indicator of communitywide dynamic collaboration. A clique is “a sub-set of a network in which the actors are more closely and intensively tied to one another than they are to other members of the network.” (Hanneman, 2001, p. 79) (p. 4)

Content analysis (Chi, 1997) was adopted to code: (a) questions identified by students in their notes (e.g., how do solar panels work?); (b) student personal ideas that presented their own theories and claims often labeled as “My theory” (e.g., “If there is no light, there can’t be a shadow”); (c) information sources, to introduce new information from readings, the Internet, the teacher, or parents, etc., often labeled as “New information,” and use the information to deepen their understanding; (d) evidence, to examine and deepen their understanding using findings from experiments and observations; and (e) inquiry threads contributed to. (p. 4)

Increasing use of sophisticated, low frequency words in free writing indicates growth of productive vocabulary and writing skills (Nation 2001). Thus, lexical frequency analysis was employed to examine student use of three types of words in their online discourse: (a) The first 1,000 most frequent word families in English (West, 1953). Low-proficiency writers tend to rely more on these basic word families in writing; (b) A list of academic words, including 570 word families that are typical of academic discourse across disciplinary areas, enabling references to other authors and findings (e.g., assume, establish, conclude) and processing of data and ideas (e.g., analyze, assess, category) (Coxhead 1998). Writers need to gain productive written control of the academic vocabulary in order to be recognized as a member of the academic discourse community (Corson, 1997); and (c) Domain-specific terms, which included 89 domain words related to light (e.g., names of optical concepts, devices and phenomena) identified from the Ontario Curriculum (Sun et al., 2010). (p. 4)

Several social interaction measures indicate productive discourse to achieve deep understanding, including the number of notes contributed, percentage of notes read, in-degree (being read by peers) and out-degree (reading peers’ work) in the note reading network, and in-degree (being built on by peers) and dynamic memberships in cliques (sub-networks) in the note linking network developed through build-ons, rise-aboves, and referencing citations of peer ideas. (p. 6)

Content-based discourse indicators associated with student deep understanding involve the number of notes contributing personal theories, identifying deepening problems, and incorporating new information sources, with student contributions to multiple inquiry threads strongly connected to the scope of their optical understanding. The number of notes reporting evidence is not significantly correlated to the depth of student understanding, possibly because this analysis only considered the frequency of evidence use. (p. 6)

Finally, all the lexical discourse measures have significant correlations to the depth of student understanding, including total words written, occurrences of academic words and domain-specific words (both total and unique words), and less frequent use of the 1st 1,000 English word families. Incorporating unique domain-specific words in the knowledge building discourse additionally suggests the expanding scope and breadth of inquiry. (p. 6)

Blog Logo

Bodong Chen


Published

Image

Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home