read

References

Citekey: @Mazza2007a

Mazza, R., & Dimitrova, V. (2007). CourseVis: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies, 65(2), 125–139. doi:10.1016/j.ijhcs.2006.08.008

Notes

This paper introduces CoruseVis, a visual learning analytics system designed for WebCT. The design of CourseVis draws heavily from Information Visualization principles and (loosely) map with three identified aspects of learning, namely cognitive, social, and behavioral. A large portion of this paper reports on an evaluation study about the effectiveness, efficiency, and usefulness of CourseVis among instructors. This tool was proven to be useful and usable among instructors to have a bird-eye view about their students and effectiveness of their course design.

(Visualizations in this tool focus on numbers / summary statistics, but less on nuanced aspects of learning (e.g., connecting social interactions with the temporal aspect).)

Highlights

CourseVis, a system that takes a novel approach of using Web log data generated by course management systems (CMSs) to help instructors become aware of what is happening in distance learning classes. (p. 1)

Several graphical representations are generated to help distance learning instructors get a better understanding of social, behavioural, and cognitive aspects related to learners. (p. 1)

Smith-Gratto, 1999; Hara and Kling, 2000; Rivera and Rice, 2002; Valentine, 2002). (p. 1)

The effective use of a CMS in distance education requires that instructors have a way to diagnose when a problem might arise or has arisen, so that they can take appropriate actions to prevent or overcome that problem. (p. 1)

A number of problems with using CMSs in distance learning have been reported, e.g. students may feel isolated due to the limited contact with the instructor and other students, can get disorientated in the course hyperspace, may lose their motivation, and often find it difficult to manage without appropriate institutional support (Galusha, 1997; (p. 1)

However, tracking data is usually provided in a tabular format, is often incomprehensible, with a poor logical organization, and is difficult to follow. (p. 1)

Behavioural aspects: specific features of the students’ behaviour, for instance, course attendance, students performing very well or very badly, students who are progressing too fast or too slowly with the course schedule, etc. (p. 2)

Educational research shows that monitoring the students’ learning is an essential component of high-quality education, and is ‘‘one of the major factors differentiating effective schools and teachers from ineffective ones’’ (Cotton, 1988). (p. 2)

Constant feedback and guidance from the instructors is crucial in distance education (Ragan, 1998) where ‘‘the learner is impaired by the lack of casual contact with the teacher and other students’’ (Galusha, 1997). (p. 2)

true?
Wondering whether these claims are still true in the age of social media? (p. 2)

information visualization (IV) techniques (Tufte, 1990; Card et al., 1999; Spence, 2001) (p. 2)

The questionnaire included 17 questions, grouped into fourcategories: (p. 2)

Demographic questions addressed the type and duration of involvement in on-line courses, as well as the size and type of the classes. (p. 2)

Platform-related questions asked what technological platforms were used and what facilities were provided by these platforms. (p. 2)

Student and assessment-related questions asked what assessment techniques where utilized by the instructors, how assessment information was used in their teaching practice, and what information about students the instructors needed. (p. 2)

  1. Gathering design requirements for CourseVis (p. 2)

The design of CourseVis was based on the results of a survey conducted to find out what information about distance students instructors may need when they run courses with a CMS, and to identify possible ways to help instructors acquire this information. (p. 2)

Feedback-related questions enabling the respondents to enter comments about the survey and to request a summary of the results. (p. 2)

Social aspects: interactions between students, interactions between students and the teacher, interactions between students and the system, and vicarious interactions.2 (p. 2)

alert
Social data may not be interpreted ‘socially’. A distinction made between socialized vs social learning. Something to keep in mind. (p. 2)

Cognitive aspects: the students’ overall course performance, their performance on selected quizzes or assignments, the performance on a specific topic of the course, etc. (p. 2)

2.2. Results (p. 2)

Social aspects of learners: The main tools used by the participants to engage students in on-line communicative activities were discussion forums (80%), e-mail (85%), and chat (56%). (p. 3)

Cognitive aspects of learners: The most popular assessment tools were quizzes, assignments, and group work, all of them were used between 65% and 76% of the participants. The respondents expressed a high interest in having information about the overall performance in the course (84%), as well as the level of knowledge achieved by each student for each domain concept of the course (60%). (p. 3)

Table 2 Design space for the discussion plot generated in CourseVis (p. 3)

Behavioural aspects of learners: Instructors use behavioural indicators to judge factors such as active learning, motivation, engagement, and, in general, to assess the success or failure of a distance course. (p. 3)

time (T), retinal properties of colour ðCÞ, size (SZ), orientation ðOÞ, and shape (SH), and connection (–).4 (p. 3)

3.1. Visualizing social aspects of learners (p. 3)

2.3. Design requirements for CourseVis The results of the survey led to key information requirements for Course Vis, shown in Table 1. (p. 3)

One representation produced in CourseVis to represent social aspects is a discussion plot in which the discussion board variables (originator, date, topic) are mapped onto the three dimensions of a 3D scatterplot. An additional dimension—the number of follow-ups in a discussion—is represented by the size of the sphere, as well as by using colour, see Table 2. (p. 3)

  1. CourseVis (p. 3)

The instructor may perform operations, such as rotating and zooming, to manipulate the image. Figs. 1 and 2 (p. 3)

To define the design spaces of the graphical representations in CourseVIS we have followed a conventional method described in Card and Mackinlay (1997). CMS tracking data variables have been mapped to the following visual signals: space coordinates ðX ; Y ; ZÞ, a coordinate for (p. 3)

Another example representation generated in CourseVis to visualize social aspects is a discussion graph, which is a trivariate representation describing the number of threads started and the number of follow-ups received for each thread. (p. 4)

3.2. Visualizing cognitive aspects of learners (p. 4)

The students’ results from quizzes were identified in the design requirements as the main source used to assess the students’ understanding of the domain. (p. 4)

The domain can be seen as a collection of concepts fC1; . . . ; Cng where each concept Ci is composed of a list of pages fPi1 ; . . . ; Pih g belonging to the content material of the course and a set of questions fQi1 ; . . . ; Qik g selected among the quizzes in the course (p. 5)

This simple approach allows CourseVis to link marks received on questions with concepts of the domain. (p. 5)

In the cognitive matrix, the students are mapped onto the x-axis and the concepts are mapped onto the y-axis of a matrix. The performance values are mapped onto the colour of the mark corresponding to a student and a concept, which is represented by a square. (p. 5)

The matrix enables global analysis of the overall student performance on the course topics and comparison between topics and individual students. This can promote the instructors’ reflection on their practice. (p. 5)

For cognitive aspects of learners, CourseVis generates two representations—a cognitive matrix (see Fig. 4) and a cognitive graph (see Fig. 5). (p. 5)

3.3. Visualizing behavioural aspects of students (p. 6)

Every time a student accesses the course, CMSs register the date, time and the duration. We take advantage of this information to create the student accesses plot which summarizes the overall students’ attendance in the course. (p. 6)

The global accesses of all students on day D is calculated as the sum of the number of accesses of every student on that day. (p. 6)

The list of variables related to the students’ performance on a concept and their mappings are given in Table 5. This bivariate data can be represented with a cognitive graph relating the two variables, as shown in Fig. 5. Course concepts are mapped onto the x-axis and the performance mean is mapped onto the y-axis. (p. 6)

the graph allows a quicker and more precise evaluation of problematic concepts. (p. 6)

Other behavioural aspects are covered in a student behaviour graph. (p. 6)

Accesses to content pages by topics (p. 7)

Global accesses to the course: (p. 7)

  1. Evaluation with instructors (p. 8)

An empirical evaluation of CourseVis was conducted focusing on: Effectiveness: Can the tool help instructors gain some understanding of what is happening in distance classes? Efficiency: Can instructors infer required information quickly? Usefulness: To what extent is information provided useful for the instructors? (p. 8)

a focus group, an experimental study, and a semistructured interview. (p. 8)

4.1. Focus group (p. 8)

Progress with the course schedule (p. 8)

Messages (p. 8)

Quiz and assignment submission (p. 8)

See Mazza (2004) for a detailed description of CourseVis and more examples of graphical representations. (p. 8)

The predominant criticisms of the discussion plot (see Figs. 1 and 2) was the difficulty in locating the exact position of the spheres onto the axis. (p. 9)

the rotation of the scatterplot was considered confusing (p. 9)

The participants were very positive about the discussion graph (see Fig. 3) due to its clarity and simplicity. (p. 9)

The participants commented that the cognitive matrix (see Fig. 4) could be used to monitor anomalous situations (p. 9)

The participants stressed that the cognitive graph (see Fig. 5) may help instructors to quantify the difference between concepts, but that the cognitive matrix was preferred, because it allowed ‘‘a quicker glance’’ to compare concepts and individual students’ performance at the same time. (p. 9)

All participants would use the student accesses plot (see Fig. 6) to monitor the students’ accesses and identify early problems. (p. 9)

The participants were asked to pretend that they were the instructors of the course and to get as much information as possible about the students. Both groups where provided with instructor access to the course. (p. 9)

Most of the participants found the student behavior graph (see Fig. 7) confusing and difficult to follow. (p. 9)

However, the access to content pages by topics and the global access to the course were regarded as very useful. (p. 9)

4.2. Experimental study (p. 9)

4.3. Semi-structured interviews (p. 10)

Social aspects: Five of the six participants found it interesting to have the information on the number of threads opened and the follow-up received. (p. 10)

However, the discussion graph was preferred for its clarity and simplicity, while the discussion plot was judged helpful for the relationship of discussions with the dates. (p. 10)

Cognitive aspects: All participants considered the information provided by the graphical representations on cognitive aspects useful. (p. 11)

Behavioural aspects. The student accesses plot was considered useful by most participants (p. 11)

However, it was stressed that this representation did not express the real engagement of a student in the course, for example, a student may print all the course materials and study on his own without accessing the course for several days. (p. 11)

4.4. Discussion (p. 11)

variables at the same time. (p. 12)

Improvements have been suggested, such as providing flexible links between the graphical representations, developing easy connections between graphics and corresponding data from CMS (e.g. a link from a discussion graph to postings in the discussion forum), and enabling instructors to say what data should be included/excluded from graphics and what elements should be highlighted (e.g. important dates). (p. 12)

CourseVis would also be greatly improved by providing direct access from the visualizations to the details that make up a particular aspect of a plot, e.g. opening a discussion thread, bringing up all the assignments related to arrays, etc. (p. 12)

4.4.5. Problems with the graphical representations and limitations of the evaluation studies (p. 12)

  1. Related work (p. 12)

The problems were related to missing values, confusions with rotation, and difficulty in examining too many (p. 12)

For example, Raffay and Chanier (2002) utilize the Graphvis (2004) visualization tool monitoring group communications in order to help instructors detect collaboration problems. (p. 13)

Blog Logo

Bodong Chen


Published

Image

Crisscross Landscapes

Bodong Chen, University of Minnesota

Back to Home