Bodong Chen

Crisscross Landscapes

Notes: Gravel2017-yf: Integrating Computational Artifacts into the Multi-representational Toolkit of Physics Education

2018-04-23


References

Citekey: @Gravel2017-yf

Gravel, B. E., & Wilkerson, M. H. (2017). Integrating Computational Artifacts into the Multi-representational Toolkit of Physics Education. In D. F. Treagust, R. Duit, & H. E. Fischer (Eds.), Multiple Representations in Physics Education (pp. 47–70). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-58914-5_3

Notes

Summarize:

A piece presenting two case studies about computational artifacts in two communities – one group of scientists, and one community of grade 5 students – and their use of computational representation in their own work. Three major themes or discourse moves were identified from case studies of their discourse.

The tool, SiMSAM, is focused more on representation instead of simulation, in my opinion. This is quite smartly done. Of course, one could argue simulation is covered by student explanations of their representation.

Assess:

I would appreciate more elaboration of the term ‘computational artifacts’ because I noticed earlier literature would treat a computer as a computational artifacts as well.

Reflect:

Highlights

Computational artifacts such as simulations and visualizations are important repre- sentational tools in physics and physics education. (p. 1)

Instead, computa - tional artifacts are constructed, used, and adapted over time by particular learning communities for particular purposes. Community members must negotiate how such artifacts should be understood as representations that can describe and uncover particular aspects of scientific phenomena. (p. 1)

In this chapter, we address the question: How does a learning community inte- grate a particular computational artifact into the shared multi-representational tool- kit they use for communicating and reasoning about scientific phenomena? (p. 1)

In both cases, development of a shared understanding of the computational artifact involved (1) Working to articulate the representational meaning(s) of the artifact and its connection to other more familiar representations; (2) Using shared language about the artifact to focus attention on the causal mechanisms describing the phenomenon of interest; and (3) Noting limitations of the representational arti- fact and its computational architecture. (p. 2)

Our findings suggest this process is nontrivial, and critical for computational models to be deeply and meaningfully integrated into classroom-level scientific activity. (p. 2)

3.1.1 Computational Representations in Scientific Practice (p. 2)

Representational tools, thus, play a central role in the how knowledge is generated, expressed, and shared to construct the “lan- guage of communication” for the ideas relevant to that community (Noss et al. 2007, p. 381). The situation is no less true in science classrooms, where the growing use of multiple representations can fundamentally shape how learners interact with one another and co-construct knowledge (Jewitt et al. 2001; Prain and Waldrop 2010). All of this activity requires an understanding of how learners interpret, con- struct, and negotiate meaning across these various representational resources (Jewitt 2008). (p. 3)

Some of the most powerful and ubiquitous modes of representation in physics are computational (Thijssen 1999). (p. 3)

exactly how computational artifacts are meant to serve as representations in science practice is unclear, and varies from community to com- munity (Grüne-Yanoff and Weirich 2010). (p. 3)

3.1.2 Computational Representations in Science Education (p. 3)

Stratford et al. ( 1998) documented what they called “Cognitive Strategies for Modeling”—analysis, relational reasoning, synthesis, testing and debugging, and making explanations—that they argue students engage in while building dynamic computational models. (p. 4)

However, less work examines how such artifacts might afterward be used at the classroom level to support collec- tive argumentation and knowledge construction. (p. 4)

Berland and Reiser ( 2011) found that some middle school students blurred the distinction between inferences and evidence when engaged in scientific argumentation using a computer simulation of ecosystem dynamics. (p. 4)

Hmelo-Silver et al. (2015) described how two teachers engaged their students differently in simulation-mediated inquiry. They found that one teacher, Mr. Fine, encouraged students to explicitly reason through what particular features of the simulation were meant to represent. (p. 4)

3.1.3 Computational Representations as Distributed (p. 4)

We are interested in studying communities in which members construct their own computational artifacts, and in the ways those artifacts then become understood, shared, and integrated into the representational toolkit of the commu- nity as a whole. (p. 4)

Distributed representations are “… created and used in the cooperative practices of persons as they engage with natural objects, manufactured devices, and traditions, as they seek to understand and solve new problems” (Osbeck and Nersessian 2006, p. 144). (p. 5)

involving two notions they termed cognitive partnering and representational coupling. Cognitive partnering involves forming links across people and artifacts in order to allow or sustain sense-making practice. For example, researchers may note that they are building on colleagues’ prior ideas or work. (p. 5)

Representational coupling involves articulat - ing relations across multiple representational resources, so that those resources form systems that can be used as models for reasoning. (p. 5)

a subset of members of a team create a computational artifact. The artifact is to be used meaningfully by the wider team, with the intention of moving forward the collective work. (p. 5)

developing shared epistemic and representational practices to move forward students’ work (Enyedy 2005; Greeno and Hall 1997). (p. 5)

3.2 Research Design (p. 5)

This study was conducted as part of the NSF-funded project entitled SiMSAM: Bridging Student, Scientific, and Mathematical Models with Expressive Technologies (henceforth the SiMSAM Project; IIS-12172100). (p. 5)

We did this by developing and researching how students use a simulation construction toolkit (henceforth SiMSAM for Simulation, Measurement, and Stop-Action Moviemaking; Wilkerson-Jerde et al. 2015), and by consulting with, and studying the behavior of, professional scientists who use computational modeling in their own work. (p. 5)

Our research design is most closely aligned with an interpretivist paradigm (p. 5)

3.2.2 Professional Scientists: The LCD Research Group (p. 6)

The group sought to model the behavior of liquid crystal structures, which could in turn inform the development of faster and more energy-efficient liquid crystal displays. Their work involved extend- ing established 1-dimensional and 2-dimensional models of liquid crystal behavior to more complex, multi-dimensional cases. (p. 6)

We collected data during a meeting early in the collaboration. (p. 6)

3.2.3 5th Grade Science Class: The Evaporation and Condensation Lesson (p. 7)

During the activities, middle school students use computer- based animation and simulation tools to construct models of “experiential unseens” (Gravel et al. 2013, p. 165), such as smell diffusion or air pressure. (p. 7)

The two–week activity was designed to focus on evaporation and condensation as related to the particulate nature of matter. For the first week, students addressed the question “When you take a cold bottle of soda out of the fridge, why does it get wet after some time?” and for the second, “What happens to puddles on a hot day?”. During both weeks, students were first invited to discuss their theories as a class and (p. 7)

to create drawn models. Next, they worked in groups of 2–3 with SiMSAM and a desk-mounted camera to create stop-motion animations using common craft materi- als. Students could then crop images from their animations, which became pro- grammable entities. Finally, they used these entities to create computational simulations representing the processes of condensation or evaporation using a sim- ple, programming-by-demonstration and menu-based programming options (Fig. 3.2). (p. 8)

Instead, it is students’ progress and ways of revising their generated models to become more coherent, explanatory, and predictive that are the focus of our work. (For a detailed account of the motivations for this sequence and ways in which learners iteratively present and revise ideas, see Wilkerson-Jerde et al. 2015.) (p. 8)

3.2.4 Analysis (p. 9)

Our guiding “quintain” (shared phenomenon of interest across instances; Stake 2006, p.4) was the uptake of a particular compu- tational artifact as a shared representational tool. (p. 9)

Next, we analyzed the video segments as holistic single cases (Yin 2009), work- ing to understand each independently. We did this through a process of iterative, collaborative viewing (Jordan and Henderson 1995) during which we took notes and divided episodes into descriptive segments based on major themes. (p. 9)

Over the course of this repeated viewing, we identified three distinct phases that constituted each case’s development of a shared representational understanding. For both groups, three descriptive segments or “phases” emerged in which participants were: (1) developing a shared understanding of the computational artifact as a rep- resentational tool; (2) leveraged the artifact to focus attention on their respective goals; and (3) discussed strengths and limitations of the computational environment relative to those goals. (p. 9)

3.3 Case 1: Modeling Liquid Crystal Displays (p. 10)

3.3.1 Episode 1 – “It’s Just Gonna Lie Down?” (p. 10)

Fig. 3.3 Representational elements from Case 1, Episode 1. R1 shows the example that Ian first presents containing the particular parameters modeled and the 3D plots produced by the computa- tional scripts in Mathematica. This is the central computational artifact for this Episodes 1 and 2 of Case 1 (p. 10)

plots, generated by the script written in Mathematica, serve as the central computa- tional artifact for the discussion. (p. 11)

Following Ian’s initial description of Example 1, Peter re-articulates what he understood the plot to represent. He used a similar gesture to Ian’s (p. 12)

Throughout this episode, Ian connected the computational model he was intro- ducing to a large collection of representational forms—including gesture, sketches, and 2D plots. (p. 12)

3.3.2 Episode 2 – “There’s Kind of a Funny Bump” (p. 12)

The team’s rapid navigation and critique of this new plot reflects their increasing comfort with the computationally-generated plots as representations of the behavior of liquid crystals–another example of Phase 2 (p. 14)

3.4 Case 2: Modeling Condensation and Cloud Formation (p. 15)

Our second case study is drawn from the ninth day of the modeling activities done in one of Mr. Arbor’s 5th grade classrooms. (p. 15)

What happens to puddles on a hot day? (p. 15)

A simu - lation constructed by Sergio, Luis and Ryan was projected on a screen at the front of the room. The simulation featured puddles located at the bottom of the screen, small blue objects the group identified as “water droplets” positioned immediately above those puddles, and clouds at the top of the screen (Fig. 3.5A). The students programmed the simulation so that when run, the water particles moved upward in a somewhat random path toward the clouds. (p. 15)

3.4.1 Episode 1: “What Do We Think About This Representation?” (p. 15)

Fig. 3.5 The student generated simulation projected on the Smartboard. In the simulation, “water droplet” objects positioned near puddle objects (A) move toward cloud objects. When a droplet intersects a cloud, it disappears and a new cloud appears near the site of intersection (B) (p. 15)

In this short exchange, students begin to make connections between the compu- tational artifact—the SiMSAM simulation and its constituent symbols, behaviors, and interactions—and the phenomenon it is meant to represent. (p. 16)

Phase 2, and focus stu - dents’ attention on a key question about the phenomenon: “when there’s evaporation… does it form its own new clouds, or does it add on to the clouds that are already here?” (p. 16)

3.4.2 Episode 2 – “Maybe You Could Have a Color Option” (p. 17)

we redirected the conversation to see if there were other representational features or elements students wanted, but were unable to add to their simulations. This initiated Phase 3–an explicit conversation about the limita- tions of the modeling tool and whether it satisfactorily served the students’ repre- sentational goals. (p. 17)

Rather than producing a second cloud when a water droplet collides with the exist- ing cloud, Kenny wanted to make clouds become darker in color. (p. 18)

The LCD Research Group and 5th Grade Science Classroom we report on in this chapter are quite different learning communities. (p. 18)

Nice to present these two in one chapter. We can think further about bridging them–temporally or spatially. Like connecting students with scientists now; or thinking about those students 20 years later.. (p. 18)

In this section, we draw comparisons between the processes and practices we found in the two cases, and identify specific discursive moves that marked ways in which members of each community begin to treat their respective computational artifacts as repre- sentational tools in service of their different goals. (p. 18)

3.5.1 From Making Sense to Making Use of Computational Artifacts as Representations (p. 18)

Upon further analysis, we found these different phases involved three types of discursive moves practiced by both the professionals and the students. These are described in Table 3.1. (p. 19)

Meta-representational talk refers to instances where participants established explicit links between elements of the computational artifacts and aspects of the phenomenon that they are working to understand. (p. 19)

It was also the means by which they developed a shared language around that artifact, such that it could then become a tool for thinking and an object of critique. We view critique as a meta-representational tool (diSessa and Sherin 2000) used to position the artifacts as a useful contribution, but also incom- plete, malleable, and fallible. (p. 19)

Building on these publicly-established and shared understandings and language, participants then focused their attention on more specific causal mechanisms related to the phenomenon under study. They began to articulate the mechanisms that (p. 19)

Table 3.1 Discursive moves practiced as professionals and students worked to make sense and make use of computational artifacts as representations of physical phenomena (p. 20)

linked cause and effect, and questioned how these mechanisms were represented within and extended beyond the artifact itself. (p. 20)

As they developed this understanding, they made suggestions for how to extend the architecture to accommodate their epistemic goals. The professionals recognized the need to extend or redevelop the numerical solvers needed to model liquid crystals (p. 20)

3.5.2 Understanding the Representational Toolkit of Physics and Physics Education (p. 21)

Computational tools are an integral part of the toolkit of physics and researchers are calling for increased integration of computational tools in physics education. (p. 21)

With both learning communities, the integration of these computational environ- ments into the toolkit of physics is deliberate, explicit, and effortful. (p. 21)

Chandrasekharan, S., & Nersessian, N. J. (2014). Building cognition: The construction of compu- tational representations for scientific discovery. Cognitive Science. Advance online publication. doi:10.1111/cogs.12203. (p. 22)

diSessa, A. A., & Sherin, B. L. (2000). Meta-representation: An introduction. The Journal of Mathematical Behavior, 19(4), 385–398. (p. 22)

Ergazaki, M., Zogza, V., & Komis, V. (2007). Analysing students’ shared activity while modeling a biological process in a computer-supported educational environment. Journal of Computer Assisted Learning, 23(2), 158–168. doi:10.1111/j.1365-2729.2006.00214.x. (p. 23)

Greca, I. M., Seoane, E., & Arriassecq, I. (2014). Epistemological issues concerning computer simulations in science and their implications for science education. Science & Education, 23(4), 897–921. doi:10.1007/s11191-013-9673-7. (p. 23)

Greeno, J. G., & Hall, R. P. (1997). Practicing representation: Learning with and about representa- tional forms. Phi Delta Kappan, 78(5), 361–367. (p. 23)

Grüne-Yanoff, T., & Weirich, P. (2010). The philosophy and epistemology of simulation: A review. Simulation and Gaming, 41(1), 20–50. doi:10.11771046878109353470. (p. 23)

Hmelo-Silver, C. E., Liu, L., Grey, S., & Jordan, R. (2015). Using representational tools to learn about complex systems: A tale of two classrooms. Journal of Research in Science Teaching, 52(1), 6–35. doi:10.1002/tea.21187. (p. 23)

Lehrer, R., & Schauble, L. (2000). Developing model-based reasoning in mathematics and science. Journal of Applied Developmental Psychology, 21(1), 39–48. (p. 23)

Noss, R., Bakker, A., Hoyles, C., & Kent, P. (2007). Situating graphs as workplace knowledge. Educational Studies in Mathematics, 65(3), 367–384. (p. 23)

Osbeck, L. M., & Nersessian, N. J. (2006). The distribution of representation. Journal for the Theory of Social Behavior, 36(2), 141–160. doi:10.1111/j.1468-5914.2006.00301.x. (p. 24)

Stratford, S. J., Krajcik, J., & Soloway, E. (1998). Secondary students’ dynamic modeling pro- cesses: Analyzing, reasoning about, synthesizing, and testing models of stream ecosystems. Journal of Science Education and Technology, 7(3), 215–234. (p. 24)

White, B. Y., & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making sci- ence accessible to all students. Cognition and Instruction, 16(1), 3–118. doi:10.1207/ s1532690xci1601_2. (p. 24)

Wilkerson-Jerde, M. H., Gravel, B. E., & Macrander, C. A. (2015). Exploring shifts in middle school learners’ modeling activity while generating drawings, animations, and simulations of molecular diffusion. Journal of Science Education and Technology, 24(2-3), 204–251. doi:10.1007/s10956-014-9497-5. (p. 24)

Windschitl, M., Thompson, J., & Braaten, M. (2008). Beyond the scientific method: Model-based inquiry as a new paradigm of preference for school science investigations. Science Education, 92(5), 941–967. (p. 24)

Winsberg, E. (1999). Sanctioning models: The epistemology of simulation. Science in Context, 12(02), 275–292. (p. 24)