Haku

Going beyond what is visible:what multichannel data can reveal about interaction in the context of collaborative learning?

QR-koodi

Going beyond what is visible:what multichannel data can reveal about interaction in the context of collaborative learning?

Abstract

Progress in the development of technology has provided data-capturing devices that make it possible to identify detailed processes in collaborative learning. This study utilized multichannel data, namely physiological data, video observations, and facial recognition data, to explore what they can reveal about types of interaction and regulation of learning during different phases of collaborative learning progress. Participants were five groups of three members each, selected for further study from an initial set of 48 students. The collaborative task was to design a healthy breakfast for an athlete. Empatica sensors were used to capture episodes of simultaneous arousal, and video observations were used to contextualize working phases and types of interaction. Facial expression data were created by post-processing video-recorded data. The results show that simultaneous arousal episodes occurred throughout phases of collaborative learning and the learners presented the most negative facial expressions during the simultaneous arousal episodes. Most of the collaborative interaction during simultaneous arousal was low-level, and regulated learning was not observable. However, when the interaction was high-level, markers of regulated learning were present; when the interaction was confused, it included monitoring activities. This study represents an advance in testing new methods for the objective measurement of social interaction and regulated learning in collaborative contexts.

Tallennettuna: