The evidence in favor of active student engagement is overwhelming: learning outcomes are higher in active versus passive learning environments. Many instructors use active learning techniques; however, the actual level of each student’s engagement throughout a class session isn’t readily apparent. The data collection required to verify student engagement has traditionally been a time-consuming task with little standardization: manual coding of videos, questionnaires or pre- post-surveys, quizzes, and interviews.
Online learning represents a tremendous departure from some of these limitations: platforms can automatically record classes, generate transcripts, and log instances of students speaking, editing documents, and contributing to text chat. These new opportunities to quantify engagement challenge us to consider what we are measuring and how we should use this information.
At Minerva University, for example, a classroom on its digital platform, Forum(TM) has multifaceted engagement, with verbal, written, and visual elements in the main classroom and breakouts. Active learning is further supported with interactive learning resources, including collaborative workbooks, whiteboards, and polls.
After completing a class session, Forum includes metrics for overall student-instructor talk time, reactions (emojis), hand-raises, chats, and individual student talk-time and talk-time history in breakouts and the main classroom. A single class session includes hundreds of measurements of student engagement. While some platforms include similar metrics, many others focus on providing transcripts, poll responses, message boards, and click-through tracking of engagement with course materials. Minerva instructors use metrics of in-class engagement as a powerful tool to identify students who may be struggling to participate and examine how we include all our students in active learning.
Quick Facts
Conversation
The evidence in favor of active student engagement is overwhelming: learning outcomes are higher in active versus passive learning environments. Many instructors use active learning techniques; however, the actual level of each student’s engagement throughout a class session isn’t readily apparent. The data collection required to verify student engagement has traditionally been a time-consuming task with little standardization: manual coding of videos, questionnaires or pre- post-surveys, quizzes, and interviews.
Online learning represents a tremendous departure from some of these limitations: platforms can automatically record classes, generate transcripts, and log instances of students speaking, editing documents, and contributing to text chat. These new opportunities to quantify engagement challenge us to consider what we are measuring and how we should use this information.
At Minerva University, for example, a classroom on its digital platform, Forum(TM) has multifaceted engagement, with verbal, written, and visual elements in the main classroom and breakouts. Active learning is further supported with interactive learning resources, including collaborative workbooks, whiteboards, and polls.
After completing a class session, Forum includes metrics for overall student-instructor talk time, reactions (emojis), hand-raises, chats, and individual student talk-time and talk-time history in breakouts and the main classroom. A single class session includes hundreds of measurements of student engagement. While some platforms include similar metrics, many others focus on providing transcripts, poll responses, message boards, and click-through tracking of engagement with course materials. Minerva instructors use metrics of in-class engagement as a powerful tool to identify students who may be struggling to participate and examine how we include all our students in active learning.