The evidence in favor of active student engagement is overwhelming: learning outcomes are higher in active versus passive learning environments. Many instructors use active learning techniques; however, the actual level of each student’s engagement throughout a class session isn’t readily apparent. The data collection required to verify student engagement has traditionally been a time-consuming task with little standardization: manual coding of videos, questionnaires or pre- post-surveys, quizzes, and interviews.
Online learning represents a tremendous departure from some of these limitations: platforms can automatically record classes, generate transcripts, and log instances of students speaking, editing documents, and contributing to text chat. These new opportunities to quantify engagement challenge us to consider what we are measuring and how we should use this information.
At Minerva University, for example, a classroom on its digital platform, Forum(TM) has multifaceted engagement, with verbal, written, and visual elements in the main classroom and breakouts. Active learning is further supported with interactive learning resources, including collaborative workbooks, whiteboards, and polls.
After completing a class session, Forum includes metrics for overall student-instructor talk time, reactions (emojis), hand-raises, chats, and individual student talk-time and talk-time history in breakouts and the main classroom. A single class session includes hundreds of measurements of student engagement. While some platforms include similar metrics, many others focus on providing transcripts, poll responses, message boards, and click-through tracking of engagement with course materials. Minerva instructors use metrics of in-class engagement as a powerful tool to identify students who may be struggling to participate and examine how we include all our students in active learning.
Quick Facts
Computational Sciences
Computational Sciences
Social Sciences & Business
Business
Natural Sciences
Social Sciences
Social Sciences
Social Sciences & Business
Business & Computational Sciences
Business and Social Sciences
Social Sciences and Business
Computational Sciences & Social Sciences
Computer Science & Arts and Humanities
Business and Computational Sciences
Business and Social Sciences
Natural Sciences
Arts and Humanities
Business, Social Sciences
Business & Arts and Humanities
Computational Sciences
Natural Sciences, Computer Science
Computational Sciences
Arts & Humanities
Computational Sciences, Social Sciences
Computational Sciences
Computational Sciences
Natural Sciences, Social Sciences
Social Sciences, Natural Sciences
Data Science, Statistics
Computational Sciences
Business
Computational Sciences, Data Science
Social Sciences
Natural Sciences
Business, Natural Sciences
Business, Social Sciences
Computational Sciences
Arts & Humanities, Social Sciences
Social Sciences
Computational Sciences, Natural Sciences
Natural Sciences
Computational Sciences, Social Sciences
Business, Social Sciences
Computational Sciences
Natural Sciences, Social Sciences
Social Sciences
Arts & Humanities, Social Sciences
Arts & Humanities, Social Science
Social Sciences, Business
Arts & Humanities
Computational Sciences, Social Science
Natural Sciences, Computer Science
Computational Science, Statistic Natural Sciences
Business & Social Sciences
Computational Science, Social Sciences
Social Sciences and Business
Business
Arts and Humanities
Computational Sciences
Social Sciences
Social Sciences and Computational Sciences
Social Sciences & Computational Sciences
Social Sciences & Arts and Humanities
Computational Science
Natural Sciences
Sustainability
Computational Sciences
Computational Sciences
Computational Science & Business
Economics
Social Sciences
Computer Science and Artificial Intelligence
Economics and Society & Strategic Finance
Enterprise Management
Economics and Society
Cells and Organisms & Brain, Cognition, and Behavior
Cognitive Science and Economics & Political Science
Applied Problem Solving & Computer Science and Artificial Intelligence
Computer Science and Artificial Intelligence & Cognition, Brain, and Behavior
Designing Societies & New Ventures
Strategic Finance & Data Science and Statistics
Brand Management and Designing Societies
Data Science & Economics
Machine Learning
Cells, Organisms, Data Science, Statistics
Arts & Literature and Historical Forces
Artificial Intelligence & Computer Science
Cells and Organisms, Mind and Emotion
Economics, Physics
Managing Operational Complexity and Strategic Finance
Global Development Studies and Brain, Cognition, and Behavior
Scalable Growth, Designing Societies
Business
Drug Discovery Research, Designing and Implementing Policies
Historical Forces, Cognition, Brain, and Behavior
Artificial Intelligence, Psychology
Designing Solutions, Data Science and Statistics
Data Science and Statistic, Theoretical Foundations of Natural Science
Strategic Finance, Politics, Government, and Society
Data Analysis, Cognition
Brand Management
Data Science and Statistics & Economics
Cognitive Science & Economics
Data Science and Statistics and Contemporary Knowledge Discovery
Conversation
The evidence in favor of active student engagement is overwhelming: learning outcomes are higher in active versus passive learning environments. Many instructors use active learning techniques; however, the actual level of each student’s engagement throughout a class session isn’t readily apparent. The data collection required to verify student engagement has traditionally been a time-consuming task with little standardization: manual coding of videos, questionnaires or pre- post-surveys, quizzes, and interviews.
Online learning represents a tremendous departure from some of these limitations: platforms can automatically record classes, generate transcripts, and log instances of students speaking, editing documents, and contributing to text chat. These new opportunities to quantify engagement challenge us to consider what we are measuring and how we should use this information.
At Minerva University, for example, a classroom on its digital platform, Forum(TM) has multifaceted engagement, with verbal, written, and visual elements in the main classroom and breakouts. Active learning is further supported with interactive learning resources, including collaborative workbooks, whiteboards, and polls.
After completing a class session, Forum includes metrics for overall student-instructor talk time, reactions (emojis), hand-raises, chats, and individual student talk-time and talk-time history in breakouts and the main classroom. A single class session includes hundreds of measurements of student engagement. While some platforms include similar metrics, many others focus on providing transcripts, poll responses, message boards, and click-through tracking of engagement with course materials. Minerva instructors use metrics of in-class engagement as a powerful tool to identify students who may be struggling to participate and examine how we include all our students in active learning.