Please ensure Javascript is enabled for purposes of website accessibility
Hamburger Nav

Conducting Research on the Digital Learning Platforms

Introduction

SEERNet includes 5 digital learning platforms (DLPs) that connect researchers to millions of teachers and learners across US K-12 schools and post-secondary institutions. Each DLP has their own unique users and capabilities for conducting research – this site is a central resource for researchers looking to understand what each platform offers.

Guiding Questions

This guide aims to answer the following set of questions that researchers might ask when considering each platform:

  • What types of research questions can the platform be used to answer? 
  • What are examples of experimental contrasts?
  • What types of study designs are supported? For experimental platforms, what levels of random assignment are supported?
  • Are there processes for vetting researchers or research studies? What researcher prerequisites are needed before beginning a study?
  • Does the platform require pre-registration, and what/whose IRB approvals are necessary?
  • Who is the potential user base of the platform? What data is available on potential participants?
  • Does the platform assist with recruitment? 
  • What types of data are routinely collected by the platform? Can these be linked to external data? Are there internal measures of prior achievement? Does the platform accommodate external measures? What are the potential outcome measures?
  • Does the platform have analytic capabilities?
  • How might pre-registration/IRB requirements affect analysis, reporting, or dissemination?

Recommended Use

This guide highlights the capabilities of each digital learning platform for external researchers interested in conducting experimental studies. The below table highlights the similarities and differences between the DLPs at varying stages of the research process – more detailed information on each platform, as well as other possibilities for non-experimental studies or alternate collaborations can be obtained from the platforms themselves. As network lead, SEERNet welcomes questions and feedback from interested partners – stay tuned for more content and future opportunities!

Comprehensive View

 
E-Trials (Assistments)
UpGrade (Mathia)
Terracotta (Canvas)
Kinetic (OpenStax)
Learning at Scale (ASU)
User Population
K-12 math students using OER math curriculum
6-12 math students using Mathia, teachers using MATHia
6-16 students using Canvas
Post-secondary students preferably using OpenStax textbooks
Post-secondary online ASU students
Research Questions
Effectiveness of student supports for math learning
Improvements to student learning based on alternative presentations of material. Also motivational and related improvements due to design, messaging, etc.
Impact of learning activities and assignment context within Canvas on student mindset and performance
Learner characteristics and their influence on behavior, performance, and psychosocial constructs
ASU L@S affords a wide range of questions regarding learning in credit-bearing courses that utilize long-term and short-term student performance data and various student demographics.
Pre-Registration/Vetting
Pre-register on OSF.io
Verify feasibility of intervention with Carnegie Learning design team and interested district, including completing pre-registration form
No formal vetting process
Pre-register on OSF.io recommended
ASU Provost’s Office
IRB requirements
Normal educational practice covered by existing ASSISTments IRB, external researchers get an IRB to receive data.
Researchers use own IRB (if needed)
Researchers use own IRB
Researchers submit to Rice IRB
ASU IRB, researchers use own IRB
Recruitment
No recruitment necessary – all users eligible. The timing of the study will depend on when the teachers assign the problems as determined by the curriculum order/time of year.
Researchers recruit school(s)/district(s) using Mathia’s customer base, and may need to initiate data-sharing agreements with these districts.
Teachers (at institutions where Terracotta has been integrated) recruit students to participate in study. In the event that the researcher is not a teacher, the researcher recruits teachers to participate.
Students opt-in, incentivized, institutional partnerships
Recruitment depends on existing data or implementing interventions/surveys.
Randomization
Student-level random assignment
Individual or group random assignment (class, teacher, school, district)
Student-level random assignment AND student/assignment-level randomization (within-subject crossovers)
Student-level random assignment
Affords randomization at individual or group level depending on research question.
Intervention
Set of student supports for one or more problems
Alternate unit of instruction/activity in Mathia. Messaging, hints, presentation and design features.
Assignments
Open-ended based on capabilities of Qualtrics
Affords randomization at individual or group level depending on research question.
Prior achievement/ demographic data
Class/group membership, school/class-level contextual data, prior ASSISTments achievement
Class/group membership, prior Mathia achievement
Existing data within Canvas course site (gradebook, activity, assignments), and any student-level data added by the Teacher
Learner characteristics collected by Kinetic across studies
Data warehouse will contain demographic, achievement, course activity data
Outcome Measures
performance on Similar-but-Not-the-Same (SNS) problems
Mathia process measures, performance, and survey measures
Canvas gradebook, activity, assignments, data added by teacher
Researcher-administered measures in Qualtrics In future versions, connections to institutional data (course grades, etc)
Course activity, grades, persistence/graduation
Analysis
Data export, posted to OSF.io
Data export
Data export, possible analysis tools
Secure data enclave allows researchers to run analysis with full dataset without access to PII
Data warehouse
 
User Population
E-Trials (Assistments)
K-12 math students using OER math curriculum
UpGrade (Mathia)
6-12 math students using Mathia, teachers using MATHia
Terracotta (Canvas)
6-16 students using Canvas
Kinetic (OpenStax)
Post-secondary students preferably using OpenStax textbooks
Learning at Scale (ASU)
Post-secondary online ASU students
 
Research Questions
E-Trials (Assistments)
Effectiveness of student supports for math learning
UpGrade (Mathia)
Improvements to student learning based on alternative presentations of material. Also motivational and related improvements due to design, messaging, etc.
Terracotta (Canvas)
Impact of learning activities and assignment context within Canvas on student mindset and performance
Kinetic (OpenStax)
Learner characteristics and their influence on behavior, performance, and psychosocial constructs
Learning at Scale (ASU)
ASU L@S affords a wide range of questions regarding learning in credit-bearing courses that utilize long-term and short-term student performance data and various student demographics.
 
Pre-Registration/Vetting
E-Trials (Assistments)
Pre-register on OSF.io
UpGrade (Mathia)
Verify feasibility of intervention with Carnegie Learning design team and interested district, including completing pre-registration form
Terracotta (Canvas)
No formal vetting process
Kinetic (OpenStax)
Pre-register on OSF.io recommended
Learning at Scale (ASU)
ASU Provost’s Office
 
IRB requirements
E-Trials (Assistments)
Normal educational practice covered by existing ASSISTments IRB, external researchers get an IRB to receive data.
UpGrade (Mathia)
Researchers use own IRB (if needed)
Terracotta (Canvas)
Researchers use own IRB
Kinetic (OpenStax)
Researchers submit to Rice IRB
Learning at Scale (ASU)
ASU IRB, researchers use own IRB
 
Recruitment
E-Trials (Assistments)
No recruitment necessary – all users eligible. The timing of the study will depend on when the teachers assign the problems as determined by the curriculum order/time of year.
UpGrade (Mathia)
Researchers recruit school(s)/district(s) using Mathia’s customer base, and may need to initiate data-sharing agreements with these districts.
Terracotta (Canvas)
Teachers (at institutions where Terracotta has been integrated) recruit students to participate in study. In the event that the researcher is not a teacher, the researcher recruits teachers to participate.
Kinetic (OpenStax)
Students opt-in, incentivized, institutional partnerships
Learning at Scale (ASU)
Recruitment depends on existing data or implementing interventions/surveys.
 
Randomization
E-Trials (Assistments)
Student-level random assignment
UpGrade (Mathia)
Individual or group random assignment (class, teacher, school, district)
Terracotta (Canvas)
Student-level random assignment AND student/assignment-level randomization (within-subject crossovers)
Kinetic (OpenStax)
Student-level random assignment
Learning at Scale (ASU)
Affords randomization at individual or group level depending on research question.
 
Intervention
E-Trials (Assistments)
Set of student supports for one or more problems
UpGrade (Mathia)
Alternate unit of instruction/activity in Mathia. Messaging, hints, presentation and design features.
Terracotta (Canvas)
Assignments
Kinetic (OpenStax)
Open-ended based on capabilities of Qualtrics
Learning at Scale (ASU)
Affords randomization at individual or group level depending on research question.
 
Prior achievement/ demographic data
E-Trials (Assistments)
Class/group membership, school/class-level contextual data, prior ASSISTments achievement
UpGrade (Mathia)
Class/group membership, prior Mathia achievement
Terracotta (Canvas)
Existing data within Canvas course site (gradebook, activity, assignments), and any student-level data added by the Teacher
Kinetic (OpenStax)
Learner characteristics collected by Kinetic across studies
Learning at Scale (ASU)
Data warehouse will contain demographic, achievement, course activity data
 
Outcome Measures
E-Trials (Assistments)
performance on Similar-but-Not-the-Same (SNS) problems
UpGrade (Mathia)
Mathia process measures, performance, and survey measures
Terracotta (Canvas)
Canvas gradebook, activity, assignments, data added by teacher
Kinetic (OpenStax)
Researcher-administered measures in Qualtrics In future versions, connections to institutional data (course grades, etc)
Learning at Scale (ASU)
Course activity, grades, persistence/graduation
 
Analysis
E-Trials (Assistments)
Data export, posted to OSF.io
UpGrade (Mathia)
Data export
Terracotta (Canvas)
Data export, possible analysis tools
Kinetic (OpenStax)
Secure data enclave allows researchers to run analysis with full dataset without access to PII
Learning at Scale (ASU)
Data warehouse