Please ensure Javascript is enabled for purposes of website accessibility
Hamburger Nav

Research on the Digital Learning Platforms

SEERNet includes 5 digital learning platforms (DLPs) that connect researchers to millions of teachers and learners across US K-12 schools and post-secondary institutions. Each DLP has its own unique users and capabilities for conducting research – this site is a central resource for researchers looking to understand what each platform offers.

Digital Learning Platforms-at-a-glance

These five digital learning platforms (DLPs) have the power of scale: ASSISTment’s E-TRIALS, MATHia/UpGrade, Canvas + Terracotta, Kinetic by OpenStax, and ASU Learning @ Scale each have at least 100,000 users. Read on for an overview of each DLP:

We have 6,000 problems with more than one support for each one and they have been randomized to over 300 students each. Educational data scientists might ask questions about the features of these supports (i.e., are hints that are shorter correlated with better student learning). Learning scientists will ask questions that contrast specific features of interventions they are interested in exploring.

UpGrade is an open-source A/B testing platform that facilitates randomized experiments on digital learning experiences. Currently it allows experiments to contrast the type/sequence/timing of secondary math content within MATHia, but by the end of the grant period it will enable connections to other software applications.

Terracotta is an open-source research platform that facilitates randomized experiments on learning activities within Canvas. It allows researchers to evaluate the content, context, timing, and mode of learning activities. The learning activities manipulated in Terracotta have the potential to be designed and implemented differentially at the class-level.

Kinetic will enable research on a wide spectrum of postsecondary learner outcomes related to behavior, performance, and psychosocial constructs. Depending upon the version of Kinetic and the exact outcomes of interest, researchers might design their studies using pre- and post-intervention assessments, collection of self-report or external data from students, and/or longitudinal analyses over multiple time scales using student records in OpenStax products. The alpha version will allow researchers to administer any measure that can be delivered via Qualtrics, while the beta version will allow linking of these researcher-administered measures to existing learner outcomes in OpenStax materials.

The L@S data warehouse will allow researchers to conduct several types of exploratory analysis and future designs may allow experimental interventions.

Recommended Use

This guide highlights the capabilities of each digital learning platform for external researchers interested in conducting experimental studies. The below table highlights the similarities and differences between the DLPs at varying stages of the research process – more detailed information on each platform, as well as other possibilities for non-experimental studies or alternate collaborations can be obtained from the platforms themselves. As network lead, SEERNet welcomes questions and feedback from interested partners – stay tuned for more content and future opportunities!

Guiding Questions

This guide aims to answer the following set of questions that researchers might ask when considering each platform:

  • What types of research questions can the platform be used to answer? 
  • What are examples of experimental contrasts?
  • What types of study designs are supported? For experimental platforms, what levels of random assignment are supported?
  • Are there processes for vetting researchers or research studies? What researcher prerequisites are needed before beginning a study?
  • Does the platform require pre-registration, and what/whose IRB approvals are necessary?
  • Who is the potential user base of the platform? What data is available on potential participants?
  • Does the platform assist with recruitment? 
  • What types of data are routinely collected by the platform? Can these be linked to external data? Are there internal measures of prior achievement? Does the platform accommodate external measures? What are the potential outcome measures?
  • Does the platform have analytic capabilities?
  • How might pre-registration/IRB requirements affect analysis, reporting, or dissemination?

Comprehensive View

E-Trials
(ASSISTments)
UpGrade
(Mathia)
Terracotta 
(Canvas)
OpenStax Kinetic Learning at Scale (ASU)






User Population
K-12 math students using OER math curriculum6-12 math students using Mathia, teachers using MATHia6-16 students using CanvasPost-secondary students preferably using OpenStax textbooksPost-secondary online ASU students






Research Questions
Effectiveness of student supports for math learningImprovements to student learning based on alternative presentations of material. Also motivational and related improvements due to design, messaging, etc.Students’ behavior in learning activities, and the effects of learning activities on student performance (or any outcome score in Canvas, or manually added by the teacher).
Learner characteristics and their influence on behavior, performance, and psychosocial constructs. 3 key guiding questions: who is the learner ? (individual differences), what are they learning? (contextual information), how are they learning? (context – learning strategies)
ASU L@S affords a wide range of questions regarding learning in credit-bearing courses that utilize long-term and short-term student performance data and various student demographics.






Pre-Registration/Vetting
Pre-register on OSF.io
Verify feasibility of intervention with Carnegie Learning design team and interested district, including completing pre-registration form 
No formal vetting process. Teachers are in control.Pre-register on OSF.io recommendedASU Provost’s Office






IRB requirements
Normal educational practice covered by existing ASSISTments IRB, external researchers get an IRB to receive data.Researchers use own IRB (if needed)Researchers use own IRB protocolResearchers submit to Rice IRBASU IRB, researchers use own IRB
Recruitment
No recruitment necessary – all users eligible. The timing of the study will depend on when the teachers assign the problems as determined by the curriculum order/time of year
Carnegie Learning will assist researchers in recruiting school(s)/district(s) using Mathia’s customer base, and will assist with data-sharing agreements with these districts.Teachers (at institutions where Terracotta has been integrated) recruit students to participate in study. In the event that the researcher is not a teacher, the researcher recruits teachers to participate.Students opt-in, incentivized, institutional partnershipsRecruitment depends on existing data or implementing interventions/surveys.
RandomizationStudent-level random assignment
Individual or group random assignment (class, teacher, school, district)

Student-level random assignment AND student/assignment-level randomization (within-subject crossovers)
Student-level random assignmentAffords randomization at individual or group level depending on research question.
InterventionSet of student supports for one or more problemsAlternate unit of instruction/activity in Mathia. Messaging, hints, presentation and design features.AssignmentsOpen-ended based on capabilities of QualtricsAffords randomization at individual or group level depending on research question.






Prior achievement/ demographic data 

Class/group membership, school/class-level contextual data, prior ASSISTments achievement
Class/group membership, prior Mathia achievementExisting data within Canvas course site (gradebook, activity, assignments), and any student-level data added by the TeacherLearner characteristics collected by Kinetic across studiesData warehouse will contain demographic, achievement, course activity data






Outcome Measures
Performance on Similar-but-Not-the-Same (SNS) problemsMathia process measures, performance, and survey measuresCanvas gradebook, activity, assignments, data added by teacherResearcher-administered measures in Qualtrics In future versions, connections to institutional data (course grades, etc)Course activity, grades, persistence/graduation
AnalysisData export, posted to OSF.ioData exportData export, possible analysis toolsSecure data enclave allows researchers to run analysis with full dataset without access to PIIData warehouse

Printer-friendly version