Terracotta is an open-source research platform that facilitates randomized experiments on learning activities within Canvas. It allows researchers to evaluate the content, context, timing, and mode of learning activities. The learning activities manipulated in Terracotta can be designed and implemented differentially at the class level. This paper showcases some of the platform’s features and gives one example of a study that was run using Terracotta.
Primary, secondary, and postsecondary teachers who use Canvas (and researchers who collaborate with such teachers) working at an institution that has opted to integrate Terracotta into its Canvas instance.
While there is no formal vetting process, researchers must partner with teachers at an institution using Canvas, and these institutions should seek appropriate permissions to install the Terracotta plug-in. In the coming years, Terracotta will be integrated with OSF to enable streamlined preregistration from within Terracotta. Data exported from Terracotta is deidentified and filters non-consenting participants, so there are no restrictions on data sharing.
Researchers use their own IRB protocols. In the event of multi-institutional research, the lead researcher hosts the IRB protocol, and participating institutions are data collection sites under that single IRB. If there are multiple researchers at different institutions, researchers should follow the guidelines of the funders or the lead PI’s institution.
A primary user of Terracotta is a teacher. The teacher sets up an experiment in their class site, and recruits students to participate, effectively by assigning students to (a) submit work for an assignment that has been experimentally manipulated, or (b) submit an informed consent response providing permission from the student to participate. Researchers (who are not teachers) can be added to a Canvas site as a Teacher, which would enable the researcher to create the experiment on the teacher’s behalf. A note on privacy: Terracotta enables privacy protections for student participants such as informed consent that is hidden from the teacher, filtering of non-consenting participants from result summaries and data exports, and removal of student identifiers from these exports. In the case of multisite research where an experiment is being deployed across districts or institutions, Terracotta will export de-identified research data that can be shared publicly across institutional boundaries, preventing the need for complex multisite protocols.
Terracotta supports simple A/B tests, but also supports within-subject crossover designs (AB/BA) with pretest/posttest. This automates the difficulty of counterbalancing across experimental treatments, and also resolves concerns about bias due to experimental treatment (in within-subject crossovers, all students get all treatments, just in different orders). Moreover, Terracotta flexibly supports multiple assignments in each exposure period (AABB/BBAA), multiple crossovers (AB/BA/AB/BA), and more than two treatment conditions (ABC/CBA/BCA, where crossovers are randomized). Teachers can determine whether (and how) experimentally-manipulated assignments contribute weight to course scores.
Terracotta is an open-source experimental research platform that facilitates randomized experiments on learning activities within Canvas. It allows researchers to evaluate the content, mode, timing, and context of learning activities. Terracotta enables manipulation of assignments in Canvas, and accommodates multiple choice, short answer, and file upload question types. An “assignment” in Canvas can be remarkably broad, and could include instructions to do things outside the LMS, could also involve video playback, could be a vehicle for mindset interventions, etc.
Terracotta collects item-level data about students’ responses to questions, timestamped clickstream events of students’ activity in Terracotta, grades on activities, and a complete set of contextual data about each experiment embedded in the platform. In this regard, Terracotta data are multilevel: student-level data and class-level data (about the experiment context). There is no built-in way to link student data from Terracotta to outside data sources, and the system does not have access to demographic data outside of input provided by the instructor. Terracotta does have access to Canvas-internal identifiers, so it may be technically possible to link to other sources if appropriate data-sharing agreements were in place. It is also possible to create a survey assignment within Terracotta, which will allow students’ responses to survey questions to be included in an experiment.
Learn more about the structure of data exports here.
Terracotta allows a researcher or teacher to collect outcome measures from the Canvas gradebook (from the LMS course site where the experiment is running), or to enter outcome scores manually. Because Terracotta is integrated with a class’s gradebook, any gradebook item can be selected as a research outcome (or as a pretest measure). These could be existing scores on target assignments, quizzes, or exams, and if the target outcome score is not already in the gradebook (such as statewide leveling assessments), the platform will allow manual entry or CSV upload. A researcher could also implement a custom posttest assignment within Terracotta, for example, if the outcome were measured by survey responses or student work artifacts.
Terracotta does not currently support analysis within the platform; it will eventually have the capability to summarize data, contrast between conditions, and display summary results. This is not intended to replace the export of raw data at the conclusion of an experiment but rather to summarize experimental contrasts for monitoring and consistency. For an example analysis from an experiment in Terracotta (a preregistered hierarchical Bayesian model), see https://osf.io/jm5r4.