Given that contemporary educational technologies are recording student learning data, then these
data should ideally be put to good use, building actionable insights, driving evidence-based
innovations, and fostering student success. These are the same goals historically pursued by
education researchers, and yet, education researchers rarely have easy access to these emerging
data resources. What happens when academic researchers try to gain access to student learning
data, what challenges do they encounter, and what are the opportunities for collaboratively
advancing student success if we can overcome these challenges?
Bright and early on June 3, 2025, I moderated a panel discussion about researchers’ access to the
rich data collected on digital learning platforms (DLPs, e.g., online homework systems, learning
management systems) at 1Edtech’s Learning Impact Conference, in Indianapolis, Indiana.
1EdTech (formerly IMS Global) is one of the organizations building standards for how these
platforms should be interoperable, and how digital services should work together to create (in the
1EdTech parlance) a digital ecosystem. A hallmark of such an ecosystem is data (lots of it!) from a
variety of sources. In some ways, these data have fueled research movements such as Learning
Analytics, but in other ways, the data have been out-of-reach from research, under the careful
oversight of the technical personnel at the DLP or the educational institution. Protecting the
security of this data is paramount for the technical personnel, and therefore, opening the data
requires a high level of scrutiny focused on the researchers. I was joined by three of my peers to
discuss how we can collectively work with the DLPs and the educational institutions to enable safe
use of the data for research and possibilities for evidence-based instructional practices.
The panelists, with different backgrounds and areas of research, shared more about their
experience working with DLP-based research. Dr. Anne Cleary from Colorado State University
shared about her project that used smartwatches for nudging students to engage in spaced-
retrieval practice. Dr. Cameron Hecht from Rochester University shared his work on help-seeking
behavior. Both lines of research had significant applied value for instructional practice and,
therefore, aligned with institutional needs. My third panelist, Dr. Debshila Basu Mallick, is working
on building SafeInsights, an infrastructure that brings together researchers, edtech, and
educational institutions. With privacy-preserving technology, SafeInsights flips the traditional
research workflow: instead of taking the data out of the DLP, it brings the researchers’ query to the
data, a complement to bringing learner data into the lab.
The discussion surfaced the needs and barriers typically experienced by researchers, including but
not limited to data access, the fact that DLPs are not natively designed to support research,
overreach from IRB, and multi-institutional data sharing. We discussed the researchers’
enthusiasm for engaging in DLP-based research, new approaches for DLP-based research that
address established pain points for researchers and DLPs as implemented on SafeInsights, and
addressed questions from the audience on cultivating mutually beneficial collaborations.
The audience for this panel included DLP representatives who are actively trying to build bridges,
much like years of work coming out of Arizona State University under the stewardship of Dr.
Danielle McNamara. For example, Unizin, a consortium of leading academic and research institutions committed to optimizing digital transformation in higher education, has been doing this
for 10 years now, and Unizin has a robust solution to many of the challenges (albeit only for their
member institutions). From Unizin’s perspective, and from the perspective of other audience
members (based on the kinds of questions they asked the panel), the issue is getting more
researchers interested and moving from traditional research methods to DLP-based research,
which inevitably requires them to acquire new technical skills and to manage new procedural
burdens. Even when a technical process exists for research access to platform data, the “if you
build it, they will come” scenario isn’t happening.
SEERNet is an Institute of Education Sciences (IES) funded group of five DLPs instrumenting their
platforms for research, and subsequently-awarded research teams that come together to share
knowledge and collectively address pain-points for the research community engaging in platform-
based education research. As members of SEERNet, we frequently discuss how to encourage
researchers to come and play ball with digital learning platforms. How do we communicate the
value of platform-enabled research, when platform-enabled research will entail new challenges? It
might be disingenuous to describe this research as being more efficient. More important, perhaps,
is that platform-enabled research has much greater relevance to the digital “ecosystem” — the
large-scale real-world context of contemporary education. Anne Cleary’s work on spaced retrieval
using smartwatches and Cameron Hecht’s work on interventions to improve help-seeking have
practical relevance. Practical relevance is a virtue that everyone in education research can agree
on. To demonstrate that practical relevance, the work should be done within the platforms that
comprise the firmament of current educational practice. That might entail challenges, but the
challenges are worth enduring for researchers who want to prioritize practical relevance.
The turnabout here is not lost on me. I organized a panel, aimed at the edtech community, focused
on the challenges of bringing learner data into the research lab. For those in the edtech developer
community who attended our panel, their response might be paraphrased as, “We’re trying! We
can’t solve everything! We’d love to see more interest among researchers!” – and this is a sentiment
shared within SEERNet. There will be challenges. Even the most streamlined platform-enabled
research solutions (like those we’ve been building) will present difficulties. SEERnet members are
clear-eyed about them, and we’re working hard to minimize them. In the meantime, the upside is
research with practical relevance, which is something worth a little struggle on all fronts.