Implementing Evidence-based Decision-making in the Edtech Industry

A closer look at ratings, reviews—and the reach for sound evidence.   

GUEST COLUMN | by Sarah Collings

The numbers are extraordinary: every month, each school district in the USA accesses an average of 1,403 edtech solutions. When facing such an overwhelming number of product options, school districts need to know how to make better purchase decisions. So can “evidence” help administrators distinguish between products and find interventions that actually improve student outcomes?

‘When facing such an overwhelming number of product options, school districts need to know how to make better purchase decisions.’

Showing Academic Progress

When spending federal funds, school district administrators know that they will be held accountable for showing academic progress. Tarro Funches is the English Learner Coordinator at Canton Public School District (MS). Like many of her peers, she is concerned about spending money on products that make a tangible difference. With a background in monitoring federal funding at the Mississippi Department of Education, and sitting on national boards for WIDA, Funches strongly believes in using data. Speaking at a recent webinar event: 

“Evidence is what we need,” said Funchess. “We have to get a product that shows us they can help us in the deficiency areas.” 

For many purchases, this evidence is not simply a “nice-to-have”. To make use of much federal and stimulus funding, districts are required to choose “evidence-based interventions”, as outlined in the Every Student Succeeds Act (ESSA). 

Based On Sound Evidence

However, progress towards this goal has been slow, as Daniel Stanhope, a researcher at Learn Platform, an Edtech effectiveness tracking platform, noted.   

“Historically, too many decisions around selection and implementations have been made based on recommendations from peers or compelling marketing and sales offers. 

The result is that decisions are often based on “ratings and reviews, rather than sound evidence. 

The problem with recommendations is that interventions can perform very differently when implemented in different contexts. Funches encouraged educators to, “visit other districts…sit and watch and see what is going on.” 

However, she was quick to criticize a one-size-fits-all approach, where decision-makers choose a product solely because it has worked well elsewhere. “It’s not as easy to say that because this product worked for ABC school district, that it would work for Canton School district,” she says. “You have to look at the qualitative and quantitative data and compare the demographics with your school district.” 

What It All Means—And Challenges

Relying on untested edtech interventions means students may not be receiving the support they deserve. Stanhope points to a common comparison between health and education: “You wouldn’t give your child medicine that has no basis in research or proof of its efficacy,” he says. “So why are we acquiescing to this practice in education?” 

One issue is the challenge of gathering high-quality data about edtech products, particularly at start-up or small companies. Nathan Martin has been working in edtech research since 2011 and is now the Head of Marketing at Off2Class, an edtech toolkit for teachers of English as a Second Language. As a start-up company, compiling evidence was a challenge. Martin suggests the comparisons between selling untested medicines versus edtech tools can only go so far. 

“Education is different than medicine,” said Martin. “The answer is not just to have a bunch of randomized control trials (RCTs).” 

In education, an RCT study may involve assigning different edtech tools to randomly selected students. This process can take several years, and the result could leave some students at a disadvantage. As Stanhope points out, “you don’t want to deprive the control group of students of a potential source of growth and learning advancement.” This is particularly true for student groups that are already under-served in the school system, such as English Language Learners. 

Another challenge with testing edtech products is that developers have the option to tweak the technology on an ongoing basis. This is a strength of edtech products and allows for more testing and data, but RCTs do not have the flexibility for testing updated iterations. 

Rapid Cycle Evaluation

Stanhope champions an alternative approach to evidence and proposes shorter studies of varying vigor. 

“You can’t just sit back and wait three years before you try something,” said Stanhope. “We need a rapid cycle evaluation approach.” This approach would shorten the time between testing a product and getting it into classrooms – an absolute must in the post-covid era where educators need to respond to ever-changing circumstances. 

The evidence requirements for ESSA funding allow for this kind of research. The four-tier ESSA system offers an accessible way for EdTech providers to start progressing towards rigorous evidence, acknowledging that advanced levels of rigor might not be appropriate for new tools being implemented in trial environments. In addition, this framework helps administrators differentiate between products. The most accessible point for edtech companies to start at is Tier 4: “demonstrates a rationale”. Tiers three to one then get progressively more rigorous.

To be successful, the evidence must go hand-in-hand with faithful implementation. After all, an intervention is worth nothing if teachers do not use it. LearnPlatform refers to this factor as “fidelity of implementation,” and it is something that districts can consider when measuring the efficacy of purchases. “A product is only as good as you use it,” says Funches. “The implementation part of it has to be done.” To achieve this implementation, districts need to maintain contact with the edtech provider and offer teachers training and support.

Collaboration and Open Communication

Ultimately, the panel agreed collaboration and open communication are key to ensuring that evidence can become embedded across the edtech landscape. Relying on evidence calls for more communication, not less. “We need more collaboration between the researching teams for companies and school districts,” said Stanhope. “Let’s do more.”

Sarah Collings works at Off2Class and has worked internationally in the charity and education sectors since 2010. As a Modern Foreign Languages graduate, and a qualified Teacher of English as a Foreign Language, Sarah is passionate about language learning and the power of languages to change people’s lives. Sarah works closely with the Teacher Community and is always keen to hear from educators about their challenges, goals, and successes.


    Leave a Comment

    %d bloggers like this: