article thumbnail

Ed tech companies promise results, but their claims are often based on shoddy research

The Hechinger Report

Examples from The Hechinger Report’s collection of misleading research claims touted by ed tech companies. All three of these companies try to hook prospective users with claims on their websites about their products’ effectiveness. Some companies are trying to gain a foothold in a crowded market. Video: Sarah Butrymowicz.

Company 145
article thumbnail

With Budget Cuts Looming, Here’s How Districts Will Decide What to Keep or Cut

Edsurge

Examples of districts that embed practical evidence-based analysis into their program evaluation and budget cycles include efforts such as Jefferson County Public Schools’ Cycle-based Budgeting program and Schenectady City School’s return on investment review. That data is then fed into program and budget review cycles.

Policies 110
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

K-12 Educators and Administrators: Share Your Ed-tech Pilot Approach

Digital Promise

Digital Promise will award up to ten school or district leaders who submit a response by March 27 with a $1,000 stipend for a trip to San Francisco, including workshops with leading software companies, in partnership with the Education Technology Industry Network. So do most ed-tech companies. ” Mahnaz Charania.

Robotics 120
article thumbnail

Debunking the ‘Gold Standard’ Myths in Edtech Efficacy

Edsurge

But when it comes to demonstrating that products “work,” too many companies fall back on testimonials. But unfortunately over the past decade or two, educational research has gotten tangled up in how the medical industry defines and measures efficacy—standards that are as inappropriate as evaluating a headache only with an MRI machine.

EdTech 118
article thumbnail

More than a Checkmark

MIND Research Institute

If you’re merely looking to check a box saying “It has evidence” before you commit to a program, you’re asking for something that can be generated by any program. It is time to leave behind that old checkbox mentality and adopt a more comprehensive and useful lens through which to evaluate.

article thumbnail

ST Math Results & Impact

MIND Research Institute

Understanding MIND's Research Paradigm and Ways to Evaluate Effective Education Research Academic conversations around education research can often be daunting, difficult to follow, and beg more questions than answers when considering education programs. The longer the program has been implemented with consistency and fidelity (i.e.,

EdTech 52
article thumbnail

IES Selects ST Math for Replication Studies

MIND Research Institute

Department of Education (DOE) Institute of Education Sciences (IES) as one of six math programs to be further investigated under what IES terms “replication studies.”. “We We look forward to using ST Math to help IES realize its vision of a healthier and better-informed market in education program evaluations.

Study 45