Remove Assessment Remove Company Remove Outcomes Remove Program Evaluation
article thumbnail

Ed tech companies promise results, but their claims are often based on shoddy research

The Hechinger Report

Examples from The Hechinger Report’s collection of misleading research claims touted by ed tech companies. All three of these companies try to hook prospective users with claims on their websites about their products’ effectiveness. Some companies are trying to gain a foothold in a crowded market. Video: Sarah Butrymowicz.

Company 145
article thumbnail

With Budget Cuts Looming, Here’s How Districts Will Decide What to Keep or Cut

Edsurge

Examples of districts that embed practical evidence-based analysis into their program evaluation and budget cycles include efforts such as Jefferson County Public Schools’ Cycle-based Budgeting program and Schenectady City School’s return on investment review. That data is then fed into program and budget review cycles.

Policies 119
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

More than a Checkmark

MIND Research Institute

If you’re merely looking to check a box saying “It has evidence” before you commit to a program, you’re asking for something that can be generated by any program. It is time to leave behind that old checkbox mentality and adopt a more comprehensive and useful lens through which to evaluate. Learn more here.

article thumbnail

Debunking the ‘Gold Standard’ Myths in Edtech Efficacy

Edsurge

But when it comes to demonstrating that products “work,” too many companies fall back on testimonials. Few can offer buyers independent assessments of the value of their products, even if those same entrepreneurs have sweated and toiled to build great wares. Researchers then study the outcomes.

EdTech 128
article thumbnail

ST Math Results & Impact

MIND Research Institute

Understanding MIND's Research Paradigm and Ways to Evaluate Effective Education Research Academic conversations around education research can often be daunting, difficult to follow, and beg more questions than answers when considering education programs. The longer the program has been implemented with consistency and fidelity (i.e.,

EdTech 52
article thumbnail

5 Principles for Evaluating State Test Scores

edWeb.net

But as Mitch Slater, Co-Founder and CEO of Levered Learning, pointed out in his edWebinar “ A Little Data is a Dangerous Thing: What State Test Score Summaries Do and Don’t Say About Student Learning,” looking at data from one set of assessment scores without context is virtually meaningless. WATCH THE EDWEBINAR RECORDING.

article thumbnail

IES Selects ST Math for Replication Studies

MIND Research Institute

“We look forward to using ST Math to help IES realize its vision of a healthier and better-informed market in education program evaluations. But it was a one-off on our program three generations ago, pre-common-core assessments, and one type of school profile. How meaningful is the difference between outcomes?

Study 45