"Debate continues to rage over the effectiveness of technology in learning, and how best to measure it. But it is difficult to tell that from technology companies’ promotional materials."
"School officials, confronted with a morass of complicated and sometimes conflicting research, often buy products based on personal impressions, marketing hype or faith in technology for its own sake."
"Amid a classroom-based software boom estimated at $2.2 billion a year, debate continues to rage over the effectiveness of technology on learning and how best to measure it. But it is hard to tell that from technology companies’ promotional materials.
Many companies ignore well-regarded independent studies that test their products’ effectiveness. Carnegie’s Web site, for example, makes no mention of the 2010 review, by the Education Department’s What Works Clearinghouse, which analyzed 24 studies of Cognitive Tutor’s effectiveness but found that only four of those met high research standards. Some firms misrepresent research by cherry-picking results and promote surveys or limited case studies that lack the scientific rigor required by the clearinghouse and other authorities.
“The advertising from the companies is tremendous oversell compared to what they can actually demonstrate,” said Grover J. Whitehurst, a former director of the Institute of Education Sciences, the federal agency that includes What Works."
Government Official Nixes Research
"Karen Cator, a former Apple executive who directs the Office of Educational Technology at the Department of Education, said the clearinghouse reports on software should be “taken with a grain of salt” because they rely on standardized test scores. Those tests, Ms. Cator said, cannot gauge some skills that technology teaches, like collaboration, multimedia and research.
Ms. Cator’s office is developing a new framework to measure the educational value of technology, but she advised schools and districts not to wait to invest in software like Cognitive Tutor.
“They know what their students need to know and what they need to be able to do,” she said."
Comment: Right or wrong, school officials bought the software to improve student performance, primarily on state assessments. Cator telling school leaders to ignore her own Education Department's research is eye-opening.
The Bottom Line
Officials spend hundreds of thousands of dollars on software "silver bullets" but fail to do two simple things before and after making the purchases:
1. Implement the programs with fidelity. School officials mistakenly wait until state test scores are released to judge the effacacy of a software program. They then wrongly attribute causality to the software.
Most schools and school systems never check to see if the programs are being implemented at all, and very few check on the fidelity of the implementation. Are these programs being implemented by the teachers as designed and are they being consistently used as presrcibed.
2. Check on available research. The What Works Clearinghouse is only a free click away http://ies.ed.gov/ncee/wwc/ .
Even "though the clearinghouse is intended to help school leaders choose proven curriculum, a 2010 Government Accountability Office survey of district officials found that 58 percent of them had never heard of What Works, never mind consulted its reviews.
“Decisions are made on marketing, on politics, on personal preference,” said Robert A. Slavin, director of the Center for Research and Reform in Education at Johns Hopkins University. “An intelligent, caring principal (or district official) who’d never buy a car without looking at Consumer Reports, when they plunk down serious money to buy a curriculum, they don’t even look at the evidence.”