The New York Times reports that the Institute of Education Sciences, an office of the U.S. Department of Education that was created in 2002, has been working to improve the quality of education research by increasing the use of randomized trials.
What works in science and math education? Until recently, there had been few solid answers — just guesses and hunches, marketing hype and extrapolations from small pilot studies.
But now, a little-known office in the Education Department is starting to get some real data, using a method that has transformed medicine: the randomized clinical trial, in which groups of subjects are randomly assigned to get either an experimental therapy, the standard therapy, a placebo or nothing.
At long last, education is following the lead of medicine in its approach to research.
The institute gives schools the data they need to start using methods that can improve learning. It has a What Works Clearinghouse — something like a mini Food and Drug Administration, but without enforcement power — that rates evidence behind various programs and textbooks, using the same sort of criteria researchers use to assess effectiveness of medical treatments.
Grover J. Whitehurst, first director of the IES, found that rigorous quantitative education research was almost non-existent.
“I found on arriving that the status of education research was poor,” Dr. Whitehurst said. “It was more humanistic and qualitative than crunching numbers and evaluating the impact.
“You could pick up an education journal,” he went on, “and read pieces that reflected on the human condition and that involved interpretations by the authors on what was going on in schools. It was more like the work a historian might do than what a social scientist might do.”…
So Dr. Whitehurst brought in new people who had been trained in more rigorous fields, and invested in doctoral training programs to nurture a new generation of more scientific education researchers. He faced heated opposition from some people in schools of education, he said, but he prevailed.
Most educational programs lack solid evidence about their effectiveness.
As the Education Department’s efforts got going over the past decade, a pattern became clear, said Robert Boruch, a professor of education and statistics at the University of Pennsylvania. Most programs that had been sold as effective had no good evidence behind them. And when rigorous studies were done, as many as 90 percent of programs that seemed promising in small, unscientific studies had no effect on achievement or actually made achievement scores worse….
“Most programs claim to be evidence-based,” he said, but most have no good evidence that they work.
Among many educators, ‘personal anecdote trumps data’.
… Too often, they are swayed by marketing or anecdotes or the latest fad. And “invariably,” he added, “folks trying to sell a program will say there is evidence behind it,” even though that evidence is far from rigorous.
Dr. Merlino agreed. “A lot of districts go by the herd mentality,” he said, citing the example of a Singapore-based math program now in vogue that has never been rigorously compared with other programs and found to be better. “Personal anecdote trumps data.”
Last year Daniel Willingham, a long-time critic of education research who writes the Science and Education blog, came out with his book, When Can You Trust the Experts: How to Tell Good Science from Bad in Education.
I wrote the book out of frustration with a particular problem: the word “research” has become meaningless in education. Every product is claimed to be research-based. But we all know that can’t be the case. How are teachers and administrators supposed to know which claims are valid?
Willingham found the IES What Works Clearinghouse, ”has not been a success, mostly because the criteria it used to evaluate products was so stringent, and also because (at least initially) it was dogged by suspicions that its conclusions would not be politically neutral”.
As mentioned in the NYT article, strong opposition to rigorous research comes from schools of education, criticized for their “focus on fads, not knowledge and skills“.