Education research is beginning to rely on data instead of on ‘guesses and hype’

by Grace

The New York Times reports that the Institute of Education Sciences, an office of the U.S. Department of Education that was created in 2002, has been working to improve the quality of education research by increasing the use of randomized trials.

What works in science and math education? Until recently, there had been few solid answers — just guesses and hunches, marketing hype and extrapolations from small pilot studies.

But now, a little-known office in the Education Department is starting to get some real data, using a method that has transformed medicine: the randomized clinical trial, in which groups of subjects are randomly assigned to get either an experimental therapy, the standard therapy, a placebo or nothing.

At long last, education is following the lead of medicine in its approach to research.

The institute gives schools the data they need to start using methods that can improve learning. It has a What Works Clearinghouse — something like a mini Food and Drug Administration, but without enforcement power — that rates evidence behind various programs and textbooks, using the same sort of criteria researchers use to assess effectiveness of medical treatments.

Grover J. Whitehurst, first director of the IES, found that rigorous quantitative education research was almost non-existent.

“I found on arriving that the status of education research was poor,” Dr. Whitehurst said. “It was more humanistic and qualitative than crunching numbers and evaluating the impact.

“You could pick up an education journal,” he went on, “and read pieces that reflected on the human condition and that involved interpretations by the authors on what was going on in schools. It was more like the work a historian might do than what a social scientist might do.”…

So Dr. Whitehurst brought in new people who had been trained in more rigorous fields, and invested in doctoral training programs to nurture a new generation of more scientific education researchers. He faced heated opposition from some people in schools of education, he said, but he prevailed.

Most educational programs lack solid evidence about their effectiveness.

As the Education Department’s efforts got going over the past decade, a pattern became clear, said Robert Boruch, a professor of education and statistics at the University of Pennsylvania. Most programs that had been sold as effective had no good evidence behind them. And when rigorous studies were done, as many as 90 percent of programs that seemed promising in small, unscientific studies had no effect on achievement or actually made achievement scores worse….

“Most programs claim to be evidence-based,” he said, but most have no good evidence that they work.

Among many educators, ‘personal anecdote trumps data’.

… Too often, they are swayed by marketing or anecdotes or the latest fad. And “invariably,” he added, “folks trying to sell a program will say there is evidence behind it,” even though that evidence is far from rigorous.

Dr. Merlino agreed. “A lot of districts go by the herd mentality,” he said, citing the example of a Singapore-based math program now in vogue that has never been rigorously compared with other programs and found to be better. “Personal anecdote trumps data.”

Last year Daniel Willingham, a long-time critic of education research who writes the Science and Education blog, came out with his book, When Can You Trust the Experts: How to Tell Good Science from Bad in Education.

I wrote the book out of frustration with a particular problem: the word “research” has become meaningless in education. Every product is claimed to be research-based. But we all know that can’t be the case. How are teachers and administrators supposed to know which claims are valid?

Willingham found the IES What Works Clearinghouse, “has not been a success, mostly because the criteria it used to evaluate products was so stringent, and also because (at least initially) it was dogged by suspicions that its conclusions would not be politically neutral”.

As mentioned in the NYT article, strong opposition to rigorous research comes from schools of education, criticized for their “focus on fads, not knowledge and skills“.

Related:  Problems with study that claims groups are better than lectures for learning (Cost of College)


4 Comments to “Education research is beginning to rely on data instead of on ‘guesses and hype’”

  1. Yes, the NSF has been looking for more rigorous studies when it funds science education research, for the past few years. Generally it is a good thing, but in higher ed it means only the big universities have the numbers to run the studies, which I think can skew the results – what works in big lecture settings of top notch, committed students at, say, U of Michigan, may not work at the CC’s or regional state u’s. But the numbers aren’t there to test the methods in those smaller settings, unless a coalition of schools is formed. The NSF has been trying to encourage that, but it is almost impossible given school politics


  2. There appear to be all sorts of obstacles to improving the quality of education research, but I think a culture that doesn’t value rigorous research may be at the root of the problem. I don’t hear about similar problems in the field of medicine.


  3. I think educators in the sciences do value rigorous research in educational methods. But numbers are a problem. And medicine has lots of similar issues – there is very little rigorous evidence supporting many practices in obstetrics, for example, and no one wants to run good studies. In oncology, it can be very hard to accrue enough numbers to run good studies for some cancers, which is a similar problem to that of running studies in smaller school settings.


  4. Yes, I suspect educators in the sciences value rigorous research. But I was thinking more of schools of education, where most professors come from non-science backgrounds.


%d bloggers like this: