And yet, my study has one thing I think is essential for validity. I began this study with no hypothesis. I had no idea, no theory, no assumptions in mind when I began visiting these schools. Thus, unlike almost every study of education in the United States I have read in the past five years, there is no "confirmation bias" in my study, because their was nothing to confirm.
In my first semester of graduate school, in my first research methods class, we read a study that Dr. Robert Slavin of Johns Hopkins University had done on the reading project he sells. In fact, almost everyone connected with the study had a very strong financial incentive to make the product, Success for All, look good.
My professors thought this was a great study. It followed "all the rules" in Scientific Research in Education
But I, and a number of other students, found the study to be, as research, worthless. First, as I noted, it was conducted entirely by people with a financial interest in proving the program's success. Second, it combined dozens of changes in school operations with the introduction of Slavin's product, including things like free food and massive tutoring efforts - how could anyone possibly suggest which part of those changes produced any effect? Third, it refused to look at any "side effects" - what happened to students in other areas when most of the school day became devoted to chanting sentences?
But, the more I thought about it, the biggest problem was the hypothesis. Slavin, of course, set out to "prove" that his product worked. That was his hypothesis. And, shock of shocks! he confirmed his belief.
We constantly teach hypothesis in our schools. It is as much the lifeblood of almost every Science Fair as the fact that parents do the project. But does hypothesis help or hurt our science?
Let's go way back to a very famous example. Galileo hypothesized, via Copernicus, that the planets circled the sun. And, yes, his "experiments" (observations) confirmed this. But when Jesuit researchers looked up they noted that what Galileo was saying could not possibly fit with their observation.
The problem is, of course, that the planets orbit the sun, but not in circles. At Galileo's trial both sides were "right" about celestial mechanics and both sides were quite wrong. Galileo had the idea but not the details, the Jesuits had the details but not the idea. Both had gotten to the wrong place through what we now call "confirmation bias." And confirmation bias is a direct result of our commitment to hypothesis.
Hypothesis should not be first. The first question when we study something needs to be "what is happening?" The problem is - a problem most social science research "leaders" gloss over - is that when someone like Slavin goes out to "prove" something "works," every question he asks, every bit of data he collects, every measure he uses, is, at least in part, designed to prove his hypothesis. Slavin measures short term reading test gains, not for example, student interest in literature. The tests he uses measure components of reading, not gains in subject interest. He does not compare "his" schools to others with extensive tutoring. He uses statistical models which presume that the human experience can be "averaged." It is not that he is lying. He is not. It is not that he falsely manipulates data. He does not. But his study (studies) are fatally flawed from the inception because the intent is not to observe but to confirm.
It would be no different if I ran a study to prove that Success for All was a ridiculous waste of school money. My questions, my measures, my statistical analysis would be selected to confirm that.
Is there another way?
When people ask my hypothesis I tend to say, "I have no idea." This didn't go over big on my "practicum" project with the faculty, but I tried hard to stick to it, even when measurement structures were imposed on my by the federal grant. On my dissertation, I began with the question, "how did this system develop?" and then, "why did this system develop?" If I had begun with "America's schools were designed to limit opportunity," I would - for example, have read both Horace Mann and William Shearer in very different ways. But by beginning with non-leading questions, I was free to hear these men as they wished to be heard, not through the results of 50 years or a century later.
So, could you, a researcher, walk into a school without a hypothesis? Could you clear your mind of your years of training and just "see"?
How might that change educational research?
- Ira Socol
Oh, the results of that "informal" study?
My top five "best," "healthiest" environments - corridors seemed under control, kids were polite, there was little or no bullying, kids and adults interacted well, kids seemed to arrive in class after passing times not unduly stressed...
1. Black River Public School (a small, urban, fairly diverse, non-profit charter)
2. Godfrey-Lee Middle School and Lee High School (an impoverished urban school)
3. Holland Christian High School (a small Christian Reformed Church school)
4. Hamilton High School (a large rural/suburban public school)
5. Reeths-Puffer High School (a large suburban/rural public school)
My bottom five (opposite of above)
5. Zeeland West High School (a large suburban/rural public school)
4. North Muskegon High/Middle School (a small wealthy suburban public school)
3. Holland High School (a large urban public school)
2. Zeeland East High School (a large suburban public school)
1. West Ottawa High School (a large suburban/rural public school)
What's important here is not the ratings, informal and, as with all research, highly subjective. What's important is that this reveals that, at least to me, the questions we ask about size, income level, management might be the wrong questions. If I had gone out to compare large to small, urban to suburban, non-public to public, I suspect what I would have "seen" would have changed, and my very hypothetical question would have altered my results.
1. I have called Scientific Research in Education "the most destructive book of the last decade," so, there is some bias there.
No comments:
Post a Comment