Pages

12 April 2011

The Problem with Hypotheses

In what I would call "an informal study" I have ranked the "atmosphere" in various high school hallways around the area in which I live. Yes, it is a "convenience sample," consisting of schools I happened to be visiting. No, I had no particular method of recording observations. No, I will not provide a long bibliography of sources cited. Yes, I have developed my own way of coding the "data" and my own way of ranking the "school atmosphere."

And yet, my study has one thing I think is essential for validity. I began this study with no hypothesis. I had no idea, no theory, no assumptions in mind when I began visiting these schools. Thus, unlike almost every study of education in the United States I have read in the past five years, there is no "confirmation bias" in my study, because their was nothing to confirm.

In my first semester of graduate school, in my first research methods class, we read a study that Dr. Robert Slavin of Johns Hopkins University had done on the reading project he sells. In fact, almost everyone connected with the study had a very strong financial incentive to make the product, Success for All, look good.

My professors thought this was a great study. It followed "all the rules" in Scientific Research in Education1, the "bible" of "gold standard" educational research created by top AERA professors for the Bush Administration.

But I, and a number of other students, found the study to be, as research, worthless. First, as I noted, it was conducted entirely by people with a financial interest in proving the program's success. Second, it combined dozens of changes in school operations with the introduction of Slavin's product, including things like free food and massive tutoring efforts - how could anyone possibly suggest which part of those changes produced any effect? Third, it refused to look at any "side effects" - what happened to students in other areas when most of the school day became devoted to chanting sentences?

But, the more I thought about it, the biggest problem was the hypothesis. Slavin, of course, set out to "prove" that his product worked. That was his hypothesis. And, shock of shocks! he confirmed his belief.

We constantly teach hypothesis in our schools. It is as much the lifeblood of almost every Science Fair as the fact that parents do the project. But does hypothesis help or hurt our science?

Let's go way back to a very famous example. Galileo hypothesized, via Copernicus, that the planets circled the sun. And, yes, his "experiments" (observations) confirmed this. But when Jesuit researchers looked up they noted that what Galileo was saying could not possibly fit with their observation.

The problem is, of course, that the planets orbit the sun, but not in circles. At Galileo's trial both sides were "right" about celestial mechanics and both sides were quite wrong. Galileo had the idea but not the details, the Jesuits had the details but not the idea. Both had gotten to the wrong place through what we now call "confirmation bias." And confirmation bias is a direct result of our commitment to hypothesis.

Hypothesis should not be first. The first question when we study something needs to be "what is happening?" The problem is - a problem most social science research "leaders" gloss over - is that when someone like Slavin goes out to "prove" something "works," every question he asks, every bit of data he collects, every measure he uses, is, at least in part, designed to prove his hypothesis. Slavin measures short term reading test gains, not for example, student interest in literature. The tests he uses measure components of reading, not gains in subject interest. He does not compare "his" schools to others with extensive tutoring. He uses statistical models which presume that the human experience can be "averaged." It is not that he is lying. He is not. It is not that he falsely manipulates data. He does not. But his study (studies) are fatally flawed from the inception because the intent is not to observe but to confirm.

It would be no different if I ran a study to prove that Success for All was a ridiculous waste of school money. My questions, my measures, my statistical analysis would be selected to confirm that.

Is there another way? 

When people ask my hypothesis I tend to say, "I have no idea."  This didn't go over big on my "practicum" project with the faculty, but I tried hard to stick to it, even when measurement structures were imposed on my by the federal grant. On my dissertation, I began with the question, "how did this system develop?" and then, "why did this system develop?" If I had begun with "America's schools were designed to limit opportunity," I would - for example, have read both Horace Mann and William Shearer in very different ways. But by beginning with non-leading questions, I was free to hear these men as they wished to be heard, not through the results of 50 years or a century later.

So, could you, a researcher, walk into a school without a hypothesis? Could you clear your mind of your years of training and just "see"?

How might that change educational research?

- Ira Socol

Oh, the results of that "informal" study?


My top five "best," "healthiest" environments - corridors seemed under control, kids were polite, there was little or no bullying, kids and adults interacted well, kids seemed to arrive in class after passing times not unduly stressed...

1. Black River Public School (a small, urban, fairly diverse, non-profit charter)
2. Godfrey-Lee Middle School and Lee High School (an impoverished urban school)
3. Holland Christian High School (a small Christian Reformed Church school)
4. Hamilton High School (a large rural/suburban public school)
5. Reeths-Puffer High School (a large suburban/rural public school)

My bottom five (opposite of above)
5. Zeeland West High School (a large suburban/rural public school)
4. North Muskegon High/Middle School (a small wealthy suburban public school)
3. Holland High School (a large urban public school)
2. Zeeland East High School (a large suburban public school)
1. West Ottawa High School (a large suburban/rural public school)

What's important here is not the ratings, informal and, as with all research, highly subjective. What's important is that this reveals that, at least to me, the questions we ask about size, income level, management might be the wrong questions. If I had gone out to compare large to small, urban to suburban, non-public to public, I suspect what I would have "seen" would have changed, and my very hypothetical question would have altered my results.

1. I have called Scientific Research in Education "the most destructive book of the last decade," so, there is some bias there.

7 comments:

  1. I really like like that you are writing about this. I had the support of one grad advisor durning my thesis process who let me run with the question, "what happens when teachers talk about authority and mathematics." I have a feeling that most PhD folk wouldn't have let that fly. However, the study was personally inspiring and motivated a whole bunch of teachers to look at how they approach content and domination. The study left a few things in the air (I am still a noob). After my presentation, I had someone come up to me and ask if I new anything about Grounded Theory. I checked it out and it looks like it may provide a structure for this kind of inquiry. I tried to follow Dr. Samshima work and apply a version of poetic parallax. Thanks for the post.

    ReplyDelete
  2. Ira--you have touched upon some very important issues in research. There are ways to avoid these problems but, as I'm sure you know, they involve responsibility and usually someone forfeiting potential income or recalibrating too much of their life to bother.

    Ecological validity is very popular in current edu research circles. How are we to measure important constructs without getting in the way of what we want to measure?! Come up with all these novel ways of measuring something and come to find out that it makes no sense in an authentic context. There are ways to do this if you focus attention on the right people: teachers and students.

    The elephant in many rooms is $$$$ and stake-holding. Money is easier from the standpoint of a research USER--avoid thinktanks and research that doesn't report funding sources. Avoid research that has major or unreported stakes in success. Avoid journalists who wear the garb of researchers but are not held to (even questionable) reporting standards. How much of this can be avoided? What remains?

    Most important of all is this: What will schools do if it is found that their approach, beliefs, strategies, and methods are found wanting? Will they change? Will schools change? Will they continue to mask financial concerns with 'progressive' practices?

    There are many "edu" researchers who have never set foot in a classroom without a stopwatch, clicker, or checklist. They have neither held chalk nor the rapt attention of students in their palms. Needless to say, their ability to create meaning of all of this is lacking.

    To whom would you listen?
    To whom should you listen?

    The answer to this question has never changed.

    Thank you for continuing the challenge.

    ReplyDelete
  3. Ira,
    Thank you for this post. You have put to words what I have long felt about my own experience with (and failure at) research. You have described precisely the reason I was never able to complete my own dissertation - a study of faculty reward systems in higher education. The tradition of research dictated (I'll leave out all of the methodological squabbles with well-meaning advisors, etc)...that I hypothesize and make comparisons....when what I REALLY wanted was to observe and describe a reality - I wanted to see. Hence, I am and will forever be ADB.
    Thank you for writing this in to being for me. It IS validating to see someone else express it.

    ReplyDelete
  4. Maybe some PhD folks could chime in here, but it seem like there is more room for alternate representations of data and differing research methodologies. Google art based reasearch and Dr. Pauline Sameshima...she is doing some really great work around poetic parallax (And to name one more, Dr. Carl Leggo from UBC) Dr. Sameshima's dissertation is beautiful, Seeing Red and looks creatively at the relationship between learning, love, and relationships.I would like to be like her when I grow up:)

    ReplyDelete
  5. We seem to teach research in the order in which published articles are printed, rather than in the order in which the thinking has to occur. Yes, published scientific papers begin with an abstract and and a 'purpose', but those are actually best left until last in the writing process. A hint of data that sparks curiosity is the real beginning of research, followed by gathering more data, trying to eliminate biases, and drawing conclusions. Only after we've thoroughly explored an idea can we hope to make a sensible "If ..., then ..., because ..." statement and challenge others to find gaps and exceptions to our hypothesis.

    ReplyDelete
  6. You wrote every thing very well and precized.I like your efforts.Thanks for your post.

    ReplyDelete
  7. Hello!

    My name is Miranda Tidikis and I am a student in the College of Education at the University of South Alabama. I am in Dr. Strange's EDM310 class. I will be following your blog for the next two weeks as part of an assignment and I will be posting my reactions on my blog at the end of the two weeks.

    I really enjoyed this post about hypothesis. I have, many times, actually started a research project or paper by doing the research then basing my hypothesis off my findings. I try to keep an open-mind and to see all sides of the research before deciding which direction to take it, most of time it takes its own direction and I just follow.

    Researchers are great at manipulating findings and data to fit their hypothesis. They are out to "prove" their hypothesis so they get whatever desired outcome they wish. All findings can be turned one way or another depending on the use and wording.

    This was a great post. Thank you for sharing!

    Our class blog is: edm310.blogspot.com
    My blog is: TidikisMirandaedm310.blogspot.com

    ReplyDelete