Wednesday, December 15, 2010

Disagreements with study interpretatation . . .

Well, there has certainly been follow-up in the blog world over the release of the Gate’s funded study onto value added and student survey responses that I blogged about on Sunday. In this Education Next post by Jay P. Greene, he refutes the claim made in the New York Times article about the correlation between teachers who drill for standardized tests and their value added scores.


One notable early finding, Ms. Phillips said, is that teachers who incessantly drill their students to prepare for standardized tests tend to have lower value-added learning gains than those who simply work their way methodically through the key concepts of literacy and mathematics. (emphasis added)


I looked through the report for evidence that supported this claim and could not find it. Instead, the report actually shows a positive correlation between student reports of “test prep” and value added on standardized tests, not a negative correlation as the statement above suggests. (See for example Appendix 1 on p. 34.)


The statement “We spend a lot of time in this class practicing for [the state test]” has a correlation of 0.195 with the value added math results. That is about the same relationship as “My teacher asks questions to be sure we are following along when s/he is teaching,” which is 0.198. And both are positive.


It’s true that the correlation for “Getting ready for [the state test] takes a lot of time in our class” is weaker (0.103) than other items, but it is still positive. That just means that test prep may contribute less to value added than other practices, but it does not support the claim that ”teachers who incessantly drill their students to prepare for standardized tests tend to have lower value-added learning gains…”

I won’t copy the whole article, but I wanted you to see what he found in his analysis. Silly me, I took what the paper reported as factual though I was more interested in how the students answered the survey questions than in how drilling influenced the value added scores. He also claims that the same information is misinterpreted in the LA Times article that came out over the weekend.

Andy Russo in This Week in Education post also takes exception with the report in the LA Times, but does agree with the New York focus on the influence of student answers on learning gains. If you have the time and are interested, you can read the initial findings of the report here.

What did I learn? To be more skeptical about how reporters are interpreting the findings of education studies. To wonder what the communication was between the Gate’s people and the newspaper people that lead to the conclusions that may not be supported by the study’s findings. To wait for the reaction to the study by those that actually read it before sharing in a post, or at least acknowledge that there will be one. It will be interesting to see how the reporters and Foundation staff respond to these critiques of the articles or if they even do.

There are too many people looking for easy answers to what is wrong in our profession. The issue is complex and the answers will not all be found in our classrooms. Yes, we can and must do better at ensuring that all students experience success. And yes, there will be information from this study that we can learn from. We can do this, however, without the misinterpretation and finger pointing that normally accompanies a $45 million study.

No comments: