About this blog Subscribe to this blog

Thompson: Foolish To Hope Data Can Be Used For Multiple Purposes

BainThe Center for American Progress's "Movin' It and Improvin' It" report by Craig Jerald makes the claim that it is possible for school systems to use data for both “movin’ it” (removing poor teachers) and “improvin’ it” (making good teachers better). But Gerald should read Larry Cuban's latest piece in the Washington Post about of  the large body of social science which shows that, real world, if data-driven evaluations are done by management alone, that systems have to choose between one policy or its opposite.  Or, Jerald could just reread his own paper.  He quotes a blogger who claims that  “the real point of this reform (data-driven evaluations) is not punitive, i.e., firing bad teachers.”  Jerald replies, "If so, that point seems to have been lost on state legislators." I would add that the point has also been lost on districts, principals, and teachers. What did Jerald think we would hear? Are we all suffering from a mass hallucination? -JT (@drjohnthompson)Image via.  

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Craig Jerald here: John, I'm not sure I entirely understand your point because it's encased in all that snark. But I'm guessing you're simply trying to argue that districts shouldn't make personnel decisions based on performance data because teachers will always try to game the measures. If so, you seem to have an extremely cynical view of teachers, especially in light of the argument I made in the report about providing teachers with real opportunities to improve on the measures of effectiveness districts are adopting. You seem to be arguing that, given a choice between taking advantage of the "improvin' it" opportunities I'm arguing for (PD, coaching, peer-to-peer support) or gaming/cheating, most teachers would rather game/cheat. I couldn't disagree more. I think all but a very tiny number of teachers deplore gaming/cheating and, if given a real choice, would choose to do the right thing, both by themselves as professionals and by their students.

Craig - I *THINK* what John may be saying is that it is important to have data and feedback loops back to teachers outside of a formal evaluation system. That is certainly something we at New Teacher Center have argued for as part of a comprehensive teacher induction program. A formative assessment (FA) system provides an opportunity for teachers to receive such non-summative feedback from a mentor and to collaborative identify strong/weak teaching practices along a standards-based continuum. Such a FA system is central to California's BTSA program and is a strong feature in state policies in Delaware, Idaho, North Carolina, South Carolina and Utah -- and in some induction programs in Hawaii and Oregon. NTC employs such a system in our program work across the country. http://www.newteachercenter.org/induction-programs/new-teacher-induction

Liam, I suspect you're being very generous in (re?)interpreting John's point. But, look, if that's the point he really was trying to make, then I agree wholeheartedly. I've never argued against purely formative feedback as a complement to evaluative measures and feedback from evaluative measures. We should provide provide teachers with all the opportunities for growth that budgets allow, increase budgets for such opportunities if possible, and figure out innovative ways to stretch dollars to provide even more. Nothing I wrote in the paper argues against that and some sections could be interpreted as arguing for it. (Though I do think it's important to provide teachers with feedback from evaluative measures as well, and to help them put that feedback to good use, including individual classroom observations and student surveys.)

Why would you think I was saying that teachers would be corrupted? Personally, I've never seen teachers cheat on standardized tests but I've read about it. what I've seen, and read about, is a steady increase in teacher-proof, rote instruction on teachers, the narrowing of the curriculum, excessive test prep, pushing out lower performing students and the standard array of counter-productive instruction incentivised by data-driven accountability.

Click the link to Cuban's article and read another outstanding account of Campbells Law and the 40 to 50 years of experience in data-driven policies corrupting professions ranging from medicine to television. Cuban began with the words:

"Numbers glued to high stakes consequences, however, corrupt performance. Since the mid-1970s, social scientists have documented the untoward results of attaching high stakes to quantitative indicators not only for education but also across numerous institutions ..."

He ended with Glazerman's 2011 Mathematica study.

When I read your old Ed Trust work, back in the day, I was saddened that you did not realize the unanticipated consequences of NCLB-type accountability. Well the jury is in. Now NCLB-type accountability on steroids is being imposed on teachers. Why is it supposed to turn out differently this time?

And please recall my actual words, data-driven evaluations, IN THE HANDS OF MANAGEMENT ALONE, will always result in punitive outcomes in many or most cases. I don't know how many real world examples you want before you admit that systems of featherless bipeds have a long history of being corrupted by power, and responding to coercion in unattractive ways.

And please remember that the prime victims are poor children being subjected to soul-killing rote instruction. Haven't you ever watched inner city kids taking this punitive tests or listened to them explain why they know that the nose is being rubbed into it? You don't think that poor teens realize how and why the indignity is being dumped on them?

John, did you bother to read more than a few pages into the paper?

1. The CAP paper was not about accountability. If you want to (mis?)read a recent paper I've written about accountability, here's one where I argued that states should consider an inspection system that considers multiple measures: http://www.educationsector.org/publications/her-majestys-school-inspection-service. Seriously, if you really want to have a discussion about school accountability policies, read that paper, and let's talk.

2. Speaking of which, yes, I did read the whole of Cuban's article, all the way to his policy conclusion that, "The best policymakers, not merely good ones, know that multiple measures for a worthy goal reduce the possibility of reporting false performance." In my CAP paper, I specified that I was talking about teacher evaluation systems that consider multiple measures.

3. I'm mystified how you could have so completely missed the main point of the CAP paper, which was that as states and districts implement these new teacher evaluation systems, they are focusing disproportionately on policies like hiring, retention, and firing as mechanisms to increase "teaching effectiveness," and not nearly enough on teacher support and professional development. THE VAST MAJORITY of the paper addresses the topic of professional development! I specifically refute the argument made by Erik Hanushek and other economists that teachers can't improve and that professional development is a waste of money. What do you think about that?

I'm excited by the prospect of vigorous debate about that argument I made in the CAP paper. But your post and subsequent comment are so frustratingly off topic that it's difficult to understand your critique.

These proposals are marketing schemes for data processing and consulting providers, and have nothing to do with either "formative" or "summative" teacher evaluation.

As a chemistry teacher, I studied MCAS math scores for students I know and taught, and also their growth scores, looking for useful insights. Even if you test them into exhaustion, a standardized test is like a shotgun. It's a bunch of static; a bubble test doesn't contain information about any specific teacher's teaching or student's learning.

I assure you, teachers don't need professional development in using your bogus data. "Data driven education" is a hoax, in anybody's "hands". Quit going along with it, colleagues.

The comments to this entry are closed.

Advertisement

Advertisement

Disclaimer: The opinions expressed in This Week In Education are strictly those of the author and do not reflect the opinions or endorsement of Scholastic, Inc.