About this blog Subscribe to this blog

Media: When Journalists Do Their Own Data Analysis

110829_cartoon_057_a15718_p465 Behind the scenes, Green Dot charter schools and the LA Times continue to debate the merits of the paper's story from last week (LAUSD bests reform groups in most cases), which purported to show that low-performing district schools were making more progress than turnarounds operated by Green Dot or the Mayor's cluster of schools which are operated by outside nonprofits.  Read below if you're interested in what Green Dot's saying, how the newspaper has responded, and what the implications are for other newsrooms interested in conducting their own research.

What Green Dot disputes (see email below) is whether it's fair to compare district schools in the bottom 20 percent with schools where outside management has been brought in, which are in the bottom 5-10 percent.  The charter network also argues that the LAT's analysis omits retention effects (ie, the impact on scores of lower dropout rates).  And they don't like the headline.    

According to the paper, they had no knowledge which way the data would look before they ran the analysis, there was no realistic way to construct a more tailored comparison group than the bottom 20 percent, retention data aren't available for all schools, and proficiency rates are key, if incomplete, indicators of school success.  Obviously, the district was until recently responsible for the schools Green Dot and the Mayor are now trying to rescue.  They have so far declined to run a correction since there's no mathematical mistake involved.  

This is interesting to me not only because I wrote a book about the Green Dot turnaround effort at Locke (that included early debate about the retention/proficiency issue) but also because it's an example of (a) a newspaper doing its own analysis in house rather than relying on outside experts (as the LAT did with the value-added story last year) and (b) the strong reaction that we've seen newspapers and others generate when they report that much-publicized reform efforts may not be performing as well as expected.  (Think Urban Prep in Chicago, the Obama success stories, Harlem Children's Zone, etc.)

For better or worse, papers seem to be doing their own research and analysis, which is faster and more tailored to journalistic needs but also leaves experienced independent experts out of the process. For better or worse, some reformers have decided to respond aggressively when newspapers do their own analysis and report less-than-stellar findings rather than letting critical reporting stand.  I'm still checking to see if anyone has re-run the data or examined the assumptions built into the analysis.  Who knows what would be revealed, but I'd love to see what an outside expert thinks of the decisions and data the paper used.

GREEN DOT EMAIL

Firstly, the headline doesn’t match the story.  If you read the article and review the data, a more appropriate headline (or at least a caveat!) would be, “Green Dot’s Locke High school shows a faster pace of improvement in student proficiency than other high schools in the analysis.”

Secondly, the methodology/analysis that the Times uses to draw its conclusions is flawed, to the point that it punishes schools that reduce the drop-out rate. In order to measure the success of a turnaround you must also focus on retention and rigor, in addition to results.  At Locke High School not only have our TEST scores increased, but we have managed to retain more students and double our proficiency levels, while offering students a more challenging college prep A-G curriculum. Retention, Rigor, and Results is the trifecta for school turnarounds.  (See attached for our more detailed analysis of the issues and the appendix for a more thorough explanation on why this is and how these comparisons should be fairly made.)

And thirdly, the Times does not use the correct comparison set.  They compare reform-led schools, such as Locke that started out in the bottom 1%, with LAUSD schools that are in the bottom 20% range. A better comparison set would be to compare schools that had similar performance levels in 2007-08, the starting point of the analysis.

We will agree with the LA Times on one thing: we have a long way to go to eliminate the achievement gap and to create “turnaround miracles” from chronically underperforming schools.  But let’s start with a fair and comprehensive analysis that accurately portrays the important work that is being done by all parties to turn around failing schools and the progress that is being achieved. We invite the LA Times and other education analysts to re-examine the analysis with the right numbers and compare schools that were at the same starting point at the beginning of the period under consideration.   Then, let’s have a frank conversation about what is working or isn’t working.

 

 

Comments

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00e54f8c25c98834015390f5d217970b

Listed below are links to weblogs that reference Media: When Journalists Do Their Own Data Analysis:

Permalink

Permalink URL for this entry:
http://scholasticadministrator.typepad.com/thisweekineducation/2011/08/media-when-journalists-do-their-own-data-analysis.html

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

The L.A. Times made some pretty arbitrary decisions in preparing this story (for example, why is the bottom quintile the "litmus test" for school reform, rather than the next quintile up, or the quintile above that? Why is any one subsection of the population more important than another? Any why 20%? Why not 10%, 5%, or 25%? Why is moving from basic to proficient more important than moving from far below basic to below basic, or proficient to advanced?).

Green Dot does make one valid, important point in its criticism: the decision to ignore retention and only focus on test scores. Every educator knows the easiest trick to raising test scores is simply to eliminate low scorers from the system. The L.A. Times blew it when it decided to use a single measuring system for both primary and secondary schools, which are quite different institutions and serve different purposes. A consistent mistake made by educational assessors of many stripes is to base the assessment on data easy to find, rather than on what is important to the students and their lives.

No excuses, Green Dot.

from rishawn biddle from the EWA listserve, used with permission:

Honestly, even in the education policy world, you don't have folks referring data and research disputes. Either a piece is peer-reviewed before publication, or peer-reviewed after. There are no referees.

Let's face facts: There will always be disputes about data, largely because all sides have an interest to protect, ideological, financial, reputational, or otherwise. And when a newspaper publishes its own analysis, some outfit or another will take issue with the results because it may not favor them. The best an outlet can do is let them write an op-ed or letter contesting the findings. Given that organizations have ample media capacity of their own (or can build it up if they so choose), there's no reason for media outlets to start bringing in third-parties who will often have their own stake in these matters and aren't likely to be neutral anyway. And if the data is absolutely inaccurate, the media outlet will be forced to admit as much because, well, it is in its best interest to do so.

This isn’t to say that media outlets should behave recklessly when handling data. Media outlets should always be thoughtful about interpretation and admit the limits of what data can reveal. It is also a good idea to always talk to researchers about the quality of data and how to proceed in doing analysis. At the same time, there is nothing wrong with organizations challenging those interpretations. Interpretations of data – and challenges to it – are healthy aspects of every debate. Ultimately, this is good for everyone involved – especially for families and taxpayers who deserve to know more about what happens in American public education.

yesterday's LA Times editorial page praised a new law that will include student retention along with proficiency rates -- just what Green Dot is saying that the LAT should have done (but is not current law)

http://www.latimes.com/news/opinion/opinionla/la-ed-test-20110824,0,7688499.story

from Joe Tyrrell of new jersey news room:

Not quite sure what "professional limitations" is intended to imply, but insofar as media organizations are able to do independent data analysis without the usual babble of talking, or screaming, heads, that's a plus for journalism and its consumers.

Given the limitations inherent in the data as opposed to the professionals, this seems an exemplary if short story. It includes responses from the relevant parties and makes the argument that privately administered Locke showed large improvement relative to where it started, just not as much as the average school.

As usual, those with the greatest financial stake are the most alarmed that this particular set of numbers does not completely flatter them. But not every story or data run needs to address every possible argument or counter-argument.

Of course, the newspaper should provide space for Green Dot and others to make their points, which might prompt follow-up coverage. If that produces community-wide discussion, as opposed to comments from the mayor and the commercial interests, then it would be a job well done.

The idea that data needs to be vetted prior to publication by "experts" other than statisticians or IT staffers is one of the pernicious myths of the current education scene. That's a recipe for producing the same coverage over and over, as a mediation among the loudest interests in the sector of commentators, consultants, politicians and school-related businesses.

from matt tabor of ednews.org, with permission:


Why not send it to all concerned parties? Seems that there's a place at the table for the ombudsmen, the outside experts, all of them. If you're going to work something up, not only should everyone relevant weigh in, but originating journalists should *want* them to weigh in to confirm, dispute, etc.

There's lots of expertise out there and multiple perspectives, and I don't see any reason the folks in the newsroom shouldn't take advantage of that even after their analysis goes out (I'd prefer before, but reality and professional limitations usually get in the way). But if that was common practice -- a sort of 'sector-wide review' that the newsroom data types seek actively -- then we'd expect the *next* LA Times article to contain a bit more context and nuance.

The comments to this entry are closed.

Disclaimer: The opinions expressed in This Week In Education are strictly those of the author and do not reflect the opinions or endorsement of Scholastic, Inc.