Muddying the Data

Districts with scores at subterranean levels have been locked into required assemblies, forming multiple committees to solve the problem of resistant test scores. Especially in financially disadvantaged districts, resources commonly been end up being redeployed repeatedly, since money to add new resources can seldom be found. Instead of checking for lead in the water fountains, maybe the district buys Chromebooks instead. Some districts in America are starving for funds.

In exceptional and unpredictable cases, a district may receive a grant. For example, Neal Math Science Academy received a three-year federal School Improvement Grant (SIG) a few years back.  That grant does not seem to have radically improved performance. It’s intriguingly hard to tell, though.  The shift to the new PARCC test makes comparisons with the past tough. In cynical moments, I wonder if the push to go to the PARCC test was aided and abetted by this useful blurring of the past and present.

Is Neal improving? A few years ago, we could have compared Illinois State Achievement Test (ISAT) results over time to get a fairly clear picture. But comparing PARCC to the ISAT is like comparing apples to pomegranates. They are not at all the same test. These two measurement instruments do not necessarily even test the same attributes. That one shift from pencils to keyboards was enough to impact results, without even considering the effect of content changes and the introduction of multiple right answers.

Eduhonesty: Nice job of obscuring the data, guys, in the name of improved data. Was that intentional?