Delayed for months by debate and discussion and dissection, ISTEP scores were finally released to the public yesterday, and the question we have is whether the results should have been released at all, whether it would have been better, given the problems experienced this spring, just to drag the results onto the desktop trash icon and hit delete.
We just aren't convinced that these results, whether good or bad, are all that valid.
Yes, the state brought in an "expert" to review whether this spring's problems had any measurable impact on the results, and while he ultimately determined that, outside of a few exceptions, they didn't, from those in public education with whom we've spoken and whose views we trust, there's a difference of opinion.
“I just can’t imagine that there was no impact at all," said Tim Grove, superintendent at South Knox — which, overall, again had the best scores in the county. "I struggle with the idea that 1,400 (of the test results) should be thrown out but the others are OK."
Richard Hill, chairman of the National Center for the Improvement of Educational Assessment who was hired to investigate whether the problems CTB/McGraw-Hill experienced with its computer servers crashing during the test taking had any effect on the results, recommended that about 1,400 results be thrown out.
All the other results, Hill assured state officials, were OK.
But even Glenda Ritz, the state superintendent for public education, has her doubts about that.
“We will never, ever know how our students would have performed had we not had the interruptions,” she said.
If, as Mr. Hill asserts, only those few students (less than .3 percent of all those who took ISTEP) were affected, then the statewide results were certainly disappointing.
After all the focus on improvement and implementation of the "reforms" of the Daniels/Bennett years, the results showed only slight improvement in scores: up 1.5 percentage points in math from the year before, with hardly any change at all in language arts.
If, however, as some educators suspect, far more students were affected by McGraw-Hill's problems, then the statewide results — as well as some of the wide swings up and down seen in the results of individual schools — back up those suspicions.
Much depends on the outcome of the ISTEP exam — teachers' pay, administrators' jobs, a community's self-esteem — too much, really.
Like that questionable A-F grading scale, the test is supposed to assure school accountability, with parents able to compare the scores of different schools and decide the one in which to enroll their children.
But ISTEP is a one-off event, whereas education is, well, it's like the baseball season: a long, drawn-out affair.
It's no coincidence that both have their grinds.
Evaluating a student's academic progress based solely on the ISTEP is like judging Joe DiMaggio based solely on his performance on July 17, 1941, against the Cleveland Indians, ignoring what he'd done in the previuous 56 games of the season and what he would do in the next 17 games.
And to judge a student's academic achievement based on this year's ISTEP? That would be like expecting Don Larson to throw a perfect game every time he started.