Superintendent knows the score

Tests: In delaying the announcement of questionable MSPAP results, Nancy S. Grasmick has opted for a little furor now in order to avoid a major crisis later.

November 11, 2001|By Howard Libit | Howard Libit,SUN STAFF

Experts say there's one rule that stands above all others when it comes to state testing programs: Never, ever publish results if you aren't sure they're correct.

"There has to be confidence in the quality of the results," says Walter M. Haney, Boston College professor and senior research associate in the school's Center for the Study of Testing, Evaluation and Educational Policy. "People have to believe that the results they're seeing are accurate."

In states where mistaken scores have been published - and there have been many - the effects have been widespread: hundreds of schools wrongly praised or scorned, thousands of dollars in bonuses given to teachers who didn't deserve them, and even the scrapping of entire testing programs.

So when Maryland School Superintendent Nancy S. Grasmick announced this week a delay in releasing the results of the state's annual exams, she opted for a little short-term furor in hopes of avoiding a larger crisis later on.

"We need to ensure the integrity of the exams," Grasmick says. "I don't want to release scores that I don't have confidence in."

During the delay, state education officials and some outside researchers will look at the 2001 results of the Maryland School Performance Assessment Program (MSPAP) exams and try to explain what they are describe as "wild swings" in the scores, both up and down.

While many schools have posted large annual gains and losses since the MSPAP scores were first publicly reported in 1993, officials have generally been able to find reasons for the changes - often new principals or programs.

This year, state officials and educators in the 24 local systems could find no such explanations for more than 100 schools, prompting internal questions about whether something was amiss with the scoring.

Though state officials say they're just as likely to release the same scores in January that they had intended to release later this month, the delay will give them more time to check out the results, just to be sure.

"You can't take them back once they're out there," says Richard Hill, executive director of the National Center for the Improvement of Educational Assessment Inc.

Hill's New Hampshire-based nonprofit group - which has never previously worked on the MSPAP exams - was hired by Maryland officials to review the tests, and Hill says he sympathizes with the problem.

In 1997, Hill's former company was working with Kentucky when scoring problems were discovered in that state's tests. A small miscalculation led to results being misstated, and the state paid some bonuses to teachers at schools that hadn't earned them.

Within a year, amid growing public criticism, Kentucky's legislature voted to dump the state's testing program and replace it with a new one.

"No one wants the scores of schools to be based on capricious data," says William Schafer, a retired statistics professor at the University of Maryland, College Park who is involved in analyzing the MSPAP exam results.

In 2000, thousands of students in Minnesota were mistakenly told they had failed state exams required for graduation, and 54 seniors were denied diplomas and barred from graduation ceremonies.

Also last year, officials in New Mexico revealed they had miscalculated their list of most-improved schools. They ended up replacing the 94 schools initially on the list with a different set of 101 schools and reassigning the $1.8 million in rewards.

And this fall, California officials are trying to figure out how a mistake in the state's scores prompted more than $750,000 in bonuses to go to the wrong teachers.

"When mistakes have happened, there have been legislative investigations, and legislatures start passing laws mandating changes in the tests," says John F. Jennings, director of the Center on Education Policy in Washington, D.C. "As usual, [Grasmick] has shown herself to be very sure-footed in the face of a potential crisis.

"It's embarrassing to do what she had to do, but it's far less of a problem than publishing the wrong results and then having to take them back," says Jennings, who is co-chairman of a task force examining Maryland school reform.

While Maryland doesn't use the MSPAP exams to give individual scores, it does use the results to reward improving schools and take over those that are failing.

"Delaying the results and taking the time to investigate shows guts and integrity and superb judgment," says Steven Ferrara, an analyst with the American Institutes for Research who helped develop the MSPAP exams. "These scores help influence so many important decisions, you have to be sure the scores are correct."

And with the MSPAP exams under scrutiny from some parents and teachers who question their influence on classrooms, it becomes even more important that the 2001 results are correct.

"When you're being held up to such a microscope as the MSPAP, you wouldn't want to come out with a set of scores and then come back three weeks later and say, `Oops, we were wrong,'" says Patricia A. Foerster, president of the Maryland State Teachers Association. "It's the quality that's the important thing."

Howard Libit is The Sun's state education reporter and has been covering K-12 education for seven years.

Baltimore Sun Articles
|
|
|
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.