Wide review of tests begins

Focus is on scoring of MSPAP

state hires consultant

Results puzzle educators

November 08, 2001|By Howard Libit | Howard Libit,SUN STAFF

In their search for an explanation to puzzling state test scores, Maryland education officials say they plan to focus much of their attention on the complicated statistical analyses used each year to calculate the results.

But officials said yesterday that no part of Maryland's testing program will escape their emergency six-week probe - including classroom instructional techniques, the way the tests are given and the summertime grading by teachers.

"We would like to be able to figure out what's going on and have some results for 2001," said Assistant State Superintendent Ronald A. Peiffer. "We think the issue is in what happened after the tests are graded, but we're not sure."

The extraordinary review of scores from last spring's Maryland School Performance Assessment Program exams was ordered this week by state schools Superintendent Nancy S. Grasmick, who was concerned about unexplained rises and drops in the scores of more than 100 schools across the state.

The scores - which typically are released just after Thanksgiving each year - will not be announced until January.

Officials say they don't know whether the review will change the results, but at the very least they want to better understand the large fluctuations before publishing scores that are closely scrutinized by the state's principals, parents, teachers and pupils.

Grasmick's decision has been strongly supported by the state's 24 local superintendents. The superintendents had flooded Grasmick's office with phone calls after privately seeing their systems' results and questioning the scores of many of their schools.

During the six-week delay, state education officials will conduct an in-depth review of the scoring process.

They've also asked a New Hampshire-based testing organization that's never done work on the MSPAP exams to look at the state's program and scoring process, agreeing to a $40,000 contract with the National Center for the Improvement of Educational Assessment Inc.

"We're willing to look at anything," said Richard Hill, executive director of the nonprofit organization that has consulted with at least nine other states on testing. "We'll step back and help them resolve whatever issues might be found."

Yesterday, state education officials said the strange scores were found across all three grade levels that take the exams - third, fifth and eighth - and across all six subject areas in the tests.

Unlike traditional, multiple-choice exams, the MSPAP tests include open-ended questions that require pupils to write short answers and essays.

The tests are scored each summer by certified Maryland teachers at several sites across the state, supervised by a North Carolina testing company.

State education officials say the test administration in the spring and the summer grading appeared to proceed as smoothly as in past years.

While they will re-examine those aspects of the 2001 exams, much of the review will focus on what occurs later in scoring the tests - including statistical steps to ensure that this year's exams were as difficult as last year's.

"The object is to make sure the scores this year are worth the same as the scores last year," Peiffer said.

Testing experts say states across the country have experienced problems with the post-grading statistical analyses - known as scaling and equating.

"It seems to be increasingly common," said Walter M. Haney, Boston College professor and senior research associate in the school's Center for the Study of Testing, Evaluation and Educational Policy. "As more and more states are expanding their testing, this is an issue where they're finding problems."

If Maryland were to change how it processes the data, testing experts suggest it's possible the state would have to go back and restate results of exams since 1993 - a possibility called very unlikely by state officials.

Though other states have had scoring problems, researchers said they were not familiar with states having such wild swings both up and down as Maryland officials say they have found.

"There have been states where the scores have been high, and states were the scores have been low, but I'm not aware of other examples where there were questions about large increases and decreases," said Monty Neill, executive director of the National Center for Fair & Open Testing in Cambridge, Mass.

Those who work on Maryland's exams say they don't believe substantial problems will be found in the MSPAP's scoring and statistical procedures.

Even in a recent critical report of the MSPAP exams by the Abell Foundation, the "psychometric review" section generally praised the validity and reliability of the tests.

"It's probably a good thing to have someone look over their shoulder and make sure everything is being done correctly," said William D. Schafer, a retired University of Maryland, College Park statistics professor who continues to consult with the state on the MSPAP exams.

"But I believe that what's going to happen is that the scores will not change. I believe that the scores we have now are accurate," he said.

Baltimore Sun Articles
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.