Test jargon belies thoughtful scoring

The Education Beat

MSPAP: Teachers spending the summer reviewing exams say pupils' gains in reading are evident.

August 01, 1999|By Mike Bowler | Mike Bowler,SUN STAFF

SCHOOL TESTING officials don't talk like the rest of us. They speak not of roses, crabs and beer, but of norms, cluster equating, rubrics and coefficient alphas.

So when I set out last month to watch the scoring of the 1999 Maryland School Performance Assessment Program reading tests, I expected incoherence.

It wasn't that bad. The MSPAP scorers are Maryland teachers who speak plain English, and several of them spoke of their joy -- even after years of scoring -- when they come across a creative response from a totally anonymous Maryland third-grader.

But I didn't get near the actual scoring at Chesapeake High School in eastern Baltimore County. These are high-stakes tests, earning rewards and punishments for schools across the state. Even my knowledge of the theme of test items would have jeopardized the program, and the arrival of a Sun photographer nearly caused a nervous breakdown among state testing officials and those from Measurement Inc., a Durham, N.C., test consulting company under contract at $1.9 million a year.

The shorts-clad teacher- scorers, about 740 at four sites, work for an average of $10.50 an hour. About half are repeaters, and some have scored since MSPAP's beginnings in 1991. By Friday, when this year's six-week task will be complete, they'll have looked at 190,000 booklets filled out in the spring by Maryland third-, fifth- and eighth-graders.

While I was kept far away from any test booklets, I was welcome to talk to a dozen or so scorers, and all testified that the slow but steady rise of MSPAP reading scores is for real, especially in the third grade. "Most of the students are finishing [the tests] now," said Anne Park, a teacher at Shrine of the Sacred Heart School in Baltimore. "There used to be lots of blank booklets and blank pages."

I'd thought the reading part of MSPAP stood alone, so I was surprised to learn that reading is often integrated with other subjects.

"That's a logical and natural way to judge reading," said Nancy S. Grasmick, state schools superintendent, with whom I visited the scoring center. "Reading is a part of everything else. You can't do social studies if you don't know how to read. You can't do science or math, either."

A third-grade "exemplar" on one of the Education Department's Web sites (mdk12.org), for example, asks pupils to follow written directions and do a graph about recycling. That's part of the mathematics test, and it's scored separately. Then pupils write a persuasive article about recycling and shorter passages about what they learned from the graph -- and that's part of the reading test. (When they're to be judged on language usage -- grammar, punctuation and the like -- MSPAP alerts them with an icon, a symbol that tells them it's time to get serious about their writing.)

It's clear from this "public release task" -- one of a very few examples security-conscious state officials have made public -- that MSPAP reading is at least as much about writing, and that's one criticism of the test. Pupils summarize passages they've read. They respond to tasks that begin, "If you were the editor of a newspaper ," or "If you were a character in the story you've just read "

About a third of MSPAP's reading exercises are for "literary experience" -- poems, stories and the like. Two-thirds reflect "real-world" situations, such as following directions. All require complex thinking, and all require writing. They may be a better indicator of the quality of the second "R" than of the first. But there's no way in a mass testing program to hear 190,000 kids reading. In testing, reading and writing are joined at the hip and probably always will be.

Last week, the State Board of Education added a test to the array. In the second, fourth and sixth grades, Maryland kids will take a nationally "normed" commercial test, the Comprehensive Test of Basic Skills (CTBS). It measures individual pupils against their peers across the country.

I asked Gary Heath, chief of the state arts and sciences branch, how CTBS and MSPAP differ in reading testing.

It's in the response, Heath said. The two tests have similar questions, although the passages in the Maryland test are longer and require a broader range of reading. But CTBS has multiple-choice responses that can be read (much more cheaply) by a computer.

In defense of MSPAP, Heath used test language, but he made a good point: "If I'm a businessman, I'm probably not given the privilege of writing my boss in a multiple-choice response mode."

On the other hand, Heath allowed that a close correlation exists between performance on the tests. In other words, the teachers scoring MSPAP "by hand" last week in Essex would get roughly the same results as CTBS's computer in California.

Perhaps that's comforting to Maryland school officials, but it raises an important question: If they come to the same conclusions, why give both?

Baltimore Sun Articles
|
|
|
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.