THIS IS the story of the making of a credibility gap.
Last month, the city school system launched still another testing program, this one to be administered quarterly to all students in grades two through eight. Called the Baltimore Quarterly Assessment, the program is designed to "assess student learning in a format that is similar to the Maryland School Performance Assessment Program (MSPAP)," according to a memo to his management staff from Superintendent Walter G. Amprey.
"Administering and scoring the tasks in BQA," Amprey wrote, "will enable us to grow and develop in our understanding of the Maryland Learning Outcomes."
Numerous programs have been ordered from the top in Baltimore schools with little prior notice and virtually no training of the employees assigned to carry them out. But this was an extreme case. Many of the 4,000 or so elementary and middle school teachers who would have to administer and score the new tests protested. A few simply refused to score the tests (a task that is supposed to be completed by next Monday), opening themselves to charges of insubordination.
Amprey, who became a lame-duck superintendent with the signing of the city-state agreement resolving federal and state lawsuits yesterday, backed off a square or two. Several schools working on an innovative curriculum project were exempted from giving the new tests, and there will be no testing in the fourth quarter next spring, when the schools will be in the middle of MSPAP testing.
Education Beat set out to try to understand the dramatic disparity between the North Avenue view of the testing and that of a great many teachers. We talked to Amprey, to his assistant in charge of the BQA, Clarissa B. Evans, and we listened to a group of angry teachers one afternoon at the Waverly Library. Here we construct a dialogue from real quotations:
Teachers: They're assessing these kids to death, and this test is taking a bunch of time away from instruction. When will we have time to teach?
North Avenue: Many of these tests can be substituted for regular classroom testing and writing assignments. The tests are based on the city curriculum, so they should be a great help to teachers. Testing and teaching aren't two different animals.
Teachers: This isn't fill-in-the-bubble scoring. It's entirely subjective. The scores will be meaningless, and you watch, they'll use them against us. They're setting us up so that 6,000 teachers can be blamed for their mismanagement.
N.A.: There's still residual reaction to the shift from norm-referenced [fill-in-the-bubble] testing to criterion-referenced [more subjective, using a "rubric" of right answers] testing. Eventually, they'll get over it.
Teachers: Since the scoring is entirely subjective and none of us has been trained to do the scoring, how could anyone tell whether there's improvement from quarter to quarter?
N.A.: The main purpose really isn't to tell whether there's improvement from quarter to quarter. It's to make sure that teachers and students are familiar with the approach to learning by which we're increasingly being judged by the state.
Teachers: The test itself is a joke; it was slapped together, is full of errors and wasn't field-tested. Some of us couldn't understand it, and we're adults. A lot of the kids are leaving most of it blank. They don't understand the vocabulary. Where they do fill out everything, we have to grade 34 pages of answers for each student.
N.A.: Sure, there will be some problems, but we'll work the bugs out. The initial administration of BQA should be viewed as a learning experience.
Teachers: They'll eventually use these tests to evaluate teachers and weed out those they don't like. So what teacher in his right mind is going to give out bad grades on these tests?
N.A.: Student performance on BQA should not in any way be factored into teacher evaluation. Eventually, we might use the tests to identify specific professional development needs and to figure out which parts of the curriculum need to be revised.
Teachers: This isn't bad planning. It's no planning.
N.A.: It's kind of like a fire drill. People will go through the drill and become familiar with this kind of curriculum, this kind of testing. We'll have to work out the kinks, but one of the options that isn't open is abandoning the tests.
The six-year college degree
Factoid of the week: Only 25 percent of the freshmen who entered Maryland public colleges and universities in 1989 graduated four years later. After five years, half had earned a bachelor's degree. After six years, 58 percent had. Of the students who transferred to a four-year school from a community college in 1990-1991, only 10 percent graduated two years later.
Students are staying in college longer, doing more course "dabbling" and "stopping out" for a year or more. It's a major problem for the schools because it costs them money. Look for Maryland colleges and universities to take measures to move 'em through and move 'em out.
Pub Date: 11/27/96