Kids' test same as the drill

January 14, 1994|By Suzanne Loudermilk | Suzanne Loudermilk,Staff Writer Staff writer Angela Winter Ney contributed to this article

As they work on the Maryland Functional Writing Tests this week Harford County ninth-graders are taking the identical test that they had been using for months as a practice lesson.

Every fall, in preparation for the test, English teachers work with students to help them improve their writing skills. Each county develops its own guidelines, often using old topics and sometimes developing new ones.

The test consists of two sections -- a narrative part and an explanatory section -- which are given on different days.

The topics of each section, or prompts as they are called by school personnel, draw on a student's personal experience.

For example, students might be asked to write a letter to someone about a particular incident.

In Harford, local school administrators put together a "help packet," using test topics from previous years, and then distribute it to teachers.

This year, one of the practice topics, which was used in 1988, was also the one that students were asked to write about Tuesday in the first part of the current test, which is traditionally given during January.

"It was word for word," said a surprised Harford County teacher, who asked not to be identified. Original topics, not recycled ones, have been used since the writing test's inception in 1983, she said.

Today, 2,500 Harford students will take the second part of the test, again writing about a topic they have seen before, the teacher said. She also said that the students had worked extensively on this year's two topics.

"I helped them in organization; they finished the assignments; I corrected their errors; and they worked on them again," she said. "I'd say they had a bit of an edge."

The state Department of Education, however, has a different opinion on the issue of taking the same test that was used as a practice exercise.

"The prompts are a template for prompting students to write," said Ron A. Peiffer, spokesman for the state education department. "The likelihood that [using a familiar prompt] would increase a student passing is very, very small," he said.

The scores will eventually tell whether that's true.

The tests are sent out of state for grading, and it will be several months before school systems learn the results.

"In the past if there have been any design flaws [in other tests] that have disadvantaged the students, we verify that fact through scoring," Mr. Peiffer said.

"If we saw bizarre things with results, we would need to make changes."

The grades are also important, not because they are a graduation requirement for students but because they are used as a criterion in the annual Maryland School Performance Report, which was formulated in 1990 to monitor the success of the state's schools.

In most school systems, at least 90 percent of the students passed the writing test in 1993, with the exception of those in Baltimore City and Talbot and Caroline counties, which did not achieve the satisfactory rating of 90 percent.

In last year's performance test, students in Harford schools scored an average of 96.6 percent on the writing test, above the 96 percent that the state has established for an excellent rating.

Steven Ferrara, state director of student assessment, doesn't agree.

"It's not like Harford is the only system that has access to prompts," he said, although he did add that a newspaper article about the recycled tests could "make a bad situation worse."

While there were no similar reported incidents from other counties yesterday, Dennis Younger, director of curriculum for Anne Arundel County public schools, said it could easily have happened, because teachers expected new prompts to be used.

"As you work with students to help them understand what the test will be like, you use examples of the kinds of things they'll be asked to do. Local prompts might have been used that are similar to the ones used today. It could very well happen," he said.

However, he wasn't concerned that the test results in Harford would be skewed.

"I think the writing assignment is demanding enough that you would still get a very good measure of a student's ability to respond to a writing prompt," Mr. Younger said.

Students still must organize their writing, defend a position and summarize an argument in a reasonably complete fashion," even if they had practiced using the same prompts, he said.

Harford County administrators also weren't upset that their students had seen the test beforehand. "As we see it, we don't have an edge since all counties receive the same information we did," said Al Seymour, assistant superintendent of Harford schools.

Harford teachers are still wondering why no one told them that the topics weren't going to be new this year. "They are not supposed to have been seen before," said a teacher who has been involved with the tests since their beginning.

According to Mr. Ferrara, "We made a conscious decision to put in a plan and work at building a bag of prompts we could draw on."

There are now 15 to 16 pairs of topics the state could use, he said.

Asked why the teachers weren't told, since this is the first year the topics were reused, he had two responses: "Why would anyone announce this?" and "You want me to divulge test security?"

A Harford teacher said that before this year the tests were handled with upmost confidentiality and always arrived at the schools sealed, leading her to believe that the secrecy of the topics was important.

She also pointed out that the student materials handbook on the tests says that students should be prepared for the test: "by giving them practice writing assignments which simulate the requirements" of the writing test.

"Simulate doesn't mean using the same [test] to me," she said.

Baltimore Sun Articles
|
|
|
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.