Writing to Read in Baltimore: Expensive Lesson?

August 18, 1991|By M. WILLIAM SALGANIK

Mayor Schmoke needed a success story from the city schools before the September election. And he got one recently, or the appearance of one, with the release of an evaluation that showed the Writing to Read program had good results.

The mayor had pushed the program, marketed by IBM, which uses personal computers to improve the reading skills of kindergarten and first grade students. A study by researchers at University of Maryland Baltimore County found that students in the 38 schools using the program had shown more improvement in reading than students at other schools who had started with similar reading scores.

That sounds like good news. Good news that the city schools, with the lowest reading scores in the state, had found a program that raises those scores. And good news that the city schools, with a history of trying programs willy-nilly, often with haphazard evaluation, seemed to be proceeding with more rigor. Buoyed by the results of the evaluation, school officials prepared a plan to install Writing to Read centers in all 122 city elementary schools and to add computer labs for other uses.

But the success is not so clear. And in the complexities of the Writing to Read story is a lesson in how complicated it is to "prove" what works in schools -- and how political leaders can run into trouble when they function as curriculum directors.

As an educational program, Writing to Read is expensive. Each lab, serving about 150 students, costs $30,000 to set up and another $30,000 a year to operate, according to school system figures. The city will pay IBM more than $2.1 million for last year's program. The plan to put Writing to Read centers and other computer labs in all elementary schools would cost more than $38 million over four years ($13 million to buy equipment and the rest in operating costs, maintenance and so on).

Know one knows where that money would come from. The school department is hoping to interest a foundation or other outside source to pay for the computers.

If Writing to Read really helps city kids read -- and other programs don't -- it is easily worth the money. But there are reasons to question whether the city should press ahead with Writing to Read:

* A review by Robert Slavin, a researcher at the Johns Hopkins University, of 29 evaluations of Writing to Read, most in large, urban school districts, concluded the program generally had little or no effect.

* According to the Slavin review, published in the journal Educational Evaluation and Policy Analysis, other programs have shown equal or better improvement in reading scores for kindergarten students -- and those programs cost a lot less.

* Even where an evaluation concluded that a Writing to Read program did have an effect in the short term, a few which conducted follow-ups found Writing to Read participants did not maintain their edge over other students a year after the program ended. An expensive kindergarten/first grade program is worth the money if it means students can then do better throughout their school careers. If they do better in first grade, but perform like other students in second grade, the program is not worthwhile.

* The largest evaluation of Writing to Read was commissioned by IBM itself and conducted by the Educational Testing Service. It found kindergarten students benefited from the program (compared to a control group), but first graders did not. Baltimore is using the program in both kindergarten and first grade.

* Writing to Read was tried in Baltimore in 1987, and a school department evaluation concluded: "The Writing to Read program did not demonstrate improvements for pupils. We do not think that further piloting is necessary."

The recent Baltimore evaluation was conducted on a much larger group of students (1,231 program participants this year, just two classes in 1987), which makes it more persuasive. But the other evaluations should give officials pause.

But how can this be? Doesn't a program either work or not work? Don't educational researchers follow some sort of standards?

In an "experiment" conducted with real people in real schools, there are lots of complications that don't occur under laboratory conditions. And even if a program does "work," we may not be sure why it works. For example:

* In the Baltimore Writing to Read program, each student received an hour a day of work in the Writing to Read lab, in addition to regular reading and language arts instruction. This extra hour of daily reading instruction may account for the advantage over the control group.

* Some programs use learning activities similar to those of Writing to Read -- kids write stories to develop a better feel for language -- without the expensive computers. The computers may help, or similar gains might come from a writing-based curriculum without the hardware.

Baltimore Sun Articles
|
|
|
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.