Evidence of effectiveness proves elusive

Educators looking for software to help their students encounter lots of claims but scant scientifically sound research.

September 20, 2004|By Alec MacGillis | Alec MacGillis,SUN STAFF

TALLADEGA, Ala. -- Graham Elementary School has no shortage of problems, so it might seem surprising that when it was awarded a $100,000-per-year federal grant earlier this year for "Comprehensive school reform," the school used most of the money to buy educational video games.

The school, like 16 others in Alabama, purchased dozens of Sony Playstations and, to go with them, reading and math programs from an education software company call Lightspan. Designed like video games, the programs came with colorful graphics, point totals and names like "Kazmania" and "The Three Decoders.

Graham Elementary Principal Frank Buck said he was persuaded partly by Lightspan's success claims at schools such as Lansdowne Elementary in Baltimore County, where 34 percent of kindergarten-to-second-graders supposedly posted major test score gains with the software.

Buck never called Lansdowne to check on this figure, but if he had, he would have learned that the company's claim was a bit of a stretch. Lansdowne Elementary administrators say that while the games were of some use, the school's gains had more to do with improved teaching, a partnership with the Baltimore Symphony Orchestra, and after-school clubs and tutoring.

"Like any company, they're going to make it look like the test scores were all because of them," said Lansdowne Principal Anne Gold. "There were many things happening here, and they were one piece. It wasn't all Lightspan."

Under pressure from the 2001 No Child Left Behind law, and from vendors using the law in their pitches, struggling schools across the country are spending heavily on education software programs that promise to raise their test scores.

But in many cases the products are unproven and the glowing claims behind them highly dubious - often based on flawed studies or on data compiled by the companies themselves.

While federally funded studies on the impact of education technology are under way, most of the results won't be available for two years. By then schools will already have spent millions on software in their attempt to comply with the law's tough performance standards.

"We need to get beyond platitudes. We need proof," acknowledged Susan D. Patrick, the U.S. Department of Education's director of education technology. "That's one of the big issues I hear from districts and states: `We need help with this, knowing what works.' ... There isn't anything out there now, and we realize that's a challenge."

Education software executives say they welcome objective research, but they deny making misleading claims. Mark Schneiderman, the industry's chief Washington lobbyist with the Software & Information Industry Association, said that No Child Left Behind's emphasis on accountability has made schools more discriminating in their software purchases.

"Schools are asking for evidence of products' effectiveness," he said. "They're going through a very thoughtful process when they make decisions."

Easier to take advantage

But many experts in education technology say that the 2001 law has made it easier for companies to take advantage of credulous administrators. Fast action is demanded: Schools must test all students annually from grades three through eight and show "adequate yearly progress" on scores not just school-wide, but also among minorities, special education students and other groups. Schools that fall short for several years running face funding cuts and possible state takeover.

Faced with so many requirements, few administrators spend the hours needed to sift carefully through vendor claims or hunt for better research.

"We're being bombarded; everyone says they have the turnkey solutions, but you don't have the time to review them," said Jayne E. Moore, the instructional technology director for the Maryland Department of Education. "It's overwhelming to try and discriminate. They'll provide their documentation, but how can you tell?"

The success stories are brandished with all the breathless fervor of weight-loss claims, but often they don't hold up under close inspection. Some examples:

Renaissance Learning, a Wisconsin-based software company, leads off a "scientific research case study" on its Web site by reporting that students at Sheridan Elementary in St. Paul, Minn., saw their scores on statewide math and reading tests increase an average of 43.5 percentage points over the four years since the school started working with Renaissance in 2000. Buried in the study is the fact that the school began using Renaissance's math software in only the last of the four years of steady increases.

Pearson Digital Learning trumpets, among many others, an elementary school in Hialeah, Fla. that, by using the Mesa, Ariz., company's SuccessMaker software, vaulted from a "D" state ranking to an "A" in the 1999-2000 school year. One has to look closer to discover that the school had been using SuccessMaker since 1994, long before the turnaround, which occurred in the same year as a major curriculum overhaul.

Baltimore Sun Articles
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.