State officials fail science test


July 25, 2008|By John Monahan

One of the toughest things I have to do as a Baltimore biology teacher is to teach my students about the scientific method. That is, basically, the set of rules under which science operates.

Every year, when my kids take the High School Assessment, they have a lot of difficulty on that section of the test. They don't quite understand about variables and how to run a controlled study. I always worried that this would hinder them if they went into a scientific profession. Now, however, I can take comfort in the fact that it prepares them for jobs with the Maryland State Department of Education.

For the benefit of my students, let me explain. When a researcher designs a study, it's important to take controlled variables into account. For instance, let's suppose that I want to test whether zebra mussels, an invasive species, are better than native mussels at removing polychlorinated biphenyls, or PCBs, from the water. (This is from an actual question on a past HSA.)

As the researcher, I put 20 zebra mussels in a tank of water, with a fixed amount of PCBs added. I set up another tank of water, with 20 native mussels, and add the same amount of PCBs to that tank. After a week, I can then measure the levels of PCBs in both tanks. It is essential, if I'm going to have a valid study, to keep the conditions in the tanks as close to identical as possible. I can't put PCBs into one tank, and then add an equivalent chemical to the other, even if the other chemical looks like PCBs, smells like PCBs and is just as toxic as PCBs. The fact that it is a different chemical invalidates the test.

The same holds true for the MSA (Maryland School Assessment) and the HSA. If you are going to compare test scores from one year with test scores from another year, which many Maryland education officials seem to be gleefully doing, it is essential that you are comparing the same test. If you use different tests, even if they are of equal difficulty, it invalidates the comparison.

It is quite possible that this year's drastic increase in MSA scores is the result of a better curriculum or better instruction or improved textbooks. It could also be because this year's test was shorter than last year's, or because the instructions on this year's test were clearer than last year's, or because the questions on this year's test were more closely aligned with the curriculum that students were taught than last year's. Once you change the test, it becomes impossible to tell.

No wonder our students don't know how a proper scientific study should be conducted.

If we want to tell how well our kids are being educated, we should rely on something that's not as easily manipulated. We could look at the percentage of students who graduate, or the percentage accepted into college, or better yet, the percentage who graduate from college. Try comparing the average income of students five years after they graduate. Comparing those figures from year to year might give us some useful data.

Instead, with the No Child Left Behind sword hanging over our heads, we design and administer tests that are tailor-made to produce increasing scores. Politicians and administrators take credit, and anyone who questions the validity of the tests and the supposed increases is dismissed for questioning the achievements of black and Latino children.

What a fine example this sets for our kids. I wonder what lesson they will learn from it.

About the author

John Monahan teaches science at Patterson High School in Baltimore.

What's your issue?

E-mail an article, about 400 words, to

commentary@, and include "Having Your Say" in the subject line.


John Monahan teaches science at Patterson High School in Baltimore

Baltimore Sun Articles
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.