Advertisement
YOU ARE HERE: LAT HomeCollections

VENTURA COUNTY PERSPECTIVE

Things to Consider When Trying to Make Sense of the Stanford 9 Scores

The number of kids evaluated, preparation efforts and progress by cohort should all be taken into account when viewing the results.

September 03, 2000|RICHARD FERRIER | Richard Ferrier lives in Santa Paula and teaches at Thomas Aquinas College. He serves on the board of the California Assn. of Scholars and writes on education issues for the California Political Review

When California public schools chose to use the Stanford 9 to test our students, the idea was to get an honest, nationally norm-referenced test to let us know how well or badly our kids and schools were doing compared to other kids across the country. A big part of the idea was that educators, parents and taxpayers would look at the scores with some care and praise real progress--or put the educators' feet to the fire if the numbers were bad.

We now have three years of those scores, and the other day I tried, in my amateur way, to make some sense of them. I then went to the Santa Paula High School board meeting and did my civic duty. I told board members that we are not making much progress.

First, here is how to read the numbers, which The Times published in July. Oxnard High, to pick a school at random, got a 36% rating in reading in the 11th grade. That means that 36% of the kids in the school scored in the top half of the national sample, and and 64% in the bottom half.

Eleventh-graders at Santa Paula High School, where my son David recently graduated, scored 30% in reading. That school's 11th-graders scored 16% the previous year.

Progress, right? Well, maybe--and maybe not.

Consider these caveats.

You should pay close attention to the number of kids tested. If the school tests nearly all of its students, then a shrinking number of students reflects the dropout rate, about 6% per year statewide. The number of 11th-graders at Santa Paula High taking the test was about 12% fewer than those taking it in the 10th grade, and that number in turn was 12% lower than those taking the test in the 9th grade. In other words, a lot of kids were dropping out, and surely not, for the most part, the academic all-stars. That should actually help lift the average scores at the school.

Also, the intention of the state in mandating the test was not to see if schools could jack up scores by artificial preparation for this test. It was meant to be diagnostic. So if your school has made a major effort in test-taking strategy or in practicing for the test, and both were done at Santa Paula, discount the scores a bit.

Moreover, as I learned from the testing experts at Mathematically Correct, a San Diego-based parents and teachers' group that advocates a back-to-basics approach to math, a small upward drift is to be expected simply from expectation of the test style. The same thing may happen to you or your son or daughter in taking the driver's test. You often do a bit better the second or third time out. This upward drift means nothing, and should be discounted. It is surely still taking place after the third year of the Stanford 9, but should diminish in the next few years.

But most important, I suggest looking at progress by cohort, not by grade. That is, compare the scores of the class of 2001 as it moves through the school; don't compare the class of 2001 to the class of 2000 to the class of 1999.

In Santa Paula High, the 11th-grade scores in reading, from 1998-2000 went from the 25th percentile to 16th to 30th. In math, they went 27th to 22nd to 39th. Did the school lose ground in 1999, and gain in 2000? And is it getting better?

Now look at the class of 2001 over these three years. This is a fairly stable group of students, although with a high dropout rate. The reading scores of this one group of students now run 24%, 19%, 30% while the math scores in these same three years run 42%, 31%, 39%. In three years, this group of kids dropped three notches in math and gained six in reading, compared to other students nationally. Given the caveats, this is an essentially flat performance, or even a slight drop.

The class of 2002 showed a drop of 10 percentile ranks in math and no change in reading, while the class of 2000 gained exactly 1% in each area, posting dismal scores of 16 and 22 in reading and math respectively in the last year they were tested. I don't call this progress, do you?

We want to know, all of us--teachers, students, parents and administrators--how good, or how bad, the results are. That should lead us to look for oddities in the scores. In Santa Paula High, there is a consistent drop, ranging up to 14 percentile ranks and averaging 8, in each cohort, from 9th to 10th grade. There is a gain of almost 12% from 10th to 11th grade.

*

What does this mean? Are the classes, texts and teachers worse in 10th grade and better in the other two years? How does this 10th-grade slump, and 11th-grade recovery, compare to county or state averages? Or to schools with comparable demographics? Oxnard High, for example, does not have such a profound 10th-grade slump and 11th-grade recovery. Are they doing something different, and if so, what? I don't have details on curriculum to push this inquiry further but I do hope the school board and the administrators and teachers look into this oddity.

Advertisement
Los Angeles Times Articles
|
|
|