Advertisement
YOU ARE HERE: LAT HomeCollections

Interpretation of CAP Scores Goes Beyond Raw Numbers

November 12, 1989|JIM NEWTON | TIMES STAFF WRITER

California Assessment Program tests, a staple of the state's educational efforts for more than a decade, offer telling nuggets of information about schools and their progress. But it takes a practiced eye to pick out the significance in the results, and experts warn that scores can be easily misinterpreted.

To help avoid that, educators strongly caution against reading too much significance into small yearly shifts in a school's averages.

"It doesn't tell you much to see that a school or a district has dropped a couple of points in any given year," said Fred Lange, director of instructional services for the Orange County Department of Education. "You really need to look at two or three years to find the trend."

That's because a certain instability is built into the test. Different groups of students are tested each year, so that this year's third or sixth or eighth graders are compared to last year's. Thousands of new students enter and leave the educational system every year, and one year's group of third graders may be very different from the next year's.

Looking at results for several years gives a more accurate sense of where a school actually stands and helps mitigate factors that can skew results in a single test. A three-year sample gives a reasonably clear picture of a school, experts say, and though it does not measure the entirety of an academic program, it gives analysts plenty to go on. In the Anaheim Union High School District, for instance, officials have eyed stubbornly low reading and writing scores for years. Eighth-graders were not tested in writing last spring, but reading scores are reflected in the statistics.

Eighth-grade statewide averages in reading have steadily risen for the past four years, going from 243 in 1986 to 256 this year. Anaheim's scores have remained almost steady at about 260.

Those are not alarmingly low, but they show so little improvement that James Cox, the district's director of research and development, said officials are embarking on a revamped writing curriculum this year.

"We're turning to a more literature-based program," Cox said.

While three years of a school's raw CAP scores provide a better sample, they do not tell the whole story. Socioeconomic factors hold strong influence on test results, so that schools in poor neighborhoods, even if they are doing commendable work, will often test lower than their wealthier counterparts.

"Kids who live in affluent areas are more educationally advantaged," Cox said. Affluent children often have access to books, parents who encourage them to read and quiet places to do homework. All of those can help their test scores regardless of what is done at their schools.

To take into account those socioeconomic factors, the CAP test assigns schools a rating that reflects the educational and professional level of students' parents.

Third- and sixth-grade tests determine socioeconomic status based on parents' professional attainment. A rating of 1 indicates parents who are unskilled workers; a 2 indicates semiskilled laborers, and a 3 means parents are employed as skilled workers.

A five-point scale ranks the educational background of parents with children in the eighth and 12th grades. One stands for parents who did not graduate from high school; 5 indicates parents with advanced degrees.

The socioeconomic ranking helps educators compare their schools to others of comparable means, a comparison generally considered more meaningful than one that looks at all schools regardless of their socioeconomic background.

Advertisement
Los Angeles Times Articles
|
|
|