Advertisement

California and the West

State Scraps Flawed Data That Compared Similar Schools

Education: Raw scores are unaffected, officials say, but ethnic, economic classifications were skewed.

March 12, 2000|MARTHA GROVES | TIMES EDUCATION WRITER

The California Department of Education has yanked information from its Internet site that ranked schools against others with similar characteristics, acknowledging that data used to compute the rankings were more extensively flawed than the agency had realized.

Doug Stone, an agency spokesman, emphasized Friday that schools' raw scores on the state's first Academic Performance Index and the 1-to-10 statewide rankings of schools were not affected by the data problems.

Rather, the inaccuracies skewed rankings of schools against counterparts with similar ethnic and economic traits.

About 600 school districts with a total of 4,300 schools, nearly two-thirds of the 6,700 schools rated in the Academic Performance Index, have told the department that they inadvertently supplied faulty data--or no data at all--on the percentages of students qualifying for free or reduced-price lunches under federal guidelines. The federal lunch program is used as the measure of poverty among schoolchildren.

The department gathered the data from a form on last spring's Stanford 9 standardized achievement test. Some districts apparently paid little attention to whether those forms were filled in properly.

"We suspect that many of the districts didn't realize this data would be used in a high-stakes accountability measure," Stone said.

The index, released with great fanfare Jan. 25, is the cornerstone of an ambitious state program to gauge the academic improvement of schools. It will serve as a baseline for determining which schools will eventually qualify for financial rewards and which might come in for sanctions.

The statewide rankings, based on last year's scores on the Stanford 9, attracted intense interest from schools, parents and even real estate agents eager to promote the benefits of living in neighborhoods where schools achieved 9s or 10s.

Many schools with rankings of 1s or 2s seized on their better showings against comparable schools as a sign of hope. It remains to be seen how such schools will fare when the rankings are revised.

Soon after the listings were unveiled, about 400 schools notified the department that their data were erroneous. Many school officials realized that because they had underreported the percentage of low-income students, their campuses were lumped in with 100 counterparts in more affluent areas, where achievement tends to be higher.

In mid-February, Delaine Eastin, state superintendent of public instruction, wrote to superintendents and gave them three options: They could stick with the numbers originally reported on the Stanford 9; they could use a different percentage calculated by the state office that runs the school lunch program; or they could submit a percentage calculated by the district.

The department says it now has the corrected data from districts and plans to revise the rankings for posting in mid-April on its Web site (http://www.cde.ca.gov). For the 2000 index, the department advised districts, only data from the Stanford 9 information form will be used.

Among Southland districts that replied, the giant Los Angeles Unified School District requested changes for only 11 schools.

Hacienda La Puente Unified, by contrast, asked for changes, many of them dramatic, for all its schools.

In Montebello Unified, two-thirds of schools needed revisions. The percentage of students qualifying for free or reduced-price lunches at Bell Gardens Elementary School, for example, will jump to 100% from 66%.

At Cesar Chavez Elementary, the figure will rise to 94% from 61%.

Ann V. Rich, director of educational measurement for Montebello, said the district was hampered in its reporting by confidentiality rules that precluded the inclusion of the lunch data in the main database. As a result, the information had to be filled in by hand on the test forms. In many cases, it was not.

Now that schools know the significance of the data, said Pat McCabe, an administrator in the office of policy and evaluation for the state Department of Education, "we're quite confident that next year the data will be more accurate."

Advertisement
Los Angeles Times Articles
|
|
|