Superintendent John Deasy takes over the second-largest school system in America, LAUSD, just as the district releases its own calculations of student performance based on standardized test scores. From the LA Times:

The Los Angeles Unified School District’s new school performance measure is likely to surprise many parents, who have traditionally compared schools — and at times purchased homes — based on the state’s Academic Performance Index, which rates schools on a 1,000-point index based mainly on their students’ abilities on standardized tests.

The value-added approach [instead] focuses on how much progress students make year to year rather than measuring solely their achievement level, like the API, which is heavily influenced by factors outside a school’s control, including poverty and parental involvement. Value-added analysis compares a student with his or her own prior performance, largely controlling for outside-of-school influences.

The district’s ratings, dubbed “Academic Growth Over Time,” can send parents a very different signal about a school’s performance. Take, for example, 3rd Street Elementary School in the Hancock Park neighborhood of L.A., which has an API score of 938, putting it among the highest-scoring schools in the district. Under the new growth measure, 3rd Street is one of the lowest-performing elementary schools in the district.

What’s troubling is that rhetoric surrounding use of student standardized test scores–even if student performance year-over-year is compared to eliminate external influences–appears to include other measures of teacher performance, yet those other measures have never been identified. What are they? Teacher peer evaluation? Portfolio review of instructional materials? The focus has been almost exclusively on standardized test scores and “value-added” as it applies to teacher performance.

Furthermore, what is the reliability of the data analysis conducted by the Wisconsin non-profit? The LA Times hardly mentions the unnamed non-profit:

The district scores are based on an analysis conducted by a nonprofit research group affiliated with the University of Wisconsin, which has a three-year, $1.5-million contract with L.A. Unified. The group has also worked with public school districts in New York City, Chicago and Milwaukee.

An interview with Pat Morrison of KPCC-FM, the Los Angeles-area NPR affiliate, reveals the Wisconsin non-profit as the University of Wisconsin Value Added Research Center (VARC). Professor Rob Meyer worked on the development of AGT and is also affiliated with the Wisconsin Center for Education Research (WCER). VARC has also collaborated with Edvance, a San Antonio-based education data analysis company that has worked with the Bush Institute. An “About” page acknowledging funders lists the institutional affiliations as well:

Housed in the Wisconsin Center for Education Research at the University of Wisconsin-Madison, the Value-Added Research Center is home to multiple on-going research projects, and is funded by grants from sources such as U.S. Department of Education IES/NCES, NSF, and the Joyce Foundation. Research partners include the Milwaukee Public School System, the Wisconsin Department of Public Instruction, Chicago Public Schools, and Teacher Incentive Fund grant recipients.

The Joyce Foundation and other funders of value-added research at VARC are hardly the same ideological stripe as the Walton (Wal-Mart) Foundation, the Broad Foundation, or many of the other markedly conservative philanthropies that fund “school choice” programs and think tanks. (The Joyce Foundation funds gun control measures and other programs with an explicit social justice bent.) So the question here is, why are typically “non-partisan” foundations so wedded to the “value-added” approach?

Why, when local education policy researchers based in Los Angeles who have made the area school districts the subject of longitudinal study for decades, have never come out in favor of a “value-added” approach? Why is “value-added” embraced when leading economists within California deride much of the Gates Foundation-funded research on Measures of Effective Teaching, saying that it’s “flawed” and “misinterprets its own results”?

This effort is troubling in its effects, wasteful, and focused on the wrong approach.


Comments are closed.

  1. Randall Traweek 9 years ago

    Teachers are now going to be evaluated and even fired because they cannot demonstrate growth using standardized tests scores which largely do not even address what they purport to measure. Districts like Los Angeles will do this using sophisticated "value added" computer algorithms. There never has been and never will be a computer algorithm that can defy a fundamental law of computer science. Garbage in, garbage out.

    Supt. Deasy claims LAUSD's algorithm is MUCH better than the LA Times which has now largely been discredited. Of course he does! But read the FAQ's on the District's own website. Does their algorithm account for class size? NO (NO? But isn't class size important when it comes to "raising scores?") Does it account for attendance? NO (NO? So a teacher is branded a failure for a student or group of students who only show up once a week?) What about difficult students who are moved to a different teacher midyear or even a month or two before the tests. Does it account for that? NO Does it account for teachers who team teach? NO It doesn't stop Does it account for socioeconomic status? Yes (AND NO.) It only factors in whether they are in a federal lunch program or not and we have no idea how much they account for this as algorithms are as secret as they are magical. A student could be homeless, or barely qualify for meals and will count the same. A dollar or two a year separate qualifiers from nonqualifiers, so we just draw (this one) line in the sand? Students (and their teachers) will not be credited if their dirt-poor families are too proud to apply for the meals program. Many at our school are not only too proud, they must face a stigma of having lunch tickets. We fight this, but it's like people years ago making fun of welfare in the ghetto. They do it when we are not watching. Does it account for Special Ed? Yes (AND NO.) There is no accounting for the level of disability other than extremely severe. You either have an IEP or not. No accounting for level or frequency of services you receive or should receive but don't. No accounting for a child on the waiting list who desperately needs an IEP, but hasn't been tested while waiting for years. What about a kid with a serious learning disability, but just missed qualifying (perhaps improperly). What about kids heavily medicated at home because the parents don't want the stigma of an IEP wrapped around the neck of their child? Is any of this accounted for in the District's algorithm. NO NO NO NO NO. Garbage in, worse garbage out. We are going to be judged a failure based largely on something worse than garbage. Madness. Madness worthy of Kafka.

  2. […] And the situation is only going to get worse. [LAUSD Superintendent John] Deasy and [Secretary of Education Arne] Duncan both are pushing value-added standardized testing measures to evaluate teachers. The LA Times slanders all of us on a daily basis with its value-added measure on its website. Deasy calls his AGT. […]

  3. […] LAUSD Releases It&#1109 Own “Value-Added” Rankings &#959f Schools […]

  4. […] And the situation is only going to get worse. [LAUSD Superintendent John] Deasy and [Secretary of Education Arne] Duncan both are pushing value-added standardized testing measures to evaluate teachers. The LA Times slanders all of us on a daily basis with its value-added measure on its website. Deasy calls his AGT. […]

©2020 KLEO Template a premium and multipurpose theme from Seventh Queen


We're not around right now. But you can send us an email and we'll get back to you, asap.


Log in with your credentials

Forgot your details?

Skip to toolbar