Under a new teacher evaluation system being developed, Oklahoma teachers may be categorized based on whether students in their classroom perform better or worse than expected on standardized tests.
Known as value-added evaluations, the model is notorious for being used in 2010 by the Los Angeles Times to rate about 11,500 elementary school teachers from Los Angeles Unified public schools.
Teachers were ranked on a metric of least effective to most effective depending on students' expected performance. The results were published on the newspaper's website.
Any data used in teacher evaluations in Oklahoma likely would be considered a personnel matter and kept out of the public eye, but the model is essentially the same.
Florida just developed and will implement a value-added evaluation system for teachers using almost $4 million in Race to the Top grant money, according to the Florida Department of Education.
Oklahoma didn't win any Race to the Top funds, but lawmakers mandated the state still develop an evaluation system that uses student growth as measured by state exams to evaluate teachers.
Under the law, student growth will account for 35 percent of a teacher's evaluation and 15 percent of the evaluation will be based on another yet-to-be identified quantitative measure. The other half of a teacher's score will come from qualitative measures such as classroom management.
An 18-person commission — the Teacher and Leader Effectiveness Commission — is tasked with developing that complex system by Dec. 15 and then recommending it to the state Education Board.
Sam Foerster, chairman of the Florida value-added committee, said picking a student-growth evaluation system is about finding the best way possible to measure a teacher's impact on a student over the course of a school year.
“If you believe that there are other factors about a student that might help a student to grow, you want to try to take those into account so that we're not falsely attributing those to a teacher,” Foerster said. “You're trying to level the field based on the kids that come into the classroom. You want to understand what impact the teacher has had, that the teacher is responsible for. Not the extraneous factors, the socio-economic factor or any other factor.”
Florida is doing that by using a statistical analysis of each student that takes into account “everything and the kitchen sink” to project that student's “expected” academic growth in a school year will be.
Foerster said the equation accounts for performance on test scores in previous years, the age of the student, attendance, class size and all sorts of other variables, everything except socio-economic status which the state expressly prohibited them from taking into account.
The formula then projects what a student is expected to score on the exam. If every student in a teacher's class meets the “expected” score on a standardized test, the teacher would be considered effective.
If a majority of students exceed their expected score, based on each individual student's set of extraneous variables, the teacher would be highly effective.
And if students are not growing as much as expected, that teacher would be ineffective.
That's an extremely oversimplified version of a complicated statistical analysis of how students are growing, but Foerster said it gets to the heart of how the system works.
The Tulsa School District is collecting value-added data about its teachers.
“Value-added has been around for 20 years, as far as our research can tell,” Tulsa Public Schools Superintendent Keith Ballard said. “We're accepting that a lot of this methodology is reliable.”
Community donors have made it possible for Tulsa Public Schools to fund the data collection and analysis required.
Jana Burk said one of the most costly and time consuming steps of value-added data collection is linking students to teachers. It may sound simple, but in an urban district like Tulsa that has great mobility among students, Burk said they must calculate what percentage of a student's growth is attributable to what teachers.
It's a concept called dosage. If a student was in a class for half a year, that student got a 50 percent dose of one teacher's influence. That teacher would get credit for 50 percent of the student's growth.
Foerster emphasized that the purpose of collecting this data is to help teachers become better.
He said a teacher in a rich suburban neighborhood may get students every year that are above grade level and leave the class above grade level, but the students don't actually meet their potential because it's an ineffective teacher.
“That teacher looks like a hero,” Foerster said. “It goes completely unnoticed that the growth rates in those classrooms are actually below expectations.”
While a teacher in a poor community in an inner city may get students who start every year below grade level and leave below grade level despite having grown far more than expected.
“That teacher looks like the villain,” Foerster said. “They're not.”
The Oklahoma commission is accepting comments on its website about the qualitative side of evaluations and soon will make a recommendation for the value-added portion of the evaluations.