ALBANY — A dozen new factors could be tossed into the state’s formula for measuring how much teachers have boosted their students’ state scores, according to a proposal that is dividing state education policy makers.

The state’s teacher evaluation law, passed in 2010, requires student performance to count in teacher ratings. Currently, the state calculates “growth scores” that count for a fifth of teachers’ overall ratings. But the law allows the state to increase the weight of its score to a quarter of teachers’ ratings once officials adopt a more complex “value-added” model for assessing teacher impact.

Both models are based on the principle that comparing students’ actual test scores with their predicted scores can show the impact their teachers had on their learning. The question is what variables to use when predicting scores so that teachers whose students have greater needs are not at a disadvantage.

The current growth model considers and controls for four basic characteristics: previous test scores, students with disabilities, students living in poverty, and students who are still learning English.

At Monday’s Board of Regents meeting, state education officials said they had come up with a value-added formula that includes 12 other factors, including students’ previous test scores in other subjects and whether they are overage for their grade. The officials want to adopt the new formula for use starting this year in districts that have teacher evaluation agreements. (New York City is one of a few districts that do not yet.)

The proposed value-added formula would also control for the first time for classroom variables, such as class size and the proportion of students with high needs.

The added factors represent an attempt to distinguish more precisely between students who share the same characteristics that are controlled for in the current growth model. Right now, the state’s formula treats students with special needs as all presenting the same challenge to teachers, for example, but the proposed formula would give a different weight to the scores of students with severe disabilities.

“All those variables are there to further refine similar students,” said Amy McIntosh, a senior fellow for the Regents Research Fund who oversees teacher evaluations, of the value added formula.

The Regents must approve the proposed formula before the State Education Department can use it to calculate this year’s ratings for math and English teachers in grades 4-8, who make up about 17 percent of teachers statewide.

At this week’s meeting, discussion of the value-added formula was unusually contentious. It came halfway through the state’s first round of tougher tests, which have drawn fire from those who think the scores should not be used in high-stakes decisions such as teacher evaluations.

The plan drew swift resistance from several Regents, including two from New York City, who have been critical of using student test scores in teacher evaluations in the first place. They said making the state’s measurement of teacher impact on student test scores more complex would not address some of the issues surrounding the value-added approach, which include year-to-year instability and a requirement for some teachers to score low.

“I have no confidence that this is going to give us what we’re looking for and that teachers will be evaluated fairly,” said Westchester County’s Harry Phillips.

Phillips and Roger Tilles, of Long Island, both urged the state to shelve the growth models or treat them as a low-stakes pilot for now.

A vote is scheduled at next month’s Regents meeting, where Commissioner John King and Chancellor Merryl Tisch promised to bring an updated proposal that would at least in part address the concerns raised. But King reiterated his support for using the value-added model, citing research from the Gates Foundation’s Measures of Effective Teaching study that compared several approaches to factoring student performance into teacher ratings.

“This approach has been used all throughout the country, quite extensively,” King said. “The methodology that we’re using is very similar to what’s been used and studied extensively.”

King said the state convened its own advisory group that included teachers, principals, and district officials from across the state On technical matters, it also consulted an advisory group of eight researchers from around the country, including Jonah Rockoff and Douglas Staiger, two economists who studied New York City’s value-added data.

At times, the advisory groups’ recommendations were at odds, state officials said. The state’s Regents advisory group recommended weighing students’ gender when calculating teachers’ impact, King said, but the technical advisors recommended against doing so. Gender is not a factor in the proposed value-added formula.

The composition of the technical advisory group drew criticism from several Regents. The Bronx’s Betty Rosa pointed out that all eight researchers in the group are men. And Kathy Cashin of Brooklyn said the group lacked a different kind of diversity.

“How do you have a panel with one point of view?” Cashin said, referring to the fact that the researchers’ work begins with an assumption that it is possible to isolate teachers’ contributions to student learning.

In addition to changing the formula for measuring student growth in teachers’ ratings, the State Education Department has also proposed introducing a growth measurement for high school principals, based on their students’ Regents exam pass rates.

And the department also wants to change the way that transient and chronically absent students are counted, an issue that has divided some school districts as they craft their education evaluation systems.

Last year, 16 percent of students statewide weren’t counted in growth models because they were not enrolled in a single school for the entire year. This year, students who enter just before the halfway point will be counted, meaning that the scores of an addition 150,000 students — who often have high needs — would count in teachers’ ratings.

Students with low attendance will also be counted, but only by the same rate at which they attended school, according to the proposal.

The description of the State Education Department’s advisory groups has been clarified since this story was originally published.

Here’s the slide from the State Education Department’s PowerPoint presentation that shows the new factors that could become part of the state’s value-added teacher evaluation formula. The full presentation is below.
Screen shot 2013-04-23 at 10.33.38 AM