State releases teacher rating data that most districts won't use

As of today, school districts across New York State have in hand the first piece of data they would need to calculate some teachers’ ratings: their “growth scores” for last year.

The State Education Department today distributed scores to districts for 36,685 educators who teach reading and math in grades 4-8 or supervise those teachers. The scores — which calculate students’ growth on state math and reading tests, adjusting for the students’ past performance, the performance of similar students, and the reliability of the exams — would count for 20 percent of educators’ ratings under the state’s evaluation law.

Two consecutive “ineffective” ratings could trigger termination proceedings under the law. But the data released today suggest that the state’s current formula for measuring student growth would be unlikely to place many teachers’ jobs at risk.

Nearly 85 percent of the 36,685 educators who received a score fell into the “highly effective” or “effective” ranges. Just 6 percent of them had scores in the “ineffective” range.

Few of the scores issued today will actually be used to evaluate teachers. Most of the state’s 715 school districts, including New York City, have not yet adopted evaluation systems that comply with the state’s evaluation law, and many that have adopted new evaluations won’t use them until next year.

“The evaluation law won’t be fully implemented until the coming school year, so most districts won’t be using the scores we’re releasing today for evaluations,” State Education Commissioner John King said in a statement. “But they can use the scores to help improve instruction.”

The state temporarily removed the scores from its internal data system this afternoon after districts alerted it to inaccuracies in some of the reports, according to an email from Jeff Baker, SED’s data chief, that GothamSchools obtained. Baker said the reports would be corrected and reposted on Friday.

The state’s scores are similar in theory but not in algorithm to the city’s defunct Teacher Data Reports, were produced from 2008 to 2010 for reading and math teachers in grades 3 to 8. The TDRs were “value-added” scores that adjusted students’ test score growth for many more factors than the growth scores used. The state’s evaluation law mandated a growth score measure for last year and a value-added measure for the future, possibly starting this year.

The state is working with the American Institute of Research to build its model; the city’s model was designed by a research center at the University of Wisconsin.

Both strategies aim to upend the way teachers traditionally has been judged. In the past, assessments of teacher quality tended to look only at students’ test scores: A teacher whose students scored higher was deemed stronger. But that design stacked the deck against teachers whose students started the school year with greater needs and lower scores.

The idea behind value-added and growth measurements is that they look instead at how much improvements students make in a year. Teachers are rewarded not when their students score highest, but when the students’ performance gains exceed the average gains made by similar students.

Several news organizations published the city’s TDR scores in February, leading to a public reckoning about the value of rating systems that emphasize progress on test scores. It also placed some low-rated teachers in the line of sharp criticism and caused some principals to receive requests for students to be moved to higher-rated teachers’ classes.

This fall, educators and district leaders will get access to the growth scores, but they will only be able to see their own scores and scores of teachers they supervise. In December, the public will get a window into the scores, but in accordance with a law passed in June, only aggregate data will be available and parents will have to ask their principal for access. The state is supposed to shield all information that could allow people to link ratings with individual teachers.

City Department of Education officials said they were reviewing the state’s growth scores for accuracy right now and would make them available to schools in the future. “We intend to prepare materials for schools to help principals share and discuss the results with individual teachers in the fall,” said Connie Pankratz, a department spokeswoman, in a statement.

But Pankratz said the city would not share the state’s scores with parents until a new evaluation system is in place in New York City, explaining that the department is “legally prohibited” from sharing the scores until then. “This is just one of the many reasons why it is imperative that we come to an agreement on teacher evaluations,” she said.

In addition to the 20 percent that the state calculates, evaluations under the state law require another 20 percent to come from locally approved growth measures and the final 60 percent to come from subjective measures, at least half based on principal observations.

The law mandates that districts make publicly available both the subcomponent scores and the overall ratings. But if the state’s growth score is not yet part of the city’s annual evaluations, it would not necessarily fall under the transparency law.

New York City was supposed to have evaluations online last year in 33 schools that had been receiving federal funds known as School Improvement Grants. But the city and its teachers union could not reach an agreement about particulars of the evaluation system, so neither new evaluations nor the federal funds are flowing to the schools. Nine other districts across the state did reach evaluation deals for their SIG-eligible schools, and they make up the majority of schools where the growth scores for last year will influence teachers’ ratings.

Gov. Andrew Cuomo has set a Jan. 16, 2013, deadline for all districts to comply with the teacher evaluation law or risk forgoing increases in their state school aid.