Next year, the teacher data reports that sparked a battle between the city and the teachers’ union could find a much warmer reception.
The new firm hired to produce the Teacher Data Initiative is reaching out to the teachers’ unions that bitterly opposed the program, and the firm’s researchers say they are committed to producing tools to help teachers learn, not to rank them.
The Value-Added Research Center at the Wisconsin Center for Education Research, the firm hired last month to produce the reports, held a summer workshop on their research methods for officials from school districts around the country. Two researchers from the United Federation of Teachers also attended.
Chris Thorn, associate director of the center, said that this school year’s round of teacher assessment reports will likely look much the same as last year’s.
But the long-term goal for the three-year, $840,000 program, he said, is to refine the way data is collected so it tells the most accurate stories about what’s going on in the classroom.
“You can’t connect students to teachers without data so clean you can eat off of it,” Thorn said.
Last school year, the first year of the program, the city used a different vendor. It issued the reports to about 12,000 fourth-through-eighth grade English and math teachers. The reports grade teachers using the so-called “value-added” model, which judges teachers based on progress their students made on tests from one year to another. The reports also factor in obstacles to progress such as large class size and students’ poverty levels.
The reports have been controversial since their inception. The teachers’ union responded to news of the program by fighting for a state law that banned using student test scores as a factor when making decisions about teacher tenure.
City education officials have been adamant that the data reports are confidential and will be used only for giving teachers feedback on their performance, not for any high-stakes personnel decisions. But some teachers and other observers have publicly worried that the reports may be “a Trojan horse” ushering in other forms of high-stakes teacher evaluations.
Thorn played down those concerns. He said that the center will pay close attention to elements that can unfairly influence the way a teacher’s “value” is measured. For example, he said that the center has teachers compare their students’ attendance records to the rosters of students sitting for tests. This ensures that teachers are measured by the performance of students they actually taught, he said.
The center also uses a 38-step diagnostic measure of state exams to make sure that it is fair to compare test results from year to year. Researchers plan to spend a significant amount of time learning more about New York’s exams and data-collection systems. Thorn stressed that the center is taking a cautious approach to learning about the complex New York system.
The project in New York will be the first time the center works with such a large urban district, Thorn said, and it will be one of its first forays into value-added research of individual teachers, rather than at the school level.
“One of the issues is that there isn’t another district that compares to New York,” Thorn said. “Even Chicago is less than half the size.”
The fundamentals of the New York project, however, will maintain the same approach that the center has been developing over more than 15 years of work in other school districts, including Minneapolis, Milwaukee and Chicago. In those cities, the program has researchers working directly with schools and districts to improve the way they gauge teachers’ performance. The center has posted a two-hour long presentation by its director Rob Meyer outlining its approach to value-added evaluation based on its work in Madison and Milwaukee.
Thorn also emphasized that the process the center uses to produce the reports is one thing; how the education department chooses to use the reports is another.
Education department spokeswoman Melody Meyer stressed that the teacher data reports are informational and will not be used to formally evaluate teachers. She also said that the department is still determining what kinds of improvements they seek in the reports.
“We’re in the process of defining ‘improvement,'” Meyer said. Some changes will likely be made to the methodology of the reports, Meyer said, and others might be as simple as cosmetic improvements to the reports’ look.
“We want to make them as informative as possible for people who don’t have a statistical background,” she said.