A study examining whether getting poor grades on city progress reports prompted schools to improve their students’ test scores found little evidence of such a boost.

The study, released today by the conservative-leaning Manhattan Institute, asked the question by comparing schools with progress report raw scores that were roughly the same, but just different enough to get different letter grades.

In fact the two groups showed about the same amount of progress — except in fifth-grade math, where students in failing schools made “significant and substantial improvement” compared to their peers in schools that had been assigned a grade of D, according to the study.

The progress reports assign letter grades to schools based primarily on improvements in students’ test scores. Since the first reports were released a year ago, the program has been the subject of sustained criticism: Parents and teachers have complained about unfair stigmatization of good schools, and statisticians have charged that the reports are driven as much by error as by actual school improvement.

The study’s architect, Manhattan Institute senior fellow Marcus Winters, called his findings “mixed-positive” in favor of the progress reports. Those findings were the subject this morning of a panel discussion sponsored by the Manhattan Institute featuring Winters, Columbia University economist Jonah Rockoff, and two officials from the Department of Education’s accountability office, including its CEO, James Liebman.

Liebman said he was “very, very pleased” by positive results so soon after the progress report program launched. “We want a system that enables people to react to it and respond to it,” Liebman said.

Winters sounded a less positive tone. He repeatedly expressed relief that his research didn’t bear out the “dire predictions” made by some critics about the negative effect a failing grade could have on a school. “At least we’re not seeing a negative impact,” Winters said. Later, he repeated that sentiment: “There’s nothing inherently destructive” in F grades, he said.

And on the question of whether higher grades earned by previously failing schools reflected real improvements, Winters said, “The fact that a lot of them were becoming A’s makes me worry.”

Winters and Rockoff, who earlier this fall released a study based on schools’ aggregate test scores that showed results similar to Winters’, emphasized that their research doesn’t answer some major questions about the progress reports. Winters said his study doesn’t indicate whether the progress reports formula “accurately identifies” good and bad schools. And neither study examined whether it was the stigma or the consequences attached to low grades that caused any gains that did register, Rockoff said.