A list of takeaways we noticed from this year's state test scores

Despite our ongoing attempt to streamline the mountain of information that came with the state’s release of the 2010-2011 test scores, there are still plenty of takeaways that haven’t been said on a press release or at a press conference.  After taking a slightly deeper look at the data, here are 10 worthwhile bulletins to consider:

  • Some of the neediest students took a step back; others showed progress. Students who are identified as English Language Learners, or ELL, improved slightly in math, but took another step back from statistical gains they made on the english test (ELA) earlier in the decade. While nearly half of the city’s non-ELL students met the state’s ELA standards, just 12 percent ELL students did so. That’s down from 34 percent two years ago, when the standards were easier and 1 percent drop from a year ago. The ELL students improved slightly in math. Special education students improved in both ELA and math.
  • The achievement gap remains vast. Schools in poor neighborhoods still struggle the most. In the South Bronx — one of the nation’s poorest congressional districts — and central Brooklyn, average proficiency rates were below 30 percent in ELA and below 40 percent in math. (Citywide rates were 57 percent in math; 44 percent in ELA). In the city’s more affluent neighborhoods, like Bayside, the Upper West Side and lower Manhattan, scores hovered at significantly higher rates. District 26 in Queens topped out in both subjects, with 74 percent proficiency in reading and 88 percent proficiency in math.
  • New doesn’t always mean better. More than a dozen schools in their first year of testing spanned both extremes of the performance spectrum. Half of them, including The Active Learning Elementary School, whose entire 20-student third grade class was perfectly proficient, significantly outperformed other schools in their districts. But many others struggled just as much as the closed schools that they were supposed to replace. In four such schools, less than a quarter of students did not meet reading standards. Just 5.8 percent of students at one school, Urban Scholars Community School, were proficient in reading.
  • Charter schools outperformed their neighbors, mostly. Citywide, 69 percent of students in charter schools met standards in math, up from 63 percent last year. In ELA, 45 percent were proficient, up from 43 percent last year. Both beat citywide averages. Nearly 75 percent of the charter school classes that took a state exam scored better than their districts, on average.
  • But some charter schools didn’t fit into this trend. A notable exception was Opportunity Charter School, which was the lowest scoring charter school in the city. Just 3 percent of its eighth graders were proficient in reading and just 23 percent met standards in math. By design, the school has a high number of special education students, who typically score much lower on state tests. But this year, as we reported, Opportunity also experienced significant internal turmoil, including a unionization bid by its teachers and the subsequent firing of at least 14 of them. Other charter schools with scores worth noting? The school operated by the United Federation of Teachers posted proficiency rates well below the charter school median score in nearly every grade. And The Equity Project, a charter school in Harlem that pays first-year teachers $125,000, had its second straight year of mediocre marks. TEP’s average scores hovered close to — or fell below — the district average.
  • Unsurprisingly, the city’s highest-scoring schools are also its most selective. The citywide gifted and talented Anderson School, which rarely admits anyone with a score below the 99th percentile on a nationally normed aptitude assessment, had the highest average scores in the city. Many other high-performing schools also contain gifted programs, which limit admission to students scoring in the 90th percentile or higher on the same tests.
  • Few schools experienced big score gains. There were exceptions, such as the Brighter Choice Community School in Brooklyn profiled in the Daily News, where the principal attributed a 55 point gain in its ELA scores to a bad year last year and a smaller class and new resources this year. But most schools saw change within a point or two of the citywide average. One explanation could be that the new test standards, put in place to curb grade inflation, actually worked. A more cynical take would be that principals changed their schools’ testing conditions and scoring policies in response to increased scrutiny by the city. In February, city officials announced that they would begin auditing 60 high schools whose Regents scores and graduation rates showed suspicious patterns, including large score gains in a short time. While the schools that received scores this week weren’t part of these audits, principals might have worried that the department’s auditors would turn to elementary and middle schools next. An even more cynical read on the flatter scores? That students simply didn’t learn more than last year.
  • A handful of schools did see big drops in their scores. Reading scores at P.S. 197 John B. Russwurm in Manhattan dropped 24 points, although it improved slightly in math.
  • The state doesn’t think the tests the scores came from were very good. The state completely discredited exams given before last year, saying their results frequently called students proficient when they weren’t. Now the state has made some incremental changes, such as adding more questions and keeping old questions under wraps to make coaching harder. But it’s still planning to join many other states in adopting completely new tests in three years. Those tests will be based on the new Common Core standards and are likely to prompt a massive dropoff in scores, as testing experts have documented typically happens when accountability measures suddenly change.
  • And yet the scores really matter. The city’s argument that it doesn’t judge schools “solely” according to test scores is a little disingenuous. It’s true that decisions about whether to close schools are based on a number of data points, including their progress reports, school surveys, and quality reviews. But 85 percent of a school’s letter grade on its progress report comes from students’ test scores arrayed in various ways.