Numbers game

Colorado is about to release a torrent of test results. Here are four storylines worth watching.

Sheridan School District sixth grader Monica Dinh takes part in a practice session last year (Photo By Craig F. Walker / The Denver Post)

The state education department is scheduled Thursday to publicly release a mammoth amount of data detailing how Colorado students performed on last spring’s standardized tests.

We’ll get to dive into state, district and school results from English, math, science and social studies tests, the PSAT and SAT, and student academic growth, which tracks how much students learn each year compared to their academic peers.

The data — beloved or loathed depending on which educator you ask — is supposed to gauge how well students grasp the state’s academic standards that are designed to prepare them for either college or a career.

The state also uses the results, along with other factors such as graduation rates, to issue quality ratings for schools and districts. And in some instances, teachers are rated based on the data.

Here is background and some storylines to keep in mind in advance of the release:

First a reminder of where we stand:

Three years ago, the state made a monumental shift in its testing system. Colorado was one of about a dozen states to drop paper-and-pencil standardized tests in favor of a new multi-state computer-based test.

The PARCC tests would measure critical thinking, a major component of the state’s new academic standards, which devalued rote memorization.

Prior to the first release, school officials in Colorado and across the nation warned that test scores would likely be low considering the newness of the academic standards and tests.

Indeed, they were.

In 2015, only 43 percent of fourth graders met the state’s expectations on the English test. Math was worse: Only 37 percent of third graders were able to complete math equations at grade level.

In 2016, the state saw a slight uptick in scores, mirroring national trends.

However, state officials worried about how far behind students with learning disabilities were compared to their peers.

Here’s a look at the changes in test scores in English and math:

English

Math

 

With three years of data from PARCC, we can — finally — talk about trends. But what are we going to learn that we didn’t already know?

For the last two years, state and school district officials have warned about two things: First, don’t compare the results of PARCC to that of previous standardized tests. Second, they said we needed three years of data to pinpoint trends in student performance.

Why three years?

Derek Briggs, a professor at School of Education at the University of Colorado Boulder who also sits on the technical advisory board for PARCC, said one reason why we might need three years of data is because of exaggerated bumps sometimes found in the second year of a new standardized test.

“One explanation for this sort of trend was that it would take teachers/schools a year to figure out the emphasis on the new assessment, so in the first year, the alignment between teaching and instruction isn’t optimal, so student performance in the first year is depressed,” he said in an email. “Then in the second year, it snaps back up once instruction and assessment are better aligned.”

Briggs added that so far, no state that updated its test to align to the Common Core State Standards like Colorado did had a second year bump.

So, now we have three years of data: What can we say?

It’s difficult to make sweeping declarations about state trends — especially in a local control state where so many decisions about what students learn is made at the school and district level.

But Juan D’Brot, a senior associate at the Delaware-based Center for Assessment, said that at the three-year mark, school officials and parents alike can start to better understand what’s working or not at individual schools.

“It can serve as a gut check about a school’s general performance over time,” D’Brot said. “If you have three points that are moving upward or constantly moving downward, we can quickly create a story around that.”

It’s more difficult to draw conclusions if a school’s results are less consistent, he said.

And there are some state-level benefits.

“This trend data can help the state evaluate their own efforts to work with districts and schools,” he said. This is especially valuable when school leaders use a variety of data points including patterns of student growth.

The state is suppressing data in an effort to “protect student privacy.” How much will be redacted?

Colorado was once considered one of the most education data-friendly states. But beginning with the first release of PARCC data in 2015, the state began blacking out more school-level data than it had in the past.

The effects of the new so-called “suppression rules” were even more pronounced in the state’s 2016 release. The state shielded roughly 4,000 data points that year, frustrating education reform advocates who say this data helps parents make better decisions about schools.

Stay tuned to see what we won’t learn about school performance due to these rules after Thursday’s release.

After two years of delayed and drawn-out data releases, the state is giving us everything on time and all at once. But the promise of getting data back quicker is still elusive.

In 2015 and 2016, testing data dribbled out of the state education department over several months — state-level results first, then school level, then student growth data. This was a departure from a decades-long routine of releasing test score data in August.

On Thursday, the state will release almost everything all at once. (District and school performance data disaggregated by different student groups is expected within a month.) This is a major victory for the state and the makers of PARCC because one of the longest-running criticisms of the test was how long it took to get data back to schools.

Schools received their results in June, the earliest data has gotten back to the schools since the state switched to PARCC.

But the timeline still falls short of one of the promises of new tests and the demands of the State Board of Education, which going forward wants data back to schools within 30 days.

Is the state’s gradual move away from PARCC at the high school level working to curb the opt out movement?

In 2015, Colorado became one of the nation’s epicenters for the testing opt out movement. Thousands of high schoolers, backed by their parents, refused to take the PARCC exams, claiming they served no educational purpose.

In some cases, entire schools sat empty during the state’s testing window.

In response, lawmakers eliminated some high school tests and changed others. In 2016, more high school sophomores took the state’s tests than the year before. Policymakers hope additional changes at the ninth grade level, set to take effect next spring, will move even more families back to the state’s testing system.

Will the trend continue? We’ll find out on Thursday.

And finally, here’s a roundup of previous coverage you might find helpful:

ASD scores

In Tennessee’s turnaround district, 9 in 10 young students fall short on their first TNReady exams

PHOTO: Scott Elliott

Nine out of 10 of elementary- and middle-school students in Tennessee’s turnaround district aren’t scoring on grade level in English and math, according to test score data released Thursday.

The news is unsurprising: The Achievement School District oversees 32 of the state’s lowest-performing schools. But it offers yet another piece of evidence that the turnaround initiative has fallen far short of its ambitious original goal of vaulting struggling schools to success.

Around 5,300 students in grades 3-8 in ASD schools took the new, harder state exam, TNReady, last spring. Here’s how many scored “below” or “approaching,” meaning they did not meet the state’s standards:

  • 91.8 percent of students in English language arts;
  • 91.5 percent in math;
  • 77.9 percent in science.

View scores for all ASD schools in our spreadsheet

In all cases, ASD schools’ scores fell short of state averages, which were all lower than in the past because of the new exam’s higher standards. About 66 percent of students statewide weren’t on grade level in English language arts, 62 percent weren’t on grade level in math, and 41 percent fell short in science.

ASD schools also performed slightly worse, on average, than the 15 elementary and middle schools in Shelby County Schools’ Innovation Zone, the district’s own initiative for low-performing schools. On average, about 89 percent of iZone students in 3-8 weren’t on grade level in English; 84 percent fell short of the state’s standards in math.

The last time that elementary and middle schools across the state received test scores, in 2015, ASD schools posted scores showing faster-than-average improvement. (Last year’s tests for grades 3-8 were canceled because of technical problems.)

The low scores released today suggest that the ASD’s successes with TCAP, the 2015 exam, did not carry over to the higher standards of TNReady.

But Verna Ruffin, the district’s new chief of academics, said the scores set a new bar for future growth and warned against comparing them to previous results.

“TNReady has more challenging questions and is based on a different, more rigorous set of expectations developed by Tennessee educators,” Ruffin said in a statement. “For the Achievement School District, this means that we will use this new baseline data to inform instructional practices and strategically meet the needs of our students and staff as we acknowledge the areas of strength and those areas for improvement.”

Some ASD schools broke the mold and posted some strong results. Humes Preparatory Middle School, for example, had nearly half of students meet or exceed the state’s standards in science, although only 7 percent of students in math and 12 percent in reading were on grade level.

Thursday’s score release also included individual high school level scores. View scores for individual schools throughout the state as part of our spreadsheet here.

Are Children Learning

School-by-school TNReady scores for 2017 are out now. See how your school performed

PHOTO: Zondra Williams/Shelby County Schools
Students at Wells Station Elementary School in Memphis hold a pep rally before the launch of state tests, which took place between April 17 and May 5 across Tennessee.

Nearly six months after Tennessee students sat down for their end-of-year exams, all of the scores are now out. State officials released the final installment Thursday, offering up detailed information about scores for each school in the state.

Only about a third of students met the state’s English standards, and performance in math was not much better, according to scores released in August.

The new data illuminates how each school fared in the ongoing shift to higher standards. Statewide, scores for students in grades 3-8, the first since last year’s TNReady exam was canceled amid technical difficulties, were lower than in the past. Scores also remained low in the second year of high school tests.

“These results show us both where we can learn from schools that are excelling and where we have specific schools or student groups that need better support to help them achieve success – so they graduate from high school with the ability to choose their path in life,” Education Commissioner Candice McQueen said in a statement.

Did some schools prepare teachers and students better for the new state standards, which are similar to the Common Core? Was Memphis’s score drop distributed evenly across the city’s schools? We’ll be looking at the data today to try to answer those questions.

Check out all of the scores in our spreadsheet or on the state website and add your questions and insights in the comments.