Yesterday I was told which of my students passed the New York State English Language Arts and Math exams, and which students will be attending summer school to get a second try. Unfortunately (or fortunately perhaps) we didn’t get exact scores that would allow me to see exactly how well my students did and therefore how well I did as a teacher. Rather I got two copies of my class roster with each student labeled either “Met Promotion Criteria” or “Did Not Meet Promotion Criteria.” Somewhat unsatisfying, no?
The Good News
While 10 of my students scored 1’s in the fall ELA simulation, all but four of them met the criteria, even in a year when the criteria were supposedly tougher. In math, all but two met the promotional criteria. Considering my students were acknowledged to be the lowest-performing class in the school, I’m pretty proud of the class as a whole.
The Bad News
Of course the other side of this story is that four of my students didn’t pass the test. That’s certainly more than any other class and while I know my students and their limitations, that won’t matter when I’m judged against my peers. Worse still, the results included one big surprise, a student who is currently reading at a late second-/early third-grade level — meaning he should/could have passed the test. He passed every simulation we took. But he never really showed any true interest or effort, and I blame myself for never finding a way to spark that in him.
The Slightly Reassuring News
While I’m not happy that any of my students failed, I wasn’t surprised by the results (except for one student I mentioned above). In a strange way the results were slightly reassuring, a sign that the test isn’t completely invalid. Three of the students who didn’t pass the ELA are reading at a level E or lower (that’s kindergarten). Two of those students are the two who didn’t pass the math, largely, I believe, because they can’t read. Based on where they are, in spite of the growth they have made, they should not have been able to pass the test. Their failure represents a glimmer of hope for the legitimacy of these assessments.
In addition, the two lowest-performing students who failed both exams are SIFE (Students with Interrupted Formal Education). One of them attended kindergarten at my school and then left, only to return in May of last year. The other student has left midway through the year, every year, until this year. So, it’s not totally unexpected that they aren’t performing at grade level. In the end it might be good for them to be held back or at the very least to attend summer school. They certainly won’t catch up otherwise.
I understand why the official performance scale scores need to be delayed. And I’m hesitant to attribute it to any sort of conspiracy theory. The tests were later, so it makes sense the scores will be released later. Still, the State Education Department and everyone involved with the exams at the city and state level are doing a disservice to teachers, students, and themselves by even allowing a hint of wrongdoing to hang over the tests and their scores. They should be bending over backward to make the process as transparent as possible.
I imagine the process of translating the raw scores into performance scores is pretty benign and boring. Allowing the media or some third-party organization to monitor the process would instantly erase any shadow of malpractice from the state exams. As far as I can tell no such monitoring is taking place. Instead, while we all wait for the binary “met promotional criteria”/”did not meet promotional criteria” to be magically transformed into familiar numbers, we have to wonder how exactly it’s getting done, and why it’s being done without any openness.
About our First Person series:
First Person is where Chalkbeat features personal essays by educators, students, parents, and others trying to improve public education. Read our submission guidelines here.