This past spring, Connecticut students took the Smarter Balanced Assessments for the first time. The exam, aligned with the controversial Common Core State Standards, is an online-only exam that is “adaptive” — meaning questions get more difficult if you get the previous one correct and easier if you get it wrong.
The results of that exam were released on Friday, and TrendCT parsed the data to understand what it tells us.
1. It’s hard to do an apples-to-apples comparison
Usually we can see how students improve from year-to-year because they take the same exam. But because of this transition to a new exam, we can’t measure how much students have grown from the previous year. If you want to measure the effectiveness of a teacher or school, you want to measure how much they’ve helped a student improve from one year to another — and not just a raw score. So a second year of exams should make this data much more useful.
In two years, the test results are scheduled to become part of teacher ratings.
2. This is a harder test — or the benchmarks are higher
If we look at the percentage of students who “passed” — meaning they were judged proficient in a tested subject — it is significantly lower with the SBAC exam compared to the state exam students took in 2013.
|Subject||SBAC 2015||State’s exam 2013|
|English||55%||81% reading/84% writing|
But because the test has changed, it’s only fair to say that test scores dropped if we mean that the number of students who passed each test dropped. There’s really no meaningful conclusion to be made comparing the two tests other than that the benchmarks are higher than before. As the Mirror’s Jacqueline Rabe Thomas point out, it’s something other states are also seeing.
3. Students are still worse in math
In the previous test, students didn’t score as well in math as they did in writing or reading. But in this new test, the gap is even wider.
With the state exams in 2013, 83 percent passed the math section — but with the new exam, just 39 percent did. The percentage of students passing dropped by half, which didn’t happen on the English section.
The new test focuses more on creative problem-solving than memorization.
4. Urban schools in low-income areas did significantly worse
Districts are separated into groups called “District Reference Groups,” or DRGs, based on their socioeconomic status. The districts with the greatest need include Bridgeport, Hartford, New Britain, New Haven, New London, Waterbury and Windham.
On average, the pass rate for those districts was 27 percent for English and 15 percent for math — significantly lower than every other district.
|Group||Pct. Passing English||Pct. Passing Math|
|DRG I schools||27%||15%|
|All other schools||61%||44%|
But interestingly, about the same percentage of students in both groups fall into “Level 2” — the proficiency level that is closest to passing, but not actually passing. It’s “Level 1” — the lowest level — where the largest difference is. In English, the average percentage of students at “Level 1” among high-need schools is 46 percent; for math, it’s 59 percent.
5. One-in–10 black and Latino students in high-need schools passed math
At these high-need schools, the average pass rate among black students was 11 percent, versus 22 percent at all other schools; among Hispanic or Latino students, it was 9 percent, versus 30 percent for all other schools; among white students, it was 30 percent versus 48 percent at all other schools.
Now, this isn’t necessarily an indictment of these schools, because we don’t know how much growth these students exhibited over time. But it does show a large achievement gap between students in high-need areas and others.
|Race/ethnicity||DRG I schools||All other schools|
|American Indian or Alaska Native||15%||17%|
|Black or African American||11%||22%|
|Hispanic/Latino of any race||9%||30%|
|Two or More Races||18%||43%|