Five things the SBAC scores show us

Print More

This past spring, Connecticut students took the Smarter Balanced Assessments for the first time. The exam, aligned with the controversial Common Core State Standards, is an online-only exam that is “adaptive” — meaning questions get more difficult if you get the previous one correct and easier if you get it wrong.

The results of that exam were released on Friday, and TrendCT parsed the data to understand what it tells us.

1. It’s hard to do an apples-to-apples comparison

Usually we can see how students improve from year-to-year because they take the same exam. But because of this transition to a new exam, we can’t measure how much students have grown from the previous year. If you want to measure the effectiveness of a teacher or school, you want to measure how much they’ve helped a student improve from one year to another — and not just a raw score. So a second year of exams should make this data much more useful.

In two years, the test results are scheduled to become part of teacher ratings.

2. This is a harder test — or the benchmarks are higher

If we look at the percentage of students who “passed” — meaning they were judged proficient in a tested subject — it is significantly lower with the SBAC exam compared to the state exam students took in 2013.

Percentage of students ‘passing’ exam, 2013 vs. 2015
The number of students passing plummeted — but the test and benchmarks change.
Subject SBAC 2015 State’s exam 2013
Math 39% 83%
English 55% 81% reading/84% writing

But because the test has changed, it’s only fair to say that test scores dropped if we mean that the number of students who passed each test dropped. There’s really no meaningful conclusion to be made comparing the two tests other than that the benchmarks are higher than before. As the Mirror’s Jacqueline Rabe Thomas point out, it’s something other states are also seeing.

3. Students are still worse in math

In the previous test, students didn’t score as well in math as they did in writing or reading. But in this new test, the gap is even wider.

With the state exams in 2013, 83 percent passed the math section — but with the new exam, just 39 percent did. The percentage of students passing dropped by half, which didn’t happen on the English section.

The new test focuses more on creative problem-solving than memorization.

4. Urban schools in low-income areas did significantly worse

Districts are separated into groups called “District Reference Groups,” or DRGs, based on their socioeconomic status. The districts with the greatest need include Bridgeport, Hartford, New Britain, New Haven, New London, Waterbury and Windham.

On average, the pass rate for those districts was 27 percent for English and 15 percent for math — significantly lower than every other district.

Average pass rate for SBAC, by type of school
DRG I schools include those in the following districts: Bridgeport, Hartford, New Britain, New Haven, New London, Waterbury and Windham.
Group Pct. Passing English Pct. Passing Math
DRG I schools 27% 15%
All other schools 61% 44%

But interestingly, about the same percentage of students in both groups fall into “Level 2” — the proficiency level that is closest to passing, but not actually passing. It’s “Level 1” — the lowest level — where the largest difference is. In English, the average percentage of students at “Level 1” among high-need schools is 46 percent; for math, it’s 59 percent.

5. One-in–10 black and Latino students in high-need schools passed math

At these high-need schools, the average pass rate among black students was 11 percent, versus 22 percent at all other schools; among Hispanic or Latino students, it was 9 percent, versus 30 percent for all other schools; among white students, it was 30 percent versus 48 percent at all other schools.

Now, this isn’t necessarily an indictment of these schools, because we don’t know how much growth these students exhibited over time. But it does show a large achievement gap between students in high-need areas and others.

Average score among high-need districts, by race
Note: This isn’t looking at the average score for all students in high-need schools. Rather, it is averaging each district-wide score for minority groups. It is a small difference, but an important one.
Race/ethnicity DRG I schools All other schools
American Indian or Alaska Native 15% 17%
Asian 41% 65%
Black or African American 11% 22%
Hispanic/Latino of any race 9% 30%
Two or More Races 18% 43%
White 30% 48%

What do you think?

  • Mary Burnham

    Thanks to Alvin Chang for breaking through the myriad of statistics to glean
    “five major things the SBAC scores show us” that were already well-established and
    known to educators, politicians, and the public alike. His first two points raise the necessity for
    cautious consideration of these released preliminary SBAC test results. Even though the SBAC
    may or may not be a harder test, it had been purposefully designed to claim increased “rigor” and
    sustain the misleading narrative of failing schools and inadequate teachers. Mr. Chang’s last
    three points also reflect no new information. After millions of dollars squandered on an unfair and
    unproven assessment system in order to gather data that is already well-known simply demonstrates
    how wasteful our government has been while trying to band-aid serious issues like poverty,
    segregation, and inequitable funding in our already-identified high-needs and under-performing
    school districts. For our political leaders and education officials to continue to stand by this
    psychometrically flawed, invalid, unreliable, and biased assessment protocol is unconscionable.

  • Jenn

    #1-“In two years, the test results are scheduled to become part of teacher ratings”. Worst idea ever. Besides the obvious reasons there is no statistical validity to using SBAC test results to teacher evaluations and ultimately put children in the ill-suited position of being responsible for 22.5% of their teachers effectiveness rating. They have no business at 8 to 16 years old being put in that position. Thank D.C for that one.

    #2-The scores dropped, and less students were deemed proficient for one reason only- the cut scores, set using the field test results, were set to fail the amount who did. There is no magic wand to the results.

    #3-Now this interesting. Were they worse in Math? It was almost 10% harder to pass the Math portion than the ELA- again due to how the cut scores set by the SBAC member states voted to set them. Here they are:

    Specifically for the Math achievement levels:
    grade 3: 61% to fail 39% to pass ( the greatest # of students
    would pass Math in grade 3)
    grade 4: 63% to fail 37% to pass
    grade 5: 67% to fail 33% to pass
    grade 6: 67% to fail 33% to pass
    grade 7: 67% to fail 33% to pass
    grade 8: 68% to fail 32% to pass (the greatest # of students would
    fail Math in grade 8)
    grade 11: 67% to fail 33% to pass.
    National Overall Average Expected to Pass Math: 34.29%
    State of CT Overall Math Pass Rate: 39.1%

    ELA Achievement Levels”
    Grade 3: 62% to fail 38% to pass (the greatest # of students would fail ELA in grades 3 and 7)
    grade 4: 59% to fail 41% to pass
    grade 5: 56% to fail 44% to pass

    grade 6: 59% to fail 41% to pass
    grade 7: 62% to fail 38% to pass
    grade 8: 59% to fail 41% to pass
    grade 11: 59% to fail 41% to pass
    National Overall Average Expected to Pass ELA: 40.57%
    CT Overall ELA Pass rate: 55.4%
    So they were worse in Math because it was set up to be that way. Using the low and high points as the range of differential 8th grade math was set to be the lowest pass point at only 32%, whereas 44 % would pass, on average grade 5 ELA. That is a 12% difference in pass rates between subject and grade. And ps. why would there be different “pass” percentages by grade and also by subject. When looking for a standard measure all cut scores should have been equal between grades and subject. TI was imply harder or easier to pass based on the child’s grade and the subject.
    #4-The same information we have had since No Child Left Behind and annual testing was mandated in 2001- so what’s new?
    #5-See #4.
    The atrocity that is SBAC simply robbed a grotesque amount of tax payer money in CT and the other member states from low income schools and students of resources that could actually have done something about #4 and #5.
    To use an instrument that has no validity or reliability for decision making purposes on human beings or system that have no statistical value what so ever is akin to malpractice.

    • alvinschang

      To play devil’s advocate, and not get too far into No. 5, I’m really curious to see more data in the coming years. To see growth data from other states taking the exam is interesting to me. I guess, right now, I’m not all that interested in the number of students scoring proficient or not, because as this year’s results show, that is often about where the goal line is — albeit the goal line is set methodically. But all this to say: I’m curious to see more data.

  • Julie Nevers

    My Fairfield Warde High school children were not interested in taking these tests. Bright children. Each averaging close to a 5 in their 9+ AP courses (over 20 total.) They played games on the test. What is the most absurd answer I can put. I am sure they failed miserably. Wonder how many other students did that.

    • Jenn

      I am guessing quite a few given the discrepancy between the two high schools. why opting out is better choice going forward as the commissioner will use test results to rank schools and it will negatively impact teacher effectiveness ratings, also thanks to D.C, our commissioner, and out state ESEA waiver. Using children and a tool that has no validity for such uses must come to an end.

  • Joseph Brzezinski

    Regardless of how SDE uses SBAC to evaluate school districts, schools, and/or teachers, a growing number of colleges, universities, vocational schools, etc. will incorporate individual scores or school results along with other factors to develop selection “profiles” for acceptance and scholarship or financial awards.

    In private busineses, analytics is one of the fastest growing activity and has been for several years AND the activity can ce expected to catch on for both private and public higher education.

    Consequently, playing games in taking the tests and even by opting out may exact a penalty for individual students and generically the community of students within particular schools.

  • Norma

    How was the comparison for schools that piloted SBAC in the 2013-2014 school year?