March 12, 2015 12:01AM

Did Common Core Do That? We Don’t Actually Know

Common Core supporters love to accuse opponents of peddling misinformation, and sometimes opponents do. On the flip side, Core supporters are frequently guilty not only of peddling deceptive information of their own, but promising the world without sufficient evidence to justify it. A new report from Harvard’s Paul Peterson – generally a pretty sober analyst – comes a bit too close to making such a leap, strongly suggesting that the Common Core has caused appreciable improvement in the rigor of state standards.

Based on a rough trend of decreasing differences between the percentage of students scoring “proficient” on state tests and on the National Assessment of Educational Progress, Peterson and co-author Matthew Ackerman report that state standards are rising. In other words, “proficient” on state tests is looking more like presumably high-flying “proficient” on the “Nation’s Report Card.”

Between 2011 and 2013, “20 states strengthened their standards, while just 8 loosened them,” Peterson and Ackerman report. To what do they attribute this? “A key objective of the CCSS [Common Core] consortium – the raising of proficiency standards – has begun to happen.” In case the text of the report didn’t make the attribution of success to the Core clear, the report’s subhead intoned that, “commitments to the Common Core may be driving the proficiency bar upward.”

At the very least, there should be a huge emphasis on “may,” and the Core probably shouldn’t be mentioned at all.  

Indeed, Peterson and Ackerman’s results could suggest that the Common Core actually dampened rigor. According to the report, of the four states that never adopted the Core, Texas and Virginia raised their standards while Alaska and Nebraska stood pat. That means 50 percent of non-adopters lifted their standards and 50 percent stood their ground. None went backward. Among Core adopters, in contrast, eight states, or 18 percent, lowered their standards; 19, or 42 percent, stood still; and only 18, or 40 percent, raised their bars. (I exclude Minnesota, which adopted the English standards but not the math, and West Virginia, for which data were unavailable. Among adopters I include Indiana and Oklahoma, which eventually dropped out but were Core states as of 2013.)

That said, the bigger point is that you can’t reasonably reach any conclusion about the Core’s effect from Peterson and Ackerman’s analysis. For one thing, only Kentucky had fully implemented the Core as of the 2013 NAEP administration, which took place at the beginning of the year, and the vast majority of states weren’t even close. Indeed, most states are just now – 2015 – starting to give the tests associated with the Core, which also happen to define “proficiency.”

There is also a huge problem of controlling for the effects of numerous variables besides the Core, including other federal, state, and local policies; changes in socio-economic conditions; etc. Indeed, toward the end of the report Peterson and Ackerman note the possibility that waivers from No Child Left Behind encouraged some states to raise their standards. Unfortunately, they don’t mention that possibility without also crediting the Core: “Indeed, the waivers—as well as CCSS expectations—may help to account for the increasing rigor of state standards since 2011.”

There’s one last, interesting bit of information in the report: Between 2005 and 2013 – basically, the No Child Left Behind era – almost the same number of states appeared to lower their standards as raise them. Couple that with the possible effect of waivers and the superior performance of non-Core states, and this report could easily be used to say centralized policy interventions don’t work, be they No Child Left Behind or the Common Core. Of course, the report doesn’t justify that conclusion, either.