Tag: Paul Peterson

Did Common Core Do That? We Don’t Actually Know

Common Core supporters love to accuse opponents of peddling misinformation, and sometimes opponents do. On the flip side, Core supporters are frequently guilty not only of peddling deceptive information of their own, but promising the world without sufficient evidence to justify it. A new report from Harvard’s Paul Peterson – generally a pretty sober analyst – comes a bit too close to making such a leap, strongly suggesting that the Common Core has caused appreciable improvement in the rigor of state standards.

Based on a rough trend of decreasing differences between the percentage of students scoring “proficient” on state tests and on the National Assessment of Educational Progress, Peterson and co-author Matthew Ackerman report that state standards are rising. In other words, “proficient” on state tests is looking more like presumably high-flying “proficient” on the “Nation’s Report Card.”

Between 2011 and 2013, “20 states strengthened their standards, while just 8 loosened them,” Peterson and Ackerman report. To what do they attribute this? “A key objective of the CCSS [Common Core] consortium – the raising of proficiency standards – has begun to happen.” In case the text of the report didn’t make the attribution of success to the Core clear, the report’s subhead intoned that, “commitments to the Common Core may be driving the proficiency bar upward.”

At the very least, there should be a huge emphasis on “may,” and the Core probably shouldn’t be mentioned at all.  

Indeed, Peterson and Ackerman’s results could suggest that the Common Core actually dampened rigor. According to the report, of the four states that never adopted the Core, Texas and Virginia raised their standards while Alaska and Nebraska stood pat. That means 50 percent of non-adopters lifted their standards and 50 percent stood their ground. None went backward. Among Core adopters, in contrast, eight states, or 18 percent, lowered their standards; 19, or 42 percent, stood still; and only 18, or 40 percent, raised their bars. (I exclude Minnesota, which adopted the English standards but not the math, and West Virginia, for which data were unavailable. Among adopters I include Indiana and Oklahoma, which eventually dropped out but were Core states as of 2013.)

Bush or Obama: Can We Tell Who Shuffles the Edu-Chairs Better?

I’m a Paul Peterson fan, and I sure don’t think President Obama’s education grade should be very high, but I’m afraid Peterson is offering some pretty weak stuff in this op-ed hoisting President George W. Bush above the current POTUS in education policy.

The main problem is that Peterson is using broad National Assessment of Educational Progress data as his main evidence of Bush’s success and Obama’s failure. But not only are these data far too blunt to tell us much about a single administration’s policies—myriad forces are at work in education beyond federal rules and regulations—it’s a serious stretch to suggest that we should expect to see big testing gains from any policy within a year or two of its enactment. Peterson even hints as much late in his treatment of Obama, noting that “NAEP data are available for just the first two years of his administration, [but] the early returns are not pretty.”

“Early returns” is right, considering that President Obama only took office in 2009, the first winners of Race to the Top—Obama’s main “reform” driver—weren’t declared until late August 2010, and the NAEP exams were administered between January and March of 2011.

More troubling, though, is the praise Peterson heaps on President Bush and No Child Left Behind. I’ve broken down NAEP scores six ways from Sunday and won’t rehash it all again, but based on improvement rates the NCLB era hasn’t been all that special. More important for this discussion, again considering policy implementation lags, it is a big leap to look at NAEP scores and crown Bush the edu-winner.

Let’s break down Peterson’s biggest advantage-Bush claim: “Overall, the annual growth rate in fourth- and eighth-grade math was twice as rapid under the Bush administration as under his successor’s.” (Actually, his biggest claim is that Bush’s fourth-grade reading performance is “infinitely” better than Obama’s, but that’s because there’s been no gain under Obama, not because under Bush scores were numerically much better.)

By far the biggest increase in 4th grade math scores that included Bush presidency years occurred between 2000 and 2003, when the average score rose three points per year. Pretty impressive, but Bush didn’t become President until 2001, and NCLB wasn’t enacted until January 2002. With NAEP administered between January and March 2003, NCLB was only law for about a year in that time span, and it took much of that year for people just to figure out what NCLB was all about. In other words, it is unlikely—though, granted, not impossible—that the improvement in that period was due primarily to NCLB.  Pull that span out of the Bush years, though, and growth was just 0.83 points (out of 500) per year. Years 2009 to 2011 saw a 0.50 point uptick—so neither president saw growth that was too impressive.

For eighth grade math, again excluding 2000-2003, the Bush years registered another 0.83 point growth rate, and 2009-2011 was again at 0.50.

The Bush years are a bit better than Obama’s just looking at NAEP—which alone tells us little—but neither president’s tenure is all that great, especially considering, as noted, several pre-NCLB periods saw faster growth. And then there’s the big picture: When you get to 17-year-olds—our schools’ “final products”—NAEP scores have been utterly stagnant for decades despite per-pupil expenditures roughly tripling and Washington getting ever-more involved.

Even if you could tell who nudged an Adirondack chair a millimeter farther from the abyss, arguing about which president is the better chair-shuffler is really missing the sinking ship.