Yesterday, the left‐wing Center for Tax and Budget Accountability (CTBA) released a misleading report on school choice programs in Indiana and elsewhere. Among its key findings include the following claims:
None of the independent studies performed of the most lauded and long standing voucher programs extant in the U.S.—Milwaukee, Wisconsin; Cleveland, Ohio; and Washington, D.C.—found any statistical evidence that children who utilized vouchers performed better than children who did not and remained in public schools.
According to the annual financial report of the Indiana Department of Education, Indiana spent $115 million on its voucher program in the 2014–2015 school year. In context, that means over $115 million of public, taxpayer money annually will be diverted from … the state’s public school system, and instead used to subsidize students attending private schools.
Both claims, while they contain elements of truth, are highly misleading.
Evidence for the Effectiveness of School Choice
To support its claim regarding the supposed lack of evidence for the success of school choice programs, CBTA points to a few studies of school voucher programs.
First, CTBA cites a longitudinal study of Milwaukee’s voucher program by researchers at the University of Arkansas, claiming that voucher students in grades 3–8 “performed statistically similar” to a matched group of district‐school peers on standardized tests. Oddly, CTBA relies on the 2008–2009 findings, published in 2010, rather than the most recent 2012 report. In fact, as the study’s coauthor, Dr. Patrick Wolf, explains, the study found “school choice in Milwaukee has had a modest but clearly positive effect on student outcomes.”
First, students participating in the Milwaukee Parental Choice (“voucher”) Program graduated from high school and both enrolled and persisted in four‐year colleges at rates that were four to seven percentage points higher than a carefully matched set of students in Milwaukee Public Schools. Using the most conservative 4% voucher advantage from our study, that means that the 801 students in ninth grade in the voucher program in 2006 included 32 extra graduates who wouldn’t have completed high school and gone to college if they had instead been required to attend MPS.
Second, the addition of a high‐stakes accountability testing requirement to the voucher program in 2010 resulted in a solid increase in voucher student test scores, leaving the voucher students with significantly higher achievement gains in reading than their matched MPS peers.
In the final year of the study, voucher students in grades 3–9 performed about 15 percent of a standard deviation higher on standardized reading tests, “a modest but meaningful educational difference.” The achievement growth in math was not statistically significant relative to the achievement growth of the matched district‐school students, but the study concluded that Milkwaukee district‐school students were “performing at somewhat higher levels as a result of competitive pressure from the school voucher program.” And because the vouchers were worth about half of the cost per‐pupil at the district schools, the study found that the voucher program saved the state nearly $52 million in fiscal year 2011.
The CTBA report ignores entirely previous research from the Brookings Institution, a random‐assignment study–the gold standard of social science research–that found voucher students in Milwaukee scored six Normal Curve Equivalent points higher than the control group in reading and 11 points higher in math.
CTBA also claims that the U.S. Department of Education’s study of Washington D.C.’s school voucher program “found no signfiicant difference between the performance of voucher and non‐voucher students in reading and math.” That is technically true at the 95 percent confidence interval, though there were positive findings for reading scores at the 90 percent confidence interval. Moreover, students offered vouchers graduated at a rate 12 percentage points higher than the control group, 82 percent to 70 percent respectively.
To reach its conclusion, the CTBA report completely ignores numerous other gold standard studies from respected researchers at Harvard, Princeton, the University of Chicago, the Brookings Institution, and elsewhere. For CTBA’s benefit, here is a sampling:
- Joshua M. Cowen, “School Choice as a Latent Variable: Estimating ‘Complier Average Causal Effect’ of Vouchers in Charlotte,” Policy Studies Journal, May 2008. — After one year, voucher students had reading scores 8 percentile points higher than the control group and math scores 7 points higher.
- William G. Howell and Paul E. Peterson, The Education Gap: Vouchers and Urban Schools, Brookings Institution, 2002, revised 2006. — After two years, African‐American voucher students had combined reading and math scores 6.5 percentile points higher than the control group.
- Jay P. Greene, “Vouchers in Charlotte,” Education Next, Summer 2001. — After one year, voucher students had combined reading and math scores 6 percentile points higher than the control group.
- Cecilia E. Rouse, “Private School Vouchers and Student Achievement: An Evaluation of the Milwaukee Parental Choice Program,” Quarterly Journal of Economics, May 1998. — After four years, voucher students had math scores 8 NCE points higher than the control group. NCE points are similar to percentile points.
As I noted recently, none of these findings are earth shattering, but each study found a statistically significant positive outcome overall or for certain subgroups, particularly low‐income African‐Americans who are currently the most choice‐deprived.
Evidence of Savings from School Choice
By claiming that Indiana’s school voucher program cost the state $115 million in 2014–15, CTBA completely ignores the savings generated from reduced expenditures on district schools. The average cost per‐pupil in Indiana’s district schools in north of $10,400, whereas the average voucher is worth less than $4,000. Indiana’s base per‐pupil funding amount is about $4,600, and the state gives district schools more than $6,000 for low‐income students, as well as additional funds for other categories of students (e.g. — students in special needs programs, academic honors programs, career and technical training programs, etc.), so the state saves money each time a student switches out of a district school to accept a voucher. CTBA does not even acknowledge this fact when discussing the fiscal impact to the state, let alone estimate the true fiscal impact.
Likewise, in its discussion of the fiscal impact of Indiana’s scholarship tax credit law, CTBA focuses solely on the reduction in revenue with nary a mention of the corresponding reduction in expenses. The average scholarship is worth barely $1,000, so every student who switches out of a district school to accept a scholarship saves the state a lot of money. In a forthcoming report for the Friedman Foundation for Educational Choice, using highly conservative assumptions, I calculated that the Indiana School Scholarship Tax Credit saved the state approximately $23.2 million in 2014–15.
However, CTBA does try to have its cake and eat it too. While the CTBA report never mentions the reduction in state allocations to district schools resulting from vouchers or scholarships in its discussion of the impact on state spending, it does include a section lamenting the loss of revenue to the district schools. Unsurprisingly, there is no discussion of the reduced costs associated with the reduction in student enrollment.
The CTBA report’s central claims are highly misleading. Policymakers should bury them under a truckload of New England road salt.