Is there a connection between education spending and student achievement?
This is the newest installment in Dr. Pearlstein’s year-long review of how Minnesota students are doing compared to students in other states, and more broadly, how American students are doing compared to counterparts elsewhere in the world. As with the last one, which looked at how boys in the United States are doing compared to how girls are doing in our country (not well was the answer), this iteration also asks an important contextual question: What are the connections, if any, between how much money Americans have spent on education over the last four decades, and how well have American students performed over this period?
The Minnesota Legislature won’t deal in any comprehensive way with education spending until next year, but any legislative session is a good time to take a look at the connections, if any, between how much money Minnesotans and other Americans spend on public education and how much students actually learn.
The best study I’ve seen recently was by Andrew Coulson of the Cato Institute in Washington, who died of cancer just last month. As with John Chubb, another invaluable education scholar who died of cancer a few months ago, Coulson was methodologically brilliant, making him particularly compelling in his research-based advocacy of greater parental choice and real educational freedom.
Coulson’s quantitative skills, as well as clear-writing talents, were on full display in his just-cited 2014 study, State Education Trends: Academic Performance and Spending over the Past 40 Years, as when he wrote: “[T]he overall picture can be summarized in a single value: 0.075.”
That is the correlation between the spending and academic performance changes in the past 40 years, for all 50 states. Correlations are measured on a scale from 0 to 1, where 0 represents absolutely no correlation between two data series and 1 represents a perfect correlation. Anything below 0.3 or 0.4 is considered a weak correlation. The 0.075 figure reported here suggests that there is essentially no link between state education spending (which has exploded) and the performance of students at the end of high school (which has generally stagnated or declined).
Please note a key phrase at the end of this passage: “. . . the performance of students at the end of high school (which has generally stagnated or declined).” Haven’t younger students frequently progressed better than this? Yes, they have, on average. But one of the more distressing dynamics in U.S. public education is that the older American kids get, the less well they do, generally speaking, in comparison with students around the world.
Coulson also wrote: “The performance of 17-year-olds has been essentially stagnant across all subjects since the federal government began collecting trend data around 1970, despite a near tripling of the inflation-adjusted cost of putting a child through the K-12 system.” The full amount was $56,903 in 1970. It was $164,426 in 2010.
What about Minnesota? According to my reading of Coulson’s graphs, per-pupil percentage increases in Minnesota education spending, relative to 1972, were smaller than in most other states, roughly akin as they were to percentage increases in Alaska, Arizona, Idaho, Michigan, Oklahoma, Oregon, and Utah. Even so, per-pupil, inflation-adjusted K-12 education spending in Minnesota still increased by more than 80 percent between 1972 and about 2010. Which is to say, it wasn’t far away from doubling in real terms. Not skimpy.
As for how Minnesota students fared in terms of achievement over the 38-year period, while there were small changes up and down, “SAT scores adjusted for participation and demographics” wound up exactly where they started. (Building on the work of economists Mark Dynarski and Philip Gleason, Coulson ingeniously developed a way of adjusting state-average SAT scores to compensate for varying rates of test taking as well as disparate demographic characteristics. To see how he did so, please take a look at pp. 2-3 of his paper.)
But what about actual dollars spent per student in Minnesota, not just rates of change?
For fiscal year 2013, the Census Bureau reports that per-pupil spending among public school students in the 50 states plus the District of Columbia was $10,700. It was bigger in Minnesota, albeit modestly, at $11,089. Notable states in which per-pupil spending was lower than in Minnesota included Arizona ($7,028), California ($9,220), Colorado ($8,647), and Missouri ($9,597). Also Oregon ($9,543), Texas ($8,299), and Washington State ($9,672).
Everyone should have a favorite education economist (econometrician, actually) and Stanford’s Eric Hanushek is mine. This is how, in a comprehensive literature review in 2006, he distilled scholarly research on the ties between how much money is spent on education and how much students learn.
[T]he research indicates little consistent relationship between resources to schools and student achievement. Much of the research considers how resources affect student achievement as measured by standardized test scores. These scores are strongly related to [future] individual incomes and to national economic performance, making them a good proxy for longer run economic impacts. But, the evidence – whether from aggregate school outcomes, econometric investigations, or a variety of experimental or quasi-experimental approaches – suggests that pure resource policies that do not change incentives are unlikely to be effective.
I interpret Hanushek’s most recent writing as affirming that nothing, over the last decade, has caused him to change his professorially couched, but critically instructive conclusion.