The Dumb Math Error Heard ‘Round the World

calvin_hobbes_math_atheistRarely has any one study had such an immediate impact on global policy. Usually, folks wait until a study is replicated and put through robust follow-up before any one takes research to heart. Reinhart & Rogoff (2010) basically played into the narrative of the plutocracy and what the ‘very serious people’ around the world wanted. So, its significant findings were taken very seriously before all those secondary tests of robustness and such were undertaken. Well, now we find out the wunderkind study that justified a lot of unnecessary austerity has some serious math mistakes. I’m still reading through all the criticisms but, as Krugman puts it “Holy Coding Error, Batman”!  Let’s just call this some serious MATH FAIL!

The intellectual edifice of austerity economics rests largely on two academic papers that were seized on by policy makers, without ever having been properly vetted, because they said what the Very Serious People wanted to hear. One was Alesina/Ardagna on the macroeconomic effects of austerity, which immediately became exhibit A for those who wanted to believe in expansionary austerity. Unfortunately, even aside from the paper’s failure to distinguish between episodes in which monetary policy was available and those in which it wasn’t, it turned out that their approach to measuring austerity was all wrong; when the IMF used a measure that tracked actual policy, it turned out that contractionary policy was contractionary.

The other paper, which has had immense influence — largely because in the VSP world it is taken to have established a definitive result — was Reinhart/Rogoff on the negative effects of debt on growth. Very quickly, everyone “knew” that terrible things happen when debt passes 90 percent of GDP.

Some of us never bought it, arguing that the observed correlation between debt and growth probably reflected reverse causation. But even I never dreamed that a large part of the alleged result might reflect nothing more profound than bad arithmetic.

The best explanation of the problem that I’ve seen comes from Mike Konczal at RortyBomb.

In 2010, economists Carmen Reinhart and Kenneth Rogoff released a paper, “Growth in a Time of Debt.” Their “main result is that…median growth rates for countries with public debt over 90 percent of GDP are roughly one percent lower than otherwise; average (mean) growth rates are several percent lower.” Countries with debt-to-GDP ratios above 90 percent have a slightly negative average growth rate, in fact.

This has been one of the most cited stats in the public debate during the Great Recession. Paul Ryan’s Path to Prosperity budget states their study “found conclusive empirical evidence that [debt] exceeding 90 percent of the economy has a significant negative effect on economic growth.” The Washington Post editorial board takes it as an economic consensus view, stating that “debt-to-GDP could keep rising — and stick dangerously near the 90 percent mark that economists regard as a threat to sustainable economic growth.”

Is it conclusive? One response has been to argue that the causation is backwards, or that slower growth leads to higher debt-to-GDP ratios. Josh Bivens and John Irons made this case at the Economic Policy Institute. But this assumes that the data is correct. From the beginning there have been complaints that Reinhart and Rogoff weren’t releasing the data for their results (e.g. Dean Baker). I knew of several people trying to replicate the results who were bumping into walls left and right – it couldn’t be done.

In a new paper, “Does High Public Debt Consistently Stifle Economic Growth? A Critique of Reinhart and Rogoff,” Thomas Herndon, Michael Ash, and Robert Pollin of the University of Massachusetts, Amherst successfully replicate the results. After trying to replicate the Reinhart-Rogoff results and failing, they reached out to Reinhart and Rogoff and they were willing to share their data spreadhseet. This allowed Herndon et al. to see how how Reinhart and Rogoff’s data was constructed.

They find that three main issues stand out. First, Reinhart and Rogoff selectively exclude years of high debt and average growth. Second, they use a debatable method to weight the countries. Third, there also appears to be a coding error that excludes high-debt and average-growth countries. All three bias in favor of their result, and without them you don’t get their controversial result.

Whenever you model a system, you have to make some assumptions going in.  These assumptions–coupled with the coding error–basicallyR_Rcorrect show the statistician’s slight of hand.  You can prove just about everything and anything with numbers if you manipulate the data enough. The UMAss-Amherst professor et al has really pulled the curtain away from the big green talking head this time.  Here’s the abstract.

Herndon, Ash and Pollin replicate Reinhart and Rogoff and find that coding errors, selective exclusion of available data, and unconventional weighting of summary statistics lead to serious errors that inaccurately represent the relationship between public debt and GDP growth among 20 advanced economies in the post-war period. They find that when properly calculated, the average real GDP growth rate for countries carrying a public-debt-to-GDP ratio of over 90 percent is actually 2.2 percent, not -0:1 percent as published in Reinhart and Rogo ff. That is, contrary to RR, average GDP growth at public debt/GDP ratios over 90 percent is not dramatically different than when debt/GDP ratios are lower.

The authors also show how the relationship between public debt and GDP growth varies significantly by time period and country. Overall, the evidence we review contradicts Reinhart and Rogoff ’s claim to have identified an important stylized fact, that public debt loads greater than 90 percent of GDP consistently reduce GDP growth.

Shorter Abstract:  They REALLY screwed the pooch.

So, there’s now this bigger problem out there which is the very serious people that are crashing economies based on a set of very biased assumptions and a very serious coding error.  To put that in a more politically correct way: How Much Unemployment Was Caused by Reinhart and Rogoff’s Arithmetic Mistake? Followed by, will the very serious people correct their very serious policy errors now?

This is a big deal because politicians around the world have used this finding from R&R to justify austerity measures that have slowed growth and raised unemployment. In the United States many politicians have pointed to R&R’s work as justification for deficit reduction even though the economy is far below full employment by any reasonable measure. In Europe, R&R’s work and its derivatives have been used to justify austerity policies that have pushed the unemployment rate over 10 percent for the euro zone as a whole and above 20 percent in Greece and Spain. In other words, this is a mistake that has had enormous consequences.

In fairness, there has been other research that makes similar claims, including more recent work by Reinhardt and Rogoff. But it was the initial R&R papers that created the framework for most of the subsequent policy debate. And HAP has shown that the key finding that debt slows growth was driven overwhelmingly by the exclusion of 4 years of data from New Zealand.

If facts mattered in economic policy debates, this should be the cause for a major reassessment of the deficit reduction policies being pursued in the United States and elsewhere. It should also cause reporters to be a bit slower to accept such sweeping claims at face value.

I spent most of Monday’s Morning Reads showing the current US economic data that shows that the deficit and debt of the US are melting like the wicked witch under that bucket of water.  I worry any more deficit reduction will throw our economy into recession and pave the way for Republican take over of the Senate.  However, the big green talking heads have been ignoring the data and just about everything else that a legion of economists have said citing this one–now clearly known to be flawed–study.  As Krugman mentioned in his blogs, their results was counter-intuitive and controversial among economists from day one of publication.  Policymakers through out Europe and the US gratuitously ignored all that because the questions did not fit their plan to push the mistakes of banks on a lot of hapless citizenry.

I have to give Matthew Ygleisas his due for this insight.  The damage is done, continues to be done, and has folks planning to do more and I believe they will just ignore this very serious error.

This is literally the most influential article cited in public and policy debates about the importance of debt stabilization, so naturally this is going to change everything.

Or, rather, it will change nothing. As I’ve said many times, citations of the Reinhart/Rogoff result in a policy context obviously appealing to a fallacious form of causal inference. There is an overwhelming theoretical argument that slow real growth will lead to a high debt:GDP ratio and thus whether or not you can construct a dataset showing a correlation between the two tells us absolutely nothing about whether high debt loads lead to small growth. The correct causal inference doesn’t rule out causation in the direction Reinhart and Rogoff believe in, but the kind of empirical study they’ve conducted couldn’t possibly establish it. To give an example from another domain, you might genuinely wonder if short kids are more likely to end up malnourished because they’re not good at fighting for food or something. A study where you conclude that short stature and malnourishment are correlated would give us zero information about this hypothesis, since everyone already knows that malnourishment leads to stunted growth. There might be causation in the other direction as well, but a correlation study woudn’t tell you.

The fact that Reinhart/Rogoff was widely cited despite its huge obvious theoretical problems leads me to confidently predict that the existence of equally huge, albeit more subtle, empirical problems won’t change anything either. As of 2007 there was a widespread belief among elites in the United States and Europe that reductions in retirement benefits were desirable, and subsequent events regarding economic crisis and debt have simply been subsumed into that longstanding view.

The very serious policymakers were looking for any justification for their austerity pogrom.  This is mainly because German taxpayers and pols don’t want to be on the hook for what German and US bankers did around the Eurozone. It is also because Republican law makers and their plutocratic overlords–like the Dr. Strangelove of Wall Street Pete Peterson–don’t want any funds floating around anywhere that could possibly find residency in their fee-churning ponzi schemes of investment funds.

It is not unusual, unfortunately, for some academics to neatly choose assumptions to drive results towards their hypothesis.  That is why peer review is extremely important.  Nearly every major study done using empirical data should be easily replicated.  It is usual for the authors to share their database and R&R obliged on this matter.  But, this emphasizes why major studies with major findings that don’t fit snugly with the current body of theory should undergo robust challenge.  Many economists had challenged the findings back in 2010 and the fact that some felt compelled to repeat their research indicated a healthy level of skepticism which is the hallmark of good research and researchers.

What is most disturbing about this is that agendas that drive the interests of a few start to reflect these theories-in-process.  R&R 2010 fit  the gross ambitions of people that were less concerned about truth than philosophy and ability to drive policy that basically is at odds with everything we’ve known about fiscal policy.  So, this takes us back to Matt’s question. There is incredible discussion on this in nearly all economics and finance blogs and circles.  Will these findings engender the same discussion and any course correction of the very serious people that used this very serious paper to do some very serious damage around the world.

I know it’s too much to hear those wonderful words ” We were wrong” on top of some course corrections.  But, hey it’s not too late for our President to give up the debt and deficit hysteria or is it?


47 Comments on “The Dumb Math Error Heard ‘Round the World”

  1. bostonboomer says:

    Reinhart and Rogoff have responded.

    Here are their main points:

    We were only arguing association, not causality.

    Herndon, Ash et. al. (the critique’s authors) still ostensibly got the same growth results for given levels of debt.

    The authors argue that a 1% growth differential between countries of high and low debt levels is small. That is “utterly misleading.”

    They have published a separate paper, with Vincent Reinhart, that addresses some of the data the authors accuse Rogoff and Reinhart of leaving out.

    The statement does not explicitly take on a charge in Herndon, Ash that they mis-entered some of their data into Excel.

    Full statement at the link.

    Thanks for writing this, Dak. Too bad the Very Serious People will pretend it never happened.

    • dakinikat says:

      I’ve been trying to read more about it … but have to wait until I’m done with students. Thanks for the heads up about it today.

      • bostonboomer says:

        You’re welcome. But were they really using Excel?!

      • dakinikat says:

        Yes. Every one–including the sources of the original data like the FED or the World Bank–put the data into excel spreadsheet form for upload to the econometrics package. You can get the excel files for all of Eugene Fama’s seminal papers on random walk on his website. I had to replicate his studies for my seminar in investments class for my doctoral program. That’s where I learned he conveniently leaves out all the great depression data.

        They are just big old flat files of data.

        It’s also easier to do simple math manipulation of a variable like taking a nominal value and using a price index to make it ‘real’. I put all my exchange rate data into US$ format as an example. It’s easier to do the math that way and you can keep track of it. You just use the econometrics package for their statistical purpose like regression or cointegration analysis.

      • bostonboomer says:

        Why don’t economists use SPSS?

        • dakinikat says:

          It’s not mathy enough for econometrics. It works well for panel data like that used by sociology people or poly sci people. It’s not sophisticated enough for econometrics and its missing a lot of the analysis tools. Most social scientists don’t use very sophisticated analysis and aren’t very comfortable with high level statistics and math analysis. It’s not complete enough for economic or financial analysis. I use STATA which uses excel files. Some times I have to use more difficult to use programs where I have to program the formulas into the analysis. I do stuff with Markov Chains, cointegration analysis and stats that a poly sci person just wouldn’t do. A lot of our math is more closely associated with what physics researchers use or extremely theoretical engineering. SPSS is just way too weak ass for that.

      • bostonboomer says:

        I see. I’m surprised there aren’t other stats packages for math then. BTW, neither sociology nor political science is science/statistics-based like psychology. I’m frankly insulted by the comparison. A lot of psychologists use SAS, but it’s too expensive for anyone who isn’t at a major institution.

        It sounds like you’re saying the data is entered in excel files and then analyzed by more powerful software. That’s the same thing we do. I like to enter things in Excel first because it’s so easy to use.

        • dakinikat says:

          There are but a bunch of universities developed TPS and STATA and other packages to make up for the lame math ability of SAS and SPSS. So, there are specialty programs for econometrics now so you don’t spend so much time programming things. SPSS and SAS are fine for folks running surveys or trying to look at populations of people. Most of their applications appeal to folks that run panel data and need limited functionality.

        • dakinikat says:

          It’s drive by the data you need to run … like surveys or typical panel data which is simple correlation of characteristics. That kind of data tends to run in the social sciences areas but marketing people and management people use it too.

      • bostonboomer says:

        Some areas of psychology use pretty sophisticated analysis. I guess I shouldn’t take it personally, but I can’t help it. Psychology is not analogous with sociology or political science in terms of quantitative analysis.

        • dakinikat says:

          I’m not saying that … if you use panel data, you have a limited number of things to do with it … most econometrics studies are time series and the numbers reflect different things. Of course, survey analysis is unsophisticated in terms of math but it’s the nature of the beast and market and poly sci people thrive on it.

      • bostonboomer says:

        Dak,

        Surveys are not at all representative of the kinds of data psychologists use! But I’ve had a tough couple of days. I can’t argue with you anymore.

        Thanks for a good post.

        • dakinikat says:

          I’m not saying surveys are representative of the data psychologists use because I don’t know. I am saying SAS and SPSS are set up to use panel data and the statistics you need for that is really limited. The type of data you use drives which math tools you need. I’m just giving example uses for for other fields. SPSS doesn’t cut it for econometrics because our data is different and requires math more akin to what engineers use or what physicists use because we use dynamic models that involve rates of change and time. Panel data analysis doesn’t work for that at all. I must not be explaining this well. It’s like the difference between algebra and calculus. You need them for different things.

          • Well there’s also ANOVA too and other stuff in social sciences, but anyway. I agree with Bb, SPSS can get pretty sophisticated in psychology, especially for longitudinal and meta-analyses. But, I can also how it’s a different type of data and analyses. Interesting, I’m glad to hear the excel was just for data entry, and that there’s a econometric pkg.

      • NW Luna says:

        That’s interesting about SPSS. It’s used in a number of other healthcare disciplines besides psychology — physiology, physical therapy, nursing — studies looking at metabolic values, anatomical measurements, electrolyte levels, medication levels in blood, rate of cardiac events, and other quantitative data.

        But regardless of the program, if inappropriate or incomplete data are entered in …. garbage out!

    • dakinikat says:

      and wow, what a lame response …

      • dakinikat says:

        and from Dean Baker:

        http://www.cepr.net/index.php/beat-the-press/

        The gist of the response is that HAP also find that high debt is associated with slower growth, and that other studies (including one of theirs) found the same result anyhow.

        The first point is highly misleading. It is true that in most of their specifications HAP found growth was slower in periods with debt levels above 90 percent of GDP than below, but the gap was relatively small and nowhere close to statistically significant. Furthermore, they found a much bigger gap in growth rates around debt-to-GDP ratios of 30 percent. If we think that R&Rs methodology is telling us something important about the world then the take-away should be that we want to keep debt-to-GDP ratios below 30 percent.

        If R&R had produced the correct table in their initial paper no one would have taken seriously their claim that the 90 percent debt-to-GDP ratio presents some sort of cliff. The corrected table in no way supports that view.

        yeah, it was a teeny tiny error (see graph in post)

    • RalphB says:

      They should commit ritual suicide, though I don’t know a traditional method for economists. Perhaps death by millions of paper cuts, one for each life they helped ruin.

      • bostonboomer says:

        LOL

      • dakinikat says:

        I would say they’ve just seen their research creds go down the crapper and no more “serious” book deals for them accept for the AEI. This is basically now the new laffer curve. I am pretty sure their careers are cooked. Now, the get the ritual suicide affair no as no more grad students and guess what you get to teach the worst section of the lowest freshman class on campus … unfortunately, they should do a lot of volunteer work in Greece instead. Good thing Iceland blew them off.

      • NW Luna says:

        Heh. Who’s going to sign up for a class from someone with a whoppingly huge “Excel error” publication? None of their students will believe a thing they say in lecture!

  2. bostonboomer says:

    The third person killed in the bombings was a grad student at BU, where I got my Masters and Ph.D. One person from my town and one from my university. I don’t think I can take any more.

    • bostonboomer says:

      Boston University student among the dead from Monday’s Marathon explosion

      One of the people killed in Monday’s Boston Marathon explosion was a Boston University graduate student, the university said this afternoon.

      Colin Riley, a BU spokesman, declined to release the student’s name, pending discussion with the family.

      According to BU Today news site, the student was one of three friends who had been watching the Marathon near the race finish line. One of the three was injured and is in stable condition at Boston Medical Center, according to the university.

      “She is doing well,” Robert Hill, dean of Marsh Chapel, who visited the surviving student, is quoted as saying in the BU Today site. “She has her friends around her, and she will soon have family around her.”

      The third BU student was unharmed.

      Earlier today, the Chinese consulate reported that one Chinese student from Boston University was “badly hurt” in Monday’s bombings.

      The Chinese consulate in New York identified that student as BU student Zhou Danling. Chinese media said Zhou is a graduate of Wuhan University in central China, and is a student at BU in actuarial science.

      “She cannot talk now but can communicate with pen and paper,” the consulate said in an e-mailed statement.

      The consulate said that one Chinese student was initially missing, and later found. Another Chinese student, identifed by the consultate as Ms. Lv Lingzi, is still missing. Her name has also been rendered as Lingzi Lu. No further information is available.

      • bostonboomer says:

        I just got this e-mail from the BU Alumni Assoc.

        Dear Alumni and Friends:

        Our hearts and prayers go out to those impacted by the Boston Marathon bombings. We just learned that among those injured were two BU students, including one who was killed. The names of these members of our BU family are being withheld, but we can relay that the severely injured student is in stable condition. Words cannot express the sense of loss felt by the entire BU community.

        We know that all BU alumni around the world are deeply saddened by this news. There has been an enormous tide of support for Boston, the University and those whose lives have been changed forever.

        You can find up-to-date news at BU Today along with a statement from Boston University President Robert Brown. We also encourage you to continue to support one another by sharing experiences, words of encouragement and offering support to those in need on Facebook and Twitter.

        Thank you again for your continued support as our University and our BU family unite, recover and grow stronger during this time.

        Your BU Alumni Association

        I am devastated.

      • Fannie says:

        Am so so sorry…………

      • janicen says:

        I read that earlier. I didn’t say anything to you because there have been so many false reports and I didn’t want to be wrong. I’m so very sorry.

      • bostonboomer says:

        It just hits so close to home…

      • Beata says:

        I am feeling like JJ – helpless. I don’t know what to say. Words seem so trite and useless.

      • NW Luna says:

        Oh, this is so sad.

    • Beata says:

      Peace be with you, BB. Please take care of yourself.

  3. janicen says:

    Well it’s been a couple decades since I’ve cracked an Econ book but back in the day, I used to gather, analyze and present data for the chief corporate muckety mucks to use. This was before there were nifty software programs that do it all by themselves. But I have to say that if I ever produced a chart with the huge disparity showing between the 60-90 percent column and the 90 percent column, I’d have immediately gone back and rechecked my data and I can guarantee I would have been questioned about it. Seriously, just a cursory glance at that thing would lead me to believe there was an error.

  4. fiscalliberal says:

    Thank you for the analysis / comment – have you read Stockmans book yet- just got it and it is big