Minnesota Macro: The Real Villains

The Krugosphere is hostile to the macroeconomics of the University of Minnesota. I understand that. Krugman has used the term “Dark Age Macroeconomics” to describe what took place between the late 1970s and today. I understand that, too.

But what happened in Minnesota could have stayed in Minnesota. Instead, Stan Fischer and Olivier Blanchard gave MIT’s blessing to DSGE models and vector autoregressions. To me, those two are the real villains.

Had Fischer taken his cues from, say, Clower and Leijonhufvud, rather than from Sidrauski and his ilk, macroeconomists might have spent the last 30 years working on interesting issues and gaining some better understanding of the economy. Instead, they spent the last thirty years diddling with fancy unverifiable equations and pouring a few globs of macro data into the VAR immersion blender.

The CBO on the Budget Outloook

Director Elmendorf warns,

CBO estimates that federal debt held by the public will equal 74 percent of GDP at the end of this year and 79 percent in 2024 s. 4 (the end of the current 10-year projection period). Such large and growing federal debt could have serious negative consequences, including restraining economic growth in the long term, giving policymakers less flexibility to respond to unexpected challenges, and eventually increasing the risk of a fiscal crisis (in which investors would demand high interest rates to buy the government’s debt).

Some possibilities:

1. The CBO are Koch-funded austerians.

2. Just like the reduction in hours worked that the CBO forecasts for Obamacare, eventually increasing the risk of a fiscal crisis is actually a good thing.

3. More government debt gives the Fed more debt to buy, which in turn makes the stock market happy.

Paragraphs to Ponder

Two pointers from Reihan Salam.

Michael Schrage wrote,

America doesn’t have a jobless recovery; it has a hireless recovery. Don’t confuse them. After all, you first have to get hired to have a job. Organizations may be desperate to grow, but they overwhelmingly lack the desire to hire. Fewer people are working longer, harder and (presumably) smarter hours. So many firms have proven so productive even after several rounds of layoffs, that serious economists wonder if, in fact, large slices of the workforce actually offer ZMP — Zero Marginal Productivity — to their enterprise. In other words, the Great Recession reveals many employees not just to be worth less but economically worthless. Ouch.

For most organizations, people are a means and medium to an end. They’re not hiring employees, they’re hiring value creation.

One point I am starting to harp on is that many workers are not concurrently productive. That is, the work they do helps the firm be more productive in the future. That means that when firms think about hiring they have a lot of discretion (we can meet the demand for widgets today without adding new people) and they face a lot of uncertainty (will these social media marketers really deliver us new customers?).

Schrage again:

What’s structurally changed is not the job but why people get hired. In other words, is hiring someone really essential to getting the job done? Just as important, as we look at employment costs, risks and uncertainties over the next five years, is hiring someone the most cost-effective way to get the job done?

David Levinson wrote,

[in the year 2030] Firms also are not interested in paying for training, so most people now go through a 10-year unpaid internship while simultaneously attending school online and engaging other pursuits on a more or less random schedule.

John Cochrane on Online Teaching

He writes,

Don’t dream of doing a mooc on your own. You need video and IT help. Most of all, you need pedagogical help, people who keep up with the fast-evolving art of how to successfully port classes on moocs. I had that help at the University of Chicago, and it saved me from horrible beginner blunders. Example: I wanted to tape my live classes. No, Emily, who was in charge of my class, insisted that we do it months ahead of time in 5-8 minute segments.

In fact, it takes considerable time and effort to come up with an effective, compelling short video. A typical lecture is way too long and way too boring to translate into the online world. One of the leading MOOC suppliers offered a statistics course in which the professor opened up with a 20-minute lecture on histograms. I have to assume that the course was a total failure. As Cochrane puts it,

no question about it, the deadly boring hour and a half lecture in a hall with 100 people by a mediocre professor teaching utterly standard material is just dead, RIP. And universities and classes which offer nothing more to their campus students will indeed be pressed.

One of Cochrane’s main points is that online education really underscores the fixed cost in lesson preparation. Consider that it probably takes much more work to create an effective online lesson than it does to put a lesson in the form of a textbook. Yet anyone who has ever written a textbook can tell you that it is difficult and painstaking, so imagine what it would take to do an entire online course as well as you possibly could.

I believe that it is unlikely that any one person can create an entire course as a MOOC using today’s tools and make anywhere close to the best use of the online medium. Perhaps the tools will get much better. But meanwhile, I would recommend that would-be online instructors focus on producing really good lessons, as opposed to entire courses.

Suppose you can produce ten high-quality lessons of 8 minutes or less. This may take hours and hours of planning, scripting, editing, and so on. It will not cover an entire course. But if you then combine it with other lessons that are available on line, you can cobble together a high-caliber course. That is one scenario for how online education might develop over the next few years.

The Case Against VARs

In a comment on this post, Noah Smith commended to me the work of George-Marios Angeletos of MIT. Unfortunately, Angeletos is fond of vector autoregressions (VARs), which I detest.

I got my start in macro working on structural macroeconometric models. I saw them close up, and I am keenly aware of the problems with them. Hence, I wrote Macroeconometrics: The Science of Hubris.

However, I will give the old-fashioned macroeconometricians credit for at least worrying about the details of the data they are using. If there are structural factors that are changing over time, such as trend productivity growth or labor force participation, the macroeconometrician will keep track of these trends. If there are special factors that change quarterly patterns, such as the “cash-for-clunkers” program that shifted automobile purchases around, the macroeconometrician will take these into account.

The VAR crowd cheerfully ignores all the details in macro data. The economist with a computer program that will churn out VARs is like a 25-year-old with a new immersion blender. He does not want to spend time cooking carefully-selected ingredients. He just wants to throw whatever is in the pantry into the blender to make a smoothie or soup. (Note that I am being unfair to people with immersion blenders. I am not being unfair to people who use VARs.)

The VAR appeared because economists became convinced that structural macroeconometric models are subject to the Lucas Critique, which says that as monetary policy attempts to manipulate demand, people will adjust their expectations. My reaction to this is

(a) the Lucas critique is a minor theoretical curiosity. There are much worse problems with macroeconometrics in practice.

(b) How the heck does running a VAR exempt you from the Lucas Critique? A VAR is no less subject to breakdown than is a structural model.

The macroeconometric project that I first worked with is doomed to fail. Implicitly, you are trying to make 1988 Q1 identical to 2006 Q3 except for the one causal factor with which you are concerned. This cannot be done. There is too much Manzian causal density.

The VAR just takes this doomed macroeconometric project and cavalierly ignores details. It is not an improvement over the macroeconometrics that I learned in the 1970s. On the contrary, it is inferior. And if the big names in modern macro all use it, that does not say that there is something right about VAR. It says that there is something wrong with all the big names in modern macro. On this point, Robert Solow and I still agree.

Trends in Faculty and Administration

Timothy Taylor comments on a recent report.

When it comes to employment, colleges and universities have tried to hold down faculty costs in dealing with the expanding numbers of students by the use of time-contract faculty and part-timers. The nonprofessional staff are dealing with the increased number of students by using improved information technology and other capital investments, without a need for a higher total number of staff. But the number of professional staff is rising, both in absolute terms and relative to the number of students…

I’ll only add that institutions are defined by their people. As the full-time and tenured faculty become a smaller share of the employees of the institution and the professional administrators become a larger share, the nature and character of the institution inevitably changes. In this case, colleges and universities have become less about faculty, teaching, and research, and more about the provision of professional services to students and faculty. As far as I know, this shift was not planned or chosen, and the costs and benefits of such a shift were not analyzed in advance. It just happened.

My comments:

1. Perhaps this parallels shifts in other sectors of the economy. That is, we have fewer front-line production workers and more people working on building organizational capital.

2. The value of the organizational capital provided by non-teaching staff in education seems particularly nebulous because the measure of value in education is particularly nebulous.

3. In other sectors, the number of production workers per unit of output probably is falling faster than in higher education.

4. In other sectors, information technology has had more profound effects on the process of providing goods and services. People suspect that bigger changes are in store in education, once people figure out the best ways to apply information technology. I offered my guesses here. Some of these possibilities could lead to a dramatic reduction in the number of professors per student and also in the number of professors per organizational-capital builder in education.

What I’m Reading

I finished Gregory Clark’s new book. I put it in the must-read category. I hope to publish a review on line in the next few months.

I am now reading Fragile by Design by Charles Calomiris and Stephen Haber. I posted a few months ago on an essay they wrote based on the book. I also attended yesterday an “econtalk live,” where Russ Roberts interviewed the authors in front of live audience for a forthcoming podcast. You might look forward to listening–the authors are very articulate and they speak colorfully, e.g. describing the United States as being “founded by troublemakers” who achieved independence through violence, as opposed to the more boring Canadians.

I think it is an outstanding book, although in my opinion it is marred by their focus on CRA lending as a cause of the recent financial crisis. This is a flaw because (a) they might be wrong and (b) even if they are right, they will turn off many potential readers who might otherwise find much to appreciate in the book. Everyone, regardless of ideology, should read the book. It offers a lot of food for thought.

I am only part-way through it. The story as far as I can tell is this:

1. There is a lot of overlap between government and banking. Governments, particularly as territories coalesced into nation states, needed to raise funds for speculative enterprises, such as wars and trading empires. Banks need to enforce contracts, e.g., by taking possession of collateral in the case of a defaulted loan. Government needs the banks, and the banks need government.

2. If the rulers are too powerful, they may not be able to credibly commit to leaving banks assets alone, so it may be hard for banks to form. But if the government is not powerful enough, it cannot credibly commit to enforcing debt contracts, so that it may be hard for banks to form.

3. Think of democracies as leaning either toward liberal or populist. By liberal, the authors mean Madisonian in design, to curb power in all forms. By populist, the authors mean responsive to the will of popular coalitions of what Madison called factions.

4. If you are lucky (as in Canada), your banking policies are grounded in a liberal version of democracy, meaning that the popular will is checked, and regulation serves to implement a stable banking system. If you are unlucky (as in the U.S.), your banking policies are grounded in the populist version of democracy. Banking policy reflects a combination of debtor-friendly interventionism and regulations that favor rent-seeking coalitions who shift burdens to taxpayers. The result is an unstable system.

I may not be stating point 4 in the most persuasive way. I am not yet persuaded by it. In fact, I think libertarians will be at least as troubled as progressives are by some of the theses that the authors promulgate.

Motivated Reasoning

Dan M. Kahan and colleagues write,

If high Numeracy subjects use their special cognitive advantage selectively—only when doing so generates an ideological congenial answer but not otherwise—they will end up even more polarized than their low numeracy counterparts. Such a result, while highly counterintuitive from the perspective of SCT [“science comprehension thesis”], would be consistent with the view of a smaller group of scholars who take the view that identity-protective cognition operates on both heuristic and systematic—System 1 and System 2—forms of infor-mation processing (Cohen 2003; Giner-Sorolla & Chaiken 1997; Chen, Duckworth & Chaiken 1999; Kahan 2013). It would also be consistent with, and help to explain, results from observational studies showing that the most science comprehending citizens are the most polarized on issues like climate change and nuclear power.

Read the draft paper. I got to the link from Jonathan Haidt, which in turn I got to from Tyler Cowen.

I think that there is a general moral here, but I am not sure how to phrase it. Maybe something along the lines of, “Try to find the holes in your own most strongly-held beliefs.”

What Should Austrian Macroeconomics Resemble?

Noah Smith writes,

So it basically seems to me that the New Classicals captured and improved on the basic ideas of the Austrians in almost all of the ways that matter, while vastly improving on the presentation. New Classical concepts of rationality, distrust of empiricism, and distrust of government intervention are more moderate and nuanced than those of the Austrians, and their mathematical style is simply much more appealing to modern academics than the dense, turgid prose of von Mises or Hayek. Thus, if you were a smart young macroeconomist in 1980 who believed that people were both rational and smart, that government intervention was a bad idea, and that theory was the best way to investigate human behavior, you did not become an Austrian; you became a New Classical.

Not me!!

What I don’t like about Austrian macro is the focus on the central bank as the sole source of distortions. But I see New Classical as horrible along almost every dimension. I do not like the math. I do not like rational expectations (it is a very anti-Hayekian notion, that we all have the same information). I do not like the representative-agent formulation, because it rules out important co-ordination problems. I do not like the production function, which ignores the roundaboutness of production and what Fischer Black emphasizes, which is that people invest in all sorts of physical and human capital under conditions of uncertainty, and sometimes their capital ends up not so valuable.

I do not like AS and AD, which I think channel people’s thinking narrowly. AS and AD are like a pair of glasses that make you see the world only in black and white and in two dimensions. Sometimes, simplification is good, but not when you miss the color and the depth.

I do not like a priorism, but I see macro as only faux-empirical. As Noah pointed out in a previous post, the macro equations are not verified (or even verifiable) in the real world.

Moreover, the data with which macroeconomists work is very problematic. Who can be happy with how the money supply is measured? Or prices? Or even GDP–what is the GDP of Google? The measured economic activity consists of advertisements. Do we think that measures Google’s output?

Maybe the worst-measured variable of all is productivity. How many workers in the U.S. are in large organizations where they spend time reading email, producing reports, and going to meetings? I am going to go out on a limb here and say that these are, on average, productive activities. They produce some sort of organizational capital. But they do not produce output in the here and now. So if you divide this month’s output by this month’s hours spent on the job, that is inaccurate, because a lot of this month’s work is about output in later periods.

Finally, my first forays into macroeconomics (see my book) were when macroeconometricians tried to pay attention to special factors that messed with their data–steel strikes, tax-law changes, automobile sales-incentive programs, and so on. Now, it’s like “We don’t care. Stick everything into a vector-autoregression and accept whatever the computer spits out.”

The bottom line: Austrian economics ought to resemble PSST, not New Classical.