How is 2021 inflation like 1946 inflation?

Economic advisers Cecilia Rouse, Jeffery Zhang, and Ernie Tedeschi write,

The period right after World War II potentially provides the most relevant case study, as the rapid post-war inflationary episode was caused by the elimination of price controls, supply shortages, and pent-up demand.

But in 1946 government spending was headed down. In the 1950s, the government was running large primary surpluses (tax revenues were higher than non-interest government spending). Today, it is completely different.

[UPDATE: I listened to Niall Ferguson, and near the end of the interview he says that the best historical analogy is with the 1960s, when inflation started out low and ended up at 6 percent. Then came the 1970s.]

Pent-up inflation

Scott Sumner writes,

There are many reasons why higher wages might not be the optimal way to address a labor shortage. One problem is downward wage stickiness after the economy returns to normal

There is an assumption that “normal” means that you can hire all the workers you want at the same wage rate you were used to paying.

My view: the government has showered the economy with paper wealth, raising the ratio of dollar wealth to wages and prices. The way this will resolve itself is for wages and prices to rise. Firms are in denial about this for now. So inflation remains pent up. I think that prices will rise first, as firms realize that they can make price increases stick and in fact they have to raise prices to ration demand, because they cannot hire enough workers to meet demand at current price levels. Then wages will rise as workers realize they have to seek higher pay in order to keep up with the cost of living. Workers are certainly in a frisky mood, with many quitting and others contemplating doing so.

As I see it, the economy is groping toward a new higher equilibrium price level. The financial markets, and Scott, disagree with me on that.

The industrial revolution: how did workers eat?

Davis Kedrofsky writes,

There was an Industrial Revolution, and it was slow.

The debate’s been over for two decades. The gradualists, armed with new growth accounting techniques, have won, exorcising forever the Ashtonian vision of an eighteenth-century leap into exponential growth. Where historians once thought the spinning jenny, steam engine, and factory system the immediate preconditions of England’s dramatic transformation, they now accept that these advances emerged in a trickle in a few tiny sectors.

Pointer from Tyler Cowen.

I believe that that the gradualists may have it wrong. Here is a clue, from later in Kedrofsky’s essay.

And a growing share of the population moved out of agriculture and into that increasingly-recognizable branch of manufacturing—from 33.9 percent in 1759 to 45.6 percent in 1851.

How did these manufacturing workers eat? There had to be a significant increase in agricultural productivity. But I am guessing that a lot of the increase did not take place on farms. Here is Wikipedia.

The British Agricultural Revolution, or Second Agricultural Revolution, was an unprecedented increase in agricultural production in Britain arising from increases in labour and land productivity between the mid-17th and late 19th centuries. Agricultural output grew faster than the population over the century to 1770, and thereafter productivity remained among the highest in the world. This increase in the food supply contributed to the rapid growth of population in England and Wales, from 5.5 million in 1700 to over 9 million by 1801, though domestic production gave way increasingly to food imports in the nineteenth century as the population more than tripled to over 35 million

I want to focus on the last point, about food imports. Borrowing a trope from David Friedman, I would say that British manufacturing workers were very productive at growing foodstuffs. They produced manufactured goods, put them on ships, and the ships came back with foodstuffs.

Suppose that productivity did not change in either agriculture or manufacturing between 1800 and 1850. A worker could produce a bushel of grain per day either year, and a worker could produce a bolt of cloth per day either year. But if you can trade a bolt of cloth for two bushels of grain, and you move some workers out of agriculture and into manufacturing, that worker now produces twice as much grain.

Adam Smith and David Ricardo knew more about how living standards improve than do modern productivity historians. And my guess is that the transformation that people who were alive around 1800 were seeing with their own eyes was real, today’s gradualists notwithstanding.

UPDATE: A commenter points to Anton Howes.

The push thesis implies agricultural productivity was an original cause of England’s structural transformation; the pull thesis that it was a result. The evidence, I think, is in favour of a pull — specifically one caused by the dramatic growth of London’s trade.

On the larger question, Howes is a gradualist.

Culture changes one funeral at a time

Tanner Greer writes,

Cultures do not change when people replace old ideas with new ones; cultures change when people with new ideas replace the people with old ones.

With Wokism in ascendance among the young, it is hard to see how I am going to live to see its recession.

Or, as Greer puts it,

The millennials are a lost generation; they will persist in their errors to the end of their days. Theirs is a doomed cohort—and for most of the next two decades, this doomed cohort will be in charge.

Have a nice day.

The virus, public health, and science

In an interview, epidemiology professor Vinay Prasad says,

Another great failure is that we didn’t learn a lot. We did so many different interventions, but we didn’t actually study many of them. For example, there are still questions about how much to wear masks, and under what circumstances. We don’t know much more about that than when the pandemic began.

If I had been in charge, this would have been different. You will recall that I clamored for rigorous testing very early on. He also says,

Zoom allowed a lot of upper-middle-class white-collar people the ability to work and make money and not lose their jobs, and to exclude themselves from society. That fundamentally changed the pandemic. If you went back 15 years ago, and you didn’t have Zoom, you would be facing unprecedented layoffs of wealthy, upper-middle-class people. I think a lot of businesses would have had staggered schedules and improved ventilation. Schools would have pushed to reopen. Amazon Prime and Zoom and all these things in our lives allowed a certain class of people to be spared the pains of COVID-19, taking them out of the game, and making them silent on many of the issues that affected other communities.

Central bank digital currency

Timothy Taylor has helpful post. He cites the potential to lower the cost of making payments compared with using bank deposits. But then

At least to me, other advantages sometimes cited for a central bank digital currency often miss the point. For example, one will sometimes hear claims that the Fed needs a digital currency to compete with the cryptocurrencies like Bitcoin and Ethereum. But it’s not at all clear to me that these cryptocurrencies are anywhere near unseating the US dollar as a mechanism for payments, and it’s quite clear to me that competing with Bitcoin is not the Fed’s job. Or one will hear that because other central banks are trying digital currencies, the Fed needs to do so, also. My own sense is that it’s great for some other central banks to try it out, and for the Fed to wait and see what happens. There is a hope that zero-cost bank accounts at the Federal Reserve might help the unbanked to get bank accounts, but it’s not clear that this is an effective way to reach the unbanked (who are often disconnected from the financial sector and even the formal economy in many ways), and there are a number of policy tools to encourage banks to offer cheap or even zero-cost no-frills bank accounts that don’t involve creating a central bank digital currency.

For now, I file CBDC under “solutions in search of a problem.”

Martin Gurri watch (from Matt Taibbi)

Matt Taibbi writes,

Just as the Internet allows ordinary people to DIY their way through everything from stock trading to home repair, they now have access to tools to act as their own doctors, from caches of medical papers at sites like pubmed.gov to symptom-checkers to portals giving them instant looks at their own test results — everything they need, except of course the years and years of training, experience, and practice, and therein lies the rub.

. . .Doctors around the world have expressed frustration at the “populist treatment,” as it’s become common in Central and South America in particular for poor people to defy authorities and self-medicate with ivermectin. Experts frequently associate the drug with “pharmaceutical messianism,” i.e. politicians promising panacea cures, often in conjunction with rhetoric bashing experts and credentialed authorities.

Thus the revolt of the public plays out in medicine.

The root of the problem

I write,

Higher education has turned into a self-licking ice cream cone, meaning an institution that has lost its sense of purpose and instead is focused on self-perpetuation. For many reasons, we need to do away with college as we know it.

Broadly speaking, we need to replace two aspects of college. One is the process of obtaining knowledge and demonstrating what one has obtained. The other is the rest of the college experience—its extracurricular aspects.

Read the rest.

Some of you are disappointed that I am not all in on the fight against CRT in K-12. My thinking is that even winning that fight (in some jurisdictions) won’t change college campuses. Maybe K-12 is a battle worth fighting, but it won’t win the war. K-12 is only a branch of the problem, and cutting if off will not take care of the root of the problem. The root of the problem can be found on the college campus.

There are organized efforts to try to save college as an institution. In the essay, I instead suggest that we do away with it.

All about progressivism

Michael Lind thoroughly dissects the progressive movement.

whenever you read the phrase “public interest group” or “social justice organization,” you should substitute it with “billionaire-or-corporate-funded social engineering bureaucracy.”

That is one tiny slice. Read the whole thing.

On somewhat similar lines, Musa Al-Gharbi writes,

Since the publication of Anand Giridharadas’ best-selling Winners Take All (Knopf 2018), there has been a good deal of attention of how the super-rich use philanthropy as a means of shaping society in accordance with their own tastes and interests under the auspices of helping others – often exacerbating the very problems they claim to be trying to solve. However, millionaires and billionaires are not capable of creating, enforcing, managing and perpetuating society and culture all on their own.

More realistically, to understand whose interests are being served by a social order – to see how it is formed, reproduced and sustained – we should look at the upper quintile of society, the top 20%.

This is from the introduction to his own forthcoming book. One more excerpt:

Symbolic analysts’ dominance over knowledge production, culture, institutional bureaucracies, etc. often affords us even more clout than our (relatively high) incomes would suggest. And no less than the super-rich, (we) symbolic analysts attempt to shape ‘the system’ in accordance with our own will and priorities. We facilitate the operation of the prevailing order, ensure its continued viability, and implement reforms.

Pointer from Tyler Cowen.

Price discrimination explains everything

Tyler Cowen writes about what should be taught more in econ grad school,

Price discrimination. They do it to you more and more! Or perhaps you are striving to do it to others. This is typically covered in a first-year sequence, but how many second-year students really have mastered when it is welfare-improving or not? How it relates to product tying? When it is sustainable against entry or not?

If I were in charge of undergraduate economics, no one would come to graduate school needing to learn about price discrimination. When I taught AP economics in high school, I taught that price discrimination explains everything. That is, most real-world business practices that might seem odd can be explained as attempts to charge more for consumers with the least elastic demand. An intermediate undergraduate microeconomics course ought to spend a lot of time on the topic of price discrimination.

The fundamental question that economic grad schools face is whether to teach math or economics. When I was in grad school, the answer was to teach math. The case for teaching math is that to succeed in the profession you need to be able to “use the tools.”

I think that, at the margin, economics graduate students should study more economics. Economic history is very much worth studying. Financial institutions are worth studying. Calomiris and Haber invites the reader to contemplate financial institutions, history, and public choice.

Intangible factors in the economy are worth studying. You could spend at least a semester with a course on organizational capital, institutions, innovation, trust, etc.