Modeling the passage of time is notoriously difficult in economics. Living the passage of time is much easier. Each year’s Nobel Prize turns up as a subject of discussion fifteen months later on the program of the meetings of the American Economic Association. Since one of the main functions of the meeting is the continuing education of the professoriate, it’s a highly desirable progression.
So to celebrate the 2010 award to Peter Diamond, Dale Mortensen and Christopher Pissarides, for what is known as the DMP model of labor markets and unemployment levels, there were many sessions on joblessness and a grand plenary luncheon talk by Robert Hall, ofStanfordUniversity.
Hall sought to show how the DMP model– the most realistic account available, he said, based on a complete statement of underlying principles of turnover, job-filling and -finding rates and wage determination – explains current high rates and the lingering of joblessness.
A dramatic increase in unemployment benefits couldn’t explain the problem, he said – because no such increase had taken place. Neither could the trend in productivity – it fell, as usual, in the recession, then rebounded sharply, while unemployment remained at 10 percent. Perhaps diminished inflation was responsible, he said, thanks to a certain form of wage stickiness as described recently in a modification of the standard model, in which employers are mindful of the amount of inflation that has occurred since the last time wages were set; more inflation plus stale wages mean more hiring.
With lower inflation as the result of slack conditions that have prevailed since 2007, real wages paid to new hires are elevated. The payoff from hiring new workers is correspondingly lower. It takes a higher job-filling rate to justify new hires [but]… the job-finding rate is lower and unemployment is higher. Sticky wages are not just something Keynes thought up: it makes economic sense.
To many in an audience of several hundred, composed mainly of those who were not members of the macro-labor research community, the intensely technical nature of the talk, complete with diagrams, was a sign of a profession still deeply at odds with itself. For many economists, the well-known empirical finding by Carmen Reinhart and Kenneth Rogoff, that recovery from a banking crisis ordinarily requires five or six years, carries more weight.
Diamond, of the Massachusetts Institute of Technology, who laid the groundwork in the 1970s and ’80s with models of search processes in markets with various frictions, noted that, until the news of his award, he hadn’t looked at the search literature for nearly twenty years, and was therefore reacquainting himself with what had gone on in the interim. Pissarides, of the London School of Economics, stayed home to attend the birth of a new baby. His friend Yannis Ioannides, ofTuftsUniversity, gazed down from the platform in his stead.
So it was left to Mortensen, of Northwestern University, to savor triumph in his home town, and to bask in the glow of a series of panels over three days devoted mainly to one of the most vigorous empirical programs in all of economics, the result of a patient research program he devised forty years ago that has finally paid off. Economists may not yet understand very well why the most dangerous crisis in 75 years happened, but they know a great deal more than they used to know about what happened as a result.
The big news of the meetings, therefore, was surely Bengt Holmstrom’s presidential address to the Econometric Society. The profession has been struggling to understand how a relatively small shock in housing markets could bring international trade to a grinding halt for two months and push the world economy to the edge of a global depression. Holmstrom, of MIT, contributed a vital piece of the puzzle.
Sometimes, Holmstrom said, ignorance is bliss.
The common view of the crisis, he noted, is that it had been caused by Wall Street greed and bad incentives. Banks, through the newly-invented process of securitization, had created financial instruments of baroque complexity that nobody really understood. The originate-and-distribute banking model had caused reckless lending. Credit-rating agencies had depended on the mechanical application of inappropriate formulae to evaluate risk. This was pretty much the problem, he said, as described by author Michael Lewis in The Big Short.
But what if a certain kind of desirable opacity, suddenly lost, was the heart of the matter? What if liquidity ultimately depends on a regime of “no questions asked”?
After all, Holmstrom said, there were plenty of examples of purposeful opacity in the everyday world. The South African De Beers syndicate sells uncut diamonds only in bags containing hundreds of gems, graded to certain standards. The well-functioning market for diamonds depends on continuing trust in De Beers
Fractional banking depends on the assumption that all banks are equally safe. When one bank or another came under attack in the nineteenth century as liable to fail, bank clearinghouses “circled the wagons,” ceased publishing audited data for individual banks, offered only aggregate data, insisting instead on the solvency of the whole.
Cash money is the most opaque asset of all, Holmstrom noted. It is backed by nothing more than faith in stability of the government. Yet when questions arise about, say, counterfeiting, many establishments refuse to accept $100 bills.
Something of the sort happened to money markets in the crisis, Holmstrom said. Not in familiar retail markets for bank deposits, but in the enormous and for the most part unregulated lending among investment banks, money market funds, corporations and other institutions (collectively known as the shadow banking system), in which $1 trillion in “repurchase agreements,” meaning overnight demand deposits among giant institutions, were regularly rolled over every morning – until mutual fears among counterparties began to spread, especially after September 2008, when two government-sponsored entities that made the market in mortgage-backed securities, Fannie Mae and Freddie Mac, were placed in receivership.
It is these markets, which depend on debt-like instruments about which no questions are (ordinarily) asked – in which guarantees take the place of transparency (as a rule) – that began to shut down rapidly in the autumn of 2008, precisely because questions were being asked about their backing. Suddenly private information had value. When the collapse of Lehman Brothers eroded the overall systemic guarantee, Holmstrom said, the result was a classic banking panic – but out of sight of all but those most intimately involved.
A largely opaque system is favored by private vendors, Holmstrom noted, because it effortlessly expands liquidity before the fact and facilitates economic growth. In another session, as if to buttress the point, Gary Gorton, Stefan Lewellen and Andrew Metrick, all of Yale University, showed that the “safe asset share” of information insensitive debt – government bonds, demand deposits, money market funds and collateralized repurchased agreements – had remained remarkably stable at around 33 percent of GNP since 1952. When government provision of debt declined, private production took up the slack, and vice versa. This surprising fact had been previously unnoticed, presumably because theory often determines what is observed.
Opacity has two kinds of problems, Holmstrom told his audience of perhaps five hundred persons. One it is vulnerable to a discontinuous transition from the state of no-questions-asked to another, in which it pays to create private information. If that happens, panic can easily be the result. The other problem is that opacity hides systemic risk.
The former problem can be addressed by two kinds of regulation: more transparency in the normal state, when a little more information won’t hurt (publishing net asset value for money market funds daily, for instance, instead of with the current two-month delay); less transparency in the crisis state (putting toxic assets in bigger, recapitalized bags, for instance, as banking authorities quickly did during the Scandinavian banking crisis in 1991-92, or, somewhat more slowly, as US authorities did in 2008-09).
The latter problem – the accumulation of systemic risk – means that outside monitoring will be required. (As if on cue, the newly established US Office of Financial Research last week released its first official working paper – A Survey of Financial Risk Analytics, by Mark Flood, of the OFR; and Dimitrios Bisias, Andrew Lo and Stavros Valavanis, all of MIT.)
Holmstrom’s presidential address isn’t written yet. For his Chicago presentation, he again relied on slides (he has given the same talk before many different audiences.) In due course the talk will appear as an essay in Econometrica, the journal of the Econometric Society.
But the new view it represents is slowly making its way through the profession. A formal model, with co-authors Gorton, of Yale, and Tri Vi Dang, of the University of Mannheim, will appear eventually as well. The nature of the rude surprise that overtook banking authorities around the world in the summer of 2007 is slowly being explicated.
Meanwhile, Laurence Kotlikoff, of BostonUniversity, announced plans to run for president of the United States, at least on the third party movement headquartered at AmericansElect.com. Kotlikoff, a specialist in public finance, is a natural comedian, with good timing. But that may not be enough to justify his entry into an already crowded field.
3 responses to “Continuing Education”
I am puzzled by Hall’s comment that unemployment benefits have not increased. I may be reading the wrong sources, but it is my impression that such benefits have been increased substantially in the past three years, not in the distribution per person per week, but in the number of weeks over which the distribution is to be made. Is not this distribution, in fact, lauded by the Administration as part of one or another of its economic stimuli?
Why was Holmstrom’s address such ‘big news’? The story he tells, as related in this piece, can be found in Gary Gorton’s 2010 book titled ‘Slapped by the Invisible Hand,’ the core parts of which are papers he wrote/presented in August 2008 and spring 2009.
My enthusiasm for Gorton’s work is great,
but there is a big difference between a narrative book and a model.