A Few Words about the Vladimir Chavrid Award


Remember Rosalind Franklin, “the dark lady of DNA?”  She was the physical chemist whose artful x-ray photographs enabled James Watson and Francis Crick to deduce the double-helical structure of the DNA molecule, just ahead of Linus Pauling. Franklin died of cancer in 1958.  Maurice Wilkins, her supervisor, shared the 1962 Nobel Prize in Medicine with Watson and Crick.

Only after Watson disparaged Franklin in his idiosyncratic 1969 best-seller, The Double Helix (“the best home for a feminist was in another person’s lab”) was she canonized as a icon of uncompromising empiricism, “the Sylvia Plath of molecular biology, whose gifts were sacrificed to the greater glory of the male,” as her even-handed biographer, Brenda Maddox, described the process.

Now meet Julia Lane.

Lane, 54, director of the Science of Science and Innovation Policy Program of the National Science Foundation, spearheaded the creation of  the Longitudinal Employment-Household Dynamics (LEHD) program of the US Census Bureau, an enormous innovative data base –  a “frame” of jobs over time — that permits the real world of the US economy to be interrogated by the models of unemployment dynamics for which Peter Diamond, Dale Mortensen and Christopher Pissarides shared the Nobel Prize in economics last week.

Instead of a Mention in Dispatches from Stockholm, what Lane got was the Vladimir Chavrid Memorial Award.

Don’t feel sorry for her, though.  For one thing, she’s very much alive.  For another, the ebullient New Zealander much more nearly resembles another Julia, Julia Child, than the somewhat dour Rosalind Franklin. And of course there’s that Chavrid Award.

Because I knew her to be immersed in the practical details of unemployment dynamics, Lane was the first person I called after the Nobel prizes were announced last October.  We hadn’t talked for long before I began to realize that her story was as interesting as the winners’.

It began in 1994, when Lane read an article by Simon Burgess, of the University of Bristol, “The Flow of Unemployment in Britain.”  I had been working on looking at the flow of workers through firms and I knew that even firms that had no change in employment across quarters both hired and fired workers simultaneously.  But in his model firms had a desired level of employment, and only hired until they reached it and after that they didn’t do anything more. So I called him up and I said, you don’t know me from a bar of soap but that model is just dead wrong. What my data show is that even when firms don’t change their employment levels there’s this huge churn through the work force.  Even firms that are laying workers off are still hiring.”

A long-distance collaboration began that led, six years later, to “Job Flows, Worker Flows and Churning,” a paper in the Journal of Labor Economics. “We find churning flows (the difference between worker and job flows at the level of the employer) to be high, pervasive and highly persistent within employers…” the authors wrote (joined now by David Stevens, of the University of Baltimore, who had contributed a large employer-employee data set from Maryland.)

“In fairness,” Burgess recalls, “the idea was in the air that the search and matching literature needed to marry up with the new interest in job creation and destruction. Our comparative advantage was that we had the data.”

This was the ’90s, remember. Turbulence was everywhere you looked. Integrated steel mills were closing but mini-mills were opening. The minicomputer industry was shrinking but the software industry was growing by leaps and bounds. Downtowns were collapsing and big box stores sprouting like mushrooms along the highways. Lane had dinner one night with a friend, Nancy Gordon, then the Associate Director of the Demographic Directorate at the Census Bureau.

“Census leadership was concerned about the gap in data integration between two of their major directorates, Demographics, which collects detailed data on households and workers, and Economic, which collects detailed data on firms, but little about the workers in those firms.  There  was no way in which the two data-collection activities were synchronized.  The household employment survey was measuring employment one way, and the establishment data were measuring employment another way. Census knew that the workforce affected  firm outcomes and that firms affect worker outcomes, but there was no way of linking the two. A survey would be incredibly expensive – about $1,000 a record.  I had the opposite problem with my unemployment insurance wage-record data. I could see the links between firms and workers, but I had no information about the characteristics of either. So I said, I could figure out how to create the link.”

David Stevens eventually put together a timeline. In 1997, Lane took an American Statistical Association fellowship and together Lane and Gordon planned a pilot project.  Lane and James Spletzer, from the Bureau of Labor Statistics, would examine the potential for matching unemployment insurance wage data, the Census Bureau business register, the Current Population Survey and the Survey of Income and Program Participation. John Haltiwanger, then chief economist of the Census Bureau, gave the  project a green light.  The next year John Abowd, of Cornell University,  joined the project, and he and Lane hired and led the team that developed the data infrastructure and addressed confidentiality concerns.

In 1999 Lane pitched the project to the states (where much employment data were collected) and formulated a business model (the program would have to pay for itself).  The first data, from Florida and Illinois, arrived the year after that. In 2001 the first “products” were introduced, in 2002 the program went national, and by 2003, the LEHD put up a website and expanded its reach. By 2004, it was sufficiently well established that National Association of State Workforce Agencies award gave Lane its award for contributions to labor market understanding.

To get an idea of the diverse landscape of labor market information, take a look at the program of the NASWA annual meeting that year; and scroll down to the photo of Lane.   “She is high-powered, it’s true, like a cup of caffeine,” says Richard Freeman, of Harvard University. “But she is also an awfully nice person. She is strong-willed, but she lets you work it out for yourself.”

The eventual payoff was a 2006 book by Clair Brown, John Haltiwanger and Lane: Economic Turbulence:  Is a Volatile Economy Good for America?, which offered a clear definition right from the start. Economic turbulence was as the entire process of economic change; workers changing jobs, firms expanding and starting up, contracting and shutting down. The sheer amount was staggering, they wrote: in any given quarter, one job in four begins or ends, one in thirteen jobs is created or destroyed, and one in twenty establishments opens up or closes down.

It was the clear picture of a dynamic economy developed by the LEHD that allowed Peter Diamond, of the Massachusetts Institute of Technology, in his Nobel lecture Friday, to chivvy the president of the Federal Reserve Bank of Minneapolis, for being out of touch with current developments: “What he [Narayana Kocherlakota] says is that firms have jobs and can’t find workers, workers want to work but can’t find jobs, and he goes on to say, ‘It’s hard to see how the Fed can do much to cure this problem.’  This is clearly a static picture of the world that that makes no sense, given the kinds of flows we have been talking about.”  Unassailable data should make for interesting discussions of the theory behind it once Diamond joins the Board of Governors of the Fed.

It is true, therefore, as Bertil Holmlund, of the University of Uppsala, told the laureates, that their long reconnaissance “had inspired a large amount of empirical work.”  Yet when Marcus Storch, chairman of the board of trustees of the Nobel Foundation, told the assembled throng in his opening address, “No simple linear model, from basic research to applied research to technical development is a viable explanation for more complex relationships and connections,” it was hard to resist the thought that he hadn’t known the half of it.

Today, Lane is among those riding the crest of a wave generated by the advent of virtually limitless computing power. A movement is underway to expand access for purposes of scientific research to the vast trove of administrative data compiled by US government agencies – everything from tax records to the Social Security and Medicare systems – as long as confidentiality can be maintained.

A recent open letter signed by eleven barons of applied economics – David Card and Emmanuel Saez, of  the University of California at Berkeley;  Raj Chetty, David Cutler, Martin Feldstein and Lawrence Katz, of Harvard; Steven Davis, of the University of Chicago Booth School of Business; William Gale, of the Brookings Institution;  Jonathan Gruber and Michael Greenstone, of MIT; and Caroline Hoxby, of Stanford University – delineates three approaches.

They include direct access by researchers, with only aggregate results taken out (an approach that has generally worked well in Europe); synthetic data, simulated to match a limited set of statistics from the actual data (less satisfactory because researcher rarely know beforehand exactly what to look for);  and researcher-created programs that are handed over to agency employees (the most cumbersome of all). “We strongly support efforts at various agencies to enhance and promote such access,” the letter concludes.

Even within the community, Lane’s contribution is not well understood. Kaye Husbands Fealing, of the University of Minnesota, recalls the scene at a meeting of an NSF advisory committee last year when chairman Robert Groves, of the University of Michigan, praised “the Census program started by Haltiwanger and Abowd.” Heads abruptly turned.  “A number of people looked at Julia,” she says, “because everyone knew she was the one who thought it up and did the work. It was a dramatic moment.”

Afterwards several colleagues sought Groves out. “He was really surprised. He really didn’t know.Even in the Nobel lectures, it was “Haltiwanger’s data” that was cited.

In other words, the Vladimir Chavrid Award will take you only so far ($100 and a brass plaque, according to the website). “Listen,” Lane says, bristling: “I am deeply grateful for the Chavrid award.  I came to really love the people in the state agency shops – and they gave me the top award in their field.  It came from their hearts, and I value it enormously.”  Still, it is time to add another prize or two for contributions to empirical economics, somewhere between the empyrean realms of the Nobel and the Chavrid award.

Indeed, more attention to measurement in economics in general wouldn’t hurt. It has been more than 25 years since the last Nobel prize went to a measurement economist – to Richard Stone, of Cambridge University, in 1984 (and just two others before that, to Simon Kuznets and Wassily Leontief, both of Harvard, in 1971 and 1973, in the early days of the prize). Economics needs to become much more empirical.


3 responses to “A Few Words about the Vladimir Chavrid Award”

  1. Wow, what a long and interesting post. The flow of workers through an organization is very interesting, especially as it relates to an economic structure as an entirety.

    With more and more data becoming available, and the increasing availability of computing power, the kinds of questions we will be able to ask and to expect answers to will no doubt increase. What an exciting time and an exciting field.

  2. Great story and Julia has indeed been a huge force developing micro data sets and also pushing open access of date. Excellent some recognition has finally been given, and fully agreed that in economics more emphasis should be on this kind of primary data collection as a public good for the whole profession.

Leave a Reply

Your email address will not be published. Required fields are marked *