“It’s an urban legend that the government launched the Internet.” That was former Wall Street Journal publisher L. Gordon Crovitz, starting an interesting discussion last month on the paper’s editorial pages. (He now writes the “Information Age” column for the paper – subscription required.) He was, of course, riffing on President Obama’s “you didn’t build that” remark. In fact, he wrote, it was Xerox Corp. that came up with the idea of linking different computer networks together.
Crovitz buttressed his opinion with a quotation from blogger Brian Carnell, which he misattributed in US print editions to economist Tyler Cowen, of George Mason University (later quietly corrected in the WSJ digital archives). ”The Internet… reaffirms the basic free market critique of big government,” Carnell wrote in 1999. “Here for thirty years the government has an immensely useful protocol for transferring information, TCP/IP, but it languished… In less than a decade, private concerns have taken that protocol and created one of the most important technological revolutions of the decade.”
That brought a strong letter of dissent from Vinton Cerf and Stephen Wolff, each of whom made key contributions, while working for the US Defense Department in the early 1970s, to the architecture of the astonishing new technology. (Cerf was co-author, with Robert Kahn, of the Transmission Control and Internet Protocol suite (TCP/IP), the foundational standards that permit networks of computer networks to communicate with one another.) The Internet’s development had been a model of collaboration among government agencies, universities and the private sector, they wrote. “Focusing on one element to the exclusion or disparagement of other elements is simplistic, misleading and wrong.”
Crovitz chose to read that as a rebuke to the president. But Cerf and Wolf made it clear that the open architecture they devised was the government’s idea.
Many private sector corporations vigorously resisted the open TCP/IP protocols, preferring instead to support their own proprietary approaches. Without open protocols, the private sector could not have possibly moved so quickly to commercialize and deploy networks on such a large scale. It was US government policy that TCP/IP would be an open protocol.
But even after a second Crovitz column, headlined“WeHelpedBuildThat.com,” his readers had no better idea of how the Internet actually happened. Books such as The Dream Machine: J.C.R Licklider and the Revolution that Made Computing Personal, by M. Mitchell Waldrop; and Inventing the Internet, by Janet Abbate, make abundantly clear to anyone who takes the time to read them the centrality of government-funded research to the task of getting computers to communicate with one another. Computing: A Concise History, by Paul Ceruzzi, provides a fine overview of the first seventy- five years of the digital age in fewer than 200 pages.
But the organization that, more than any other, deserves credit for swiftly bringing the technological possibilities to commercial fruition, permitting millions of jobs to be created around the world (and others to be extinguished!), is the Internet Engineering Task Force, the planning body that developed various standards for the Internet as it evolved.
The ITEF hasn’t gotten the ink that it deserves. Its website is a trove of information for technical readers, and proper narratives probably are on the way. But the only straightforward account I know in the lingua franca of book publishing is Scott Bradner’s essay in Open Sources: Voices from the Open Source Revolution.
The IETF started in 1986 as a quarterly meeting of US government-funded researchers, Bradner says (he was one of its founding members). Non-government vendors were invited starting later that year. After that, anybody who wanted to was able to attend. Barely 35 people came to the first few meetings. By the late ’90s, more than 2,000 would show up (the meeting frequency dropped to three a year.) For the meeting in Vancouver last week, more than 1300 persons registered, organized into various working groups.
People, not organizations, are members; anyone who wants to can join: all its internal documents are freely available on the Web. The trick, since the beginning, has been to be on the cutting edge of developments. This is the organization that, back in the early 1990s, drew up plans for the commercialization of what to that point was a government- operated network. The creation of the standards opening up the World Wide Web followed in a few short years. Early members note that never before in history has a new technology rolled out so quickly and exuberantly, with so little waste.
The IETF is remarkable for its bottom-up nature; most standards organizations are run from the top down. Take the British Banking Association, the organization that more than twenty years ago, created the standard known as the London Interbank Offer Rate (Libor). All standards are “recipes of reality”: that is the subtitle that Lawrence Bush, of Michigan State and Lancaster universities, gave to Standards, his excellent (and timely) book on the ubiquity of these procedures, and their multi-faceted significance for our life. From weights and measures to legal contracts and software protocols to the customs and expectations that form the ethical infrastructure of human communities, standards are “a means of partially ordering people and things so as to produce outcomes desired by someone else,” he says.
Some standards are better-constructed than others. The standards that brought the Internet into existence were exemplary. The standard that is Libor, written into trillions of dollars of contracts around the world, is something else again. A fair amount of the squabbling going forward over who was best served by the overall rate-setting procedure and whatever possibilities existed for its abuse: borrowers? lenders? or, perhaps, the banks themselves?
These are very complicated issues, but they are precisely the ones that standards organizations exist to debate and resolve. Eric Von Hippel, of the Massachusetts Institute of Technology, has studied more closely than anyone else I know the means by which industries, firms, and their customers band together to improve the standards by which products are produced. But improving products is one thing. Creating whole new industries is another.
There are certain tasks so inherently risky that only government can be expected to undertake them. Getting the computer industry started was one: governments built the first computers during World War II; and, in the early days of the Cold War, the US Air Force financed IBM Corp.’s entry into the business. Next came the first ventures in time-sharing and packet-switching – the technologies on which the Internet is based.
Meanwhile the National Institutes of Health were financing several different revolutions in health care: molecular biology, imaging techniques, genomics. Today government is funding research in gas, solar and wind technologies that will be necessary in order to cope with climate change. That’s why it’s so important to understand the beginnings of the Internet. Mighty oaks from little government acorns sometimes grow.
There’s even a moral here for corporate statesmen. As WSJ publisher, Crovitz built a successful pay-wall around the paper’s website, and pointed the way to successful newspapering in the future. But if he (and other Dow Jones executives) had paid more attention to where the Internet came from, and how it really worked, they might still be able to call that newspaper their own. Instead they are working for Rupert Murdoch.