pagre@ucla.edu
http://polaris.gseis.ucla.edu/pagre/
Revised version of a paper presented at the Telecommunications Policy Research Conference, Alexandria, Virginia, September 1999.
This is a draft. Please do not quote from it.
Version of 29 September 1999.
3000 words.
I want to build a bridge between familiar technical and economic stories and social theoretic analysis of information and its place in institutional change. My point of departure is the difficulty of upgrading Internet standards, for example the transition to an 128-bit IP address space, the slow progress in retrofitting mailers to suppress spam, the failure of many architectural and language proposals. Although the Internet's diffusion remains impressive in quantitative terms, in its qualitative organization the Internet resembles a Boeing 747 that takes off from the runway but cannot ascend beyond 5000 feet. Each case has its details, consideration of which would of course be required before any specific conclusions could be drawn. My own purpose, however, is analytical. I am after questions, not answers, and I want particularly to consider the relationship among various ways of framing the problem.
It is important to distinguish various cases. Some extensions to existing standards are straightforward, especially when network effects are weak and the underlying standard has been designed for extensibility (Hanseth, Monteiro, and Hatling 1996), for example through the philosophy of layering. In such cases switching costs are not high; one is not really switching at all but just adding.
Transitions to new standards can also fail for external reasons. Many business models require a critical mass of users, especially because of economies of scale, for example in the production of content, but the pool of potential users might be limited to those who have and use certain complementary technologies. The world's best-designed tourist guides for GPS-enabled PDA's will fail unless a sufficient number of potential travelers actually own GPS-enabled PDA's. Likewise a computation-intensive standard like VRML will fail until enough people have computers with powerful processors, and broadband applications will fail until enough people have broadband. Of course these platforms may well not be adopted until complementary applications are available, but this chicken-and-egg problem can be solved easily enough if all of the other necessary conditions obtain.
An important complement that behaves differently is skill. To the extent that users learn useful things by using a technology, they will need a great incentive to switch to any other technology to which their learning does not transfer. The same applies to sellers' learning as well (Antonelli 1994: 210). So long as this learning and its value continue to increase, positive feedback sets in -- so that, for example, fragmented standards may well stay that way until something radically better comes along.
Finally, standards can become stuck because of coordination problems. Compatibility standards are valuable as a coordination mechanism, among other reasons, not least because they are mutually reinforcing: each user continues to follow them because other users do. Even without switching costs or learning effects, users lack perfect information about one another's intentions (David 1995: 25), and so a critical mass of users cannot switch to a new standard in a sufficiently coordinated without tremendous effort. If that effort is a public good then the market may not provide enough of it.
Markets have a solution to this problem, which is sponsorship. If an economic mechanism such as property rights exists to reward some party downstream for coordination efforts and early-adopter subsidies upstream, them someone will step forward to play that role (Shapiro and Varian 1998).
I want to examine what physicists would call the interference pattern between this economic framing of the issues and a more sociological, institutional framing. First a few simple observations. The Internet community is guided by an ideology of decentralization (cf David 1995: 28), both of architecture and administration, and of the supposed impacts of the Internet on institutions and society. Of course, this ideology must be qualified in some important ways: Cisco controls nearly all of the router market, Microsoft is the 500-pound gorilla at the W3C, mature software markets such as those for Internet applications tend to be monopolies, the top few percent of Web sites carry a large percentage of the traffic, as do the largest half-dozen backbone providers, economies of scale tend to concentrate the content markets that operate on the net, the Internet is still overwhelmingly American and in English, and Internet architecture development must be coordinated through a single body. Nonetheless, if you ignore all of that then, the Internet is a beacon of decentralization. In particular the Internet is, to build on an observation of Andrew Odlyzko (1998), the phone company turned inside out. When it was a regulated monopoly, AT&T routinely executed complex upgrades to its system, for the most part transparently to users, because it had internalized the whole coordination problem. It was, in economic terms, a sponsor of it own technologies. The decentralized organization of the Internet, however, pushes at hard as it can in the opposite direction. The Internet standards process is justly celebrated for the principles of rough consensus and running code that solve the coordination problems of an important inner circle (Lehr 1995). But the Internet philosophy provides no special solution to the coordination problem involved in upgrading its own standards, except in the important cases, mostly at the applications level, that I delineated at the outset.
The Internet, in other words, has no sponsor. But it could acquire one. A private firm that developed a backward-compatible IP protocol with a large address space, quality of service guarantees, a reservations protocol, pricing mechanisms, scalability, and other attributes required for advanced broadband applications could interconnect it with the existing Internet and then sponsor its diffusion until network effects took hold, whereupon it would recover its investment by raising prices to match the value of its network of users. Perhaps this is what companies such as At Home are trying to do. To really succeed, they probably need to give away some of their proprietary functionality, thereby establishing network effects around a migration path for users that can only proceed through the areas where they make their money. But this is a well-understood process and should not be hard, assuming that they can solve the very hard technical problems that are involved. Of course, this whole scenario blasphemes against the philosophy of the Internet, and indeed the material force of that philosophy may slow it down. But if no other coordination mechanism for upgrading the Internet can be found, then the sponsorship solution may be inevitable, with the result being a proprietary Internet.
It follows that the Internet philosophy of decentralization is misleading, or at best incomplete. An excessive focus on the decentralized nature of Internet administration, remarkably, shares the same weaknesses as neoclassical economic analysis: they are both static. Thus Paul David (1995) calls for a dynamic analysis that goes beyond a simplistic opposition between freedom and order to consider the flux back and forth between relatively stable periods of coordinated administration and relatively unstable periods of coordinated transition.
The Internet case also calls for a revision of some by-now conventional economic vocabulary. In the economic analysis of technology, particularly in those schools that are set polemically against the neoclassical welfare-maximizing-equilibrium model, many authors speak in terms of positive feedback effects that tend to lock in a particular technology while locking its competitors out. But in the context of the S-shaped adoption curve that one sees with technologies that exhibit significant network effects, positive feedback is only an apt description of the middle phase of the curve, once a critical mass has been achieved and before market saturation sets in. At the beginning and end of the curve, what is happening is really negative feedback: the homeostasis that first prevents the standard from gathering steam for lack of a critical mass, and then the homeostasis that prevents any other standard from doing the same.
What is more, to the degree that the standard in question is not easily extensible, the same positive feedback that promotes the technology also feeds a negative feedback that prevents it from changing. The more users, the harder the coordination problem. In fact, a standard that is easily extensible in ways that do not exhibit strong network effects will exhibit that same negative feedback much more strongly, inasmuch as any coordinated change would contend with either the constraints of preserving those local extensions or the costs of discarding them.
All of these considerations point to the need for a deeper analysis, the general theme of which will be the Internet's embedding in the institutions around it. The term embedding is perhaps best-known from Mark Granovetter's (1995) analysis, following Polanyi (1957), of the embedding of job markets in social networks. To the extent that employers and employees find one another through people they know, the employment market will be heavily shaped by any social factor, racial discrimination for example, that shapes social networks.
This analysis can be applied directly to the coordination problems that afflict the Internet. Because it is defined in terms of the abstract and universal category of information, information technology is applicable in nearly every sphere of human life, and economies of scale make it likely that the technology will be largely the same in most spheres, even if the spheres themselves are not socially networked to one another. In fact, one important social impact of information technology is the conversation topics that it provides for people who come from otherwise distinct social worlds. Thus the role of academia, nonprofit organizations, and government in bringing wildly diverse parties around the table for discussion of IT-related issues that by their abstraction promise to affect them all. Thus too the role of research projects that survey the demands of applications in diverse spheres and attempt to abstract common functionalities for network service layers and other intendedly general computing facilities. The labor involved in building and maintaining these social networks is considerable, as is the effort of getting the parties' attention, negotiating common meanings, finding consensus, and all of the other phases of the coordinated change process.
More generally, the dynamics of Internet standards are a special case of the dynamics of social institutions. Institutions are the persistent structures of social relationships (Commons 1970 [1950], Goodin 1996, March and Olsen 1989, North 1990, Powell and DiMaggio 1991). Institutions define roles and the rules that govern them, on all scales and in every part of society. Thus we speak of educational, commercial, religious, political, and legal institutions, but languages, holiday customs, public greetings, contracts, and historical forms of the family are institutions as well. Institutions are partly designed and partly evolved, smaller institutions are nested inside of larger ones, and it is useful to think of institutions taking form through a long-term process of collective bargaining among the groups of people who occupy the various social roles that they define. Individual occupants of those roles may come and go, but the qualitative forms of the roles themselves and the relations among them can remain relatively stable, or at least continuous, across centuries.
Institutions can persist for many reasons, and it is useful to distinguish among them. A common element is some notion of dispersal: institutional arrangements persist because commitments to them are dispersed through a population. One mechanism is closely related to that which provides for the stability of compatibility standards: the institution structures relationships in a certain way, and that creates an incentive to learn and use it. Another mechanism relates to skill and identity: institutions define social roles, social roles define skills and identities, acquiring those skills and identities requires great effort, not least the effort of positioning oneself in a social network, and this gives everyone an interest in reproducing the institution, not least by internalizing its norms. If everyone has an interest in reproducing the institution, then most everyone will uphold the verdicts of formal or informal tribunals that sanction divergence from the norms.
Institutions also persist because they incorporate a considerable network of interlocking ideas and practices. To the extent that the underlying unity of these ideas and practices -- a generative metaphor, perhaps -- is forgotten or hidden, participants in the institution will be cognitively incapable of escaping it. Any attempt to comprehend or reject any single one of the ideas or practices will fail because it is so strongly determined, either logically or practically or both, by all of the others. In other words, institutions can persist because their participants embody them without being fully conscious of them. One can be socialized into a dispersed network of ideas and practices without ever consciously grasping their unity.
From this perspective, it is a wonder that institutions ever change. They do change, of course, for example when they become economically nonviable and lose their legitimacy, but even then the forces of persistence are great (Offe 1996). Yet these same forces apply to the supposedly dynamic and revolutionary technology of information that the institutions produce and depend on. Indeed, the intimate intertwining between information technology and institutions is emerging as a primary reason for the persistence of both (Kling and Iacono 1989).
Information technology encourages the persistence of institutions in other ways. Hayek teaches that one principle encouraging the dynamism of institutions is evolution. Evolution requires alternations between separate development and natural selection, but separate development has its conditions. In the absence of efficient transportation and communication technologies, different regions, cultures, and jurisdictions can develop separately, and the useful results obtained in each locality can be imported by arbitrage and imitation into the others. But the Internet enables global integration, global standards, globally coordinated firms, and globally familiar ideas. The conditions of institutional evolution and thus dynamism are thereby undermined.
To the contemporary mind, this will seem like a bad thing. Dynamism is supposed to be good, and stasis is supposed to be bad. But it is more complicated than that. Institutions rightly cause ambivalence because they both enable and constrain, and because they enable precisely by constraining. If the institutions of money and contracts were not static then investment would not be possible. If the institutions of democracy were not static then no government would give up power willingly (Offe 1996). If the basic structures of social life and social activity were not stable then life in a complex modern society would be cognitively impossible, and the effort invested in learning would not pay off.
So the Internet is related to institutions in several ways: it is an institution itself, its dynamics are analogous to those of other institutions, it is surrounded by and interlocked with institutions, and every aspect of the social life of the Internet is embedded in institutions. Internet change is institutional change. This is true for fundamental Internet architecture, and it is especially true for institutions whose everyday workings are heavily supported by institution-specific Internet applications. The Internet may be static to the extent that it is decentralized; institutions may persist to the extent that their practical arrangements are dispersed. Stasis and persistence can be good when they make the world predictable; by staying the same, institutions and technologies make change possible in other areas. Layered technologies resemble nested institutions, and the layered technologies are embedded in institutions that must necessarily be articulated. In principle the more fundamental network layers and institutional nestings should be the most stable, because they asymmetrically provide conditions for the effective working of the others. Fundamental technical change resembles fundamental social change, in that both require broad and deep coordination: social movements that are mediated by some combination of communications technology, ideologies, and formal organizations. Partly because of this analogy, the Internet is reputed to be a revolutionary technology, and indeed it is already the occasion for a renegotiation of the working rules of every institution in society. But the Internet is revolutionary in another sense: it is exceedingly hard to make a revolution because of the coordination that is required to overcome the persistence of both the technology itself and the institutions with which it is intertwined. The Internet is not an independent variable; negative feedback loops couple the technology to the institutions and vice versa.
But the Internet case also suggests a new take on the institutional problem. The decentralized Internet might get moving again if a sponsor emerges who can establish a centralized point of coordination and a concomitant position of formal control. This is conceivable precisely because of the distance-collapsing potential of the Internet, which makes possible, for example, a single, universally accessible point from which to download new code.
Perhaps something similar happens in the institutional realm as well. The effect is very clear in the case of institutions that are completely subserved by a single proprietary Internet application. An example is eBay, a wholly online auction house -- an auction institution in the full sense of the word -- that can contemplate changes in its trading rules without much coordination effort. Although this is qualitatively no different from a geographically localized physical auction house changing its own rules, the possibility that eBay is an Internet-enabled global natural monopoly should encourage us to investigate the workings of the market in marketplaces, and more generally of the market in privately organized social institutions. The same dynamic that may lead to a proprietary Internet may well lead to a proprietary higher education system, or even a proprietary political or legal system. Most social institutions exhibit effects strongly analogous to the network effects that operate on compatibility standards, and in an Internet-saturated institutional world the same dynamics that encourage a single dominant standard may also encourage a broad range of natural monopolies -- not simply in markets for specific commodities, but also in the market for much more fundamental and systemic social arrangements. A global higher education monopoly would be a matter of gravel social concern, not to mention a global legal system monopoly. Further research should try urgently to evaluate particular cases in more detail.
References
Cristiano Antonelli, Localized technological change and the evolution of standards as economic institutions, Information Economics and Policy 6(4), 1994, pages 195-216.
John R. Commons, The Economics of Collective Action, Madison: University of Wisconsin Press, 1970. Originally published in 1950.
Paul A. David, Standardization policies for network technologies: The flux between freedom and order revisited, in Richard Hawkins, Robin Mansell, and Jim Skea, eds, Standards, Innovation and Competitiveness: The Politics and Economics of Standards in Natural and Technical Environments, Edward Elgar, 1995.
Robert E. Goodin, ed, The Theory of Institutional Design, Cambridge: Cambridge University Press, 1996.
Mark Granovetter, Getting a Job: A Study of Contacts and Careers, second edition, Chicago: University of Chicago Press, 1995. First edition 1974.
Ole Hanseth, Eric Monteiro, and Morten Hatling, Developing information infrastructure: The tension between standardization and flexibility, Science, Technology, and Human Values 21(4), 1996, pages 407-426.
Rob Kling and Suzanne Iacono, The institutional character of computerized information systems, Office: Technology and People 5(1), 1989, pages 7-28.
William Lehr, Compatibility standards and interoperability: Lessons from the Internet, in Brian Kahin and Janet Abbate, eds, Standards Policy for Information Infrastructure, Cambridge: MIT Press, 1995.
James G. March and Johan P. Olsen, Rediscovering Institutions: The Organizational Basis of Politics, New York: Free Press, 1989.
Douglass C. North, Institutions, Institutional Change, and Economic Performance, Cambridge: Cambridge University Press, 1990.
Andrew M. Odlyzko, Smart and stupid networks: Why the Internet is like Microsoft, ACM netWorker, December 1998, pages 38-46
Claus Offe, Designing institutions in East European transitions, in Robert E. Goodin, ed, The Theory of Institutional Design, Cambridge University Press, 1996.
Karl Polanyi, The economy as an instituted process, in Karl Polanyi, Conrad M. Arensberg, and Harry W. Pearson, eds, Trade and Market in the Early Empires: Economies in History and Theory, Glencoe, IL: Free Press, 1957.
Walter W. Powell and Paul J. DiMaggio, eds, The New Institutionalism in Organizational Analysis, Chicago: University of Chicago Press, 1991.
Carl Shapiro and Hal Varian, Information Rules: A Strategic Guide to the Network Economy, Boston: Harvard Business School Press, 1998.