Some notes on millennial enthusiasms come and gone, the economics of broadband in the wake of the AOL / Time Warner merger, the Internet and career preparation, Hayek's history of scientism, and the fate of diversity in a digital world, plus books about the Internet in politics and even more URL's. Thanks as usual to the many people who send me URL's. If you find a URL that people should know about, don't assume that I've seen it. I'm deliberately not on many mailing lists, so I don't see things unless RRE readers tell me about them. As a periodic reminder, you can cancel your subscription to RRE by sending a message that looks like this: To: requests@lists.gseis.ucla.edu Subject: unsubscribe rre If this method doesn't work, send me a message and I'll try to figure it out. In such cases it's important to send me the full headers of an RRE message that you've received, so I can figure out how it is getting to you. In response to your comments, I have revised the new sections of "Networking on the Network" about thesis-writing and job-hunting. Can I ask others to have a look, especially professors who supervise PhD students? Here is the URL: http://dlis.gseis.ucla.edu/people/pagre/network.html After another round of comments and revisions I will send out an advertisement for it. Nonacademics keep telling me that they find my vocabulary obscure. Which words or phrases have you found obscure? If I knew then I could clean up my writing. Are there any Deadheads on the list? All of my Grateful Dead concert tapes were stolen a long time ago, and American popular music has been so awful for the past couple of years that I am starting to miss them. Do you suppose you could send me a couple of shows' worth? I suppose that shows from the 1970s would be ideal. I'd surely appreciate it. If my calculations are correct, audio files for the complete catalog of Grateful Dead shows would occupy something on the general order of 100 gigabytes, which is fast becoming a cheap amount of mass storage. I want a device the size of a ham sandwich with 100GB of storage -- presumably as an array of small 10GB drives -- and a single-chip Web server in it. You plug it into any Ethernet, and it automatically becomes a mirror site for whatever files are in it. The mirror sites all talk to each other, and people who want those files are transparently connected to whichever server is going to be the most reliable. If I'm not getting Grateful Dead shows fast enough on my machine, I just plug in one of these ham sandwiches and automatically solve the problem for everyone in the neighborhood. Nothing about this idea is hard -- except getting all the necessary standards broadly accepted. People are still day-trading. It's appalling. It was sometime in 1998 that I first encountered people exchanging "insider" stock tips for online trading who should not have been trading stocks at all. I thought surely they'd get burned and quit. Well, they got burned alright, but they didn't quit. You probably know such people too. Here's a good way to explain the problem to them: when you trade in the stock market, you are competing with every other trader in the world. You are competing on information, and whoever has the best information wins. If there is some specific area where you have the best information in the world, do your homework and go for it. If not, bag it. At best you're just riding the general level of the market, and you can do that much more efficiently in a mutual fund. Meanwhile, let's come up with some way to imprison all of the online trading company executives who run those misleading advertisements suggesting that normal people can expect to get rich through online trading, and everyone at their ad agencies as well. Here is a list of the reasonably worthwhile recent books about the uses of information technology in the political process that I have heard about: Cynthia J. Alexander and Leslie A. Pal, eds, Digital Democracy: Policy and Politics in the Wired World, Oxford University Press, 1998. Christine Bellamy and John A. Taylor, Governing in the Information Age, Open University Press, 1998. Daniel Bennett and Pam Fielding, The Net Effect: How Cyberadvocacy is Changing the Political Landscape, Capitol Advantage, 1999. Graeme Browning, Electronic Democracy: Using the Internet to Influence American Politics, edited by Daniel J. Weitzner, Pemberton Press, 1996. Ian Budge, The New Challenge of Direct Democracy, Polity Press, 1996. Chris Casey, The Hill on the Net: Congress Enters the Information Age, AP Professional, 1996. Anthony Corrado and Charles M. Firestone, eds, Elections in Cyberspace: Toward a New Era in American Politics, Aspen Institute, 1996. Richard Davis, The Web of Politics: The Internet's Impact on the American Political System, Oxford University Press, 1999. Lawrence K. Grossman, The Electronic Republic: Reshaping Democracy in the Information Age, Viking, 1995. Barry N. Hague and Brian Loader, eds, Digital Democracy: Discourse and Decision Making in the Information Age, Routledge, 1999. Kevin A. Hill and John E. Hughes, Cyberpolitics: Citizen Activism in the Age of the Internet, Rowman and Littlefield, 1998. John Kurt Jacobsen, Dead Reckonings: Ideas, Interests, and Politics in the "Information Age", Humanities Press, 1997. Wayne Rash, Jr., Politics on the Nets: Wiring the Political Process, Freeman, 1997. Susan M. Ryan, Downloading Democracy: Government Information in an Electronic Age, Hampton Press, 1996. Douglas Schuler, New Community Networks: Wired for Change, Addison- Wesley, 1996. Edward Schwartz, Netactivism: How Citizens Use the Internet, Songline Studios, 1996. Gary W. Selnow, Electronic Whistle-Stops: The Impact of the Internet on American Politics, Praeger, 1998. Roza Tsagarousianou, Damian Tambini, and Cathy Bryan, eds, Cyberdemocracy: Technology, Cities and Civic Networks, Routledge, 1998. I haven't included books on the much broader topic of political issues relating to information technology. Although Ed Schwartz's outstanding book, "Netactivism", is officially out of print, Ed is still selling it for only $10 plus $3 shipping and handling. Send your check, payable to the Institute for the Study of Civic Values, attn: Deborah, 1218 Chestnut St., Rm. 702, Philadelphia, PA 19107, USA. Phone: +1 (215) 238-1434. I should mention that the idea about masks as a surveillance-jamming technology came from Terry Kuny. In his 1990 book, "Life After Television", George Gilder quoted with seeming approval a prediction that televisions would not even exist three years hence. It didn't happen, of course, but this did not deter him from making the following statements in his contribution to a book ("Going Digital: How New Technology Is Changing Our Lives") that The Economist published in 1996: "Revenues from telephones and televisions are currently at an all-time peak. But the industries organized around these two machines will not survive the century" (page 334). "Just as the 1980s brought the collapse of the centralised scheme of a few thousand mainframes and millions of dumb terminals, the 1990s will see the collapse of similar structures in television and telephony" (page 337). I suppose you could say that he sort of got these predictions right, given the merger between AOL and Time Warner just after the decade's clock ran out. If so then Nostradamus was sort of right as well, given that the Spice Girls broke up. For me, the real question is why so few people are bothered by this wild-eyed stuff, and why it has such a vast audience among members of the business community whose reputations and profits depend on predicting the future well enough to profitably invest money. Many reasons are offered, but none are persuasive: It is argued that at least people like Gilder are trying to figure out where things are going. But lots of other people are offering predictions, and some of them -- the Institute for the Future, for example -- have serious concepts and defensible methodologies. It is argued that people who cast doubt on Gilder-like predictions are Luddite wackos who are stuck in second-wave thinking. The ad hominem nature and simple falsehood of this assertion aside, shouldn't forms of thinking that fail dramatically to predict the future trade at some kind of discount? Finally, it is argued that Gilder may have exaggerated the timetable for rhetorical effect, but that he at least has the directions of change right. I think he (like many others) was certainly right in his technical premise that numerous media would move to digital, but his whole story about the institutional changes that would supposedly result thus far bears no relationship to the directions of change that normal people can discern. Rather than beat those issues any further, though, let's concentrate on the first half -- the attempt to brush off these accelerated timetables. Who first said that people tend to overestimate how much will change in two years and underestimate how much will change in ten years? I thought it was Bill Gates, who certainly said it in a book, but others have claimed that Gates was recycling it from someone else. In any case, the heightened emotional atmosphere of the computer industry has led to a kind of compressed imagination, as if it were heretical or even insane to imagine that any projected change in the world, no matter how enormous, will take any longer than a few years. You can see the effects of this kind of thinking in Silicon Valley, which is being swept by an astonishing wave of greed by people who want to be billionaires RIGHT NOW and are trying to get their IPO launched and their Ferrari on the road as soon as they possibly can. The old school of technical people who loved the technology for itself and cared about its integrity is just about gone, and those few who think that it'll take ten years to get things right are having a hard time finding a paying audience. This effect feeds on itself: everybody in the world keeps hearing that infinitely dramatic technology-driven transformations are right around the corner, right around the corner, and so long-term planning becomes hard even to conceive. How many tremendous things will never happen because someone believed in all of these short-term predictions that, if they had come true, would have made the tremendous things moot? Last point. The millennialism of the Next Big Thing is confined pretty much to the United States, and I wonder if this is why the momentum in research on the slightly longer-range world of ubiquitous, handheld, and invisible computing, which started with people like the late Mark Weiser at Xerox, has shifted to Europe. Some people, I gather, are not clear why AOL belongs to the camp of "old media" rather than "new media". I've already indicated some of the reasons: it's a proprietary environment, and it is largely run by a guy from MTV. But it runs deeper than that. The precedent for the AOL / Time Warner merger that comes most easily to mind is eBay's acquisition of the "bricks-and-mortar" auction house, Butterfield & Butterfield. But an equally valid precedent, in my view, is Rupert Murdoch's purchase of the Dodgers. It's not quite the same, since both AOL and Time Warner comprise both "content" and "carrier", whereas the Dodgers (in the eyes of the old media) are just "content". But the point is, a war is intensifying between two models of the media. One one side is the Internet model: public networks operating more or less as common carriers and serving as a shared substrate for a wide variety of media service layers and business models. On the other side is the old, proprietary model: proprietary networks owned by firms that integrate vertically into the production of content to deliver on them. The difference between these two models is very clear indeed: if you want to start Brand X Television Network in the public-network world, you just do it; but if you want to start Brand X Television Network in the proprietary world, you have to do a deal with Rupert Murdoch or Steve Case. This is the distinction that EFF articulated in its early days, and it is just as valid today. The economic case for the closed, proprietary model was clear enough: so long as distribution systems were large, cumbersome, and scarce, every content producer needed guaranteed access to its own system, and the only way to guarantee access was to build or buy a system of one's own. Having established a distribution system, one then had to fill it with a guaranteed supply of content. Companies thus found a natural size, and that size was very large. This is the same model of industry structure that Alfred Chandler applied to the whole growth of the modern firm in "The Visible Hand: The Managerial Revolution in American Business" (Harvard University Press, 1977). Now we have a deeply institutionalized business culture that works as hard as it can to replicate that model, and the question is whether new technologies undermine it. The economic case for the open, public model would seem clear enough: moving bits from point A to point B is a generic enough functionality that a huge variety of business models can be built on top of it, and each of those models benefits by sharing facilities with all of the others. And since there is no longer any danger of losing access to distribution channels for one's expensively produced content, it is no longer necessary to own one's distribution system. The glaring problem with this argument, I'm sad to say, is that we do not have any public broadband networks, and we are not likely to get them soon. It's still a hard problem, and while we are hoping that the problem can be solved by simply scaling up the model we have now, nobody honestly knows whether that's going to work. We'll see action in music distribution soon, blessed be, but George Gilder's vision of public digital networks replacing television is a whole other story. In the meantime, the major broadband action is happening fully within the old-media model on proprietary cable systems. WebTV (Microsoft) and Excite@Home (controlled by AT&T) are real, and if the AOL / Time Warner merger goes through (far from a sure thing) then that will make at least three. (I insist on calling it a "merger", by the way, because I see no evidence that AOL has really "acquired" Time Warner in any practical sense. For one thing, the AOL people wouldn't have any idea how to run Time Warner.) The open, public model has another strike against it as well. When a single firm owns both the pipe and the content, the temptation to bias things toward one's own content is overwhelming. This is why there is (or was, before AOL signed up with Time Warner) a fight over "open access": a firm that owns a broadband distribution system wants to compete on the "value-added" services that the system supports, not just on the commoditized bits that it carries. And it's not just a temptation, oftentimes, but a competitive necessity: the other guys are doing it too, and their financials look all the better for it, even if their souls do not. In fact the point can be generalized. Think of the "portal" model: having gotten a critical mass of users flowing through one's Web site, the route to increased revenues is to keep them on your site, either by integrating into the services that the people use your site to find or by collecting a linking fee from the providers of those services. In each case, the owner of the underlying platform (the cable system or the portal) is acting like a tiger mussel: affixing itself to the inside of the pipe and sucking out all of the nutrients that go by. Open, public networks can only be established once this force is overcome. That might happen through old-media companies relaxing and seeing the light, but more likely it will happen through the maturation of the open-platform model in other sectors. What we'll see in practice, I think, is continued competition between the two models. Open and proprietary networks can be interconnected, and so services will arise that combine the two. A complicated dance will ensue, with the Internet developing most intensively in business applications while mass-market business models continue to develop most intensively in proprietary networks. One question is the extent to which so-called portable, handheld, invisible, and ubiquitous applications will develop over IP. Right now the signs are good -- after all, those applications (as I pointed out) are happening mainly in the European public-network environment of the (cellular) telephone. Nonetheless, in the long run the biggest action is in large-scale video distribution, and despite what the enthusiasts have long assumed, I think it's clear that in that area the old-media, proprietary, vertically integrated model is going to have a major head start in the digital world. When I sent out the latest version of "Advice for Undergraduates Considering Graduate School", you were kind enough to forward it to half the planet. I got quite a few comments back from the article's target audience (the most frequently asked questions, "should I apply to graduate school even though I don't have any research experience?" and "how to I get into graduate school even though I have been out of school for a while?", are good ones and will be answered in the next version). But the best comment blew my mind. It was from a professor in Arizona who said that he liked my advice, but that his advice was completely different. The key to getting into graduate school, he said, was to decide who your thesis advisor should be and then form a relationship with that person on the Internet. Oh my. I was stunned. I first wrote "Advice for Undergraduates ..." back around 1992, and even though I have revised and extended it several times, I have never taken a moment to rethink it for the wired world. I still think that its main message is at least half of the answer. Being involved in real research activities as an undergraduate is important, especially in scientific and technical fields, and one strength of a place like MIT or Caltech is the informal meritocracy of scientists that says that anybody who can hack the concepts can play, whether they have a degree yet or not. Many people are shocked at the idea that undergraduates are being "pushed" into research so early in their lives, but those people are wrong. If they ran the world then I wouldn't have been able to get away from my evil family by going to college when I was 15. Get those people outta here. Nonetheless, my colleague in Arizona was darn right. The world has changed since 1992. Institutions have changed. Relationships have changed. When I was looking for an academic position in 1990 and 1991, it took superhuman effort to obtain a simple list of the faculty in a department that had invited me for an interview. You actually had to call some office on the telephone and see if, please?, they could send you a paper brochure or something. ("Oh yeah, lemme see, I think they ran outta those.") Now we take for granted that all sorts of information about both individuals and departments is available on the Web. I think that most of those Web sites are an aggravating mess compared to what they could be, but as with cell phones and Post-Its, it's hard to remember what life was like without them. It's not just the Web, and it's not just e-mail. It's the culture. Just because the Internet exists, it doesn't follow that any given undergraduate will use it effectively to get into graduate school. Prodigies aside, the Internet only gets used effectively if new cultural forms exist, and if the people who need them find out about them and get socialized into using them. My colleague in Arizona is socializing his students into the new culture, the one in which the people you want to connect with are right there, provided you know how to do your homework and what to say next. It's hard to explain just how big a deal this is for the average undergraduate -- the one who didn't grow up in a household where it's second nature to just drop a line to a powerful professor in Chicago or Berkeley. The deep fact here is that people need a clear line of sight to the place they want to go. If you want to be a doctor, you need a clear line of sight to the real life of the medical world, and you're not going to get it by watching ER. Likewise for any other profession. Take Our Daughters To Work Day, at least when it's done well, is an example of giving kids a clear line of sight to the real world of work, although the theory that boys already had that clear line of sight is screwy in my opinion. So are internships, the good ones anyway, which students just love. Career counselors tell me that the Internet is killing the campus career center because students only willingly to go the career center to find campus jobs, the listings for which are now on the Web. I think that reflects badly on the people who run career centers, given that most of the undergraduates I knew before I moved to a professional school were just burning to get a career, and regarded me as the last obstacle they had to overcome before people started treating them as grownups. (Of course I treated them as grownups as best I could, but it's hard when they've spent the last fifteen years learning to play mind games with teachers.) What those students wanted was the identity that comes with a job: doing real work, getting paid real money, having someone else count on you for something that matters to them, and generally being an autonomous adult. It's not even that they idealized work, although they were certainly too susceptible to (ahem) friendly advice such as, "the way to succeed is to work long hours and never compare your salary with anyone". They wanted real, and they weren't getting it at school. The secret of MIT is the aura that everything that happens there is real already, not just an infantilizing preparation or simulation or substitute. We need to spread that aura around. And that's what we in higher education need to be doing: using the technology to redesign the institution and its culture in a way that gives students a sense of reality, in the form of a clear line of sight to the world that they want to be part of. The students need to try the reality of their planned future identities on for size, they need to talk to real doctors or astronauts or whatever it is they are planning to be, and they need to develop the realistic imagination that then confers meaning on their learning activities in the present day. We do a terrible job of this now. In fact we've got it totally backwards: we have students persuaded that learning is something they are doing instead of doing something real. Now sometimes that is partly because the students have come to research universities looking for vocational training in lines of work other than research. But even when that's happening it's still more complicated. So what to do? I don't know for sure. For younger students I think it would be great if we had career rock stars. Once we have infinite amounts of bandwidth, we should get a consortium of schools together to hire Hollywood celebrity promoters. Then we should get ourselves some photogenic professionals and pay them half-time to be idols. Get a doctor, a lawyer, an astronaut, a reporter, a biologist, and a librarian, take some glossy pictures of them, and make them into stars. Constantly measure their popularity, and keep finding new ones whom the kids like. Then use the full power of Hollywood image-making to show the kids their work and lives and the knowledge they needed to get where they are. Have them go on tour with slick multimedia shows, and have them answer the kids' questions. Keep track of the kids' most frequent asked questions and make professional 15-second commercials where the stars answer one of the questions. Have videos where they explain real math and science and social studies concepts and how they use them in their work. We're so used to these things being done in a totally boring and fake way that we can hardly imagine what it would be like to do them well. Sell the concepts. That's what I do in my lectures: a high-energy sales pitch for the importance of the concepts I'm lecturing about. Make it real, and make it real precisely because kids are starving for real and know fake when they see it. Forget this whole business of making education into a game -- making it fun. That's exactly the wrong approach. And certainly don't bring back the old conservative world of education as a soul-crushing punishment for having been born. ("Building character" is back! Run screaming.) Instead, let's position education, and in fine detail, as the way to get yourself the life you want, that is, the life these charismatic professionals have. That's for the younger kids. Starting in high school, the problem starts to change. Younger kids can play at being a doctor, but the older kids can start to imagine their way into the social role of being a doctor. Every kind of job has a social structure, and every social structure has its day-to-day reality: who does what, what you say, the work routines, the power stuff, rewards and frustrations, and all of that. Show the world through the professionals' eyes. And teach advanced social skills -- not etiquette lessons, which are arbitrary games in gathering cultural capital, but lessons in building social networks, organizing things, scoping out political situations, and understanding how institutional structures generate incentives and then how those incentives explain the way people act. The point is this: if someone says they want to occupy a given social role, show them how the world looks from within that role, and ask them if they like it. "If you want to be a scientist, that's great: scientists have to raise money and publish, but that's the price they don't mind paying for the sheer coolness of figuring out how diseases work, and furthermore here's how one goes about raising money, and here's a video in which a real scientist explains what a publication is and what thinking goes into writing it." The key is to parse the whole lifeworld of the scientist (or the lawyer or the reporter) structurally, looking at the relationships and the genres and the networks and the work patterns, and then show that lifeworld to the students in those analytical terms using whatever production methods and vocabulary the students find comprehensible and credible. Then bring that stuff back into the classroom. Use it to resolve the false tension between the theoretical and vocational approaches to education: theory is reflection on practice, and to the greatest extent possible it should be reflection in practice. Vocational training, meanwhile, should get away from the disastrous idea that a job is just the skills in a narrow functional sense, and instead redefine the job in terms of the full set of social relations and strategies that it comprises in the real world. This is hard, of course, for political reasons. Some people want to emphasize the bad stuff so as to persuade students that the world is basically evil, and other people want to emphasize the good stuff so as to whitewash certain social arrangements that could perfectly well be rearranged more rationally. But if we can tune responsibly into the students' hunger for reality, if we can somehow make ourselves accountable to that hunger, then maybe it is possible. Identity is central: learning is not just learning to do something, it's learning to be someone. You don't just learn law; you become a lawyer. The schools we have now, unfortunately, insulate students from the processes of identity-formation. It's hard to become someone at a university unless there is a place within the university where you can somehow join into the real practice of being that someone. But if we can use new information and communication technologies to connect the places of learning in the university to the places of doing in the world, then we can start to overcome the dichotomy between learning and doing, and then we can start to connect learning to becoming. Recommended: Friedrich A. Hayek, The Counter-Revolution of Science: Studies on the Abuse of Reason, Indianapolis: Liberty Fund, 1979. Originally published in 1952. Friedrich Hayek was the foremost libertarian opponent of Marxism, and in this unfinished book he argues that Marx, and the general totalitarian trend of which he was part, owes less to Hegel than is generally imagined and more to a couple of characters named Henri de Saint-Simon and Auguste Comte, French forerunners of modern sociology who wrote in the first half of the 18th century. Because he never finished the book, the case for these authors' influence on Marx remains circumstantial. In any event much of the influence would have been indirect, since Saint- Simon and Comte had a pervasive influence on European social thought but then fell drastically out of fashion as individuals due to their extreme personal idiosyncrasies. Saint-Simon and Comte originated the idea, which they called positivism, that society develops through a predictable series of stages leading to a complete reorganization along scientifically rational lines. This idea takes many forms, both in the socialist tradition and elsewhere, and it is entirely plausible that Marxism originates in the broader trend (e.g., in Feuerbach) of combining positivism with Hegel. "The Counter-Revolution of Science" makes interesting reading now for several reasons. One is that the chain of intellectual associations that positivism first established, from technology to rationalization to central economic planning to totalitarianism, has been broken. This is perhaps the major role of the Internet in intellectual history: it happened along at the right moment to symbolize a new set of associations, from technology to decentralized economics to (what the Europeans call) liberalism. I believe that both chains of associations are totally misguided, but if you want to understand the intense fervor with which many people uphold the new chain, it helps to read enough intellectual history -- or simply to be old enough -- to recover a sense of the iron grip that the old chain once held. An intellectual development from the French Revolution, positivism was the religion of reason. (Saint-Simon and Comte both opposed religion at first, but then both of them largely discredited themselves by inventing formal religions of reason at the end of their lives.) Hayek, of course, emphasizes that neither author understood science; he refers to their views as scientism, and it's too bad that he did not write the subsequent history of scientism down to the present day. Hayek, writing as he did in the middle of the 20th century, consciously intended to produce an antidote to the mechanistic and scientistic Marxism that then ruled in the Soviet Union, and that consequently influenced many communist parties in the West. For all its flaws, his book "The Road to Serfdom" paints a compelling picture of the deep influence of this kind of thinking in 1940s Britain. So it is all the more striking to see the remarkable resonance between Hayek's analysis and the critique of technology that subsequently descended both from the New Left and from a much wider range of authors, for example across the spectrum from Heidegger to Habermas in Germany. Drawing on Weber and other sources, this latter trend was equally opposed to the chain of associations between technology, rationalization, and centralized institutions. Whereas Hayek tended to identify these things with the totalitarian regimes, these other authors saw the same associations in the West, for example in the extreme scientism of systems analysis as applied to everything from urban planning to corporate finance to the planning of bombing missions in the Vietnam war. For these people, a society directed by centrally organized institutions whose claims to authority rest in scientific reason was bad regardless of whether it was called communism, capitalism, Nazism, or something else. One might well debate matters of proportionality, but the point is that much the same analysis applied in each case. Yet Hayek, perhaps because he was an economist, focused not on scientistic abuses of reason by centralized institutions generally, but specifically on the malignant hubris of state institutions that thought they could bring into one place all of the knowledge that was required to run a modern economy. It was a subtle distinction: the other critical tradition focused on representation (e.g., through the concept of reification, or through Heidegger's notion of enframing, or through the critiques of formalism), Hayek focused on knowledge (a concept which he did not feel any great need to analyze). For someone following Hayek, it would be a surprise that the New Left opposed centralism as much as Hayek did, opposing it not to the market but to (an idealized and unrealistic conception of) democracy. It would also be a surprise that the work of 1980s Eastern European dissidents such as Vaclav Havel and Adam Michnik was first brought to the West by people such as British socialist John Keane. Or that the foremost social democratic theorist of democracy, Jurgen Habermas, should organize his theory around a (hopelessly artificial) invidious contrast between technological rationalization and (what he calls) the lifeworld. Or that it should cause little comment that the collapse of totalitarianism is celebrated by a leftist anthropologist of popular resistance such as James C. Scott (see his Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, Yale University Press, 1998). Or that the main tradition of liberal critique of organizational computing (that deriving from the UC Irvine school that produced people like Rob Kling) should be organized around an opposition between rationalism and democracy. This is the context in which the Internet took form as an object of political theorizing. The libertarian left and the libertarian right both invest hopes for decentralization in the technology, one group identifying the Internet with democracy and the other group identifying it with markets. There remains a non-libertarian left, of course, for whom the Internet is purely an instrument of capitalist domination, just as there remains a non-libertarian right, for whom the Internet is purely a vector of moral decay. This political matrix, together with the happily protean nature of the Internet as a public network, may explain why it is so hard to conceptualize what I regard as most important social change in which the Internet participates, namely the rapid emergence of globally centralized mechanisms (institutions implemented on digital networks) -- I call them "switchboards" -- that both mediate and structure human relationships. I take this trend to be completely obvious, and yet it is not something that any of technology's conventional intellectual associations prepares us to understand. It is certainly a kind of centralization, but it is a quite different kind form that envisioned by the positivists: it has nothing to do with rationalization, and it assigns no particular political role to technologists. It includes elements of state centralization, to the extent that state mechanisms like global treaty organizations provide institutional underpinnings for it. It is primarily a phenomenon of the market, except that the market-making switchboards are more important than the market players themselves, and that the most important market players are not Adam Smith's artisans but rather globalized oligopolies that enjoy massive technology-amplified economies of scale. Of course, this new picture may prove to be just as mythical a set of associations as the others. But we won't be able to evaluate it until we get out from under the influence of those other myths, including the myth of the positivists and the myth that Hayek invented to replace it. Hayek's myth-making is most visible in his impenetrable theoretical first section, in which he upholds what economists would call a "subjectivist" conception of knowledge. This is the idea that the social sciences take their data from subjective perceptions of the world, and that the theorist's only way of gathering data about the social world is to notice that one's own perceptions of the world relate in one way or another to those reported by a specific person. The point is to promote methodological individualism, whose opposite is the the idea that collective categories such as social classes are available for observation. When Hayek argues this case in opposition to someone like Saint-Simon, it is hard to disagree. The positivists thought that they could "see" collective phenomena in a direct way, and indeed they thought that the phenomena of individual life were to be derived from these abstract observations. But like much of Hayek's theorizing, the argument only works against loons like Saint-Simon. Hayek himself employs collective categories such as "institutions" all the time -- in fact he prefers to call them "formations", just like the Marxists and for the same reason. He also views the individual as having been formed in large part by these collective phenomena, this being his alternative to the idea that we individuals can completely know and create ourselves by the exercise of our reason. And Hayek's accounts of the history of ideas likewise rely upon an only partly articulated metatheoretical framework of collective concepts. So the point of Hayek's subjectivism is not to rule out collective categories but simply to argue that they must be inferred from data that are grounded in human experience. This is hardly an unusual idea in the sociological tradition, and one could point to many versions of it that do not share his simplistic story about discrete perceptions and the other-minds problem that they raise. Hayek's stature has been rising ever since Margaret Thatcher first waved around her copy of "The Road to Serfdom", and Hayek studies are a major fashion among those conservative intellectuals who pretend that they don't exist, the better to portray academia as having been taken over by the left. He's well worth reading. But I hope in doing so we will not remain stuck in the conceptual oppositions of another time, but instead that we can open our eyes and fashion concepts that are adequate to our own. I had to rebuild the operating system on my Powerbook the other day (argh!), so I figured I would spend five minutes looking for useful free software to download. All I found was a program called NetCD that makes me laugh every time I use it. It's a CD player (that is, a program for playing music CD's in one's CD-ROM drive) that uses the Internet to solve the annoying problem of having to type in the song titles. If you play a CD whose song titles aren't stored on your computer, it opens an Internet connection to a database. If someone else has typed in the titles, it downloads them. If the title aren't in the database, it prompts you to type them in and then it uploads them. Here is the URL for it, broken onto two lines: http://hotfiles.zdnet.com/cgi-bin/texis/swlib/hotfiles/info.html ?fcode=MC17508&b=mac What makes me laugh is partly that such programs are still so unusual. If you had asked me in 1994 how many distributed applications the average person would be running on their computer by the year 2000, I (like most people who follow this stuff) probably would have said dozens. But no: we've moved everything onto the Web, which works on a different model. We interact with Internet-based services through an application, the Web browser, that is not well integrated with everything else, and that provides no feasible way to leave a program running in the background. Java was supposed to provide that way, but it didn't happen because of the standards war that broke out along the boundary between the Java interpreter and the operating system. We are truly still in the Dark Ages here, and I don't see how it's getting any better. Among the many dubious claims of the cyberspace ideology is the idea that a thoroughly wired world would be more diverse than the world we have now. After all, the proponents of this idea often say, look at all those varieties of toothpaste (or whatever) on the supermarket shelves. I disagree. Like most aspects of the cyberspace ideology, this prediction of network-amplified diversity is true in a misleading and superficial way and wildly false underneath. To see why, it helps to distinguish between two kinds of diversity, deep and shallow. Deep diversity is diversity among things that have arisen independently of one another in different cultural and institutional settings. Shallow diversity is diversity that is generated from a common framework, such as modular components that can be assembled in different combinations or parameters that can be set to different values within one device. These are obviously ideal types -- extreme cases -- and in practice they combine in various proportions. Human languages, for example, exhibit shallow diversity to the extent that they are constrained by the innate structures of the brain, and to the extent that they still bear the marks of their descent from a single language millennia ago. And they exhibit deep diversity to the extent that they have evolved in largely independent societies ever since. Deeply diverse things are incommensurable: it is hard to compare and contrast them because they are defined in qualitatively different terms and because they are bound up with everything else in their environments in qualitatively different ways. And shallowly diverse things are, by definition, commensurable: they are, in a profound sense, made of the same stuff. In our brave new wired world, we are rapidly replacing deep diversity with shallow diversity. Economies of scale encourage the adoption of globally uniform standards in every area of life, and the demands of compatibility in a networked economy do the same. Here are some examples: Technical standards are not necessarily the enemy of diversity: when you move from a deeply diverse maze of incompatible special-purpose networks to a globally standardized network such as the Internet, you create the conditions for an explosion of diversity: the shallow diversity of applications that can be built on top of the Internet. * As people stop speaking local languages and start speaking Spanish, French, Chinese, or (especially) English, the resulting mass death of deeply diverse human languages creates larger audiences in the global languages, which then provide the conditions for a greater variety of cultural expression in those languages -- shallow diversity. * As industrial distribution systems become more integrated and information-intensive, it becomes possible to pump a greater variety of goods through them, provided that they are packaged and labeled and tracked in uniform ways; this is shallow diversity on another level. * Mass customization, though slower in arriving than its proponents had predicted, is the pure essence of shallow diversity. * Treaty organizations such as the European Union and the World Trade Organization encourage nations to "harmonize" their institutional systems. This creates economies of scale for global production and increases the compatibility that encourages trade. Deep diversity is reduced, thus creating the conditions -- a larger overall market that can be subdivided into niches -- for more shallow diversity. * The central problem faced by global companies is where to draw the line between global standardization (which increases efficiency) and local diversity (which aids competition in particular markets). It's a complicated problem, but one broad pattern is that it's easier to standardize the things that people cannot see. Thus the famous case of the new Volkswagen Beetle appealing more to Americans than Germans. This is not surprising in retrospect, given the different cultural meanings that the old Beetle had acquired in the two countries. But it's part of a pattern: have the industrial designers build a shallow diversity of cars on the same underlying chassis and drive train. McDonald's might add seaweed to the menu in Japan and eliminate beef in India, but they standardize the framework as much as they can. So here's the hard idea: the wired world can bring us a cornucopia of diversity, but it's mostly shallow diversity, and we get this shallow diversity only by obliterating the deep diversity that makes life most worth living. As with diversity in ecosystems and organisms, genuine diversity in institutions, cultures, and technologies is an important resource. We cannot know what social and technical problems will need solving a hundred years from now, and deep diversity is a reservoir of potential starting-points from which solutions to those problems can grow. If we lose that diversity then we will drown in the white noise of superficial differences. Our society may become more efficient in a shallow sense, but it will also become more brittle. And more boring. We will prosper on the outside, for a while anyway, but on the inside we will die. Some URL's. Digital Libraries 2000 http://www.dl00.org/ i3 Annual Conference http://www.i3net.org/btt/ Workshop on Software Engineering for Wearable and Pervasive Computing http://www.cs.washington.edu/sewpc/ CoDesigning 2000 http://dougal.derby.ac.uk/drc/co-design/ Patterns of Disappearance Workshop http://ecate.itc.it:1024/SpringDays00-WP12/ workshop on mobile ad-hoc networking http://www.xrce.xerox.com/research/ct/invisible/PBN-workshop-CFP-i3spr ingdays2000.html Research Directions in Situated Computing http://www.daimi.au.dk/~mbl/chi2000-sitcomp/ Blocks Protocol for XML meta-data on the Web http://mappa.mundi.net/cartography/EDGAR/ Norwegian teenager who hacked DVD is arrested http://www.aftenposten.no/english/local/d121152.htm http://www.wired.com/news/business/0,1367,33889,00.html http://cnn.com/2000/TECH/ptech/01/25/dvd.charge/ EFF press release on the case http://www.eff.org/IP/Video/DeCSS_prosecutions/Johansen_DeCSS_case/ 20000125_eff_johansen_case_pressrel.html DVD copy control fight http://slashdot.org/article.pl?sid=00/01/24/118240&mode=thread http://cryptome.org/dvd-hoy-reply.htm http://www.eff.org/IP/Video/DVDCCA_case/20000114_gilc_dvdcca_statement.html good article about alt.tv.simpsons http://www.salon.com/ent/tv/feature/2000/01/24/simpsons/ Rethinking Public Key Infrastructures and Digital Certificates http://www.xs4all.nl/~brands/ A Piece of Blue Sky: Scientology, Dianetics, and L. Ron Hubbard Exposed http://www.cs.cmu.edu/~dst/Library/Shelf/atack/ Steve Bell's Deep Sheep http://www.newsunlimited.co.uk/flash/0,5860,116054,00.html Eric Nee predicted the AOL / Time Warner merger back in April http://www.pathfinder.com/fortune/technology/daily/0,3467,617990430,00.html Enron creating a market for IP bandwidth http://www.pathfinder.com/fortune/technology/2000/01/24/eco.html Microsoft's Proposed Conclusions of Law http://www.microsoft.com/presspass/trial/p-col/col.asp Survey: US Leans on Foreign Government for Trade http://www.bouldernews.com/news/worldnation/21atrad.html Inter Press Service http://www.ips.org/World/ Presence Forum http://www.presenceweb.org/frameset.php3?_hoofdmenu=discussion+forum A Behavioral Approach to Law and Economics see index entry number 55 http://www.law.uchicago.edu/Publications/Working/ Cyberterrorism Hype http://jir.janes.com/sample/jir0525.html Stolen car brought to a halt by satellite http://www.thestar.com/thestar/back_issues/ED20000119/news/ 20000119NEW01d_CI-STOP.html Brown and Williams Consider Alliance With New Online-Education Company http://chronicle.com/free/2000/01/2000011901u.htm people organizing against US Forest Service access fees http://www.wildwilderness.org/ Denise Caruso's electronic commerce columns http://www.nytimes.com/library/tech/reference/indexdigicom.html Open Directory Project (open-source competitor to Yahoo) http://dirt.dmoz.org/ end