Some quick notes, including a great deal of thoughtcrime that should be
kept away from young children.  Is this a great mailing list or what?


As a reminder, answers to frequently asked questions about RRE can be
found through the RRE web page, http://communication.ucsd.edu/pagre/rre.html


Remember "push" technology?  Gosh, it seems like two months ago already.
Seriously, why have 1000 times more people heard of "push" technology than
IP Multicast?


Slowly but surely, the Internet world is talking itself into a regulatory
framework to govern Internet life.  The turning-point, I think, has been
the fight at the FCC over flat-rate telephone service and other issues
relating to the local phone companies' complaints about Internet users.
Although I support flat-rate local phone service, I have to say that I'm
a little disappointed with certain Internet people who envision all sorts
of futuristic electronic commerce scenarios in which everyone pays for
everything incrementally using micropayment systems -- what Vinny Mosco
called "the pay-per society" -- but who then turn around and resist that
same principle when it applies to their own use of the Internet.  These
folks want a la carte for everyone else, but the buffet for themselves.

Along the way, we're also starting to see some realism poke its nose out
from under the ideology that has long obscured the Internet community's
discussions of regulatory issues.  For example, in Declan McCullagh's
column in the June 1997 issue of Wired (page 183), we read:

  the digital nation has misidentified its foe.  As a rule, Washington's
  bureaucrats are not power-crazed authoritarians; most are reactive
  creatures who simply respond to demonstrations of influence and power.
  Bell Atlantic, PacBell, Nynex, et alia leaned hard on the FCC for access
  fees, and the agency reacted in its own instinctively bureaucratic way.
  The high tech community responded by forming its own ad hoc coalition to
  pressure the FCC, and thousands of Internet users chimed in to express
  their collective dismay.  ... the real threat to netizens has come from
  complacent telcos and their legions of starched-collar lobbyists, not
  the FCC.  The distinction is important, because the old rule of thumb
  holds true: The enemy of our enemy may occasionally prove to be our
  friend.

This would be the most unremarkable common sense in any sane decade.  Of
course, the passage that I removed with an ellipsis does read as follows:

  Of course, the best way to win not just the battle but the war maybe to
  remove the commission's power to regulate the net altogether.  Still,

The old fantasies are still there.  The fact is, however, now that Uunet
and its brethren are starting in earnest to behave like an oligopoly,
and now that the Internet's peering system is starting to collapse,
we're starting to hear large ISP's call for regulation.  The FCC, having
heretofore made extra sure that nobody thinks it wants to regulate the
Internet, is suddenly starting to stroke its bureaucratic beard and allow
that maybe some type of regulation is required after all (New York Times,
5/12/97).

What's the next step?  It seems like the large ISP's -- the operators of
large national and global backbones -- are moving to treat the small ISP's
as customers rather than peers.  Some say that these organizations are
just passing along their costs in the normal market fashion, and that the
protesters might as well oppose the law of gravity.  Others point out that
these companies compete with the smaller ISP's for customers; they describe
the situation in terms of restraint of trade.  In a sense it doesn't
matter who's right, since even the large ISPs' defenders are effectively
arguing that the Internet is inherently centralized.  This argument has
a lot of economics going for it, given that the Internet is pretty much
the textbook definition of a natural monopoly.  (Don't blame me.  I didn't
write those textbooks.)

Once the oligopoly is in place, how long until the large carriers start
merging?  Maybe they'll have to, for several reasons.  For example, if
one carrier becomes a dominant player in the market to provide Internet
service to large businesses then it can start offering quality-of-service
guarantees within its own system, thus creating network externalities
for its customers.  Can someone explain why this won't happen?  I've sat
through a lot of lectures about the inherently decentralized nature of the
Internet and all the geodesic this-and-that, including lectures by people
who claim to be authorities on telecom economics, but I still haven't
gotten anything like an explanation of why the Internet is any different
in its fundamental economics from the phone system.

The fight over domain names illustrates what I mean.  Domain names are a
lot like telephone numbers, and the fight over domain names is a Keystone
Kops parody of the longstanding dispute about phone numbers in analyses
of telecom competition.  The level of ideology here is amazing.  First the
government, seized by the private-good-public-bad logic that substitutes
for thought these days, decided to privatize the issuance of domain names
-- by simply handing over the function to a private firm.  This isn't just
happening with domain names, of course.  It's also happening with water
utilities and all sorts of other essential services worldwide.  But let's
focus on domain names.

Government services might be clunky and drab, but at least they are
subject to some kind of democratic accountability, for example through
being hassled by members of Congress when they make people mad.  Private
monopolies operate under no such constraints, and in any normal world we
would be unmystified by the dramatic deterioration in domain-name service
since the government stopped doing it.  If you're going to privatize
domain name service, or anything else, you need to create the conditions
for a fair market.  That's what telecom policy is all about.  If you
want a fair market in phone service, people need to be able to move their
phone numbers from one carrier to another without significant expense.
Likewise, if you want a fair market in domain service, people need to be
able to move their domain names from one service to another.  Doing this
neatly and cleanly is a very technical matter, and it turns out that small
details of the rules can make a big difference as a practical matter in
the fairness of the marketplace.  And so why is this knowledge not being
applied to the Internet?  Because the myth of the "vast and unregulated
Internet" keeps short-circuiting our brains.

So what's the deal here?  Do I favor centralization?  Is that why I keep
making these inconvenient arguments?  No, I believe in a decentralized
distribution of power in society.  What I disagree with is arguments that
technological developments will, all by themselves, bring us that kind
of society.  Those arguments are, so far as I can tell, pure stipulation.
And what's this stuff about regulation?  Am I in favor of regulation?
Regulation isn't inherently good; I do not have, as one Internet rhetor
put it, a psychotic need to control people.  I want a telecommunications
infrastructure that can support a democratic society.  In pursuit of that
goal, I think hard about the interactions between technology, economics,
and policy, I write down whatever conclusions I come to, and I hope that
democratic societies care enough to take the effort to stay democratic.


I was dismayed a while back, you may recall, to find myself on a Web
page of "celebrity e-mail addresses".  Since then, I have been through a
whole elaborate drama.  The maintainer of this page refused to remove my
name from it, and has failed to respond to numerous additional requests.
Meanwhile, new such lists keep popping up, mostly copied verbatim from
existing lists, and I continue to receive a steady stream of requests for
autographs from people who are obviously spamming every celebrity address
they can get hold of.  I've developed a routine: when I get an e-mailed
request for an autograph, I respond by explaining that I am not any kind
of celebrity, do not respond to autograph requests, do not even own a
photograph of myself, and do not have a secretary or publicist to filter
my mail for me.  I also ask for the URL of the Web page from which the
requester got my address.  Sometimes the person responds with a URL, and
in such cases I track down the maintainer of that page and request to be
taken off it, and sometimes the maintainer actually replies and does the
right thing.  Mostly, though, I get no response at all.

Although it would be wrong, I feel a thirst for revenge.  Consider, for
example, http://www.ingress.com/~wrp/email.htm , a Web page maintained
by someone whose e-mail address is wrp@ingress.com.  This is the one who
explicitly refused to remove me from his page.  If I were a bad person
then I would ask you to send this guy a note and ask for his autograph.
You might tell him, for example, that you're going to auction autographed
pictures to raise funds for your school.  Or you might tell him that he's
your biggest hero in the whole world.  (There would be no need to mention
his name or exhibit any knowledge of who he is.)  You might even attach
a photograph of yourself and implore him to get you a role in a movie.
You would spread these messages out, so that he received a few of them
every day.  That way he would know what it's like to be listed on his Web
page.  He's making a business out of this service -- his page includes
advertising, or at least he's soliciting advertisements -- and it's not
reasonable for him to profit by creating nuisances for innocent people.
But his offense somehow doesn't seem like it's in the same category as
spamming, so I'll refrain from further increasing the general level of
unpleasantness on the net.


In my message about the FTC privacy hearings, I did not mean to imply that
no privacy lobby has offered comments.  The comments from the Electronic
Privacy Information Center are a good source of arguments for privacy
warriors: http://www.epic.org/privacy/internet/ftc/epic_comments_497.html


My most recent rant about Microsoft was certainly over the top in its
rhetoric, but long-time readers of this list are aware of the substantive
analysis that backs it up.  My message was apparently posted to some
other lists, whose members wrote me all sorts of poignant messages of
the sort that result from lack of understanding of economics.  Some of
these messages included detailed catalogues of every mistake that MS's
competitors have ever made.  These analyses miss the point, however, until
they explain in structural terms why such mistakes could lead to such
catastrophic and irreversible defeats.

Particularly poignant were the messages from the good and talented people
who work in the trenches of MS, talking with users and making deadlines
without seeming to understand their own company's strategy or the economic
dysfunctions that allow that strategy to work.  If all you've heard is the
usual stuff about markets -- the quaint 18th century theory of supply and
demand that has little to do with the high-technology markets of the 21st
century -- then any claim that inferior products dominate a market will
inevitably seem to imply that customers are stupid, that employees are
stupid, or that big bad government is meddling.  It is not so, and none of
these implications has any part of my argument.

Nor are MS's products uniformly bad.  MS's componentization and object
strategies are aligned with broad trends in the industry, and are just as
competent as the competition's -- just more proprietary.  The stuff about
market dysfunctions is just a broad generalization; each market, and each
technical feature, has to be analyzed in its own terms.  And, like most
people, I expect big things from the world-class people that MS has been
hiring in the last year or so, particularly in computer graphics.  Dollars
to donuts says those big things will employ proprietary standards whenever
possible, but they will be big nonetheless.  The point, then, is not that
MS produces bad software through any kind of character failing.  Nor is
the point that the market is specifically selecting for poor quality.  The
point, rather, is that the dysfunctions of high-technology markets often
reward other things besides quality, for example the highly developed
strategies by which proprietary standards come to be entrenched in markets
due to the obscure wonders of network externalities.  This is not the only
force in the market, but neither is it a marginal force.

Recent reporting on Microsoft: Ken Auletta has a longish article on Nathan
Myhrvold's legendary memoranda in the 5/12/97 New Yorker (they really
didn't see the Internet coming), and the 5/26/96 issue of Fortune includes
a good article on Microsoft's NT strategy.  Look at MS's strategy with
both Windows and Office: starting with a system that enjoys overwhelming
market share and whose position is reinforced by network effects, then
successively adding components to that core system that compete with
applications companies, compensating if necessary for the relatively poor
quality of those components by drawing on immense cash reserves and a vast
distribution system to cut prices until network effects take hold.  Next,
imagine what would happen if MS could generalize that strategy to NT and
BackOffice.  If you can't get the word "scalability" out of your head then
you get the picture.


A directory of anti-Microsoft Web sites can be found at
http://www.geocities.com/SiliconValley/Pines/3334/super.html
I can't say that I was enlightened by the few that I clicked on at random,
but maybe others will find something of value.


I recommend an interesting online newsletter called "Above the Crowd",
edited by J. William Gurley of Deutsche Morgan Grenfell and described as
"a bi-weekly publication focusing on the evolution and economics of the
Internet".  To subscribe, send a message to atc-request@abovethecrowd.com
with the word "subscribe" in the body.  I would forward an issue to RRE,
but it's copyrighted and I haven't gotten any response to my request for
reprint permission.


Peter Neumann's Congresional testimony on the issues raised by the flap
about the Social Security administration's Internet system can be found at
http://www.csl.sri.com/neumann/ssa.html


Some people are complacent about spam; their argument is that if nobody
replies to the spammers' advertisements then the spammers will get tired
and go away.  I disagree with this argument, for two reasons.  The first
is that it fails to distinguish between spam services and their clients.
A spam service might sell spam software or lists of e-mail addresses, or
it might send out its clients' messages to an agreed number of addresses,
or it might perform some combination of these chores.  Anybody who's low
enough to send spam is low enough to mislead potential clients with tales
of instant riches over the glamorous Internet, and there're lots of naive
people out there with dollar signs in their eyes.  Those naive people
aren't talking to one another, and I doubt if they'll spend much effort
publicizing the ignominy that resulted from their adventures in spam.

The second and more surprising reason is that some people do actually buy
stuff from spammers.  Really.  I've corresponded with them myself.  I ask
them why, and they shrug and say, roughly, "it was something I wanted, the
price was right, and the offer was right in front of me, so why not?".  I
guess that some people's mamas never taught them anything, so here's why
not.  One reason is practical: when you get an offer through spam, you can
have little confidence that it's legitimate.  Evidently big bad government
has been cleaning up the marketplace for so long that some people think
nothing of writing a check to a company they've never heard of and mailing
it to a PO Box in another state.

But the most important reason not to buy from spammers is that it's wrong.
That's right -- it's immoral to buy anything from a spammer.  Why?  Paper
mail advertising, while annoying, has a built-in reality check: it costs
something like $0.75 to send a direct-mail advertisement, so that response
rates of a few percent are needed to recover the costs.  Think about it:
When you buy something in response to a junk-mail letter, you have to pay
for about thirty other people's letters before you get anything of value.
The economics of spam are fatally different.  Spam costs something like
$20 per 100,000 to send, and probably much less for the big operators.
(I haven't seen any formal studies of the spam market; that's just my
informal impression.)  If those numbers are anywhere near right, a spammer
can profit with a response rate many thousands of times lower than a paper
letter.  The reason that spam is offensive, of course, is that it induces
a cost on the people who receive it.  This cost includes the time and
effort of skimming and deleting the messages, and for many people it
includes actual money costs for connect time and storage.  Thus, when you
buy something from a spammer, you are effectively receiving stolen goods:
the time and effort you saved were stolen from the thousands of other
people who received that same spam message involuntarily and received no
benefit from it.

Of course, this argument has limits.  I don't think people who respond to
paper junk mail are thieves in any sense worth worrying about, since the
scale of wrongdoing is orders of magnitude less.  Besides, contrary to
what most people say, my experience has been that I know exactly where the
junk mailers have gotten my name, and I find that I can control my junk
mail quite well by registering with the DMA's "mail preference service"
and consistently instructing magazine publishers, charities, and catalog
merchants to keep me off mailing lists.  I still think self-regulation in
the direct mail industry has been a failure, if only because of the amount
of effort needed to stay off the lists.  And my life is uncomplicated
compared to many people's: I have neither property nor kids.  In any case,
the capital required to send direct paper mail does encourage the direct
mailers to maintain, on the broad average, a certain level of legitimacy.
Spamming, on the other hand, has extremely low barriers to entry, which is
presumably why spammers seem to consist predominantly of the scum of the
earth.

Nonetheless, the spammers have grown sufficiently numerous and organized
that they are actually trying to justify themselves.  You've probably
seen some of these arguments.  What strikes me is their similarity to the
arguments offered by organized invaders of privacy: high-minded appeals
to freedom of speech, attempts to blur the meanings of words, promises
of self-regulation, and suggestions that the only real trouble is caused
by those rude people who object to their practices.  We need to put the
spammers permanently out of business before they get big enough to hire
"public affairs" departments that mysteriously manage to persuade elected
officials that these arguments make sense.  Once we see the first annual
Cyber Promotions Public Opinion Survey on Commercial Communications in
Cyberspace, or the first corporate lobbying group composed of former civil
liberties activists offering "compromise" legislation in the matter, it's
all over.


I had an especially bad run of spam the other day.  First it was the
get-rich-quick scheme, then it was the pornographer, and then it was -- no
kidding -- amazon.com books.  Yes, amazon.com books spamming its customers
with the news of its now having 2.4 million or 3.1 million or 1.8 million
books, or something.  Well, forget that.  I've cancelled my amazon.com
account, including a $100+ order I had placed last week, and moved to
alt.bookstore (http://www.altbookstore.com).  It has lower prices -- about
7% lower on the order I cancelled at amazon.com.  Its interface isn't as
refined, and they don't seem to list as many books that haven't yet been
published.  But they haven't sent me any spam.


Upon arising one morning and settling at the computer terminal in my
kitchen, I had a curious urge to look at the Christian Coalition Web site.
Sure enough, the press release announcing Ralph Reed's resignation had
been posted hours earlier.
http://www.cc.org/publications/ccnews/ccnews97.html#resign


You may recall my recommendation of the February 1997 issue of Internet
World.  I had never paid any attention to IW before, but that particular
issue presented such a useful summary of ongoing Internet standards
battles that I actually subscribed.  Unfortunately, that issue turns out
to have been a fluke.  The product evaluations possibly aside, the level
of analysis in IW has been uniformly poor.  They seem to have no awareness
of the institutional dimensions of Internet work, for example the need
to integrate Web page design with an organizational communication strategy.
They also keep repeating the half-truths and outright falsehoods that the
Lexis-Nexis people spread around back when their P-TRAK system was being
roasted on the net.  The last straw, though, was their interview with Jim
Manzi in the current issue.  Jim basically asserts that he's the only guy
who had thought of building infrastructure to support business-to-business
commerce on the Internet, and the IW people didn't even begin to call him
on this.  He has the concept on a very abstract level, but he didn't begin
to articulate a strategy that was going to get him there.  Well, as you
probably read, it turns out Jim's strategy had already been crashing while
IW was in press, so that Nets Inc -- which Jim had been funding out of his
pocket for a while there -- declared bankruptcy shortly after IW came out.

Can we have some actual reporting in this industry?  It's hard, I realize,
when the whole modus operandi of the IPO artist and would-be definer of de
facto standards is -- with perfect economic rationality -- to pump the old
hype-o-meter.  But the analytical tools do exist now, and organizations
with large sums of both money and expertise (not just one or the either)
do get better analyses than the readers of IW.  If you should lack for
either money or expertise, the good news is that you can now almost do
it for yourself.  From what I can tell, high technology industries can
be comprehended with great sophistication through constant study and the
application of precisely two sets of ideas.  The first set consists of
the economics of information and standards (the qualitative parts, not
the analytics), and particularly the strategic consequences of network
effects.  I've reviewed this literature here on several occasions, and it
really does explain a great deal.

The second set of necessary ideas can be found in the earth-shattering
books of Geoffrey Moore, "Crossing the Chasm" and "Inside the Tornado"
(both HarperCollins, 1991 and 1995 respectively).  I reviewed "Chasm"
with bemused respect in TNO 3(5), but only in the last few months have
I come to a full appreciation of it.  Even forgetting the parts he got
from Everett Rogers, it's still an extraordinarily original piece of work.
It explains, among other things, part of what happened to Nets Inc, which
just couldn't cross the chasm between the easy money of the early adopters
and the hard money of the companies that want real, complete solutions
to their problems.  Not having taken the effort to establish themselves
in strategically chosen niche markets, they had neither the positioning,
the knowledge, or the cash flow to ready themselves for the transition
-- a transition whose time, in the case of Nets Inc's market, has not yet
come -- to the period of explosive growth that Moore calls "the tornado".
(Denise Caruso's column in the 5/19/97 New York Times includes a good
analysis of this, including a review of NI's competitors' strategies.)
Everyone whose life is affected by high technology should read these
books before it's too late.  If necessary, I will personally come to your
office and pound my fists on your desk until you do.  When you get done
with them, you will realize that you had formerly known nothing about the
computer industry.  Either you or Jim Manzi.


John Ousterhout has an interesting rant against Java at
http://www.sunlabs.com/people/john.ousterhout/scripting.html


The Usability Professionals' Association is at  http://www.upassoc.org/


As a frequent user of the Lynx Web browser for plain-ASCII terminals,
I've found the Web becoming less and less Lynx-friendly, and in more and
more gratuitous ways.  Now, thanks to http://world.std.com/~adamg/we.html ,
I have a name for this syndrome: these sites have been dehanced for Lynx.
Check out their links to particularly dehanced sites, as well as their
style tips for Lynx-compatibility.


It has also been pointed out to me that every child on the planet has
already heard my panda joke.  It has been pointed out to me that pandas
are neither arborial nor marsupial.


Recommended books.

Isaac Kramnick and R. Laurence Moore, The Godless Constitution: The Case
Against Religious Correctness, New York: Norton, 1996.  In an era when
some religious conservatives argue that the United States Constitution
was designed to embody Christian principles, it is refreshing to read
the words of the 18th century religious conservatives who argued bitterly
against the Constitution on the grounds that it did no such thing.

Uwe Poerksen, Plastic Words: The Tyranny of a Modular Language, translated
by Jutta Mason and David Cayley, University Park: Pennsylvania State
University Press, 1995.  A wonderful rant against the evolution of
language toward bloodless abstractions like "system" and "communication".

John J. Gumperz and Stephen C. Levinson, eds, Rethinking Linguistic
Relativity, Cambridge: Cambridge University Press, 1996.  A smart and
learned (but horribly copyedited) collection of articles that brings
evidence and theory to bear in evaluating the linguistic relativity
hypothesis, the controversial idea that the grammatical forms of
particular languages influence perception and thought.

Peggy V. Beck and A. L. Walters, The Sacred: Ways of Knowledge, Sources of
Life, Navajo Community College Press (Tsaile Rpo, Navajo Nation, Arizona
86556), 1977.  A sophisticated introductory textbook on Native American
religion, placing traditional shamanism, the Ghost Dance, and the Peyote
Religion (among other things) in historical, cultural, ecological, and
religious context.


Recommended records, all of which can be described as "folk".

Dick Gaughan, Handful of Earth (Green Linnet, 1991).  A powerful set of
Scottish folk songs invoking historical memory to uphold the dignity of
working people.

Maddy Prior, Year (Park, 1993).  A wise and gentle reworking of tradition
by the leader of the modern English folk revival.

Steve Earle, I Feel Alright (Warner, 1996).  Excellently produced electric
white-boy blues by a guy who's wrestling full-time with the devil himself.

Butch Hancock, Own the Way Over Here (Sugar Hill, 1993).  An excellent
songwriter from Texas whose great intelligence sits in the shade of a
good-ol'-boy persona.


Not recommended: dc Talk, Jesus Freak (Virgin, 1995).  You'll remember my
recommendations of various exponents of Christian rock.  Well, dc Talk's
excellent single, "Just Between You and Me", has been inescapable for
some time now, so I followed my politico-religious curiousity and bought
the record.  It turns out dc Talk is actually a bad soft metal band whose
other songs, evidently recorded on a four-track in someone's basement,
do not remotely resemble the single.  I guess that even Christian bands
sometimes feel compelled to sell out to record-label marketing tactics.


The rest of this message consists of an draft note that I wrote at the
beginning of the year but never finished.  It's not quite right, and the
writing isn't so great either, but maybe it's good enough to circulate
in unfinished form...

In school I learned about this cool thing called "science".  Here's how
it works.  If someone has a theory and wants everyone else to believe it,
they draw out some consequences of the theory, called "predictions", and
if the predictions don't come true then the theory is wrong.  New Year's
Day is a convenient time for rounding up the predictions from the previous
year, so let's consider one of those predictions now.

You may recall that Bob Metcalfe, godlike inventor of Ethernet, predicted
that the Internet would collapse during 1996.  If you're reading this
message then we can cautiously conclude that it didn't come true.  Bob's
theory concerned the economics of Internet pricing: flat-rate pricing, he
suggested, would lead people to overconsume Internet bandwidth, leading
to shortages and collapse; it follows that the Internet needs a pricing
scheme (per-packet or congestion-based or whatever -- my arguments won't
require me to distinguish between them) to allocate its capacity by market
principles.  So what's wrong with Bob's theory?

Well, some would argue that his prediction did come true.  Bob himself
is quoted discussing the topic in a Reuters article today.  He suggests,
for example, that the long delays people experience on the "World Wide
Wait" are due to Internet congestion.  But those delays are often due
to queueing delays in individual servers (including domain name servers),
not router congestion.  They are also due to the bandwidth of the user's
"last mile" connection, often through a telephone line, which limits how
quickly a large file can be downloaded, quite independently of throughput
on the Internet proper.  Imposing prices on individual servers can, and
presumably already does in some cases, regulate demand for their use, but
that's different from imposing prices on routed packets.  He also points
out that outages can be caused by wire breaks and bugs in routers, but
that's also irrelevant to the issue of capacity allocation and pricing.
The Reuters article then refers to the AOL shutdown over the summer and
Stanford being cut off from the Internet in October, but neither of those
problems had anything to do with congestion either.  Furthermore, packet
pricing would not eliminate congestion; it would only reduce congestion
to the level that people are willing to pay for, and it would presumably
make possible the creation of different levels of service quality with
different prices.

The article reports that "experts such as Metcalfe predict similar
types of outages next year as telecommunications companies and Internet
providers struggle to keep up with demand".  But this struggle would
still be taking place if users paid for Internet routing by the packet,
given the rapid increase in demand.  Is that increased demand a result
of economic distortions that underprice the service?  If it was then the
companies wouldn't be investing all that money to provide the services.
Will these experts eventually suffer the same mockery as the scientists
who have been predicting for however many decades that the whole global
ecosystem will collapse real soon now?  I hope not.  What I hope is that
everyone recognizes that modern information and communications services
operate by very different rules than the commodities that motivated the
simple classical supply-and-demand stories in the 18th century.  Many
economists already recognize this, and I hope it's their thinking that
is reflected in coming generations of the Internet architecture and not
that the ideology of those who simply cannot stand to see anybody eat a
free lunch.  Of course, I'm arguing with a newspaper article here.  Maybe
the liberal media have tried to make Bob's argument sound stronger than
it is.  [And, I would note here in May, that Bob did eat his words, just
as he promised.]  My point is simply that we shouldn't slip into an easy
equation between technical difficulties and delays on the Internet and the
need to erode the practice of flat-rate pricing.

The issue of flat-rate pricing has arisen on several fronts this year.
Some of the regional phone companies, for example, have been claiming
that flat-rate service is allowing Internet users to cause uneconomic
congestion in local-loop phone systems.  Computer industry types dispute
their numbers.  An instructive text here is Milton L. Mueller and Jorge
Reina Schement, Universal service from the bottom up: A study of telephone
penetration in Camden, New Jersey, The Information Society 12(3), 1996,
pages 273-292.  Bell Atlantic hyped this study to the skies when its
results first become public a couple of years ago, presumably because
it would seem to undermine one of the most common arguments for universal
service cross-subsidies, namely that the poor could not otherwise afford
telephone service.  Mueller and Schement actually went and talked to some
poor people about telephone service, and they found that these folks had
some quite rational reasons not to want a telephone.  The relevant reason
for present purposes is that it is hard to control long-distance charges.
If you're poor then you don't have any money in the bank, so that any
unexpectedly large bills can cause calamities.  For example, if you're
poor then you probably have some relatives whose troubles are worse than
yours.  You're likely to feel an obligation to put these relatives up
occasionally, whereupon they can perhaps stick you with a large phone
bill, whereupon you can miss your rent payment and end up on the street.
Much better to put your money into something whose costs you can predict,
like cable TV.

Okay, so much for universal service subsidies.  (Maybe.)  But what about
flat-rate pricing?  It is very commonly argued that emerging information
and communications technologies dramatically lower transaction costs
and thus make it possible for numerous markets to move from flat-rate
to per-unit pricing, or from a free public service to a paid private
service.  Maybe one side-effect of this shift is that households' expenses
become ever harder to predict; the poor will suffer either by having to
forego those services or through the increased risk of sudden unexpected
insolvency.  Issues of "haves" and "have-nots", in other words, are not
just matters of distribution; they may also be affected to a significant
degree by pricing structure.

As serious telecommunications policy people are well aware, the issues
are more complicated than simple supply-and-demand stories can comprehend.
Many people wonder why subscribers choose flat-rate deals that can often
result in larger bills than under measured service, and one reason is that
the flat-rate premium is a kind of risk hedging not dissimilar to larger
organizations' use of options to hedge commodities contracts.  It is also
argued that flat-rate pricing effectively shifts costs from heavy users
to light users, and that might actually become true if one segment begins
using huge amounts of local-loop capacity to maintain Internet connections.
But given the distributional virtues of flat-rate service, we should be
very sure that this is happening, and that no alternatives are available,
before giving in to the phone companies' decades-long campaign to get rid
of it.

Fortunately for most of us, and unfortunately for the ideologists, I
note that the predominant direction of movement in new information and
communications services themselves is away from per-unit and toward flat-
rate pricing.  (I don't know of any Internet server that uses congestion
pricing.  It would be quite practical for the Wall Street Journal site,
for example, to lower its prices at night, or to post prices dynamically
according to its current load.)  Long-distance telephone has been moving
toward distance-invariant pricing, for example, which is a good first
step.  AOL has also shifted to flat-rate pricing.  Much was made of the
first-day congestion problems that resulted, but AOL could hardly have
been surprised by those problems, and must have made the decision despite
knowing that they would arise.

The ISP market also shows few signs of moving away from flat-rate pricing,
although the fundamentals in that sector are much less stable.  [And
since I wrote this draft in January, Uunet has moved back toward measured
rates.]  I realize that this could be a bad thing: if everyone has
flat-rate service then everyone has to pay the costs incurred by the
average user; less-than- average users will be reticent to sign on, thus
pushing up the average; this effect will reinforce itself to reduce the
total number of users below the levels that would be reached otherwise,
and everybody, the poor included, will suffer by losing the positive
externalities (economies of scale and network effects) associated with
a larger user community, including the large market that is required to
make advanced applications feasible economically.  On the other hand, the
transaction costs involved in administering measured or congestion-based
pricing raise the average bill and thus depress positive externalities as
well.  These are all theories too, and we need much more evidence about
which theory is correct before we do anything rash.

None of this answers the question of why Bob's theory is wrong.  People
give varying answers; my own answer (which I am sure is not original with
me) has two parts.  The first part concerns substitution.  If bread were
free then people would instantly start consuming vast amounts of it, since
they can substitute bread for several other kinds of food.  Consumption
might be bounded in urban areas, since there's a limit to how much bread
any given household is able to store and eat, but in rural areas people
would feed it to animals and maybe even use it as fertilizer.  Internet
services, however, are not tremendously substitutable for other things.
College students substitute e-mail for telephone conversations with family
and friends back home, of course.  But this brings me to the second part
of the answer, which is that current Internet services are all structured
in ways that effectively create proxy prices for Internet bandwidth.
To send lots of e-mail, for example, you have to sit there and actually
type it all in.  Even when your e-mail is sent to many people, the amount
of e-mail that you send will be regulated by the amount that people are
willing to receive.  This latter principle has been sorely tested lately
with the spam wars, but the last I saw the online service users were
winning and the spammers were losing.  People's use of the Web, likewise,
is heavily regulated by their willingness to sit and wait for files to be
downloaded.  Corporate users with T-1 lines to their desks can consume a
whole lot more bandwidth than home users with 28K phone lines, of course,
but the fact remains that every second they wait for a download carries an
opportunity cost that somebody (either the user or the boss) cares about.

Now this may all change when new applications become available.  I oppose
Internet telephone, for example, because it makes it easy for everyone
to consume vast amounts of Internet bandwidth.  Maybe when IPV6 arrives
we'll be able to segment different uses of the Internet so that people who
want to mess with bandwidth-intensive streaming data types can pay for it
and leave the rest of us alone.  I honestly don't understand the stories
people tell about Internet telephone being vastly cheaper than regular
telephone; when I try to understand the details, it has always turned out
that they're talking about a lower grade of service.  Maybe the new "push"
services and order-of-magnitude-cheaper hard drives will allow people to
download huge amounts of stuff easily, routinely, and automatically, just
in case they want to look at it later.  Maybe lots of things.  My point is
simply that economic stories depend in great detail on the workings of the
technology, and on how the technology fits into people's lives.


end