--------------------------------------------------------------------


                T H E  N E T W O R K  O B S E R V E R

  VOLUME 1, NUMBER 10                                 OCTOBER 1994

--------------------------------------------------------------------

  This month: The gender politics of "exploring" the net
              Strange ideas about privacy
              Pre-employment background checks

--------------------------------------------------------------------

  Welcome to TNO 1(10).

  This month's issue includes two articles by the editor.  The
  first one explores the metaphor of "exploring" the Internet,
  suggesting that the gross disorganization of the net promotes
  a social construction of the net as a masculine place and a
  neglect of the historically feminine activity of librarianship.

  The second article lists a batch of unfortunate arguments about
  informational privacy that I have encountered in my reading
  and travel over the last year, together with my own rebuttals
  against them.  May these rebuttals serve you in your own privacy
  activism.

--------------------------------------------------------------------

  Is the net a wilderness or a library?

  At the CPSR Annual Meeting earlier this month, Karen Coyle gave a
  rip-roaring speech that set me thinking about metaphors for using
  the net.  Karen is a library automation specialist and a CPSR
  activist who is active in getting women and girls involved in
  computing.  In her speech she pointed out that, as a technology
  for making information available to people, compared to any
  real library, the Internet is an amateur job.  Sure there's a
  reasonable amount of information, but it has been haphazardly
  collected, is almost completely disorganized, has no standard
  cataloguing system, and only the beginnings of a decent, uniform
  interface.

  Discussing her speech with another CPSR activist, Jim Davis,
  later that evening, I suddenly connected several things that
  had been bothering me about the language and practice of the
  Internet.  The result was a partial answer to the difficult
  question, in what sense is the net "gendered"?  The reason this
  question is difficult is that we don't want to be reductionist
  about it.  It's clearly not true that only men use the net,
  or that only men find the net worthwhile, or that all women
  encounter more obstacles to net usage than any men do.  Our
  analysis needs to be more subtle than that.  I don't claim to
  have a finished answer, but I do think I have one piece of it.

  That piece starts with the metaphor of "exploring".  I've talked
  with several people who have tried to teach Internet usage to
  people who aren't computer professionals, and there's one thing
  they all tell me: many students give up in frustration after
  repeatedly getting lost using "browsing" tools like Gopher and
  Mosaic.  Even a decent "history" menu doesn't seem to suffice.
  They find themselves "somewhere" in the net, don't know where
  they are, don't know how to find their way back there, and see
  no real logical connection between any one place and any other.

  Note the curious collision of metaphors here.  The most common
  use of "browsing" is in regard to libraries: wandering down
  the aisles in known sections, seeing what books might be on the
  shelf, just in case something interesting comes up.  The word
  is also often applied to the analogous activity in bookstores.
  Applied to a tool like Mosaic or Gopher, though, the metaphor is
  precisely backward: libraries and bookstores have clear ordering
  systems that are visible in the spatial layout of the building,
  and "browsing" suggests that you haven't got any very specific
  goal in your looking-around.  But no such visible ordering system
  is found in gopherspace or the WorldWide Web, and people often
  need to use those tools to actually find something that has
  certain properties.

  But "browsing" isn't really the generative metaphor that's at
  work in systems like Gopher or the Web.  The generative metaphor
  -- the metaphor that generates new meanings and new language
  for the activity of using the tools -- is "exploring".  One uses
  these tools to "explore" the net.  Think what *this* metaphor
  entails.  One normally explores alone, or with a small "party"
  with a definite organization.  One has a location at any given
  time, yet one does not normally know with any precision what that
  location is.  One is in strange territory, far from home, and the
  assumption is that few others like oneself have been there before
  -- at least, any markings on rocks or trees that are recognizable
  as being from one's own kind are rare and important signs.  One
  is normally in danger, or at least in grave uncertainty, and one
  must learn to tolerate continual fear.  (See my discussion of the
  related metaphor of the "electronic frontier", employed by the
  otherwise laudable Electronic Frontier Foundation, in TNO 1(5).)

  All of this does, in fact, describe the experience of many new
  users of tools like Gopher and Mosaic -- and many other such
  tools as well.  Maybe they get used to it, and maybe they don't.
  The real question, though, is: should they *have* to get used
  to it?  Clearly not.  Yet for many people, "exploring" is close
  to defining the experience of the net.  It is clearly a gendered
  metaphor: it has historically been a male activity, and it
  comes down to us saturated with a long list of meanings related
  to things like colonial expansion, experiences of otherness, and
  scientific discovery.  Explorers often die, and often fail, and
  the ones that do neither are heroes and role models.  This whole
  complex of meanings and feelings and strivings is going to appeal
  to those who have been acculturated into a particular male-marked
  system of meanings, and it is not going to offer a great deal of
  meaning to anyone who has not.  The use of prestigious artifacts
  like computers is inevitably tied up with the construction of
  personal identity, and "exploration" tools offer a great deal
  more traction in this process to historically male cultural
  norms than to female ones.

  This sort of thing cuts particularly hard in middle schools,
  when kids between 10 and 15 establish both their gendered adult
  social identities and their formative skills and relationships
  with technology.  Once the computer room gets defined as a "boys'
  place", it's just about all over for girls.  Teachers abet this
  process when they reinforce the pointlessly masculine metaphors
  of exploration through their lessons in the computer lab.

  If the net necessarily worked this way, or if it worked this
  way for a good reason, then we'd have a real problem here.  But,
  as Karen Coyle points out, that's not the case.  The net right
  now really is an amateur job, and perhaps it's not surprising
  that the missing element is something that historically has
  been strongly coded as a female activity, namely librarianship:
  ordering, marking, and cataloguing information so that people can
  actually find it and use it, and staffing the desk where people
  go for help with this process.

  Are the net's dysfunctionalities actually central to the gendered
  experience of using the net?  And what if they are?  Then maybe
  those heroic browsing tools should be left for the second course
  on using the Internet, and maybe the far more useful skills
  of communicating on the net should occupy the first course.  I
  don't just mean technical skills here -- I also mean the skills
  of composing clear texts, reading with an awareness of different
  possible interpretations, recognizing and resolving conflicts,
  asking for help without feeling powerless, organizing people to
  get things done, and embracing the diversity of the backgrounds
  and experiences of others.  Just sending and receiving messages
  on the computer is of little use in itself if these deeper human
  lessons are not taught and learned as well.  Electronic mail
  interaction is a good place to learn these skills because the
  e-mail texts can be saved, inspected, discussed, thought over,
  revised, presented as models, collaborated upon, and so forth.
  What's more, the motions of typing slow the process down enough
  that impulsive reaction becomes more difficult and thoughtful
  reflection becomes more likely.

  But I think that an ever deeper question lurks behind the issue
  of network metaphors.  Where is the reference desk on the net?
  Some systems (like the Well) have schemes where you can "shout"
  for help to the other users and get technical assistance, but
  these schemes are few, not standardized, often unreliable, and
  usually limited to technical matters.  Why is the net developing
  without leaving a place for the important role played by real,
  live human librarians in libraries?  The librarian is the person
  who knows what information is out there, where and how to find
  it, which tools work best for searching, which reference works
  are best for what purposes, how the special collections are
  organized, who is the expert on what, and so forth.  A library
  isn't just a bunch of books: it's a human system that's set up to
  help connect people to information.

  Why isn't the net like this?  Why does the space you "explore" in
  Gopher or Mosaic look empty even when it's full of other people?
  Why isn't there a mechanism for asking for help?  It wouldn't be
  hard to organize.  Just as libraries use networks now to share
  the costs of cataloguing books through organizations like the
  Cooperative Cataloguing Council, they could also share the costs
  of on-line professional librarianship assistance in order to
  provide 24-hour coverage to participating institutions.  This
  process would need software support as well: when you need help,
  you'd type in a text message (or just enter a voice recording)
  explaining what you're trying to do.  You would be automatically
  connected to a helper, and a snapshot of your "browser" session
  would appear on their screen.  A simple expert system would guess
  at which helper would be best suited to your question, based on
  their areas of expertise.

  It's important to do this on the library model and not just on a
  commercial model.  Librarians have no conflicts of interest that
  might influence them to steer you toward particular databases,
  and they're not paid by the hour so they have no interest in
  prolonging the interaction unnecessarily.  They do, however,
  have an organizational interest in customers being happy with
  the library.  They also have long experience in balancing the
  interests of the organization they're paid to serve (for example,
  a particular university) with the larger public interest.

  What would the Internet's tools be like if their designers
  routinely thought about the social relationships of their use?
  It's a hard question, precisely because of the one-user-one-tool
  model of lonely exploration that still routinely goes into
  the design of such systems.  The net opens up a whole world
  of possible new ways of connecting people together, but we'll
  squander its potential until we appreciate the role of the
  helping professions, and more generally the thoroughly social
  nature of the activities that are, we are told, rapidly migrating
  into the chilly nighttime of cyberspace.

--------------------------------------------------------------------

  Some strange ideas about privacy.

  The emergence of new digital technologies is opening up a new
  world of privacy issues.  Along the way, our ideas about what
  privacy even *is* will presumably be rethought and refought
  in a variety of ways.  TNO 1(7) has already looked at some new
  concepts of privacy that might be required to understand the use
  of computers to track human activities, and TNO 1(6) has taken a
  quick look at one attempt to define "privacy" in such a way that
  companies protect your privacy by accumulating massive amounts of
  information on you.

  Here I collect some strange ideas about privacy that I have
  encountered in my reading and traveling over the last year.
  I have not tried to document any of them in a scholarly way,
  and the bulleted quotations represent composite or abbreviated
  versions of the lines I have heard.  My purpose is not to make
  accusations against the people who use such lines.  Many people
  honestly believe them, and many others are just passing along
  half-thought-out ideas that they've heard elsewhere.  Instead, I
  want to help you to recognize these lines when you encounter them
  -- and equip you to argue back against them when the situation
  calls for it.

   * "We've lost so much of our privacy anyway."

  This line plays upon the dire rhetoric of privacy campaigners
  and somehow turns it on its head: we've already lost our privacy,
  so further steps to protect it are futile.  I hear this a lot
  from technical people when I recommend that they employ privacy
  protections in their newly designed systems.  It's important
  to spread the word about the routine invasions of our privacy,
  but it's also important to remind everyone of how much privacy
  we have left to lose.  You can still drive pretty much anywhere
  you like without leaving records behind.  You can still pay for
  most things in cash.  Hardly anyone has to report their sexual
  activities to anyone else -- or whether you eat fattening
  foods, or who your friends are, or your religion.  You don't
  need an internal passport to travel in most countries, and
  so you don't have to register your movements.  If you live in
  the United States then you enjoy a fair amount of protection
  under the legislation such as the Fair Credit Reporting Act and
  the Electronic Communications Privacy Act.  We can lose these
  things, and we *will* lose them, unless we ensure that each new
  generation of technology has the privacy protections it needs.

   * "Privacy is an obsolete Victorian hang-up."

  The basic idea is that we'll soon lose all control over our
  personal information, and after some hand-wringing we'll just
  get used to it.  Protecting our personal information is equated
  with prudishness, obsessional modesty, cultural embarrassment,
  and unliberated secrecy.  People who believe such things are, in
  my experience, invariably either ignorant of or in denial about
  the realities of social oppression.  Let's send them to live in
  a place where everybody knows everything about you for a while.
  There's a world of difference about being voluntarily "open", on
  one's own terms, about one's liberated sexuality and experiencing
  mandatory invasion and publicity of the less happy details of
  one's sexual life.  The same thing goes for your phone records,
  where you've been driving, what you ate for dinner, and a great
  deal else.

   * "Ideas about privacy are culturally specific and it is thus
      impossible to define privacy in the law without bias."

  This argument is found often in the American legal literature,
  principally among people whose political commitments would not
  otherwise dispose them to heights of cultural sensitivity.  It
  is true that certain ideas about privacy are culturally specific
  -- Oscar Gandy, for example, reports that African-Americans find
  unsolicited telemarketing calls to be less invasive than do their
  fellow citizens of European descent.  But this sort of argument
  quickly turns obnoxious as the issues become more serious.
  Amnesty International is not based on any sort of relativism
  about torture, and neither should Privacy International be
  overly impressed by governments claiming that their culture
  is compatible with the universal tracking of citizens, or that
  objections to such things represent cultural bias.  The argument
  is especially specious with relation to tort law, the area where
  it is most commonly made, since tort law arises in large part
  through the rational reconstruction of the decisions of juries in
  particular cases.  If you throw out concepts of privacy on such
  grounds then you must also throw out concepts like contract as
  well.

   * "We have strong security on our data."

  In my experience, this argument is common even among people
  who regard themselves as privacy activists.  It arises through
  a widespread confusion between privacy and security.  Privacy
  and security are very different things.  Informational privacy
  means that I get to control my personal information.  Data
  security means that *someone else* in an organization somewhere
  gets to control my personal information by, among other things,
  withholding access from those outside the organization.  Of
  course, this organization may have my best interests in mind,
  and may even seek my approval before doing anything unusual with
  my information.  The problem arises when the organization itself
  wants to invade my privacy, for example by making secondary uses
  of information about its transactions with me.  Those secondary
  uses of the data can be as secure as you like, but they are still
  invasions of my privacy.

   * "National identity cards protect privacy by improving
      authentication and data security."

  It might indeed be argued that my privacy is not protected
  if individuals in a society don't have enough of a standardized
  institutional identity to authenticate themselves when they make
  claims on organizations (for example, when buying on credit).
  But the holes in current mechanisms for officially conferring
  identity can be patched to a major extent without resorting
  to universal identification cards.  State Departments of Motor
  Vehicles in the United States, for example, need to institute
  much better policies at one of the notorious weak points in the
  system, namely the issuance of replacement drivers' licenses.
  It would accomplish a lot, I think, simply to mail out a letter
  about the new license to all known addresses of the legitimate
  license holder.

   * "Informational privacy can be protected by converting it into
      a property right."

  This one has suddenly become extremely common, as articulated
  for example by Anne Branscomb in her book "Who Owns Information?".
  Additionally, many people have begun to spin elaborate scenarios
  about the future market in personal information, in which I can
  withhold my personal information unless the price is right.  These
  scenarios might hold some value for certain purposes, but they have
  little to do with protecting informational privacy.  The crucial
  issue is bargaining power.  The organizations that gobble your
  personal information today have computer systems that, by their very
  design, profoundly presuppose that the organization will capture
  information about you and store it under a unique identifier.  They
  mostly capture this information with impunity because you can do
  little to stop them.  If your personal information were suddenly
  redefined by the law as personal property tomorrow, assuming
  that the lawyers figured out what this idea even *means*, then
  I predict that, the day after tomorrow, every adherence contract
  (that's legalese for "take it or leave it", the prototype being
  those preprinted contracts for credit cards and rental cars and
  mortgages that are covered with fine print that the firm's local
  representative has no authority to modify or delete) in the affected
  jurisdiction would suddenly sprout a new clause issuing to the
  organization an unrestricted license (or some such legal entity)
  over the use of your personal information.  You can refuse, of
  course, but you'll be in precisely the same position that you are
  today: take it or leave it.  The widespread belief to the contrary
  reflects a downright magical belief in the efficacy of property
  rights.  Establishing property rights in your personal information
  might actually be a good idea, but it's not nearly sufficient.
  What's really needed is machinery that establishes parity of
  bargaining power between individuals and organizations -- the
  informational equivalent of unions or cooperatives that can bargain
  as a unit for better terms with large organizations.  That machinery
  most likely doesn't need property rights to be defined over personal
  information, but maybe it would make things clearer.  That's the
  only real argument I can find for the idea, and it's not a very
  strong one.

   * "We have to balance privacy against industry concerns."

  This is probably the weakest of these arguments.  It is
  also probably the most common in administrative hearings at the
  Federal Communications Commission and the like.  It reflects a
  situation in which a bureaucrat is faced with privacy activists
  on one side and industry lobbyists on the other side, and so they
  are forced to construct the notion of a "balance" between the
  two sides' arguments.  The bureaucrats will profess themselves
  impressed by the economic benefits of the large new industry
  said to be in the offing.  These benefits are often framed in
  terms of "wealth creation", without much consideration of whether
  this wealth will be delivered to the people from whom it was
  extracted.  But the arguments just don't compare.  Privacy is an
  individual right, not an abstract social good.  Balancing privacy
  against profit is like balancing the admitted evils of murder
  against the creation of wealth through the trade in body parts
  for transplants.  It simply does not work that way.

   * "Privacy paranoids want to turn back the technological clock."

  Beware any attempt to identify privacy invasion with technical
  progress.  It is true and important that routine and rapidly
  expanding privacy invasion is implicit in traditional methods
  of computer system design, but plenty of technical design
  methods exist to protect privacy, especially using cryptography.
  This kind of argument has been used with particular force in
  the case of Caller Number ID (aka Caller ID, or CNID).  It is
  well known by now that CNID promises a thousand applications at
  the intersection between the world of telephones and the world
  of computers.  Privacy advocates are upset about CNID because
  industry keeps promoting rules that make it difficult for people
  to "block" their lines, thus preventing their phone number from
  being sent out digitally except when they explicitly ask for it
  to be sent.  Proponents of industry's view have gone to great
  lengths, though, to define things in terms of "pro-CNID" versus
  "anti-CNID" camps, and I have found myself that it takes great
  determination to stay away from this terminology.  As soon as
  any kind of technological debate get defined as "pro-" versus
  "anti-", whole layers of rhetoric start cutting in: they're
  Luddites!  But it doesn't work that way.  Most technologies worth
  having can be designed to provide inherent privacy protections
  -- not just data security (see above), but convenient, iron-clad
  mechanisms for opting out or for participating without having
  one's information captured and cross-indexed by a universal
  identifier.  I'm not normally inclined to advocate technical
  fixes, but when it comes to information technology and privacy,
  I actually do think that they're the only answer that can stick.

--------------------------------------------------------------------

  This month's recommendations.

  Jack A. Gottschalk, Crisis Response: Inside Stories on Managing
  Image Under Siege, Detroit: Visible Ink, 1993.  A book by and
  for PR people, a couple dozen case studies of crisis management
  written by the PR people who were on the front lines.  Some of
  them derive from cases, like the Tylenol poisoning, where a more
  or less faultless company did more or less the right thing, and
  others record the good clean fun of corporations fighting with
  one another over billion-dollar court cases.  But many others
  are represented as well, all written by people whose profession
  is the rationalization of egregious conduct.  The chapter about
  the isocyanate leak at Union Carbide's Bhopal plant is in this
  category -- a cornucopia of special pleading that is worth the
  price of the book.

  Colin J. Bennett, Regulating Privacy: Data Protection and
  Public Policy in Europe and the United States, Ithaca: Cornell
  University Press, 1992.  An intelligent study of the politics of
  privacy in several countries.  Bennett is a political scientist
  who uses privacy as the occasion for investigating general
  questions of how issues get defined and negotiated within
  societies.  His book sets standards for intellectual seriousness
  and scholarly rigor in research on privacy policy.

  Democratic Culture is the newsletter of Teachers for a Democratic
  Culture, PO Box 6405, Evanston IL 60204, jkw@midway.uchicago.edu.
  TDC started out as a liberal academics' answer to the "political
  correctness" craze started by the likes of Dinesh D'Souza.  Its
  newsletter, though, has grown into an interesting, politically
  diverse, and unusually high-quality discussion of the complex
  realities of intellectual freedom.  You can sign up for $50 if
  you can afford it, $25 if you can't, and $5 if you're a student
  or have a low income.

--------------------------------------------------------------------

  Company of the month.

  This month's company is:

  CDB Infotek
  Six Hutton Centre Drive
  Santa Ana, California  92707

  (800) 427-3747
  (714) 708-2000

  CDB Infotek is one of those companies you keep hearing about that
  will look up all kinds of personal information about you for a
  fee.  One large market for this information is in pre-employment
  background checks.  Their price list is fascinating.  It may or
  may not be reassuring that a nationwide felony search is $1500.
  Consumer credit reports are $25, FAA aircraft ownership searches
  are $20, and registered voter profiles are $25.  Motor vehicle
  ownership searches by name vary between $8 and $20 by state.
  Real property ownership searches run between $15 and $30 per
  state.  All manner of superior court records are available for
  usually $8 to $10 per court (e.g., San Diego County Divorce Court
  searches are $7.75).  The new subscriber fee is $199 plus $25 per
  month.  I find it comforting that these prices are all so high.
  Just think what the world will be like when they drop by a factor
  of twenty or fifty.

--------------------------------------------------------------------

  Follow-up.

  The New South Polar Times, an amusing diary of life among
  the scientists at the South Pole, is available on the Web at
  http://139.132.40.31/NSPT/NSPThomePage.html

  Chris Mays <cmays@mercury.sfsu.edu> has issued a new edition
  of his Frequently Asked Questions on California Electronic
  Government Information.  The URL is
  http://www.cpsr.org/cpsr/states/california/cal_gov_info_FAQ.html
  You can also view the ascii text by gopher at CPSR.
  Host=gopher.cpsr.org
  Port=70
  Path=0/cpsr/states/california/941101.cal_gov_info_FAQ

--------------------------------------------------------------------
  Phil Agre, editor                                pagre@ucla.edu
  Department of Information Studies
  University of California, Los Angeles         +1 (310) 825-7154
  Los Angeles, California  90095-1520                FAX 206-4460
  USA
--------------------------------------------------------------------
  Copyright 1994 by the editor.  You may forward this issue of The
  Network Observer electronically to anyone for any non-commercial
  purpose.  Comments and suggestions are always appreciated.
--------------------------------------------------------------------

Go back to the top of the file