Philip E. AgreMIT Press, 1997
University of California, San DiegoMarc Rotenberg
Electronic Privacy Information Center
Privacy is the capacity to negotiate social relationships by controlling
access to personal information. As laws, policies, and technological design
increasingly structure people's relationships with social institutions,
individual privacy faces new threats and new opportunities. Over the last
several years, the realm of technology and privacy has been transformed,
creating a landscape that is both dangerous and encouraging. Significant
changes include large increases in communications bandwidths; the widespread
adoption of computer networking and public-key cryptography; mathematical
innovations that promise a vast family of protocols for protecting identity
in complex transactions; new digital media that support a wide range of
social relationships; a new generation of technologically sophisticated
privacy activists; a massive body of practical experience in the development
and application of data-protection laws; and the rapid globalization of
manufacturing, culture, and policy making.
The essays in this book provide a new conceptual framework for the analysis
and debate of privacy policy and for the design and development of information
systems. The authors are international experts in the technical, economic,
and political aspects of privacy; the book's strength is its synthesis of the
three. The book provides equally strong analyses of privacy issues in the
United States, Canada, and Europe.
Contributors:
Victoria Bellotti, Design for privacy in multimedia computing and
communications environments
Colin J. Bennett, Convergence revisited: Towards a global policy for personal
data protection
Herbert Burkert, Privacy enhancing technologies: Typology, vision, critique
Simon G. Davies, Re-engineering the privacy right: How privacy has been
transformed from a right to a commodity
David H. Flaherty, Controlling surveillance: Can privacy protection be made
effective?
Robert Gellman, Does privacy law work?
Viktor Mayer-Schoenberger, Generational development of data protection in
Europe
David J. Phillips, Cryptography, secrets, and the structuring of trust
Rohan Samarajiva, Interactivity as though privacy mattered
Introduction (8800 words)
(Please do not quote from this version, which changed slightly in proof.)
1. Introduction
Our premise in organizing this volume is that, since the 1980s, the
policy debate around technology and privacy has been transformed.
Tectonic shifts in the technical, economic, and policy domains have
brought us to a new landscape that is more variegated, more
dangerous, and more hopeful than before. These shifts include the
emergence of digital communications networks on a global scale;
emerging technologies for protecting communications and personal
identity; new digital media that support a wide range of social
relationships; a generation of technologically sophisticated privacy
activists; a growing body of practical experience in developing and
applying data protection laws; and the rapid globalization of
manufacturing, culture, and the policy process. The goal of this
volume is to describe this emerging landscape. By bringing together
perspectives from political science, law, sociology, communications,
and human-computer interaction, we hope to offer conceptual
frameworks whose usefulness may outlive the frenetically changing
details of particular cases. We believe that in the years ahead the public
will increasingly confront important choices about law, technology,
and institutional practice. This volume offers a starting point for
analysis of these choices.
The purpose of this introduction is to summarize and synthesize
the picture of this new landscape that the contributors have drawn.
First, however, I should make clear what we have not done. We have
not attempted to replace the foundational analysis of privacy that has
already been admirably undertaken by Allen (1988), Schoeman (1984),
and Westin (1967). We have not replicated the fine investigative work
of Burnham (1983) and Smith (1979). Nor, unlike Lyon and Zureik
(1996), have we tried to place the issues of privacy and surveillance in
their broadest sociological context. Our work is organized conceptually
and not by area of concern (medical, financial, marketing, workplace,
political repression, and so on). Although our case studies are drawn
from several countries, our method is not systematically comparative
(see Bennett 1992, Flaherty 1989, and Nugter 1990). We have not
attempted a complete survey of the issues that fall in the broad
intersection of "technology" and "privacy." By "technology," for
example, we mean information and communications technology; we
do not address the concerns raised by biological technologies such as
genetic analysis (Gostin 1995). Our concern with the interactions
among technology, economics, and policy complements Smith's (1994)
study of organizational issues and Regan's (1995) more detailed analysis
of the legislative process. Nor, finally, do we provide a general theory
of privacy or detailed policy proposals. We hope that our work will be
helpful in framing the new policy debate, and we have analyzed
several aspects of the development of privacy policy to date.
2. The New Landscape
Mayer-Schoenberger's chapter describes the configuration of technology
and privacy issues in the late 1960s and the early 1970s. In that period,
privacy concerns focused on a small number of large centralized
databases; although instrumental to the construction of the modern
welfare state, these databases also recalled the role of centralized files in
the fascist era. In the United States, concern about privacy arose
through popular works by Ernst and Schwartz (1962), Brenton (1964),
and Packard (1964), as well as a detailed scholarly treatment by Westin
(1967). In each case, though, the general form of the response was the
same -- an enforceable code of practice that came to be known as data
protection in Europe and privacy protection in the United States. The
premise underlying the Code of Fair Information Practices was the
same in both places: organizations that collected personal information
about individuals had certain responsibilities, and individuals had
rights against organizations in possession of personal information. In
some instances, these practices were codified by professions or industry
associations. In other instances they were reduced to law. As a general
matter, the focus was the centralized collection of data, specified in
place and time, and under the specific responsibility of a known
individual or organization. (These principles and their
implementation are described by Gellman. I will use the term "data
protection" here.) Data protection does not seek to influence the basic
architecture of computer systems. Instead, it abstracts from that
architecture to specify a series of policies about the creation, handling,
and disposition of personal data. Mayer-Schoenberger, Bennett, and
Flaherty describe the subsequent evolution of the data protection
model. This model is by no means obsolete, but the world to which it
originally responded has changed enormously.
Some of these changes are technical. Databases of personal
information have grown exponentially in number and in variety. The
techniques for constructing these databases have not changed in any
fundamental way, but the techniques for using them have multiplied.
Data-mining algorithms, for example, can extract commercially
meaningful patterns from extremely large amounts of information.
Market-segmentation methods permit organizations to target their
attention to precisely defined subgroups (Gandy 1993). Contests, mass
mailings, and other promotions are routinely organized for the sole
purpose of gathering lists of individuals with defined interests. More
data is gathered surreptitiously from individuals or sold by third
parties.
The pervasive spread of computer networking has had numerous
effects. It is now easier to merge databases. As Bennett observes,
personal information now routinely flows across jurisdictional
boundaries. Computer networking also provides an infrastructure for a
wide variety of technologies that track the movements of people and
things (Agre 1994). Many of these technologies depend on digital
wireless communications and advanced sensors. Intelligent
Transportation Systems, for example, presuppose the capacity to
monitor traffic patterns across a broad geographic area (Branscomb and
Keller 1996). These systems also exemplify the spread of databases
whose contents maintain a real-time correspondence to the real-world
circumstances that they represent. These computerized mediations of
personal identity have become so extensive that some authors speak of
the emergence of a "digital persona" that is integral to the construction
of the social individual (Clarke 1994).
Computer networking also provides the basis for a new generation
of advanced communications media. In the context of the analog
telephone system, privacy concerns (e.g., wiretapping and the abuse of
records of subscribers' calls) were largely circumscribed by the system's
architecture. Newer media, such as the Internet and the online services
discussed by Samarajiva, capture more detailed information about
their users in digital form. Moreover, the media spaces that Bellotti
describes are woven into their users' lives in a more intimate way than
older communications technologies. Digital technology also increases
both the capacity of law-enforcement authorities to monitor
communications and the capacity of subscribers to protect them.
At the same time, the new media have provided the technical
foundation for a new public sphere. Privacy activists and concerned
technologists have used the Internet to organize themselves, broadcast
information, and circulate software instantaneously without regard to
jurisdictional boundaries. Low-cost electronic-mail alerts have been
used in campaigns against consumer databases, expanded wiretapping
capabilities, and government initiatives to regulate access to strong
cryptography. Public-policy issues that would previously have been
confined to a small community of specialists are now contested by tens
of thousands of individuals. Although the success of these tactics in
affecting policy decisions has not yet been evaluated, the trend toward
greater public involvement has given the technology a powerful
symbolic value.
Potentially the most significant technical innovation, though, is a
class of privacy-enhancing technologies (PETs). Beginning with the
publication of the first public-key cryptographic methods in the 1970s,
mathematicians have constructed a formidable array of protocols for
communicating and conducting transactions while controlling access
to sensitive information. These techniques have become practical
enough to be used in mass-market products, and Phillips analyzes
some of the sharp conflicts that have been provoked by attempts to
propagate them. PETs also mark a significant philosophical shift. By
applying advanced mathematics to the protection of privacy, they
disrupt the conventional pessimistic association between technology
and social control. No longer are privacy advocates in the position of
resisting technology as such, and no longer (as Burkert observes) can
objectives of social control (if there are any) be hidden beneath the
mask of technical necessity. As a result, policy debates have been
opened where many had assumed that none would exist, and the
simple trade-off between privacy and functionality has given way to a
more complex trade-off among potentially numerous combinations of
architecture and policy choices.
Other significant changes are political and economic. The data
protection model has matured. Privacy commissioners such as Flaherty
have rendered hundreds of decisions in particular cases, and the nature
and the limitations of the privacy commissioner's role have been
clarified. It has become possible to ask how the effectiveness of privacy
policies might be evaluated, although (as both Flaherty and Bennett
observe) few useful methods have emerged for doing so. Pressures
have arisen to tailor data protection laws to the myriad circumstances
in which they are applied, with the result that sectoral regulation has
spread. In the United States, as Gellman observes, the sectoral approach
has been the norm by default, with little uniformity in regulatory
conception or method across the various industries. In most other
industrial countries, by contrast, sectoral regulation has arisen through
the adaptation and tailoring of a uniform regulatory philosophy.
This contrast reflects another, deeper divide. Bennett describes the
powerful forces working toward a global convergence of the conceptual
content and the legal instruments of privacy policy. These forces
include commonalities of technology, a well-networked global policy
community, and the strictures on cross-border flows of personal data in
the European Union's Data Protection Directive. While the United
States has moved slowly to establish formal privacy mechanisms and
standardize privacy practices over the last two decades, it now appears
that the globalization of markets, the growing pervasiveness of the
Internet, and the implementation of the Data Protection Directive will
bring new pressures to bear on the American privacy regime.
The evolution of privacy policy, meanwhile, has interacted with
individual nations' political philosophies. Mayer-Schoenberger argues
that this interaction should be viewed not on a nation-by-nation basis
but rather as the expression of a series of partial accommodations
between the uniform regulation of data handling and liberal political
values that tend to define privacy issues in terms of localized
interactions among individuals. (This tension runs throughout the
contemporary debate and will recur in various guises throughout this
introduction.)
One constant across this history is the notorious difficulty of
defining the concept of privacy. The lack of satisfactory definitions has
obstructed public debate by making it hard to support detailed policy
prescriptions with logical arguments from accepted moral premises.
Attempts to ground privacy rights in first principles have foundered,
suggesting their inherent complexity as social goods. Bennett points
out that privacy is more difficult to measure than other objects of
public concern, such as environmental pollution. The extreme lack of
transparency in societal transfers of personal data, moreover, gives the
issue a nebulous character. Citizens may be aware that they suffer harm
from the circulation of computerized information about them, but they
usually cannot reconstruct the connections between cause and effect.
This may account in part for the striking mismatch between public
expression of concern in opinion polls and the almost complete
absence of popular mobilization in support of privacy rights.
One result of this unsatisfactory situation is that the debate has
often returned to the basics. Mayer-Schoenberger and Davies both
remark on the gap between the technical concept of data protection and
the legal and moral concept of privacy, but they assign different
significance to it. For Mayer-Schoenberger, the concept of data protection
is well fitted to the values of the welfare state. Davies, however, focuses
on the range of issues that data protection appears to leave out, and he
regards the narrowly technical discourse of data protection as ill suited
to the robust popular debate that the issues deserve.
The basic picture, then, is as follows: Privacy issues have begun to
arise in more various and more intimate ways, a greater range of
design and policy options are available, and some decisions must
therefore be made that are both fundamental and extraordinarily
complicated. Perhaps the most basic of these strategic decisions
concerns the direction of technical means for protecting privacy. One
approach, exemplified by Bellotti's study, is to provide individuals with
a range of means by which to control access to information about
themselves. Another approach, discussed in detail by Phillips and by
Burkert, is to prevent the abuse of personal information from the start
through the application of privacy-enhancing technologies that
prevent sensitive data from being personally identifiable. PET's may
also be appealing because, unlike privacy codes, they are to some extent
self-enforcing. Although PET's may not eliminate the need for
ongoing regulatory intervention, they seem likely to reduce it.
3. Negotiated Relationships
Ideas about privacy have often been challenged by new technologies.
The existing ideas arose as culturally specific ways of articulating the
interests that have been wronged in particular situations. As a result,
cultural ideas about privacy will always tacitly presuppose a certain
social and technological environment -- an environment in which
those kinds of wrongs can occur and in which other kinds of wrongs
are either impossible, impractical, or incapable of yielding any benefit
to a wrongdoer. As new technologies are adopted and incorporated into
the routines of daily life, new wrongs can occur, and these wrongs are
often found to invalidate the tacit presuppositions on which ideas
about privacy had formerly been based. The development of law
illustrates this. In a landmark 1967 case, Katz v United States (389 US
347), the US Supreme Court found that a warrantless police recording
device attached to the outside of a telephone booth violated the Fourth
Amendment's protection against unreasonable searches and seizures.
This protection had formerly been construed primarily in cases
involving intrusion into a physical place, but the justices famously
held that the Fourth Amendment "protects people, not places" (at 351).
The moral interest at stake in data protection regulation has seemed
unclear to many. Turkington (1990), among others, has suggested
identifying this interest as "informational privacy." Another
complementary approach can be understood by returning to Clarke's
(1994) notion of the "digital persona" that has increasingly become part
of an individual's social identity. From this perspective, control over
personal information is control over an aspect of the identity one
projects to the world, and the right to privacy is the freedom from
unreasonable constraints on the construction of one's own identity.
This idea is appealing for several reasons: it goes well beyond the static
conception of privacy as a right to seclusion or secrecy, it explains why
people wish to control personal information, and it promises detailed
guidance about what kinds of control they might wish to have.
Bellotti and Samarajiva develop this line of thinking further. Their
point of departure is the sociologist Erving Goffman's finely detailed
description of the methods by which people project their personae.
Goffman (1957) argued that personal identity is not a static collection of
attributes but a dynamic, relational process. People construct their
identities, he suggested, through a negotiation of boundaries in which
the parties reveal personal information selectively according to a tacit
moral code that he called the "right and duty of partial display."
Goffman developed this theory in settings (e.g., public places) where
participants could see one another face-to-face, but it has obvious
implications for technology-mediated interactions. In particular, to the
extent that a technology shapes individuals' abilities to negotiate their
identities, Goffman's theories have implications for that technology's
design.
Bellotti and Samarajiva attempt to draw out these implications in
different contexts, Bellotti in her experiments with media spaces that
interconnect users' workspaces with video and data links and
Samarajiva in his study of an online platform being developed in
Quebec. These authors explore the conditions under which individuals
can exert control and receive feedback over the release of personal
information, Bellotti emphasizing technical conditions and
Samarajiva emphasizing institutional conditions. Both authors
describe conditions under which pathological relationships might
arise. Goffman's theory, by helping articulate the nature of the wrongs
in such cases, also helps specify how technology might help us avoid
them. Technology cannot, of course, guarantee fairness in human
relationships, but it can create conditions under which fairness is at
least possible; it can also undermine the conditions of fairness.
CNID (also known, somewhat misleadingly, as Caller ID) illustrates
the point. CNID is a mechanism by which a switching system can
transmit a caller's telephone number to the telephone being called. The
recipient's telephone might display the number or use it to index a
database. CNID seems to raise conflicting privacy interests: the caller's
right to avoid disclosing personal information (her telephone number)
and the recipient's right to reduce unwanted intrusions by declining to
answer calls from certain numbers. To reconcile these interests, CNID
systems generally come equipped with "blocking" options. Callers who
do not wish to transmit their telephone number can block CNID;
recipients may decline to answer calls that are not accompanied by
CNID information. In effect, the negotation of personal identity that
once began with conventional telephone greetings ("Hello?"; "Hello,
this is Carey; is Antonia there?"; "Oh, hi Carey, it's Antonia; how are
you?") now begins with the decision whether to provide CNID
information and the decision whether to answer the phone. Sharp
conflict, however, arises in regard to the details of the blocking
interface. If CNID is blocked by default then most subscribers may
never turn it on, thus lessening the value of CNID capture systems to
marketing organizations; if CNID is unblocked by default and the
blocking option is inconvenient or little-known, callers' privacy may
not be adequately protected. In 1995 these considerations motivated a
remarkable campaign to inform telephone subscribers in California of
their CNID options. The themes of default settings and user education
recur in Bellotti's study, in which one system failed to achieve a critical
mass of users because so many users had unintentionally kept it turned
off.
It is useful to compare this approach to privacy with the approach
taken by many proponents of PETs. Representative studies from this
volume might be arranged in two dimensions, as shown in figure 1.
The horizontal axis distinguishes cases according to the structure of the
interaction -- between individual subjects or between a subject and an
institution such as a bank. This distinction is obviously artificial, since
it excludes intermediate cases, additional relevant parties, and other
elements of context. The vertical axis pertains to the conceptualization
of privacy, as the negotiation of personal boundaries or as the regulated
and conditional access of some authority to sensitive personal
information. (A third axis might contrast the normative standpoint of
the designer or policymaker -- see Bellotti and Burkert -- with the
empirical standpoint of the sociologist -- see Phillips and Samarajiva.)
In Phillips's study of public contests over cryptography, the relevant
authority is the government and the sensitive information is the
cryptographic keys that can reveal the content of personal
communications; in Burkert's analysis of technologies for conducting
anonymous transactions, the relevant authorities are numerous and
the sensitive information is personal identity. These authors' approach
is labeled "binary access" because the issue at stake is whether the
authority can gain access to the information; this access might be
conditional (for example, upon a court order), but once granted it opens
up a whole realm of personal information, often covertly, to a broad
range of unwelcome uses.
The two rows in figure 1 contrast in instructive ways. Personal
boundary negotiation emphasizes personal choice, reciprocity, and
fine-grained control over information in the construction of personal
identity. Binary access emphasizes individual rights, the defense of
interests, and coarse-grained control over information in the
protection of personal identity. This distinction partly reflects real
differences in the social relationships that two rows describe, but for
some authors it also reflects different underlying approaches to privacy
problems. Burkert points out that PETs are not inherently confined to
the maintenance of anonymity; they could be used as tools for the
finer-grained negotiation of identity. Mayer-Schoenberger associates
technologies for local choice with erosion of the values of social
democracy and emphasizes that individuals have found themselves
confused and overwhelmed by a proliferation of mechanisms that
require them to make choices about esoteric matters that are more
suited to technical analysis and legislation. Other authors, undaunted
by this potential, have embraced technologies for local choice as
expressions of market freedom. In each case, the actual conditions of
fairness and the effective protection of interests in concrete situations
are largely unknown.
The new technologies also have implications for conceptions of
relationship, trust, and public space. Samarjiva observes that
technology and codes of practice determine whether data-based
"relationships" between organizations and individuals are fair, or
whether they provoke anxiety. These concerns are a traditional
motivation for data protection regulation, but they are amplified by
technologies that permit organizations to maintain highly customized
"relationships" by projecting different organizational personae to
different individuals. Such "relationships" easily become asymmetric,
with the organization having the greater power to control what
information about itself is released while simultaneously obscuring the
nature and scope of the information it has obtained about individuals.
Phillips describes a controversy over the conditions under which
individuals can establish private zones that restrict access by outsiders.
A secure telephone line is arguably a precondition for the
establishment of an intimate relationship, an interest which has long
been regarded as a defining feature of human dignity (e.g., Fried 1968).
This concern with the boundaries that are established around a
relationship complements Bellotti's concern with the boundaries that
are negotiated within a relationship. It also draws attention to the
contested nature of those boundaries.
Beneficial relationships are generally held to require trust. As the
information infrastructure supports relationships in more complex
ways, it also creates the conditions for the construction of trust. Trust
has an obvious moral significance, and it is economically significant
when sustained business relationships cannot be reduced to periodic
zero-sum exchange or specified in advance by contract. Phillips and
Burkert emphasize the connection between trust and uncertainty, but
they evaluate it differently. For Phillips, trust and uncertainty are
complementary; cryptography establishes the boundaries of trust by
keeping secrets. Burkert, however, is concerned that this approach
reduces trustworthiness to simple reliability, thereby introducing tacit
norms against trusting behavior. Just as technology provides the
conditions for negotiating relationships, it also provides the conditions
for creating trust. Samarajiva points to the institutional conditions by
which a technical architecture comes to support these conditions or
else evolves toward a regime of coercive surveillance.
Public spaces have traditionally been understood as primary sites for
the conduct of politically significant activities, and systematic
surveillance of those spaces may threaten the freedom of association.
Davies describes the shifting ways in which public discourse in the
United Kingdom since the 1980's has constructed the issue of
surveillance in public spaces. The introduction of inexpensive video
cameras has brought a tension between privacy and personal security,
generally to the detriment of the political value of privacy. Bellotti
points out that the new technologies are being used in ways that erode
the distinction between public and private space and in ways that
problematize the very idea of private space by establishing long-lived
interconnections among formerly separate spaces. Although
occasioned by technological innovations, these observations converge
with recent attempts to renegotiate the concepts of public and private
and the conceptions of intimate relationship and political discourse
that have traditionally gone with them.
Taken together, these considerations describe a complex new terrain
around the central issues of voluntariness and coercion. Davies
observes that organizations often identify disclosures of personal
information as voluntary, even when the consequences of disclosure
are unclear, when alternative courses of action are unavailable, or
when failures to disclose are accompanied by unreasonable costs. This
contested terrain was first mapped by the data protection model, with
its concepts of notification and transparency. The purpose of these
concepts was to enable individuals to bargain more effectively over the
disclosure and use of personal information. The emerging
technological options, however, create a more complicated range of
alternatives for policy and design. A fundamental division of labor
emerges: some decisions about privacy are made collectively at the
level of system architecture, and others are made individually through
local bargaining. System architecture necessarily embodies social choices
about privacy; these choices can make abuses more difficult, but they can
also prevent individuals from tailoring their technology-mediated
relationships to their particular needs. System architecture, however,
rarely suffices to shape social outcomes, and architectural choices must
always be complemented by policy measures such as regulations and sectoral
codes. The data protection model analyzed the relationship between
architecture and policy in a simple, powerful way. Now, however, other
possible analyses are becoming perceptible on the horizon.
4. Economic and Technical Scenarios
Privacy issues, then, pertain to the mechanisms through which people
define themselves and conduct their relationships with one another.
These mechanisms comprise technologies, customs, and laws, and
their reliability and fairness are necessary conditions of a just social
order. Those who raise concerns about privacy propose, in effect, to
challenge the workings of institutions. Disputes about privacy are,
among other things, contests to influence the historical evolution of
these institutions. Before considering these contests in detail, though,
let us consider the economic and technical logics that, some have held,
are truly driving the changes that are now under way. My purpose is
not to present these scenarios as adequate theories of the phenomena,
but rather to make them available for critical examination.
A useful point of departure is Casson's (1994) analysis of the role of
business information in the evolution of social institutions. Casson
observes that information and markets have a reciprocal relationship:
perfect markets require perfect information, but information is usually
not free. To the contrary, information is one more good that is traded
in the market. As information becomes cheaper (for example through
new technology) transaction costs are reduced accordingly and the
conditions for perfect markets are better approximated (Coase 1937,
Williamson 1975). This theory makes the remarkable prediction that
many social institutions, having originated to economize on
information costs, will break down, to be replaced by institutions that
more closely resemble the market ideal of individually negotiated
transactions. Advance ticket sales through networked booking systems,
to take a simple example, now supplement box-office queues as a
mechanism for allocating movie tickets. On a larger scale, the gradual
breakdown of fixed social roles and of the customary responsibilities
that go with them is facilitated by technological changes, particularly in
communications and in record keeping, that make it easier to establish
and evaluate individual reputations.
This theory suggests two consequences for privacy. The first is that
individualized transactions must be monitored more closely than
custom-bound transactions. By way of illustration, Casson offers a
science fiction story about the future of road transportation in a world
of very low information costs. Nowadays most roads are provided
collectively, and their use is governed by customary mechanisms (such
as traffic lights and right-of-way rules) that permit drivers to move
toward their destinations without colliding very often. From an
economic standpoint, this scheme has obvious inefficiencies: road
utilization is uneven, congestion is common, and drivers spend much
time waiting. With lower information costs, however, roads could
operate more like railroads. Drivers wishing to go from point A to
point B would call up a reservation system and bid for available
itineraries, each precisely specifying the places and times of driving.
Drivers' movements would be still regulated by traffic signals, but the
purpose of the signals would now be to keep the drivers within the
space-time bounds of the journey they had purchased. Lower-paying
drivers would be assigned slower and less scenic routes, other things
being equal, than higher-paying drivers. Overall efficiency would be
maximized by the market mechanisms embodied in the reservation
system. Issues of technical workability aside, Casson points out that
such a scheme would raise concerns through its reliance on detailed
monitoring of drivers' movements. Nor are these concerns entirely
hypothetical, in view of the automatic toll-collection technologies now
being deployed as part of the Intelligent Transportation Systems
program mentioned above (Agre 1995). Decreasing information costs
make toll roads cheaper to operate, thus contributing to their spread.
Information costs alone, of course, do not explain the full political and
institutional dynamics of these developments, but they do lower one
barrier to them.
The second consequence of Casson's theory relates to the evolution
of privacy regulation itself. Data protection regulation, on this theory,
is essentially a set of customary responsibilities imposed on
organizations that gather personal information. The theory further
suggests that these responsibilities are economically inefficient,
inasmuch as they they preclude individualized negotiations between
organizations and individuals about the handling of each individual's
information. The "one size fits all" regulations are efficient, however,
if the costs of individualized negotiation are high. Data protection
regulation is thus similar to the phenomenon of standardized
contracts, which also economize on transaction costs, and regulation
(as opposed to market competition between different sets of
standardized contract terms) is required because the parties to a
standardized contract hold asymmetrically incomplete information
about the real costs and benefits of various information-handling
policies. New information technologies reopen these questions in two
ways: by making it economically feasible in some cases for
organizations to negotiate the handling of customers' information on a
more individualized basis, and by providing policymakers with a
broader range of possible information-handling policies to impose on
organizations that transact business with the public. Casson's analysis
focuses on the first of these points, predicting a transition from
generalized regulation to localized negotiation of privacy matters.
These two implications of Casson's theory may seem contradictory:
more individualized market transactions require greater monitoring
and therefore less privacy, and decreases in transaction costs permit a
transition toward local negotiation and thus more efficient allocation
of rights to personal information. Both of these contradictory
movements are found in reality, however, and both movements are
likely to continue. An economic optimist would suggest that the
necessary outcome is a high level of potential monitoring whose actual
level is regulated by the same allocative mechanisms that regulate
everything else in the market. This scenario, however, makes
numerous assumptions. It assumes, for example, that market forces
will cause economic institutions to become perfectly transparent. This
seems unlikely. After all, as Casson points out, information about
information is inherently expensive because it is hard to evaluate the
quality of information without consuming it. Economic efficiency can
even be reduced by advertising and public-relations practices that
frustrate consumers' attempts to distinguish between valid and bogus
reputations. (Davies's chapter bears on this point in reference to the
public construction of privacy issues by interested parties.) Most
important, the optimistic economic scenario applies only in the longest
possible term, after all transaction costs have been reduced to zero and
the whole technical infrastructure of society has been revised to
implement the efficient economic regime that results. It says little
about the fairness of the constantly shifting and inevitably contested
series of institutional arrangements that will govern social life between
now and then.
Even if it is accepted, Casson's theory predicts only incremental
shifts and should not be interpreted as arguing for a wholesale
abandonment of regulation. Indeed, to the extent that transaction costs
remain high, the theory argues that traditional protections should be
retained. Advances in technology do not necessarily reduce transaction
costs, and many such costs may have no relationship to technology. In
some cases, as Rotenberg (1996) has pointed out, reductions in
transaction costs should actually cause traditional protections to be
strengthened. Posner (1981: 256, 264), for example, argues that high
transaction costs should relieve magazine publishers and the Bureau of
the Census of the obligation to obtain individuals' permission before
making certain secondary uses of information about them; as
technology lowers transaction costs, this argument is steadily
weakened.
Assuming, however, that something approximating the economic
scenario comes true, how might the necessary technologies of localized
negotiation be implemented? Data protection regulation, as we have
seen, was originally motivated by fears about a single centralized
government database, and it was subsequently forced to adjust its
imagination to accommodate a world of wildly fragmented databases in
both the public and the private sector. Immense incentives exist to
merge these various databases, however, and many have predicted that
the spread of computer networking will finally bring about the original
dystopian vision. This vision has not come about on a large scale because
of the great difficulty of maintaining the databases that already exist
and because of the equally great difficulty of reconciling databases that
were created using incompatible data models or different data-collection
procedures (Brackett 1994).
It will always be difficult to reconcile existing databases. But
newly created databases may be another story. Whereas the old dystopian
scenario focused its attention on a single centralized computer, a new
scenario focuses on standardized data models -- data structures and
identification and categorization schemes that emerge as standards,
either globally or sectorally. Standards have contributed to the rise of
institutions because they permit reliable knowledge of distant
circumstances (Bowker 1994; Porter 1994). Many people once thought
that it was impossible, for example, to categorize and intermingle grain
from different farms (Cronon 1991). Once this was finally achieved,
after several false starts, it became possible to trade commodities in
Chicago and New York. Those cities became "centers of calculation"
(Latour 1987) that gathered and used information from geographically
dispersed sources. Information from multiple locations was
commensurable, to within some controllable degree of uncertainty,
because of the standardization of categories and measurement schemes.
As a result, those with access to this combined information possessed a
"view from nowhere" (Porter 1994) that permitted them to enter a
tightly bound relationship with numerous places they had never
visited.
Standardized data models may produce a similar effect, and for
precisely the same reasons: standardized information is more valuable
when it crosses organizational boundaries, and standardized real-world
circumstances are more easily administered from a distance. A firm
that plans to exchange information with other organizations can
facilitate those transactions by adhering to standards, and a firm that
expects to consolidate with other firms in its industry can increase its
own value by standardizing its information assets.
It now becomes possible to employ distributed object database
technology (Bertino and Ozsu 1994) to share information in an
extremely flexible fashion. To be truly dystopian, let us imagine a
single, globally distributed object database. Every entity instance in the
world, such as a human being, would be represented by a single object,
and all information about that person would be stored as attributes of
this object. Individual organizations would still retain control over
their proprietary information, but they would do so using
cryptographic security mechanisms. At any given time, each
organization would have access to a certain subspace of the whole vast
data universe, and it would be able to search the visible portion of that
data space with a single query. Organizations wishing to make certain
parts of their database public (for example, firms wishing to make
product information available to customers) would lift all security
restrictions on the relevant attributes. Organizations with data-sharing
agreements would selectively provide one another with keys to the
relevant attributes in their respective segments of the data space.
Real- time markets in data access would immediately arise. Lowered
transaction costs would permit access to individual data items to be
priced and sold. Data providers would compete on data quality,
response time, and various contract terms (such as restrictions on
reuse). Once the global database was well established, it would exhibit
network externalities, and whole categories of organizations would
benefit from joining it for the same reasons they benefit from joining
interoperable networks such as the Internet. (Database researchers such
as Sciore, Siegel, and Rosenthal (1994) speak of "semantic
interoperability," but they are usually referring to systems that render
existing heterogeneous databases interoperable by translating each
database's terms in ways that render them commensurable with one
another.)
Such a database would, of course, be extraordinarily difficult to get
started. Bowker and Star (1994) illustrate some of the reasons in their
study of an ambitious global project to standardize disease
classifications. But the standardization of grain was difficult too.
Standardization is a material process, not just a conceptual exercise,
and standardized databases cannot emerge until the practices of
identification and capture are standardized. The point is that
considerable incentives exist to perform this work. On this view,
privacy is not simply a matter of control over data; it also pertains to
the regimentation of diverse aspects of everyday life through the
sociotechnical mechanisms by which data are produced (Agre 1994).
5. Constructing Technology and Policy
Although powerful as stimuli to the imagination, the scenarios in the
previous section are too coarse to account for the social contests
through which privacy issues evolve. The authors in this volume
paint a coherent and powerful picture of these contests -- a political
economy of privacy that complements and extends the work of Gandy
(1993). This section summarizes their contributions, focusing on the
interactions among technology, economics, and policy. This theory
starts with observations about the somewhat separable logics of
technology, economics, and policy.
* Technological logic. Among the many factors influencing the
evolution of technical systems, some are internal to the engineering
disciplines that design them. Bijker (1987) refers to the "technological
frames" that shape engineers' understandings of the problems they
face and subsequently shape outsiders' understandings of the
technologies themselves. Agre describes the metaphors -- processing
and mirroring -- that help define a technological frame for the design
of a computer system. Another aspect of this frame is the historically
distant relationship between the designers of a system and the people
whose lives the system's data structures represent. Although system
designers' technological frame has changed slowly over the history of
the computer, Bennett observes that the technology has improved
rapidly in speed and connectivity. As a result, the underlying
representational project of computing -- creating data structures that
mirror the whole world -- has found ever-more-sophisticated means
of expression in actual practices. Other technological systems -- for
example, the infrastructures of transportation, communications, and
finance -- embed their own disciplinary logic that shapes the privacy
issues that arise within them.
* Economic logic. Privacy issues have evolved in the context of several
trends in the global economy. Samarajiva points to the decline of the
mass market and the proliferation of "compacks" -- packages of
products and services that are, to one degree or another, adapted to
the needs of increasingly segmented markets and even particular
customers. The flexible production of compacks presupposes a
decrease in the costs of coordinating dispersed manufacturing and
service activities, and the marketing of compacks presupposes a
decrease in the costs of tracking the market and maintaining tailored
relationships with numerous customers. Samarajiva also points to
the significance of network externalities (which strongly condition
attempts to establish new information and communications
networks) and the peculiar economics of information (which can be
dramatically less expensive to distribute than to produce). These
effects imply that classical economic models may be a poor guide to
the success of new social institutions.
* Policy logic. The authors in this volume are particularly concerned
with the dynamics of the policy process. Moving beyond a normative
consideration of privacy policies, they reconstruct the evolution and
the implementation of these policies. Their central claims have
already been sketched. Bennett observes that privacy policy has
emerged from a global network of scholars -- an "epistemic
community" (Drake and Nicolaidis 1992) whose thinking develops in
a coordinated fashion. Conflicts arise, as might be expected, through
organizations' attempts to quiet privacy concern in the public sphere
(Davies) and the countervailing initiatives of other policy
communities (Phillips). Privacy issues are distinctive, though, in
their rapid globalization and in the oft-remarked mismatch between
their high level of abstract concern and their low level of concrete
mobilization.
Expressed in this way, these points seem to hover outside history. Yet
they are necessary preliminaries to consideration of the numerous
modes of interaction among the respective logics. Perhaps the most
striking aspect of this interaction is the recurring sense, remarked by
Flaherty and Davies, that privacy is a residual category -- something left
over after other issues have staked their claims. Privacy often emerges
as a "barrier" to a powerful technical and economic logic, and privacy
policy, in its actual implementation, often seems to ratify and
rationalize new technical and institutional mechanisms rather than
derailing them or significantly influencing their character. Part of the
problem, of course, is a mismatch of political power. But it is also a
problem of imagination: when privacy concerns arise in response to
particular proposals, they can appear as manifestations of a perverse
desire to resist trends that seem utterly inevitable to their proponents
and to broader institutional constituencies.
With a view to understanding this phenomenon, let me sketch
some of the reciprocal influences among the technological, economic,
and policy logics of privacy.
Technological -> economic Technology influences economic
institutions in several ways. Perhaps most obvious, technology
largely determines the practicalities of production, and classical
economics holds technical factors constant when exploring the
conditions of market equilibrium. In a more subtle way, technology
serves a rule-setting function. The computers and networks of a stock
trading system, for example, embody and enforce the rules of trading.
The rule-setting function is also familiar from the previous section's
analysis of technology's role in setting ground rules for the
negotiation of individual relationships. Finally, technology affects
economic institutions through its effect on information costs and
other transaction costs. As Casson points out, these effects can be
far-reaching and qualitative and can provide a major impetus for the
economic monitoring that raises privacy concerns.
Technological -> policy Technology influences policy formation,
Phillips observes, through the obduracy of its artifacts: once
implemented, they are hard to change. Technological practices are
obdurate as well; PETs represent a rare occasion on which technical
practitioners have revealed fault lines within an existing body of
technical practices, so that formerly inevitable technical choices begin
to admit alternatives. Technology thus influences policy in a second
way: by determining the range of technological options available for
addressing public concerns. Flaherty's tale of medical prescription
systems represents a rare occasion on which a data protection
commissioner has been able to affect the architecture of a proposed
system, but at least the option was technically available. The
dynamics of privacy policy, as Bennett points out, have also been
shaped by the global nature of computer networking, which is even
more indifferent to jurisdictional boundaries than the telephone
system. Technology also influences policy, finally, by providing the
infrastructure for contesting policy, as in the case of privacy activists'
use of the Internet.
Economic -> technological Technology acquires its obduracy, in part,
through the economic dynamics of standards. Once established in the
market as the basis for compatibility among large numbers of
otherwise uncoordinated actors, standards tend to be reproduced
regardless of their consequences for concerns such as privacy, or even
regardless of their efficiency in a changed technological environment
(David 1985). The standards themselves, meanwhile, arise through
practices of computer system design that have been shaped across
generations through the demands of business applications. The full
extent of this influence is not generally visible to the practitioners,
who inherit a seemingly rational body of practices and who
encounter "problems" in the bounded, conventionalized forms that
these practices define. Yet the influence becomes visible suddenly as
technical practices are moved from applications that structure
relationships within the scope of a business to applications that
structure relationships with other parties. A current example:
Client-server systems arose on the premise that users do not wish do "see"
the boundary between the client and the server. This is a reasonable
commitment when that boundary has no legal or moral significance.
It is not at all reasonable, however, for consumer applications -- such
as commerce on the Internet -- in which that boundary corresponds
precisely to the sphere over which individual users wish to maintain
informational control. In such cases, it suddenly becomes important
for the boundary between client and server to become visible, and the
invisibility of this boundary becomes the raw material for rumors
and scams.
Economic -> policy Flaherty notes several possible effects of
economic phenomena on the policy process. Perceptions of the
overall health of the economy may, other things being equal,
influence the attention devoted to privacy issues. The organization of
an industry affects its capacity to mobilize politically to define issues
and shape regulatory regimes. Budgetary pressures on government
agencies create organizational incentives for privacy-threatening
initiatives that might not otherwise have materialized. Other
economic phenomena affect the substance of policy issues and the
institutional realities with which any policy will interact in its
implementation. For example, Bennett points out that the economic
properties of personal information have contributed to the global
nature of privacy problems and to the global harmonization of
privacy policy. Samarajiva's chapter is a concrete study in the
complex arrangement of economic incentives that motivated one
enterprise to establish relatively strong privacy policies. Because the
system was only going to be viable if 80 percent of the potential
members of the audience took specific actions to sign up, it became
necessary to appease privacy fundamentalists by providing customers
with a high degree of transparency and control over personal
information. These policies were largely private rather than public,
but they were congruent with the strong regulatory regime in Quebec.
Similar considerations may explain the willingness of industries to
submit to particular regulatory systems.
Policy -> economic Little work has been done to evaluate the
economic impact of privacy policy. Reporting requirements impose
costs, and rules about the secondary use of personal information
affect business models and may sometimes determine their viability.
On the other hand, when organizations review their
information-handing practices with a view to implementing new privacy
policies, they frequently discover opportunities for improvements in
other areas. Privacy-protection measures may also reduce the economic
costs associated with identity theft and other fraudulent uses of
personal information. Policy may also influence economic
institutions by shaping technology, though any massive effects of this
type still lie in the future.
Policy -> technological Historically, privacy policy has not attempted
to influence the basic architecture of information systems. Burkert
and Agre, however, point to a joint report of Ontario's Information
and Privacy Commissioner and the Netherlands' Registratiekamer
(1995) that explores technological means for enhancing privacy. The
privacy commissioners have focused on a subtle aspect of modern
databases: although these databases have historically been designed
on the assumption that data records can be traced back to the subjects
they represent, such tracing is often unnecessary. Digital cash (Chaum
1992) is probably the best-known of the many schemes by which the
tracing can be prevented or regulated to the subject's advantage by
technical means while still providing organizations with the
guarantees they need to conduct business. Even when the state does
not mandate specific technical systems, Flaherty observes, data
commissioners have an advocacy role of raising privacy issues early
enough in the life cycle of an emerging infrastructure to raise
consciousness about the implications of various technical options. At
the same time, the US government has been attempting to regulate
the spread of strong cryptography. Its sequence of "Clipper" proposals,
described in part by Phillips, show the government learning how to
intervene in the increasingly globalized dynamics of technical
standard setting. If successful, these proposals would lead to a
significant amount of new global infrastructure for the
implementation of key escrow.
These considerations, drawn from several disciplines, begin to fill
in the ambitious but oversimple picture that emerges from economic
analysis alone. Privacy issues emerge as indicators of larger contests
over the development of institutions, and the new landscape around
technology and privacy emerges as a metaphorical Los Angeles:
diverse, crowded, plagued by great inequalities and poor visibility, and
subject to frequent earthquakes. Some hold that the Big One is upon us;
others are more impressed with the damping effects of institutional
inertia.
6. Policy Issues
The new landscape obviously poses significant challenges to privacy
policy. Perhaps the greatest of these challenges derives from the greater
range of available policy instruments. These instruments include the
following:
* Privacy rules administered by a data protection commissioner. How
can privacy officials enforce codes of fair information practices in
times of rapid technological change? A movement toward policy
convergence -- across borders, technologies, and categories of
information -- would make privacy commissioners' work easier;
increased fragmentation of policy or technology would make
regulators' lives more complicated.
* Encouraging or requiring the use of privacy-enhancing technologies
such as digital cash. Even though technological innovations hold
considerable promise as a means of privacy protection, legislatures
and privacy commissioners may have a limited ability to influence
the fundamental architectures of technical systems. Nonetheless,
they can raise consciousness, support research and standard-setting,
participate in the development of demonstration systems, prepare
model codes for those systems that do employ privacy-enhancing
technologies, influence government systems procurement, and
encourage academics to include privacy-enhancing technologies in
textbooks and training curricula.
* Protocols for individualized negotiation of personal data handling.
As opportunities emerge for individuals to customize privacy
preferences, research should be conducted to evaluate alternative
arrangements. These evaluations should employ a broad range of
criteria, including ease of understanding, adequacy of notification,
compliance with standards, contractual fairness and enforceability,
appropriate choice of defaults, efficiency relative to the potential
benefits, and integration with other means of privacy protection.
Particular attention should be paid to uniformity of protocols across
different industries and applications, so that consumers are not
overwhelmed by a pointless diversity of interfaces and contracts.
* Outlawing certain technologies altogether, or limiting their use. To a
certain extent, technical architectures encourage particular forms of
interaction and particular types of relationship. Just as certain types
of contracts are considered inherently unconscionable, it is possible
that particular technologies may be found intrinsically unfair in their
practical effects.
* Sectoral codes of practice. Sectoral codes were once viewed as
alternatives to regulation. Now they are increasingly viewed as
complements to an established legal framework. Sectoral codes
provide opportunities to tailor codes of fair information practice to
particular services. Nonetheless, it remains to be seen whether
sectoral approaches to privacy protection will survive the rapid
convergence of information flows in global computer networks.
* Establishing a broader legal basis for torts of privacy invasion.
Although a vast amount has been written about the philosophical
basis for various torts of privacy invasion (e.g., Schoeman 1984), these
torts have remained narrow in scope and clearly inadequate to the
technical and economic realities of the modern trade in personal
information. As controversies arise in the new environment,
common law countries may begin to explore new causes of action
that formalize popular intuitions about the moral aspects of
information exchange.
* Privacy-protection standards. Bennett describes an effort by the
Canadian Standards Association to develop an auditable set of
privacy standards along similar lines to the ISO 9000 standards.
Although developed and administered by independent
organizations, these standards can be referenced in legislation (for
example as a condition of government contracts).
* "Bottom-up" education and training among system developers,
employees with data handling responsibilities, and the general public.
No matter how well-crafted a privacy code might be, privacy will only
be protected if the necessary information practices are actually
followed. Policy-makers need to understand how privacy issues
actually arise in the daily activities of information workers, and
organizational cultures need to incorporate practicable norms of
privacy protection. Once established, these norms will only be
sustained if the general public understands the issues well enough to
make informed choices and to assert their rights when necessary.
These technical and policy solutions are potentially complementary;
they comprise what Cavoukian and Tapscott (1997: 197-198) have called
a "mosaic of solutions." It will require great effort to determine the
appropriate combination of means to protect privacy in particular
settings. PETs in particular must travel a long road from theory to
practice, and it will be important to document and analyze their first
applications.
The new technologies' great promise will also require theoretical
innovation. As relationships are mediated by technology in more
sophisticated ways, designers and policymakers will need more
complex theories of agency and trust in technological environments.
Perhaps most important, future research should clarify the relationship
among technology, privacy, and association. Technologies for tracking
people and conducting surveillance of public space risk chilling the
freedom of association on which any possibility of democratic
community is based. Yet the nature of that risk is still obscure. So long
as privacy issues are located solely in the lives and relationships of
individuals, it will be impossible to conceptualize either the potentials
or the dangers of new media technologies for a democratic society.
* Acknowledgements
I greatly appreciate the contributions of Marc Rotenberg, as well as the
comments and suggestions of Colin Bennett, Trotter Hardy, and Robert
Horwitz.
* References
Agre, Philip E. 1994. Surveillance and capture: Two models of privacy.
The Information Society 10, no. 2: 101-127.
Agre, Philip E. 1995. Reasoning about the future: The technology and
institutions of Intelligent Transportation Systems. Santa Clara
Computer and High Technology Law Journal 11, no. 1: 129-136.
Allen, Anita L. 1995. Uneasy Access: Privacy for Women in a Free
Society. Rowman and Littlefield.
Bennett, Colin. 1992. Regulating Privacy: Data Protection and Public
Policy in Europe and the United States. Cornell University Press.
Bertino, Elisa, and M. Tamer Ozsu, eds. 1994. Distributed and Parallel
Database Object Management. Kluwer.
Bijker, Wiebe E. 1987. The social construction of Bakelite: Toward a
theory of invention. In The Social Construction of Technological
Systems, ed. W. Bijker et al. MIT Press.
Bowker, Geoffrey. 1994. Information mythology: The world of/as
information. In Information Acumen: The Understanding and Use of
Knowledge in Modern Business, ed. L. Bud-Frierman. Routledge.
Bowker, Geoffrey, and Susan Leigh Star. 1994. Knowledge and
infrastructure in international information management: Problems of
classification and coding. In Information Acumen: The Understanding
and Use of Knowledge in Modern Business, ed. L. Bud-Frierman.
Routledge.
Brackett, Michael H. 1994. Data Sharing: Using a Common Data
Architecture. Wiley.
Branscomb, Lewis M., and James H. Keller, eds. 1996. Converging
Infrastructures: Intelligent Transportation and the National
Information Infrastructure. MIT Press.
Brenton, Myron. 1964. The Privacy Invaders. Coward-McCann.
Burnham, David. 1983. The Rise of the Computer State. Random
House.
Casson, Mark. 1994. Economic perspectives on business information. In
Information Acumen: The Understanding and Use of Knowledge in
Modern Business, ed. L. Bud-Frierman. Routledge.
Cavoukian, Ann, and Don Tapscott. 1997. Who Knows: Safeguarding
Your Privacy in a Networked World. McGraw-Hill.
Chaum, David. 1992. Achieving electronic privacy. Scientific American
267, no. 2: 96-101.
Clarke, Roger. 1994. The digital persona and its application to data
surveillance. Information Society 10, no. 2: 77-92.
Coase, Ronald H. 1937. The nature of the firm. Economica NS 4:
385-405.
Cronon, William. 1991. Nature's Metropolis: Chicago and the Great
West. Norton.
David, Paul A. 1985. Clio and the economics of QWERTY. American
Economic Review 72, no. 2: 332-337.
Drake, William J., and Kalypso Nicolaidis. 1992. Ideas, interests, and
institutionalization: `Trade in services' and the Uruguay Round.
International Organization 46, no. 1: 37-100.
Ernst, Morris L., and Alan U. Schwartz. 1962. Privacy: The Right to Be
Let Alone. Macmillan.
Flaherty, David H. 1989. Protecting Privacy in Surveillance Societies:
The Federal Republic of Germany, Sweden, France, Canada, and the
United States. University of North Carolina Press.
Fried, Charles. 1968. Privacy (A moral analysis). Yale Law Journal 77,
no. 1: 475-493.
Gandy, Oscar H. Jr. 1993. The Panoptic Sort: A Political Economy of
Personal Information. Westview.
Goffman, Erving. 1957. The Presentation of Self in Everyday Life.
Anchor.
Gostin, Lawrence O. 1995. Genetic privacy. Journal of Law, Medicine,
and Ethics 23, no. 4: 320-330.
Information and Privacy Commissioner (Ontario) and
Registratiekamer (The Netherlands). 1995. Privacy-Enhancing
Technologies: The Path to Anonymity (two volumes). Information and
Privacy Commissioner (Toronto) and Registratiekamer (Rijswijk).
Latour, Bruno. 1987. Science in Action: How to Follow Scientists and
Engineers Through Society. Harvard University Press.
Lyon, David, and Elia Zureik, eds. 1996. Computers, Surveillance, and
Privacy. University of Minnesota Press.
Nugter, A. C. M. 1990. Transborder Flow of Personal Data within the
EC: A Comparative Analysis of the Privacy Statutes of the Federal
Republic of Germany, France, the United Kingdom, and the
Netherlands. Kluwer.
Packard, Vance. 1964. The Naked Society. McKay.
Porter, Theodore M. 1994. Information, power and the view from
nowhere. In Information Acumen: The Understanding and Use of
Knowledge in Modern Business, ed. L. Bud-Frierman. Routledge.
Regan, Priscilla M. 1995. Legislating Privacy: Technology, Social Values,
and Public Policy. University of North Carolina Press.
Registratiekamer. See Information and Privacy Commissioner.
Rotenberg, Marc. 1996. Personal communication.
Schoeman, Ferdinand David, ed. 1984. Philosophical Dimensions of
Privacy: An Anthology. Cambridge University Press.
Sciore, Edward, Michael Siegel, and Arnon Rosenthal. 1994. Using
semantic values to facilitate interoperability among heterogeneous
information systems. ACM Transactions on Database Systems 19, no. 2:
254-290.
Smith, H. Jeff. 1994. Managing Privacy: Information Technology and
Corporate America. University of North Carolina Press.
Smith, Robert Ellis. 1979. Privacy: How to Protect What's Left of It.
Anchor Press.
Turkington, Richard C. 1990. Legacy of the Warren and Brandeis article:
The emerging unencumbered Constitutional right to informational
privacy. Northern Illinois University Law Review 10: 479-520.
Westin, Alan F. 1967. Privacy and Freedom. Atheneum.
Williamson, Oliver E. 1975. Markets and Hierarchies: Analysis and
Antitrust Implications. Free Press.
(c) Copyright 1997 by Philip E. Agre. All rights reserved.
Philip E. Agre, Beyond the mirror world: Privacy and the representational
practices of computing