| Introduction | Privacy | Tort Common Law and
Privacy | Free
Speech |
| Hypotheticals | A Technological
Solution? | References &
Resources |
Privacy and Free Speech
Issues
on the Internet
Introduction
Many of the
characteristics that make digital technologies and the
Internet so appealing -- ease of transmission, perfect
reproduction, ease of amassing data, etc. -- also render
problematic these technologies and their use on the
Internet. We are all at least somewhat aware, for example,
of the difficulties in enforcing intellectual property
rights in cyberspace, which is purportedly not bound by the
legal regime of any country in particular. If cyberspace
does not exist within traditional geographical boundaries,
in which jurisdiction does one hear claims arising from
conduct within its sphere? And whose laws, language or codes
of conduct should we apply? In addition to these significant
issues are questions about rights of privacy and freedom of
expression on the Internet, and it is these topics that
primarily concern us here.
In the following pages we identify
some of the statutory and common law provisions in U.S. law
that deal with questions of privacy and free speech on the
Internet. Some of the statutes do not specifically apply to
digital technology, but may be relevant for development of
Internet regulation in the future. Our discussion of the
current state of the law in these areas is followed by a
hypothetical fact pattern that we analyze in terms of
existing legal and technical parameters. We have attached to
the end of this discussion a list of sources we consulted --
and from which we freely borrowed -- in drafting this paper.
Back to top of
page
Privacy
Although the
Constitution is invariably invoked in any discussion of
privacy rights, there is no clear statement of such a right
in the Constitution itself. Privacy rights are inferred from
the Bill of Rights, and specifically the Fourth
Amendment -- added in 1791
for the ostensible purpose of protecting citizens from
unreasonable searches and seizures by the government.
Privacy, the Supreme Court has found, falls under the
penumbra of the Fourth Amendment -- i.e. a right that could
be logically inferred from the existing language.
Reflecting their origins in the
Fourth Amendment, however, Constitutionally based privacy
rights have been interpreted mainly as a means of protecting
individuals against intrusion by the government -- not by
private parties. Some of the best-known privacy cases in
recent decades have involved government regulation of
contraceptives and abortion. The emotional freight attached
to these issues has fueled claims of those who do not
believe that the Bill of Rights contains privacy provisions.
Robert Bork, you'll remember, outraged his confirmation
committee in part because he claimed to be a strict
interpreter of the Constitution -- i.e. loath to identify
rights not clearly stated therein.
In the aftermath of Watergate,
Congress enacted statutory protection to prevent privacy
invasions by the government through wiretapping. In 1972,
when this law (Statute Governing Wire Interception and
Interception of Oral Communications) was reviewed in U.S. v.
Baldassari, a Pennsylvania District Court found that
Congress intended to prohibit all wire tapping and
electronic surveillance except by law enforcement officials
investigating certain enumerated crimes under circumscribed
and strictly controlled procedures.
Since 1972, there have been other
federal statutes enacted that regulate the privacy of
individuals in connection with the use of communications,
computer, and video equipment and networks:
- Privacy Act
(1994). Attempts to
strike a balance between the government's need to gather
and use personal information and individuals privacy
interest in controlling such information. The Privacy Act
imposes strict rules on the government's use of records
collected about individuals, requiring government
agencies to: permit individuals to control disclosure of
information in their records; retain records of
information that is disclosed; permit individuals to
review and have a copy of information in their records;
and to allow individuals to request amendment of
information in their records.
- Cable Communications Policy
Act (1984). Requires
cable television companies to provide annual notification
to subscribers about how their personal information is
used, disclosed, and the purposes for which it is
gathered. Cable operators may not collect or disclose
personal information about subscribers without their
consent.
- Video Privacy Act
(1988). A criminal law
that prohibits disclosure about video tapes individuals
have rented. No information about films rented or the
identity of the renter can be released without the
written consent of the consumer.
- Telephone Consumer
Protection Act (1991).
Regulates unsolicited phone calls. Directs the FCC to
prescribe regulations to limit pre-recorded voice calls
and automatic dialing systems that produce voluminous and
unwanted messages. May have applicability for e-mail
spamming.
- Fair Credit Reporting Act
(1970). Governs the
disclosure by consumer credit reporting agencies of
credit reports containing personal information . This Act
specifically identifies permissible purposes for which
personal information about a consumer may be disclosed
without consent, and provides mechanisms for consumers to
check on the accuracy of the information reported about
them. As more commerce takes place on the Internet, the
Internet will be used more frequently for credit history
checks, and the gathering and transmission of data
associated with them, hence this statute may have
increasing significance in cyberspace.
Most important, however, is the
Electronic Communications Privacy Act (ECPA) of 1986. The
ECPA amends the Wire Interception and Interception of Oral
Communications Statute to accommodate digital
communications, including data transmissions between
computers, paging devices, e-mail, and video transmissions.
Significantly, it expands its scope to cover not only the
actions of government agencies, but also those of private
parties. It specifically prohibits:
- unauthorized eavesdropping by
persons and businesses
- unauthorized access to messages
stored on computer
- unauthorized interception of
electronic messages in transmission
The statute permits, however,
system operators to reveal users' private messages to legal
authorities, but only when these have been accidentally
obtained, and the system operator believes questionable
activities are taking place.
Back to top of
page
Tort Common Law
and Privacy
In a widely cited law
review article, Louis Brandeis wrote in the Harvard Law Review in 1890
about a common law right to be left alone. He spoke of a
zone of privacy that would protect one from the unauthorized
public disclosure of private facts. Over the years courts
have developed Brandeis's ideas into four principal areas of
privacy protection:
- Right to be free of intrusion
upon one's seclusion.
- Right to be free of public
disclosure of private facts
- Right to be free of being
placed in a false light
- Right to prevent the
misappropriation of one's name and likeness.
Many believe that existing legal
provisions do not provide sufficient safeguards against
privacy invasion on the Internet. In 1995 the Clinton
Administration's National Information Infrastructure Task
Force published recommendations for correcting this
insufficiency at law in their Principles
for Providing and Using Personal
Information that would
require merchants to inform customers about what personal
information it intends to collect, and how they intend to
use this information. Before using sensitive personal
information about a customer the merchant would have to
obtain consent from the customer.
The government also has encouraged
industry to devise recommendations for suitable privacy
regulation on the net. Under opt-out schemes, the customer
is presumed to have consented to use of personal information
unless he has specifically stated an objection thereto.
Opt-in schemes require affirmative consent by the consumer
prior to the merchant's use of personal information gathered
on the Internet. Consumer groups like the Electronic Privacy
Information Center (EPIC) recommend an opt-in approach. The
Electronic
Frontier Foundation (EFF)
has recommended TRUSTe, which would require companies doing
Internet commerce to state on their web pages whether and
how they plan to use information gathered about visitors to
their sites. Under the TRUSTe system merchants would
identify their uses as:
- Anonymous: no data
collected
- One-to-one exchange; data
collected only for web site owner use
- Third party exchange: data
collected and provided to others
Back to top of
page
Free
Speech
Freedom of speech is,
of course, sacrosanct in America. A surprising number of
people, however, are unaware of the limitations placed on
this First Amendment right as means of balancing the value
of free expression against concerns for maintaining a safe
and civil society. Here are a few of the limitations on free
speech that certainly apply in the digital sphere,
especially to commercial speech.
Under the First Amendment one is
free to discuss criminal or violent topics, but when the
discussion becomes part of a plan to implement such behavior
it is no longer protected. The publisher of Soldier of Fortune, for instance, was unsuccessful in claiming
freedom of speech when sued by the family of an individual
murdered by someone who had placed an assassin-for-hire ad
in Soldier of Fortune.
Defamatory speech is not protected
under the First Amendment, unless the target is a public
figure. If a public figure is involved and the defamatory
speech is malicious, it is no longer protected. Because it
is easy to publish retractions on the Internet, it has been
argued that the only consequence of defamatory speech on the
Internet should be the requirement to post a retraction. Of
course, however, this would disincent people from posting
fair and accurate speech in the first place.
Speech that is clearly directed
towards an adult audience -- sexual subject matter, in
particular -- is protected. Obscene speech, however, is not,
and the Child Pornography Statue of 1991 specifically
prohibits trafficking in this material. There are also FCC
Restrictions on Obscene and Indecent Telephone Transmissions
(1989) that have implications for digital communications.
The age-old question, of course,
is, of course, what is obscene. The Supreme Court, in
Miller
v. California (1973)
posited three tests that must be affirmatively met before
something can be considered obscene:
- Whether the average person,
applying contemporary community standards would find the
work, taken as a whole, to appeal primarily to prurient
interest.
- Whether the work depicts or
describes in a patently offensive way, sexual conduct
specifically defined by the applicable state law.
- Whether the work taken as a
whole lacks serious literary, artistic, political, or
scientific value.
Opponents of pornography might take
heart from the fact that pornography on the Internet, unlike
pornography in the physical world, does not advertise itself
with lurid posters and screaming signs in shops in the
seedier parts of town. On the other hand, because of the
ease of distribution on the Internet, people have argued
that pornography is now difficult to avoid, even in one's
own home.
First Amendment protection is
particularly strong for printed works that are not obscene.
Because shops that sell pornography typically sell
non-obscene printed materials too -- which makes it
difficult to shut them down -- typically they are simply
controlled by being zoned into red-light districts.
Accordingly, Internet pornography sellers who deal
exclusively in obscene visual materials may find less First
Amendment protection than their real-world counterparts. In
fact, the thinking is that while federal and state
legislatures are still learning about digital technologies,
once they are up-to-date in these areas, they will regulate
sales of obscene materials on the Internet just as in the
traditional marketplace.
The first attempt at federal
regulation of obscene material on the Internet , however --
the Communications Decency Act -- failed completely. In
Reno
v. ACLU (1997) the
Supreme Court found the Communications Decency Act to be
unconstitutional because the expressions "indecent
transmissions" and "patently offensive display" were so
vague as to abridge First Amendment freedom of speech
guarantees. Striking about the decision is the fact that the
entire court found the statute unconstitutional; only
O'Connor and Rhenquist did not join the majority opinion,
yet wrote a concurring opinion. The Court did not agree with
the Justice Department's argument that the Communications
Decency Act was needed in order to attract more people to
the Internet (and thereby enhance Internet commerce) because
there is no indication that pornography on the Internet is
driving away potential users. The Court suggested also that
transmitters of pornographic materials can tag their
communications so recipients could block reception of it
using software.
Back to top of
page
The Real
World: Hypotheticals, Legal and Technical Analysis, and
Possible Solutions
Scenario
In 1998 Stanford
University opens a School
of Knowledge Management with a fully operational flex-lab
(things can happen quickly at private universities).
Stanford provides all entering students an advisory that the
University claims the right to review incoming and outgoing
messages over their servers for limited purposes, e.g.
maintenance and security.
Hugh, a graduate student, having
scoped out the nifty computer resources at SKM, decides to
earn some money by peddling his collection of child
pornography on the WWW. Hugh serves pages from a computer in
an office on campus. He often encrypts the materials he
sells online, and frequently receives encrypted messages to
his SKM account.
One day, while updating Hugh's
account, SKM's system manager stumbles upon some of Hugh's
unsavory messages that he neglected to encrypt, which he
forwards to University authorities. Eventually these are
given to public law enforcement agents. He also identifies a
service provider, CyberJunk, that has been sending unwanted
solicitations to SKM students, prevents them from entering
the SKM server, and sends them back as e-mail bombs that tie
up the CyberJunk servers. He also prevents all encrypted
messages from entering or leaving via the SKM server.
Issues
Stanford needs to be
careful about defining "security" too broadly. Are they just
talking about maintaining the campus firewall (if they have
one) against "pings of death" and traffic of such a high
volume that it impacts other people's use of the system? If
that's the case, they don't have to look at people's
messages. They would be
within their rights to see if one particular login is
generating a lot of traffic.
If Hugh is publicly making this
material available on a campus server, than the material is
public to Stanford as well -- and they have every right to
boot him off and press charges.
It would be easy to monitor this
kind of thing. The server software could be set up to notify
the administrator when someone has exceeded their bandwidth
or time limit.
Scenario
Hugh serves pages
from his home computer through his campus dial-in account.
Issues
Again, Stanford needs
to be careful about defining "security" too broadly.
Stanford would be within its rights to limit the duration of
session connections (which would discourage the use of
dial-in accounts for on-line commerce), or prohibit the use
of University subsidized connections for personal gain.
Stanford could also do what U.C. Berkeley has done, and
impose a two-hour limit on connections. If Stanford has
defined an appropriate use policy that addesses the duration
of use, and traffic of such a high volume that it impacts
other people's use of the system, then they don't have to
look at people's messages. They would be
within their rights to see if one particular login is
generating a lot of traffic.
Realistically, would all of Hugh's
materials be encrypted? The web site must give some clue to
what he's really doing. Also, what would stop someone from
posing as a paying customer and then turning in the
evidence?
Scenario
Hugh publishes a
link to his business page (on a commercial ISP) on his class
home page.
Issues
Once again, Stanford
needs to be careful about its definition of appropriate use.
Stanford would be within its rights to limit the duration of
session connections (which would discourage the use of
dial-in accounts for on-line commerce), or prohibit the use
of University subsidized connections for personal gain.
Scenario
Hugh guesses that
people like him, who buy child pornography, tend also to be
interested in firearms. He collects data about his customers
-- names, addresses, phone numbers, credit cards, credit
histories, frequency of purchasing, number of visits to the
site etc. -- that he sells to the publishers of gun
magazines. Because most of his customers are deadbeats, Hugh
threatens to publish their dismal credit histories on his
website if they dont pay him. Hugh also publishes an
informal on-line magazine that includes advertisements for
hired assassins, and a weekly column featuring false and
derogatory remarks about Bill Gates.
Issues
Hugh has chosen to do
what a million other companies who collect consumer profile
information in the course of their business transactions do,
namely sell his information to other interested parties such
as the publishers of gun magazines. This activity is not
against U.S. law although more and more customers are
requesting or insisting that their names not be sold to
third parties. The advent of agencies such as Trust E that
rate and certify what web sites do with the customer
information they collect could help reduce the proliferation
of unbridled commercial exploitation of what many if not
most consumers consider personal and private information.
Hugh's threat to disclose the names of his non-paying
customers by publishing them on his web site is, in and of
itself, not illegal or unethical. However, were he to follow
through on his threat it would seem to directly contravene
the Fair Credit Reporting Act (1970)
If this scenario were treated in
line with precedents in other areas where the government
demands access to information for criminal investigation,
the DOJ would have to subpoena the digital customer list to
avoid unreasonable search and seizure and privacy concerns.
Scenario
Hugh's remarks about
Bill Gates are general in nature.
Issues
The more public the
figure the more difficult it is for that figure to establish
defamation of character. In this case, Bill Gates probably
has a degree of fame such that his suit against Hugh would
be fruitless.
Scenario
Hugh's remarks could
be considered by a reasonable person to be malicious and
negatively impact Gates' business.
Issues
A defamation of
character lawsuit would most likely be fruitless, especially
if Hugh's commentary were general in nature. However, even
if Hugh's comments were malicious a lawsuit against him
would probably be to no avail because Gates is such a public
figure. With regard to the impact on Microsoft, defamation
of character shouldn't be mixed in with the effects of
Hugh's commentary on Gates' business unless it is somehow
proved that the libel campaign is part of an effort directed
at undermining the business.
Scenario
Hugh also publishes
an informal on-line magazine that includes advertisements
for hired assassins.
Issues
More and more U.S.
courts are holding that publishers are liable for the
content of the books they publish. Recently, a publisher was
held accountable for a triple murder committed by a man who
bought an assassins how-to hand book they published. A
weapons magazine was held accountable for the murders
committed by a man who advertised himself as an assassin for
hire. This sort of interpretation is more rigorous than it
has been in the past in that publishers are held liable for
the material they publish. However, should they be held
accountable for other things such as accuracy as well?
Scenario
Hugh uses a key
escrow encryption system (e.g. "Clipper Chip").
Issues
As soon as Stanford
determines that the content being served by Hugh has
implications beyond the scope of their concerns, the
appropriate campus authorities contact local, state or
government agencies to assume the investigation. If Hugh is
using a key escrow system for encryption, then these
authorities can obtain a court order to obtain keys that can
unlock Hugh's messages and then take the appropriate
actions.
Scenario
In class, Hugh has
learned that key escrow systems are subject to court
authority and therefore his client lists, and content are
subject to seizure and examination. Hugh uses public key
encryption to encrypt files served from campus. Hugh
encrypts his content using the Public keys of users that
have paid a fee to access the material.
Issues
If Hugh has done a
thorough job of encrypting his content with his own public
key, then only he can decrypt it using his private key. If
the authorities have not managed to purchase indecent
material from him, then they many not be able to extract any
evidence for use against Hugh. Hugh may be vulnerable to
prosecution if he has accepted credit cards from agents
posing as consumers, since the trail of evidence from
purchase and distribution to deposit of the funds in Hugh's
account would be complete.
Scenario
A watchdog group
discovers the content of Hugh's server and sends it to
authorities in Tennessee. A watchdog group in Germany
discovers the content of Hugh's server seeks extradition for
prosecution (e.g. Gary Rex Lauck).
Issues
Hugh may be subject to
extradition to another jurisdiction, but it may not be
possible to decrypt Hugh's content, even with his private
key, since it may be illegal to export the encryption
software to the prosecuting country.
Scenario
Hugh uses anonymous
digital cash in addition to public key encryption of his
content.
Issues
It may not be possible
for legal authorities to prosecute Hugh if he is maintaining
completely encrypted files and he has only been accepting
transactions using anonymous digital cash. Hugh may be able
to completely repudiate his connection to the content and
the transactions.
Back to top of
page
Can Technology
Solve These Problems?
A possible solution to
many of the issues addressed is the Good
Neighbor Key Escrow System (GoNeKES).
Recognizing the needs for
encryption (for personal privacy), a world wide standardized
Key Escrow system (for consumer protection against illegal
transactions), and a decentralized storage vault system (to
prevent government incursions into personal or
corporate/national privacy), the major financial
institutions and governments create a chain-of-trust based
escrow system. Individuals create Public/Private Key pairs
which are stored with either national authorities, or if
users prefer, with trusted friends in such a way that their
use requires the cooperation of two agencies or individuals
to access messages encrypted by an individual. This system
would provide a central public key distribution site so that
everyone who sought to use public key encryption was
required to use this system.
Back to top of
page
References and
Resources
Electronic Frontier
Foundation
http://www.eff.org/
Electronic
Privacy Information Center
Jeffrey Faucette, "The Freedom of
Speech at Risk in Cyberspace: Obscenity Doctrine and a
Frightened University's Censorship of Sex on the Internet,"
Duke
Law Journal, Vol. 44,
p. 1155 (1995).
Jonathan Wallace and Mark Mangan,
Sex, Laws &
Cyberspace, (New York:
Henry Holt, 1996)
Lance Rose, Netlaw
(Berkeley: McGraw-Hill, 1995)
Larry Gantt, "An Affront to Human
Dignity: Electronic Mail Monitoring in the Private Sector
Workplace," Harvard
Journal of Law and Technology, Vol 8, p. 345 (1995)
Maureen Dorney, "Privacy and the
Internet," Hastings
Communications and Entertainment Law
Journal, Vol 19 p. 638
(1997)
NII Task Force, "Recommendations
on Privacy and the National Information Infrastructure:
Principles for Providing and Using Personal
Information," (Final
Version, June 6, 1995)
Richard Raysman and Peter Brown,
"Policies for Use of the Internet," New York University Law
Journal, Vol. 213, p.3
(1995).
| Introduction | Privacy | Tort Common Law and
Privacy | Free
Speech |
| Hypotheticals | A Technological
Solution? | References &
Resources |
|