Imagine that you operate an Internet site where users can offer personal items for sale to the entire Internet community. You operate your site from the United States, attracting millions of users from around the world buying and selling millions of items. Transactions are impossible to monitor due to the large number of users and items sold. The site is divided into numerous categories to make it easier for users to post items for sale and purchase those items. Users of your system buy and sell everything from cars to computer supplies to books to jewelry and everything in between. Sites such as yours are often thought of as online auctions. Your site is different from the traditional notion of an auctioneer, however. Once the site is in place, it is up to the users to do all the work. The users post their items for sale, select items to purchase, and are responsible for completing the transaction once purchased.
Your business
is operating smoothly, facilitating millions of transactions, when you
surprisingly find that you have been charged with a crime in a foreign country.
The government of that country claims that you have broken its laws, even though
you have never even set foot there! Nobody from that foreign country had even
used your site, although anybody with an Internet connection could have found
materials for sale that are illegal to buy and sell in that country, even though
completely legal in the U.S. The foreign government claims that by operating
your Internet site, you are committing a crime. They give you an ultimatum:
either monitor the contents of the goods on your system and filter out those
offending items from users in that country, or face harsh fines.
The Internet company Yahoo! faced just such a problem in 2000 when the
French government charged it with violating a French law prohibiting the
advertisement, exhibition, or sale of any objects likely to incite racial
hatred. The charges came after Jewish and anti-racist groups in France expressed
outrage that Nazi memorabilia were among the items available on Yahoo!'s auction
pages. Although Yahoo!'s French language portal, Yahoo.fr, does not contain any
of the offensive material, French users are able to access the offensive
material through the Yahoo! English language site. There were no documented
instances of any French citizens ever purchasing or making available the
offensive material.[i]
This example illustrates the problem that the developing uses of the
Internet present when they encounter the numerous conflicting cultural and
political differences that exist in the world. Traditionally, territorial
borders determine the domain of governments and thus those governments' power to
control the people within their borders. The laws those governments implement
reflect the moral and religious beliefs of the people within their boundaries.
Often laws prohibit certain activities that violate those beliefs. The Internet,
in contrast, has no territorial boundaries, because the cost and speed of
message transmission on the Net is almost entirely independent of physical
location.[ii] Is it possible for a
particular government to protect its people from material on the Internet that
offends its moral and religious beliefs? Can one particular government legislate
standards that are applicable the whole of the Internet? Does Yahoo! have an
obligation to protect individuals from accessing material on its system that
might offend a person in the Middle East?
This paper will discuss these questions and what solutions exist when
conflicting cultures meet the global entity known as the Internet. First, I will
examine the various forms this conflict takes and how different countries have
dealt with the problem. Second, I will examine the technological capabilities
currently in place to allow governments and Internet providers to customize the
content available to different countries. Third, I will examine possible
solutions to the problem and offer a recommendation to maximize the potential of
the Internet while respecting the Interests of all parties.
Cultural conflict and the Internet
The Internet is a widespread series of interconnected computer networks,
often referred to as a “network of networks.”[iii]
The owners of the computers and networks that make up the Internet include
government and public institutions, non-profit organizations, and corporations.[iv]
One court described this new technology as: “The resulting whole is a
decentralized, global medium of communications - or “cyberspace” - that
links people, institutions, corporations, and governments around the world.”[v]
One of the defining characteristics of the Internet is its durability. As
envisioned by its original creators, the Internet can still function if portions
of it do not work, by finding alternate routes for information to travel through
the system. Another defining characteristic of the Internet is the absence of
any geographic boundaries: once a user is connected to the Internet anywhere in
the world, he or she is able to have the same access to information as any other
user in the world.
New issues in cultural conflicts are arising as the Internet continues to
develop in new parts of the world. In 1995, over two-thirds of Internet users
were located in the United States. By the year 2005, the United States is
projected to have less than one-third of all Internet users. The fastest areas
of growth will be in areas such as the Middle East, South and Central America,
and Asia[vi].
These areas of the world have cultures dramatically different from that of the
United States and Western Europe, who up to this point have dominated the
Internet in terms of number of users. As Internet users from diverse cultures
become more and more prevalent, solutions will be needed to deal with the
inevitable growth in cultural conflicts that arise.
In
the case of U.S. v. Thomas[vii],
the obscenity-based convictions in Tennessee of a California couple, who
operated a "for fee" Bulletin Board Service that allowed members to
download pornographic materials, were upheld after the court rejected the
defendants claim that under the First Amendment, the "community
standards" by which the "obscene" nature of the materials should
have been measured was that of the "cyberspace community" and not that
of Memphis, Tennessee. The Thomases were indicted by a grand jury in the Western
District of Tennessee and were convicted under, among other federal statutes, 18
U.S.C. § 1465 for knowingly using a facility and means of interstate commerce
-- in this case, the combined computer/telepphone system – for the purpose of
transporting obscene, computer-generated materials in interstate commerce. The
United States District Court for the Western District of Tennessee rejected the
defendants’ First Amendment arguments. One prong of First Amendment analysis is whether the average person,
applying contemporary community standards, would find that the work in question,
taken as a whole, appeals to the prurient interest. It was argued on behalf of
the defendants that the relevant community standards were not those of Memphis,
Tennessee, where the defendants had been prosecuted, but rather a new definition
of community was needed, i.e., one that was based on cyberspace, rather than the
geographic, connections among people. The court rejected this argument, first
holding that "obscenity is determined by the standards of the
community where the trial takes place" and that "it is not
unconstitutional to subject interstate distributors of obscenity to varying
community standards." The court acknowledged the concern expressed by the
defendants that such a ruling might lead to an impermissible chill on protected
speech because BBS operators cannot select who gets the materials they make
available on their bulletin boards and would be forced to censor their materials
so as not to run afoul of the standards of the community in America with the
most restrictive standards. However, the court believed that the First Amendment
was not implicated by this prosecution because, unlike the hypothetical bulletin
board operator who has no knowledge or control over the jurisdictions where
materials are being distributed for downloading or printing, access to the
defendants’ BBS could only be obtained by revealing one’s geographic
location. Because on the facts of this case the defendants "had in place
methods to limit user access in jurisdictions where the risk of a finding of
obscenity was greater than in California" and could avoid liability in
jurisdictions with less tolerant obscenity standards by refusing membership to
persons in those communities, the Court held that it did not need to adopt a new
definition of "community" for use in obscenity prosecutions involving
BBSs and left for another day the First Amendment questions implied, but not
directly presented, by this case.
The First Amendment of the United States Constitution states that
“Congress shall make now law … abridging the freedom of speech.”[viii]
However, the Constitution does not protect all forms of speech. Obscene material
does not receive any First Amendment protection. Indecent speech, on the other
hand, is constitutionally protected and subject to strict scrutiny. This means
that in order for a regulation to survive strict scrutiny “the State must show
that its regulation is necessary to serve a compelling state interest and is
narrowly drawn to achieve that end.” The strict scrutiny standard is applied
to any government legislation that limits the content of protected speech. The
Court, however, treats forms of communicative technology differently when
applying First Amendment standards because of the different characteristics of
each medium.[ix]
In
the landmark Reno v. ACLU[x]
the Court decided to give the Internet the same level of protection as that
received by the print media. In Reno,
the ACLU claimed that the Communications Decency Act of 1996 (CDA) provisions
concerning the Internet infringed upon protected First and Fifth Amendment
rights.[xi]
The U.S. District Court for the Eastern District of Pennsylvania conducted an
extensive fact-finding including everything from the history of the Internet to
the specific issue of sexually explicit material.[xii]
In a unanimous decision, the three-judge panel granted an injunction of the
enforcement of the CDA and ruled the CDA unconstitutional. The case was then
directly appealed to the Supreme Court, where the Court affirmed the decision.[xiii]
The
Court analyzed the statute under strict scrutiny. The Court first attacked the
statute as being overly broad, and then noted that alternatives less restrictive
of First Amendment rights were available for achieving the same purpose.
Therefore, the majority concluded that the CDA was not narrowly tailored to meet
its objective of protecting minors because less restrictive means existed to
achieve the stated purpose.[xiv]
The
Internet is a new area of media technology that the Court chose not to stifle by
allowing the passage of a statute that was overly broad in scope. The Internet
is now closer, in terms of First Amendment protection, to print media than it is
to broadcast media or common carriers. The Court’s reasoning in Reno seems to place the Internet in the realm of protection offered
by Miami Herald Publ’g Co. v. Tornillo,[xv]
and suggests that a vague statute implicating First Amendment freedoms will not
pass the strict scrutiny standard used by the Court. Reno is indicative of the strong protection the United States uses
for expression, in contrast to many other countries.
Interestingly, the United States is not immune to the threat of offensive
content that other countries and cultures could find acceptable, or at least
allowable within their legal frameworks. The FBI recently began an investigation
of an Internet site, Bonsaikitten.com, run by MIT students, after receiving
complaints from people offended by the site's depiction of animals being treated
cruelly.[xvi]
Although the site is clearly a spoof, claiming to apply the ancient bonsai
techniques of Japan to kittens, FBI officials apparently were not amused. The
probable crime involved is a federal law passed in December 1999 prohibiting the
transfer across state lines a depiction of animal cruelty, which includes the
Internet. Although this law has not been challenged on Constitutional grounds as
of yet, it is possibly unconstitutional. Regardless of its constitutionality, it
illustrates how the United States, with its tradition of allowing a free flow of
information, may have its limits that can be violated by the borderless
Internet.
Singapore
As one might expect, the proliferation of cyberporn has motivated some
nations to regulate Internet content in different ways. In policing the
Internet, some nations have prohibited content based on broad terminology, such
as being against public interest or public morality. As an example, in 1996,
Singapore enacted an elaborate administrative law framework for Internet content
reflecting its tight control of the media. In developing its Internet content
regulations, Singapore had to resolve the obvious tension between its aggressive
information technology growth strategies that allowed colossal amounts of
uncensored information into the country via the Internet and the government’s
traditional restrictions on media.
Singapore has policies in place to actively encourage Internet
development.[xvii]
Singapore forbids Internet Service Providers and Internet Content Providers from
providing “material that is objectionable on the grounds of “public
interest, public morality, public order, public security, national harmony, or
that is otherwise prohibited by applicable Singapore laws.”[xviii]
To enforce the Code, ISPs and ICPs must use their “best efforts” to comply
with the Code and must act to ensure that nothing is included in any
broadcasting service (which includes Internet content) that is against what the
government considers to be in the public’s interest and in good taste or
decency.[xix]
Although content providers within Singapore are prohibited from providing
offensive materials, ISPs are not required to monitor the Internet or its users.
They are, however, required to eliminate access to 100 “high impact”
pornographic sites, as identified by the Singapore government, as a “statement
of societal values.”[xx]
In contrast to the rather lenient policies adopted by Singapore to offensive
content, the actual enforcement of these policies is perhaps more stringent.[xxi]
Despite Singapore’s high degree of technological advancement and the
relatively small number of users and content it must regulate, censoring the
Internet has proved virtually impossible. Singapore has realized that it is
unfeasible to censor the Internet in the same manner as other types of media.
Although the government has received assistance from ISPs (which are either
owners by or connected to the government) in censoring the Internet, the head of
the Singapore Broadcasting Authority, the agency in charge of Internet
regulation, notes that “there is a limit to what domestic legislation can
achieve in the face of a global and borderless medium like the Internet that it
was impossible to fully regulate the Internet.”[xxii]
Australia
Another example of a country that has taken a different approach to the
regulation of Internet content is Australia. Due to fears that the relative ease
of availability of any form of content via the Internet would shake the complex
system of controls and regulations Australia places on telecommunications,
publishing, and broadcasting, conservative elements pushed through the adoption
of the Broadcasting Services Amendment Act 1999.[xxiii]
This legislation, which attempts to protect Australia’s children from threats
such as pornography, neo-Nazis, pedophiles, and bomb-making recipes, has faced
mixed reactions from the public. The Act establishes a complaint-based system
that gives the Australian Broadcasting Authoring (ABA), analogous to the Federal
Communications Commission of the United States, the power to require illegal or
offensive Internet sites to be taken down or for access to those sites to be
prevented.
Australia’s regulatory framework attempts to work on the principle that
what is illegal in traditional communications should also be illegal online. The
statute treats differently content depending on whether it is hosted within
Australia or overseas. If a complaint is filed about material hosted in
Australia, the host is ordered to cease carrying the offending material.
Alternatively, if a complaint is filed about material hosted outside Australia,
ISPs are required to take reasonable steps to prevent end-users accessing the
prohibited material. Obviously, if no complaint is made, content is not
restricted. Also, due to the shear number of possible offending sites, it would
be impossible to regulate them all. There have as of yet been no legal
challenges to the Australian legislation, and since Australia has no free speech
protection in its constitution, it is not clear that there would even be any
grounds compelling a court to reject the regulations.
After a French court ordered Yahoo! to filter its system to make offensive memorabilia unavailable to people in France, Yahoo! removed the offending items from its auctions. Yahoo! did not comply with the court order, instead filing a countersuit in California arguing that: 1) it is technologically impossible for Yahoo! to comply with the french court's filtering order, and 2) the French court has no jurisdiction of U.S.-based Yahoo! Although the offending items were removed from its Webside, Yahoo! claims the action was due to a change in its own policies, and not to comply with the French court.[xxiv] Yahoo! likely does not wish to pursue a possibly lengthy and complex court battle. While French activists seem pleased with Yahoo!'s actions, the is far from settled, and they have expressed their desire to police offensive material
r sights, such as eBay.com, which also host the sale of Nazi memorabilia.
Technology
As individuals, governments, and content providers grapple with the
problems of offensive content, many see the problem that was created by
technology as best solved by more technology. Content filtering software is
becoming increasingly available in the marketplace. Filters come in different
forms. While some are implemented "upstream," at the level of proxy
servers that control access for whole schools, libraries, or busiensses, others
are implemented "downstream" at individual workstations or PCs.
Conceptually, they all work similarly: they start with "control lists"
of addresses of unacceptable sites, then add automatic keyword filters to block
additional sites that contain certain words and phrases. The user of the filter
can specify the categories of sites to block, for example, "hate
speech" or "sex acts." Technology also exists to monitor the
sites accessed by users.[xxv]
There are inherent inadequacies in the present filtering systems. To
block objectionable material, the filter must be over-broad, blocking sites that
aren't objectionable at all. For example, in a recent study, 1,000 randomly
chosen addresses in the dot-com domain were submitted to the SurfWatch filter.
Of the sites it blocked as "sexually explicity," more than four out of
five were misclassified--for example, the sites of an antiques dealer in Wales,
a Maryland limo service, and a storage company in California.[xxvi]
The reason for this overblocking is because it is impossible to single out porn
sites simply by the words they use. Other types of objectionable content are
even harder to filter because they do not give themselves away so easily with
genre-specific keywords such as "XXX". Another problem is that as the
size of the Web continues to grow, the vast amount of objectionable material
will likewise grow.
Solutions
As the Internet continues its growth and an increasing amount of content
is included within its realm, every conceivable type of information will become
available, every conceivable viewpoint will be expressed in one way or another,
and countless types of human creations will be displayed for all to see. This
wealth of information, viewpoints, and creations will inevitably cross borders,
sometimes to places where such offerings are not welcome. Such will be the cost
of participation in the global network in which each society will be a part.
This does not mean, however, that we will live in a homogenized world, where the
only standard will be the lowest common denominator. Each of our cultures will
form the basis for our participation in the global environment, the way in which
we view the world, and the way in which the world views us. What will change is
that underlying each culture's involvement in the global environment will be a
heightened awareness, and tolerance of, the societal standards of others.
Countries and cultures who more quickly and easily embrace the free flow of
information will have a clear advantage over those who try to hold onto rigid
standards of information control. And yet even the most libertarian cultures
will face limits on their tolerance and need to somehow limit some types of
Internet content in some situations.
There will be three keys to solving this balancing problem: 1)
technology; 2) education; 3) global cooperation. First, there will always be
technological solutions to technological problems. Problems created by
technology (the proliferation of offensive content via the Internet) can be
solved by technology (filters). These solutions, however, will only be partial,
and will have their own drawbacks. This is a fact of nature in the technological
environment. New technology always creates problems that must be dealt with -
the technology is always one step ahead of the solution. Further, the
pace of technological advancement is far quicker than policy advancement, so
that legal standards in place to limit Internet content based on technological
capabilities will always be at least a step behind. Governments will use
technology to set standards on Internet content available to their citizens, but
these technological solutions will not be fully adequate to solve the problem.
Second, individuals, governments, and service providers will become more
educated about the usefulness of the Internet and the opportunities it provides.
This will lead to a better understanding of how to deal with the problem. A sort
of maturity will develop in which people and institutions will learn to make
available for themselves the content they want and ignore that which is
offensive. For example, governments can provide recommended or required sites to
be used as access points to the Internet, which could direct people toward
content that meets their societal standards. Also, as individuals become more
savvy, they can select content providers that will cater to their particular
content requirements. A sophisticated user with access to the Internet will
always be able to find his way into those dark alleys that contain offensive or
obscene content, but the majority of people will stick to the well-lit highways
that provide the content they need in the easiest manner possible.
Third, global cooperation is required to strike the balance between a
flourishing Internet and respect for cultures. Eventually, all countries and
most people will likely have a presence on the Internet. People will associate
through language, not location. Regardless of how the Yahoo! case, and those
like it come out, it will be entirely impractical and ridiculous for one
jurisdiction to attempt to censor a content provider over whom it has no power.
The mere existence on the Internet of a piece of information, and therefor the
availability of that information to anyone in the world with Internet access,
does not give any one country the power to judge that information and determine
its fate. The burden will be on those who attempt to restrict the free flow of
information, not on those who provide information, no matter how offensive it
might be. It should be the common goal to allow the free flow of all information
to all nations to the extent each deems acceptable.
Necessary
to achieve this goal will be international cooperation. An international
Internet oversight body should be developed, at least part of which is focused
on striking the balance that each country must find between the free flow of
information on the Internet and societal standards that require content
restriction. This body should have as its purpose the development and
distribution of essential technologies to allow nations to adopt their own
content standards to the extent possible. Also, this body should encourage the
education of institutions and people about the possibilities and hazards of the
vast array of content available online. The one overarching principle that this
body must realize and embrace is that to maximize the effectiveness of this
unprecedented resource for all people, undue burdens cannot be placed on the
free flow of information.
[i]
See, e.g., Henley, Jon, Guardian Unlimited, Nov. 21, 2000 (visited
Mar. 1, 2001) <http://www.guardianunlimited.co.uk/internetnews/story/0,7369,400681,00.html>
[ii]
See Johnson, David R., Post, David, Law And Borders - The Rise of Law in Cyberspace, 48
Stan. L. Rev. 1367, 1371 (noting that there are still important
territorial borders in cyberspace, but these new borders exist within a
virtual space where the "power to control activity ... has only the
most tenuous connections to physical location.").
[iii]
See Barbara Esbin, Internet
Over Cable: Defining the Future in Terms of the Past, 7 CommLaw
Conspectus 37, 45.
[iv]
See id. at 25 (discussing the creation of the World Wide Web in 1989
as a “global, online store of knowledge, containing information from a
diversity of sources and accessible to Internet users around the world”).
[v]
ACLU v. Reno, 929 F. Supp. 824, 831 (E.D. Pa. 1996), aff’d
Reno v. ACLU, 117 S. Ct. 2329 (1997).
[vi]
See www.isoc.org/inet2000/cdproceedings/8e/8e_4.htm#s5
(containing numerous Internet statistics).
[vii]
74 F.3d 701 (6th Cir. 1996).
[viii]
U.S. CONST. Amend. I. The full text of the First Amendment is: “Congress
shall make no law respecting an establishment of religion, or prohibiting
the free exercise thereof; or abridging the freedom of speech, or of the
press; or the right of the people peaceably to assemble, and to petition the
Government for a redress of grievances.”
[ix]
Red Lion Broadcasting Co. v. FCC, 395 U.S. 367, 386 (1969).
[x]
ACLU v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996), aff’d,
521 U.S. 844 (1997).
[xi]
See id. at 864 (noting that the ACLU’s principal argument was that
the statute was “vague” and “overbroad”).
[xii]
See ACLU v. Reno, 929 F. Supp. 824, 830-849 (E.D. Pa. 1996)
(recognizing that both parties cooperated to successfully construct a
factual background of a sort and to an extent new to the court).
[xiii]
Reno v. ACLU, 521 U.S. 844, 855 (1997).
[xiv]
Id. at 857.
[xv]
418 U.S. 241 (1974) (finding a state statute requiring newspapers to provide
political candidates a right of reply to assaults on their character
unconstitutional under strict scrutiny).
[xvi]
See McCullagh, Declan, FBI
Goes After Bonsaikitten.com, Wired.com, Feb. 9, 2001 <http://www.wired.com/news/politics/0,1283,41733,00.html>
(visited March 27, 2001); see also
http://www.bonsaikitten.com.
[xvii]
Singapore adopted a three-pronged approach to encourage Internet
development: 1) promote public awareness of positive aspects and hazards of
using th Internet through public education; b) promote the public awareness
of positive aspects and hazards of using the Internet through public
education; c) institute a light-touch policy framework in regulating content
which is regularly fine-tuned based on consultation. Rodriguez, Joseph C., A
Comparative Study of Internet Content Regulations in the United States and
Singapore: The Invincibility of Cyberporn, 1 Asian-Pacific
L. & Pol’y J. 9, 17 (2000).
[xviii]
INTERNET CODE OF PRACTICE (No. 3810/97) (visited Mar. 20, 2001) http://www.sba.gov.sg/work/sba/Internet.nsf/pages/code
(This statute is part of the Singapore Broadcasting Authority Act, which
first went into effect in November 1997).
[xix]
See Rodriguez, supra note
17, at 18 (outlining Singapore’s Internet regulation framework).
[xx]
SBA’s Approach to the Internet (visited Mar. 20, 2001) <http://www.sba.gov.sg/work/sba/Internet.nsf/ourapproach/1>.
[xxi]
See Rodriguez, supra note
20, at 19 (quoting former Prime Minister Lee on the free flow of information
on the Internet: “The top 3 to 5 percent of a society can handle this
free-for-all, this clash of ideas.” Such statements highlight the sharp
contrast in the way countries such as Singapore view information freedom
with that of the United States and other countries.)
[xxii] Rodriguez, supra note 16, at 18.
[xxiii]
See Trager, Robert, The
Internet Down Under: Can Free Speech Be Protected in a Democracy Without a
Bill of Rights?, 23 U. Ark. Little
Rock L. Rev. 123, 127-131 (2000).
[xxiv]
See Essick, Kristi, Yahoo to
Defy French Court Order, The Industry Standard, Feb. 21, 2001; see
also Order Rendered on November 20, 2000 <http://www.nantaka.com/Yahoo-case.html>
(visited Mar. 25, 2001) (English translation of French order requiring
Yahoo! to filter offensive content from French Internet users).
[xxv]
See Geoffrey Nunberg, The
Internet Filter Farce, The American Prospect, vol. 12, issue 1, Jan. 1,
2001 <http://www.prospect.org/print/V12/1/nunberg-g.html> (visited
March 8, 2001).
[xxvi]
Id. at 9.