What's a Little Information Between Neighbors? -- Policing Offensive Content on the Internet
By Brian Ekstrom
April 13, 2001
[Nicholas Johnson's University of Iowa Cyberspace Law Seminar Spring 2001]

Imagine that you operate an Internet site where users can offer personal items for sale to the entire Internet community. You operate your site from the United States, attracting thousands of users from around the world buying and selling millions of items. Transactions are impossible to monitor due to the large number of users and items sold. The site is divided into numerous categories, from antiques to automotive supplies, to make it easier for users to post items. Users of your system buy and sell everything from cars to computer supplies to books to jewelry. Such sites are sometimes called online auctions. However, your site is different from the traditional notion of an auctioneer. Once the site is in place, it is up to the users to do all the work. The users post their items for sale, select items to purchase, and are responsible for completing the transaction.
 
Your business is operating smoothly, facilitating millions of transactions, when you discover that you have been charged with a crime in a foreign country. The government of that country claims you have broken its laws, even though you have never even set foot there! Nobody from that foreign country had even used your site, although anybody with an Internet connection could have found materials for sale that are illegal to buy and sell in that country. The foreign government claims that by operating your Internet site you are committing a crime under their law. They give you an ultimatum: either monitor the contents of the goods on your system and filter out those offending items from users in that country, or face harsh fines.

The Internet company Yahoo! faced just such a problem in 2000. The French government charged it with violating a French law prohibiting the advertisement, exhibition, or sale of any objects likely to incite racial hatred. The charges came after Jewish and anti-racist groups in France expressed outrage that Nazi memorabilia were among the items available on Yahoo!'s auction pages. Although Yahoo!'s French language portal, Yahoo.fr, does not contain any of the offensive material, French users are able to access the offensive material through the Yahoo! English language site. There were no documented instances of any French citizens ever purchasing or making available such material.1
 
This example illustrates the problem that the world's numerous conflicting cultural and political differences create for e-commerce. Traditionally, territorial borders determine a government's domain and thus the government's power to control the people within its borders. The laws those governments implement reflect the moral and religious beliefs of the people within their boundaries. Laws may prohibit activities that violate those beliefs. The Internet, by contrast, has no territorial boundaries, because the cost and speed of message transmission on the Net is almost entirely independent of physical location.2  Is it possible for a government to protect its people from material on the Internet that offends its moral and religious beliefs? Can a single nation legislate standards that are applicable to the whole of the Internet? Does Yahoo! have an obligation to prevent all of its customers individuals from accessing material on its system that might offend the few who live the Middle East?

This paper will discuss these questions and the solutions that exist when conflicting cultures meet the global entity known as the Internet. First, it addresses the various forms this conflict takes and how different countries have dealt with the problem. Second, it examines the technological capabilities currently in place that allow governments and Internet providers to customize the content available to different countries. Third, it examines possible solutions to the problem and offers a recommendation to maximize the potential of the Internet while respecting the interests of all parties.


Cultural Conflict and the Internet

The Internet is a widespread series of interconnected computer networks, often referred to as a “network of networks.”3  The owners of the computers and networks that make up the Internet include government and public institutions, non-profit organizations, and corporations.4  One court described this new technology as "a decentralized, global medium of communications - or 'cyberspace' - that links people, institutions, corporations, and governments around the world.”5  A defining characteristic of the Internet is this absence of geographic boundaries: once a user is connected to the Internet from anywhere in the world he or she has access to all of its resources.6

With the Internet's expansion into formerly unserved countries, cultural conflicts arise. In 1995 over two-thirds of Internet users were located in the United States. By the year 2005 the United States is projected to have less than one-third of all Internet users. The fastest areas of growth will be in areas such as the Middle East, South and Central America, and Asia.7 These areas of the world have cultures that often differ dramatically from those of the United States and Western Europe. As Internet users from diverse cultures become more and more prevalent, solutions will be needed to deal with the inevitable growth in cultural conflicts.

These conflicts can also arise within a country. A California couple operated a "for fee" Bulletin Board Service that allowed members to download pornographic materials. The couple claimed that under the First Amendment the "community standards" by which the "obscene" nature of the materials should be measured is that of the "cyberspace community" and not that of Memphis, Tennessee, where the materials were viewed. The Thomases were indicted by a grand jury in the Western District of Tennessee and were convicted under, among other federal statutes, 18 U.S.C. § 1465. The charge was that they knowingly used a facility and means of interstate commerce -- in this case, the combined computer/telephone system – for the purpose of transporting obscene, computer-generated materials in interstate commerce.

One prong of  the First Amendment analysis of obscenity cases is whether the average person, applying contemporary community standards, would find that the work in question, taken as a whole, appeals to the prurient interest. It was argued on behalf of the defendants that the relevant community standards were not those of Memphis, Tennessee, where the defendants had been prosecuted, but rather a new definition of community was needed that recognized a community of cyberspace, rather than the geographic locations of people. The court rejected this argument, first holding that "obscenity is determined by the standards of the  community where the trial takes place" and that "it is not unconstitutional to subject interstate distributors of obscenity to varying community standards."

The court acknowledged the concern expressed by the defendants that such a ruling might lead to an impermissible chill on protected speech. BBS operators cannot select who gets the materials they make available on their bulletin boards they would be forced to censor their materials so as not to run afoul of the standards of the community in America with the most restrictive standards. However, the court believed that the First Amendment was not implicated by this prosecution because, unlike the hypothetical bulletin board operator who has no knowledge or control over the jurisdictions where materials are being distributed for downloading or printing, access to the defendants’ BBS could only be obtained by revealing one’s geographic location. The defendants "had in place methods to limit user access in jurisdictions where the risk of a finding of obscenity was greater than in California" and could avoid liability in jurisdictions with less tolerant obscenity standards by refusing membership to persons in those communities, the Court held that it did not need to adopt a new definition of "community" for use in obscenity prosecutions involving BBSs. It left for another day the First Amendment questions implied, but not directly presented, by this case.

The First Amendment of the United States Constitution states that “Congress shall make no law … abridging the freedom of speech.”8  However, the Constitution does not protect all forms of speech. Obscene material does not receive any First Amendment protection. Indecent speech, on the other hand, is constitutionally protected. Its regulation is subject to "strict scrutiny." This means that in order for a regulation of indecency to be upheld the regulation must be necessary to serve a compelling government interest and be narrowly drawn to achieve that interest. The strict scrutiny standard is applied to any legislation that limits the content of protected speech. There are, of course, some differences in the Court's approach to different media, such as radio, newspapers and television.9

In the landmark decision in Reno v. ACLU10 the Court accorded the Internet the same level of protection as that received by the print media. The ACLU claimed that the Communications Decency Act of 1996 (CDA) infringed upon protected First and Fifth Amendment rights.11  The U.S. District Court for the Eastern District of Pennsylvania conducted an extensive fact-finding including everything from the history of the Internet to the specific issue of sexually explicit material.12  In a unanimous decision, the three-judge panel enjoined the enforcement of the CDA and ruled it unconstitutional. The case went directly to the Supreme Court, where it was affirmed.13  The Court analyzed the statute under the strict scrutiny standard. The Court first characterized the statute as overly broad, and then noted that less restrictive alternatives were available.14

The Internet is a new medium that the Court chose not to stifle with an overly broad CDA. The Internet is now closer, in terms of First Amendment protection, to print media than it is to broadcast media or common carriers. The Court’s reasoning in Reno seems to place the Internet in the realm of protection of the print media and suggests that a vague statute implicating First Amendment freedoms will not pass strict scrutiny. Reno is indicative of the serious consideration the United States affords the protection of speech on the Internet, in contrast to some other countries.

As the Thomas case demonstrates, the United States is not immune to a finding of offensive content for material that other countries and cultures could find acceptable, or at least allowable within their law. The FBI recently began an investigation of an Internet site, Bonsaikitten.com, run by MIT students, after receiving complaints from people offended by the site's depiction of cruelty to animals.15  Although the site is clearly a spoof, claiming to apply the ancient bonsai techniques of Japan to kittens, FBI officials were not amused. The possible crime involved is a federal law passed in December 1999 prohibiting the transfer across state lines of a depiction of animal cruelty. Thus, it clearly applies to the Internet. Although this law has not yet been challenged on constitutional grounds as of yet, it is probably unconstitutional. Regardless of its constitutionality, it illustrates how the United States, with its tradition of allowing a free flow of information, may also have limits potentially violated by the borderless Internet.


Singapore

Internet regulations vary depending on the nation. These regulations differ because of the various social values held by diverse cultures. In policing the Internet, some nations prohibit content based on broad terminology, such as being against public interest or public morality. For example, in 1996 Singapore enacted an elaborate administrative law framework regulating Internet content. These regulations impose stringent controls on the media. In developing its Internet content regulations, Singapore had to resolve obvious tensions. It has an aggressive information technology growth strategy that allows colossal amounts of uncensored information into the country via the Internet. This technological freedom is in contrast with Singapore's traditional restrictions on media.

Singapore has policies in place to protect its citizens from potentially harmful material, while attempting to still encourage Internet development.16  Singapore forbids Internet Service Providers (ISPs) and Internet Content Providers (ICPs) from providing material that is objectionable on the grounds of “public interest, public morality, public order, public security, national harmony, or that is otherwise prohibited by applicable Singapore laws.”17  To enforce the Code, ISPs and ICPs must use their “best efforts” to comply with the Code.18  They must also act to prevent specified content from inclusion in any broadcasting service (including Internet content).19  The prohibited content includes material that the government deems against the public's interest, in bad taste, or indecent. Although ICPs within Singapore are prohibited from providing offensive materials, ISPs are not required to monitor the Internet or its users. However, they are required to eliminate access to 100 “high impact” pornographic sites. The designation of these few sites among the thousands of potentially offending sites, was made as a “statement of societal values.”20  The reality of enforcement of Singapore's regulations is likely event more stringent than the written laws.21

Censoring the Internet has proved impossible for Singapore. This is true despite Singapore’s high degree of technological advancement and the relatively small number of users and content it must regulate. Singapore is unable to impose the same level of restriction on the Internet as it imposes on other types of media. Although the government receives assistance from ISPs (which are either owned by or connected to the government) to help its censoring efforts, it has still failed. The head of the Singapore Broadcasting Authority, the agency in charge of Internet regulation, notes that “there is a limit to what domestic legislation can achieve in the face of a global and borderless medium like the Internet... [I]t was impossible to fully regulate the Internet.”22


Australia

Australia takes a different approach to Internet content regulations. The Broadcasting Services Amendment Act was adopted in 1999.23  This regulatory scheme was in response to fears that the complex system of controls and regulations in Australia on telecommunications, publishing, and broadcasting would be threatened by the availability of any form of content via the Internet. The Act, which attempts to protect Australia’s children from threats such as pornography, neo-Nazis, pedophiles, and bomb-making recipes, has faced mixed reactions from the public. The Act established a complaint-based system. The Australian Broadcasting Authoring (ABA) has the power to require illegal or offensive Internet sites to be removed from the Internet or for access to those sites to be prevented.

Australia’s regulations attempt to treat Internet content the same as other content, so that what is illegal in traditional media is also illegal online. However, the statute treats domestic content different from content originating outside Australia. If a complaint is filed about material hosted in Australia, the ICP is ordered to cease carrying the offending material. Alternatively, if a complaint regarding foreign material is filed, ISPs are required to take reasonable steps to prevent end-users accessing the material, if found to be prohibited. Obviously, if no complaint is made, no content is restricted. Due to the number of offending sites on the Internet, it is impossible to regulate them all. There has not yet been a legal challenge to the Australian legislation. Since Australia does not have any free speech protection in its constitution, it is not clear that any grounds exist to compel a court to overturn the regulations.

After a French court ordered Yahoo! to filter its content to make offensive memorabilia unavailable in France, Yahoo! removed the offending items from its auctions. However, Yahoo! did not comply with the court order, instead filing a countersuit in the United States arguing that: 1) it is technologically impossible for Yahoo! to comply with the French court's filtering order, and 2) the French court has no jurisdiction of U.S.-based Yahoo!. Although the offending items were removed from its site, Yahoo! claims this was due to a change in its own policies, and not to comply with the French court.24  Yahoo! likely did not wish to pursue a possibly lengthy and complex court battle. While French activists were pleased with Yahoo!'s actions, their victory will be short-lived. They have expressed their desire to prohibit offensive material on other sights, such as eBay.com, which also hosts the sale of Nazi memorabilia and other offensive materials.


Technology

As individuals, governments, and content providers grapple with the problems of offensive content, many see technological solutions as the best approach. Content filtering software is increasingly available in the marketplace. Filters come in several forms. While some are implemented "upstream," at the level of servers that control access for whole schools, libraries, or busiensses, others are implemented "downstream" at individual workstations. Conceptually, they all work similarly. First, "control lists" are generated, containing the addresses of unacceptable sites. Then automatic keyword filters are added to block additional sites that contain certain words and phrases. The user of the filter can specify the categories of sites to block, such as "hate speech" or "sex acts". Technology also exists to monitor the sites accessed by users.25

There are inherent inadequacies in the present filtering systems. To block objectionable material, the filter must be over-broad, blocking unintended sites. For example, in a recent study, 1,000 randomly chosen addresses in the dot-com domain were submitted to the SurfWatch filter. Of the sites it blocked as "sexually explicit," more than four out of five were misclassified. For example, the sites of an antiques dealer in Wales, a Maryland limousine service, and a storage company in California were all blocked.26 The reason for this overblocking is because it is impossible to single out objectionable sites simply by the words they use. Other types of objectionable content are even harder to filter because they do not give themselves away with genre-specific keywords such as "XXX". Another problem is that as the size of the Internet continues to grow, the vast amount of objectionable material will likewise grow.


Solutions

As the Internet continues its growth, every conceivable type of information, viewpoint, and artistic creation will be available for all users to encounter. This wealth of information, viewpoints, and creations will inevitably cross borders, sometimes to places where they are not welcome. Such will be the cost of participation in the global network containing all cultures. This does not mean, however, that we will live in a homogenized world, where the lowest common denominator will be only standard.

Each of our cultures will form the basis for our participation in the global environment, the way in which we view the world, and the way in which the world views us. What will change is that underlying each culture's involvement in the global environment will be a heightened awareness and tolerance of the societal standards of others. Countries and cultures who more quickly and easily embrace the free flow of information will have a clear advantage over those who try to hold onto rigid standards of information control. And yet even the most libertarian cultures will face limits on their tolerance and need to limit Internet content at times. There will be three keys to solving this balancing problem: 1) technology; 2) education; 3) global cooperation.

First, there will always be technological solutions to technological problems. Problems created by technology (the proliferation of offensive content via the Internet) can be solved by technology (filters). These solutions, however, will only be partial, and will have their own drawbacks. This is a fact of nature in the technological environment. New technology always create problems that must be dealt with. The technology is always one step ahead of the solution. Further, the pace of technological advancement is far quicker than policy advancement. Legal standards in place to limit Internet content based on technological capabilities will always be at least a step behind. Governments will use technology to set standards on Internet content available to their citizens, but these technological solutions will not be fully adequate to solve the problem.

Second, individuals, governments, and service providers will become more educated about the opportunities and threats of the Internet. This will lead to a better understanding of how to deal with the problem. A sort of maturity will develop in which people and institutions will learn to make available for themselves the content they want and ignore that which is offensive. For example, governments can provide recommended or required sites which people use to access the Internet, which direct them toward content that meets their personal and cultural standards. Also, as individuals become more savvy, they can select content providers that will cater to their particular requirements. A sophisticated user with access to the Internet will always be able to find his or her way into those dark alleys that contain offensive or obscene content, but the majority of people will stick to the well-lit highways that provide the content they need in the easiest manner possible.

Third, global cooperation is required to strike the balance between a flourishing Internet and respect for cultures. Eventually, all countries and most people will likely have a presence on the Internet. People will associate through language, not location. Regardless of how the Yahoo! case and those like it come out, it will be entirely impractical for one jurisdiction to attempt to censor a content provider over whom it has no power. The mere existence on the Internet of a piece of information, and therefor the availability of that information to anyone in the world with Internet access, does not give any one country the power to judge that information and determine its fate. The burden will be on those who attempt to restrict the free flow of information, not on those who provide information, no matter how offensive it might be. It should be the common goal to allow the free flow of all information to all nations to the extent each deems acceptable.

Necessary to achieve this goal will be international cooperation. An international Internet oversight body should be developed. Part of the goal of this body should be focused on helping each country strike the balance it needs between the free flow of information on the Internet and societal standards that require content restriction. This body should have as its purpose the development and distribution of essential technologies to allow nations to adopt their own content standards to the extent possible. Also, this body should encourage the education of institutions and people about the possibilities and hazards of the vast array of content available online. The one overarching principle that this body must realize and embrace is that to maximize the effectiveness of this unprecedented resource for all people, undue burdens cannot be placed on the free flow of information.


Endnotes

1. See, e.g., Henley, Jon, Guardian Unlimited, Nov. 21, 2000 (visited Mar. 1, 2001) <http://www.guardianunlimited.co.uk/internetnews/story/0,7369,400681,00.html>

2.  See Johnson, David R., Post, David, Law And Borders - The Rise of Law in Cyberspace, 48 STAN. L. REV. 1367, 1371 (noting that there are still important territorial borders in cyberspace, but these new borders exist within a virtual space where the "power to control activity ... has only the most tenuous connections to physical location.").

3.  See Barbara Esbin, Internet Over Cable: Defining the Future in Terms of the Past, 7 COMMLAW CONSPECTUS 37, 45.

4.  See id. at 25 (discussing the creation of the World Wide Web in 1989 as a “global, online store of knowledge, containing information from a diversity of sources and accessible to Internet users around the world”).

5.  ACLU v. Reno, 929 F. Supp. 824, 831 (E.D. Pa. 1996), aff’d Reno v. ACLU, 117 S. Ct. 2329 (1997).

6.  Another of the Internet's defining characteristics is its durability. As envisioned by its original creators, the Internet can still function if portions of it do not work, by finding alternate routes for information to travel through the system.

7.  See www.isoc.org/inet2000/cdproceedings/8e/8e_4.htm#s5 (containing numerous Internet statistics).

8.  U.S. CONST. Amend. I. The full text of the First Amendment is: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

9.  Red Lion Broadcasting Co. v. FCC, 395 U.S. 367, 386 (1969).

10.  ACLU v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996), aff’d, 521 U.S. 844 (1997).

11.  See id. at 864 (noting that the ACLU’s principal argument was that the statute was “vague” and “overbroad”).

12.  See ACLU v. Reno, 929 F. Supp. 824, 830-849 (E.D. Pa. 1996) (recognizing that both parties cooperated to successfully construct a factual background of a sort and to an extent new to the court).

13.  Reno v. ACLU, 521 U.S. 844, 855 (1997).

14.  Id. at 857.

15.  See McCullagh, Declan, FBI Goes After Bonsaikitten.com, Wired.com, Feb. 9, 2001 <http://www.wired.com/news/politics/0,1283,41733,00.html> (visited March 27, 2001); see also http://www.bonsaikitten.com.

16.  Singapore adopted a three-pronged approach to encourage Internet development: 1) promote public awareness of positive aspects and hazards of using th Internet through public education; b) promote the public awareness of positive aspects and hazards of using the Internet through public education; c) institute a light-touch policy framework in regulating content which is regularly fine-tuned based on consultation. Rodriguez, Joseph C., A Comparative Study of Internet Content Regulations in the United States and Singapore: The Invincibility of Cyberporn, 1 ASIAN-PACIFIC L. & POL’Y J. 9, 17 (2000).

17.  INTERNET CODE OF PRACTICE (No. 3810/97) (visited Mar. 20, 2001) http://www.sba.gov.sg/work/sba/Internet.nsf/pages/code (This statute is part of the Singapore Broadcasting Authority Act, which first went into effect in November 1997).

18.  See Rodriguez, supra note 17, at 18 (outlining Singapore’s Internet regulation framework).

19.  See id.

20.  SBA’s Approach to the Internet (visited Mar. 20, 2001) <http://www.sba.gov.sg/work/sba/Internet.nsf/ourapproach/1>.

21.  See Rodriguez, supra note 20, at 19 (quoting former Prime Minister Lee on the free flow of information on the Internet: “The top 3 to 5 percent of a society can handle this free-for-all, this clash of ideas.” Such statements highlight the sharp contrast in the way countries such as Singapore view information freedom with that of the United States and other countries.)

22.  Rodriguez, supra note 16, at 18.

23.  See Trager, Robert, The Internet Down Under: Can Free Speech Be Protected in a Democracy Without a Bill of Rights?, 23 U. ARK. LITTLE ROCK L. REV. 123, 127-131 (2000).

24.  See Essick, Kristi, Yahoo to Defy French Court Order, The Industry Standard, Feb. 21, 2001; see also Order Rendered on November 20, 2000 <http://www.nantaka.com/Yahoo-case.html> (visited Mar. 25, 2001) (English translation of French order requiring Yahoo! to filter offensive content from French Internet users).

25.  See Geoffrey Nunberg, The Internet Filter Farce, The American Prospect, vol. 12, issue 1, Jan. 1, 2001 <http://www.prospect.org/print/V12/1/nunberg-g.html> (visited March 8, 2001).
 
26. Id. at 9.