Denying Children the World:
The Conditioning Of Federal Funding Upon School and Library Internet
Filtering
Sex, nudity, perversion, and the sexual solicitation of children run rampant over the Internet, so federal legislators renew efforts to block minors' access to certain information on the Internet by bribing public schools and libraries. To continue their protect-the-children agenda but avoid the Supreme Court ruling1 that struck down vague, overbroad parts of the Communications Decency Act of 1996 ("CDA"),2 several members of the 105th Congress introduced bills aimed to prevent young people from accessing questionably inappropriate or harmful material.3 Despite narrowly avoiding issues decided by the Supreme Court in Reno, this legislation might nevertheless unconstitutionally infringe on the freedom of expression rights of youth in the United States who use the Internet ("netizens").4 This paper focuses on the Safe Schools Internet Act, introduced into the House and Senate by Representative Franks and Senator McCain, respectively, that conditions federal funding for public school and library computer service upon installation of Internet filters to block material that minors should not experience.
The Supreme Court has consistently validated federal funding that is dependent upon conditions that further government policy while appearing to inhibit constitutional rights under the First Amendment, including freedom of expression and the right to receive information.5 Limits apply to such governmental power, however, especially in the realm of education.6 The United States educational system plays a vital role in society and its traditions of freedom of expression and association.7 The Court adheres to the idea that the First Amendment's vagueness and overbreadth doctrines limit the government's ability to condition federal funding on cooperation with policies that suppress free speech in educational settings.8 The question becomes whether the federal government, through the McCain-Franks Safe Schools Internet Act, can constitutionally condition federal funding for universal computer service in public schools and libraries on the installation of Internet filtering and blocking software that censors material deemed harmful to or inappropriate for minors according to local community standards. This paper examines the possible First Amendment objections to this legislation.
The Unconstitutional Communications Decency Act9
In Reno, the American Civil Liberties Union filed suit challenging the constitutionality of two provisions of the CDA.10 One provision prohibited the knowing dissemination, to anyone under eighteen years of age, of obscene or "indecent" material.11 Another provision prohibited the knowing computer dissemination or display of "patently offensive" material in a "manner that is available to anyone under eighteen years of age."12 There are ways around these restrictions, besides curtailing Internet use. Two affirmative defenses limit the application of these two provisions: "good faith, reasonable, effective, and appropriate actions under the circumstances to restrict or prevent access by minors" using available technology,13 and the restriction of access by "requiring use of a verified credit card, debit account, adult access code, or adult personal identification number."14
In a suit brought by the American Civil Liberties Union, among others, the District Court entered a temporary restraining order against the part of the CDA prohibiting indecent transmissions because the descriptor "indecent" is too vague a basis for criminal prosecution.15 Twenty-seven plaintiffs, including activist groups, newspaper associations, and computer corporations, then filed a second suit.16 The two cases were consolidated, and a three-judge District Court panel entered a preliminary injunction against enforcement of both the "indecent" and "patently offensive display" provisions.17 The U.S. Supreme Court affirmed, striking down the two challenged provisions because they violated the First Amendment to the U.S. Constitution.18
The Court in Reno distinguished regulation of the Internet from regulation of other media that the Court previously found constitutional.19 The factors that make regulation of media constitutional include a history of extensive government regulation of the medium under scrutiny,20 a channel "scarcity" necessitating government allocation among users,21 and the medium's invasive, as opposed to passive, nature.22 The Court found that the Internet exhibits none of these characteristics.23 Although laws apply to the Internet, in the past and present this new communication medium has not been and is not "subject to the type of government supervision and regulation that has attended the broadcast industry."24.
The Internet's freedom from extensive government intrusion stems mostly from the second and third factors listed above -- the Internet's huge capacity and non-invasive nature. In support of these factors, the Court in Reno cited a government estimate that "[a]s many as forty million people use the Internet today, and that figure is expected to grow to two hundred million by 1999."25 This explosive growth and "relatively unlimited, low-cost capacity for communication of all kinds"26 distinguish the Internet from other media, such as radio and television. No medium other than the Internet permits or enables users to employ so many forms of communication to express themselves, including text, audio, video, photography, and interactive real-time dialogue.27
In addition to the Internet's enormous capacity alluded to by the government and the Court in Reno,28 this medium does not need extensive government intervention because it is non-invasive; it requires affirmative steps on behalf of the reader of viewer to access its information.29 The District Court in Reno found that "[c]ommunications over the Internet do not 'invade' an individual's home or appear on one's computer screen unbidden."30 Furthermore, warnings almost always precede potentially offensive material, which reduces chance or accidental encounters with such communications.31
After distinguishing the Internet from highly regulated media such as radio and television, the Court proceeded to hold the two challenged CDA provisions unconstitutional.32 The Court so held because the provisions were vague, overbroad, not tailored narrowly enough to meet the government interest in shielding children from inappropriate material, and were not necessary.33 The use of both "indecent" and "patently offensive" in two different provisions, and the omission of a specific definition of either term, would likely confuse the speaker not only about whether her or his speech violates the CDA but also about which particular provisions applies.34 The vague terminology in the challenged CDA provisions "presents a ... threat of censoring speech that, in fact, falls outside the statute's scope ... [and] unquestionably silences some speakers whose messages would be entitled to constitutional protection."35 The Court also stated that, because the CDA criminalizes significant amounts of speech to which adults have a right of access, it is unacceptable because alternatives that do not violate the First Amendment exist, thus rendering the CDA unnecessary.36
Legislative Sequels to the CDA
Continuing legislative efforts to shield children from adult material on the Internet adopt new, unproven tactics because the CDA standards of "indecent" and "patently offensive display" clearly lack sufficient specificity to pass constitutional muster. Two distinct approaches emerged after the Reno decision, one targeting the distribution end of the communication, one targeting the receiving end.37 The first approach, embodied in the Coats bill, parallels the CDA by targeting the distributor.38 The second approach, embodied in the Safe Schools Internet Act, focuses on the receiving or accessing end of the medium by conditioning federal funding for schools and libraries on the installation of filters or blocking software on Internet-connected computers that children use.39
Of the two bills, the Safe Schools Internet Act comes closest
to being constitutional. The Coats bill ban on commercial distribution
will not withstand scrutiny by the Supreme Court. The Coats bill
uses the phrase "harmful to minors" as the standard for determining illegal
material, based partly upon the currently accepted obscenity standard.40
The Supreme Court's Reno ruling invalidating the CDA illustrates that one
must define terminology in a statute in order withstand a constitutional
challenge based on vagueness.41 Senator Coats attempts to comply
with this requirement by including guidelines for determining what exactly
is harmful to minors for purposes of his bill.42 The bill seeks to
alleviate confusion in speakers by requesting that the Attorney General
and the Federal Communications Commission post definitions on their web
sites, supposedly to inform the public regarding which speech the bill
criminalizes.43 The bill itself offers a definition of the
targeted "harmful to minors" material.44 While the bill addresses
the specificity requirement, it does not satisfy the Court's Miller obscenity
standard.45 The Court in Miller required that the determination of
what appeals to the "prurient interest" and is "patently offensive" should
be be according to local community standards, rather than a universal or
national standard.46 No wonder Senator Coats failed to find anyone
to cosponsor S. 1482.47
The McCain-Franks Safe Schools Internet Act, however, comes close
enough to meeting the constitutional test to enjoy support.48 The
bill provides federal funding for public school and library computer programs
only if they use filters or blocks to prevent children from accessing "inappropriate"
material.49 The Safe Schools Internet Act allows local communities
to determine what is "inappropriate" according to their own local standards.50
This is the Miller local standard element vital to legislation infringing
on the First Amendment.51
To ensure that schools, school boards, libraries, or local authorities can make an autonomous determination, the Safe Schools Internet Act proscribes federal government intervention.52 The federal government may not "(A) establish criteria for making that determination; (B) review the determination made by the certifying school, school board, library, or other authority; or (C) consider the criteria employed by the certifying school, school board, library, or other authority ..."53 To avoid problems of vagueness and a blanket national standard, the bill leaves everything up to the local access site except for compliance certification. By doing so, the bill retains little power in the government and does not seriously further the government objective to censor adult material.
Conditional Federal Funding and Education
The Supreme Court has consistently held government support of speech in the form of subsidies and tax expenditures that discriminates in favor of one speaker over another to be a legitimate exercise of government power that does not violate the First Amendment.54 Failure to finance everyone's exercise of a fundamental right does not mean it suppresses the exercise of that right, and this distinction removes the regulation from strict judicial scrutiny.55 However, conditional federal funding for education based on restrictions on speech presents special constitutional problems.
Education and the First Amendment go hand-in-hand.56 Arising from challenges to patriotic loyalty oaths as a requirement for keeping a public institution teaching position,57 the policy that educational settings deserve the highest First Amendment protection has not changed over the years and advances in technology.58 The Safe Schools Internet Act clearly targets two sanctified educational settings -- schools and libraries.59 Although public libraries have not been the subject of litigation the way schools have, the two settings are analogous because of their free flow of democratic communication. This justifies at least equal First Amendment protection for institutions such as libraries, protection that schools already enjoy.
The Supreme Court broadly permits conditioning of federal funding on adherance to policies that may be construed to suppress freedom of expression in non-media contexts, such as political lobbying60 and health care.61 When such laws are upheld, it is usually because the Court finds that they do not actually suppress speech; instead, they merely refuse to grant tax exemptions or to pay people to say things with which the government does not agree.62 In Rust, for example, Title X of the Public Health Service Act effectively forbade health care professionals from discussing abortion with patients served by federally funded clinics.63 The Court found this constitutional because the restriction only reached speech uttered in the course of one's duties as a federally funded employee.64 Staff were free to say whatever they wanted outside the confines of the federally funded project. The Rust case focused on health care professionals and their freedom of expression while employed on a federally funded project, whereas the schools and libraries controversy is over the right of public school children to access or receive information as one of the rights reserved under the First Amendment.
Filters Might (Not) Be The Answer
Concerned lawmakers, communities, and parents who wish to restrict their children's Internet access currently face the difficult question of what is the best method to protect children from Internet material that those responsible for their upbringing consider inappropriate or harmful.65 Of course, parents and guardians who do not home-school their children must surrender a certain amount of control over what their children experience in public institutions. This is where legislators step in. Instead of trusting schools and libraries to properly supervise children in their charge, federal legislators now seek to force the installation of filtering software by threatening to deny them federal funding.66 The Reno District Court predicted that reasonably effective filtering software would soon be available to allow parents to control their kids' Internet exploration.67 However, several groups dedicated to freedom of expression on the Internet have published convincing reports that filtering software blocks material far beyond that which is inappropriate for children.68
Disagreement exists over the sufficiency of filtering software. The Center for Democracy and Technology ("CDT") has confidence in existing technology, but the Electronic Privacy Information Center ("EPIC") does not.69 The CDT contends that filtering software is now one hundred percent available, easy-to-use and effective, can accommodate a diversity of family values and educational needs, and does not suppress freedom of expression while protecting children.70 EPIC, on the other hand, recently conducted a study of filtering software and found that it blocks large amounts of material that is entirely appropriate for children.71 Thus, although the Safe Schools Internet Act leaves to local authorities the selection of filtering software and the determination of what exactly "inappropriate for minors" means, children's access to materials on the Internet could still be unconstitutionally suppressed if acceptable software does not exist.
To discover whether filtering will serve the purpose of shielding minors only from harmful or inappropriate material, EPIC searched for information about schools, charitable and political organizations, educational, artistic, and cultural institutions, and general concepts of interest to minors.72 EPIC used Family Search, a web-based search engine introduced by Net Shepherd and Alta Vista.73 EPIC's filtered searches for terms such as "American Red Cross," "San Diego Zoo," "Christianity," and the "Bill of Rights" blocked almost ninety percent of Internet material containing these terms.74 The CDT cites several statistics supporting its contention regarding the availability of plenty of inexpensive or free filtering software,75 but does not include evidence of the software's effectiveness or narrow scope.
Three months after the CDT published the online article How Filtering Tools Enable Responsible Parents to Protect Their Children Online, it published an article submitting a framework for analyzing "user empowerment approaches" to blocking and filtering Internet content.76 The CDT asserts that, in order to pursue the equally important objectives of protecting children and freedom of expression, Internet policy and practice should be judged based on four principles: "effective and trusted solutions," "promotion of a diversity of voices," "sustainability in the unique Internet environment," and "minimization of government censorship of protected expression."77 The CDT measures three broad approaches by applying these principles. The approaches include using no filtering, omnipresent uniform self-labeling, and multiple, third-party filtering.78
Clearly, no filtering clearly protects freedom of expression, but it leaves serious doubts about protecting children. Although there is "general agreement that healthy, constructive use of the Internet by children begins with parental involvement and responsibility,"79 the safety of poorly supervised children and children outside of the home is still reasonable cause for concern. The librarians and teachers in well-funded and developed schools and libraries with low student to teacher or faculty ratios might be able to adequately proctor children's Internet exploration, but such conditions are far short of universal or omnipresent.
The second option, ubiquitous self-labeling, is also insufficient at this time.80 The difficulty of establishing a single, objective self-rating system and ensuring universal participation on the Internet is evident. Although pornographic sites are easily treated, software still cannot distinguish between a mass murder reported on www.MSNBC.com and sado-masochistic orgies illustrated on www.teenagesex.com.
The CDT's third option, multiple, third-party rating services, seems more promising.81 Commercial enterprises have developed close to fifteen different third-party rating services, evidence of this option's viability.82 The competitive, open market for such blocking and filtering services, combined with new testing by computer and consumer magazines, means that "market pressures are forcing competing products to do a better and better job of finding sites which should be blocked."83
Conclusion
Federal legislators continue to seek to achieve the goals of the
unconstitutional provisions of the CDA through similar but untested new
legislation. These efforts make it extremely pressing to develop
and implement alternative means of protecting children while not suppressing
freedom of expression. The intricacy and vastness of the Internet,
as well as its international reach, prevent an easy solution through government
regulation. A policy of conditioning federal funding on installation
of Internet filtering software on school and library computers will result
in gross, unconstitutional infringement of the First Amendment in the most
fundamental free expression forums.
1. Reno v. ACLU, 117 S.Ct. 2329 (1997).
2. Communications Act of 1934, as amended, 47 U.S.C.A. ( 223(a), (d)
(1997).
3. S. 1482, 105th Cong. 1st Sess. (1997) (introduced by Sen.
Coats) ("A bill to amend section 22 of the Communications Act of 1934 to
establish a prohibition on commercial distribution on the World Wide Web
of material that is harmful to minors"); S. 1619, 105th Cong. 2d Sess.
(1998) (introduced by Sen. McCain) ("A bill to direct the Federal Communications
Commission to study systems for filtering or blocking matter on the Internet,
to require the installation of such a system on computers in schools and
libraries with Internet access," known as the "Internet School Filtering
Act"); H.R. 3177, 105th Cong. 2d Sess. 91998) (introduced by Rep. Franks)
("Safe Schools Internet Act of 1998," House version of S. 1619).
This paper refers to S. 1482 as "the Coats bill" and to Sen. McCain's 1619
and Rep. Franks' H.R. 3177 together as the "Safe Schools Internet Act."
4. See, e.g., American Library Association, Statement of the
American Library Association to the Senate Commerce, Science and Transportation
Committee on Indecency on the Internet for the Hearing Record, Feb. 10,
1998 <http://www.ala.org/washoff/mccain.html>; Blue Ribbon Campaign,
The Struggle Isn't Over Yet, (visited Mar. 3, 1998) <http://www.eff.org/blueribbon.html>.
5. See, e.g., Rust v. Sullivan, 500 U.S. 173 (1991); Regan v.
Taxation Without Representation of Washington, 461 U.S. 540 (1983).
6. See, e.g., Keyishian v. Bd. of Regents, 385 U.S. 589 (1967)
(striking down as unconstitutionally vague a New York statute requiring
termination of professors of public institutions of higher learning for
"treasonable or seditious" comments or actions).
7. Keyishian, 385 U.S. at 603, 605-606.
8. Id. (federal funding in Keyishian took the form of public
employment).
9. Title V of the Telecommunications Act of 1996, Pub.L. 104-104,
110 Stat. 56.
10. Supra note 1.
11. 47 U.S.C.A. ( 223(a) (Supp.1997).
12. 47 U.S.C.A. ( 223(d) (Supp.1997).
13. 47 U.S.C.A. ( 223(e)(5)(A).
14. 47 U.S.C.A. ( 223(e)(5)(B).
15. Reno, 117 S.Ct. 2329.
16. Id. Plaintiffs that filed the first suit included the
American Civil Liberties Union, the Electronic Privacy Information Center,
the Journalism Education Association, and AI(DS Education Global Information
System. Plaintiffs that filed the second suit included the American
Library Association, America Online, American Society of Newspaper Editors,
and CompuServe Incorporated.
17. Id.
18. Id. at 2342.
19. Id. at 2343.
20. Reno, 117 S.Ct. at 2343 (citing Red Lion Broad. Co. v. FCC,
395 U.S. 367 (1969); FCC v. Pacifica Found., 438 U.S. 726 (1978)).
21. Reno at 2343 (citing Turner Broad. Sys., Inc. v. FCC, 512
U.S. 622 (1994)).
22. Reno at 2343 (citing Sable Communications of Cal, Inc. v.
FCC, 492 U.S. 15, (1989)).
23. Reno at 2343-44.
24. Id. at 2343; John Perry Barlow, Declaration of Independence
of Cyberspace (visited Mar. 5, 1998) <http://www.eff.org> (declaring
to the "weary giants of flesh and steel" -- industrial world governments
-- that Cyberspace is the "new home of the Mind" and governments have no
sovereignty over Netizens, so the "global social space we are building
[is] naturally independent of the tyrannies you seek to impose on us ...
[y]ou have no moral right to rule us nor do you possess any methods of
enforcement we have true reason to fear...").
25. Reno, 117 S.Ct. at 2344 (citing Juris. Statement 3 (citing
929 F.Supp., at 831 (finding 3))).
26. Id. at 2344.
27. Id.
28. Id.
29. Id.
30. Reno, 117 S.Ct. 2344.
31. Id. at 2343.
32. See Reno.
33. Id.
34. Id. at 2344.
35. Id. at 2346.
36. Id. The authorities already enforce laws against disseminating
obscenity and child pornography, the two forms of expression relevant hereto
that are clearly constitutionally suppressible. See 18 U.S.C. ( 1464-1465,
( 2251.
37. The Coats bill focuses on distribution, while the Safe Schools
Internet Act focuses on receiving. See supra note 3.
38. S. 1482, ( 1(e)(1).
39. S. 1619, H.R. 3177, ( 1(a).
40. Miller v. CA, 413 U.S. 15 (1973) (holding the definition
of obscenity to be: "1) the average person, applying contemporary community
standards" would find that "the work, taken as a whole, appeals to the
prurient interest" and 2) the work "depicts or describes, in a patently
offensive way, sexual conduct specifically defined by the applicable state
law" and 3) the work, taken as a whole, "lacks serious literary, artistic,
political, or scientific value").
41. Reno, 117 S.Ct. at 2344 ("Given the absence of a definition
for either term ("indecent" or "patently offensive display"), this difference
in language will provoke uncertainty among speakers...").
42. S. 1482, ( 1(e)(7).
43. S. 1482, ( 1(b).
44. S. 1482, ( 1(a)(7) (defining "harmful" matter as that which
"(i) taken as a whole and with respect to minors, appeals to a prurient
interest in nudity, sex, or excretion; (ii) depicts, describes, or represents,
in a patently offensive way with respect to what is suitable for minors
... [sexual matter] and (iii) lacks serious literary, artistic, political,
or scientific value").
45. Miller, 413 U.S. 15. See supra note 40.
46. Id.
47. Lexis Bill Tracking Report, 105th Cong. 1st Sess. (1997).
48. Lexis Bill Tracking Report, 105th Cong. 2d Sess. (1997) (S.
1619 cosponsored by Sens. Hollings, Coats, and Murray).
49. S. 1619, ( 1(a)(4).
50. Id.
51. Miller, 413 U.S. 15.
52. Id.
53. S. 1619, ( 1(a)(4)(A)-(C).
54. See, e.g., Regan, 461 U.S. 540; Buckley v. Valeo, 424 U.S.
1 (1976); Rust, 500 U.S. 173.
55. Regan, 461 U.S. at 549. Strict scrutiny, applied to
protected speech such as in schools and libraries and over the Internet,
requires a regulation to be necessary to serve a compelling state interest
and be narrowly drawn to achieve that end. Widmar v. Vincent, 454
U.S. 263 (1981).
56. Shelton v. Tucker, 364 U.S. 479, 488; Keyishian, 385 U.S.
589; Bd. of Educ. v. Pico, 457 U.S. 853 (1982).
57. Keyishian, 385 U.S. 589.
58. Pico, 457 U.S. 853.
59. S. 1619, H.R. 3177, supra note 3.
60. Regan, 461 U.S. 540.
61. Rust, 500 U.S. 173.
62. Id.
63. Id.
64. Id.
65. Center for Democracy and Technology, How Filtering Tools
Enable Responsible Parents to Protect Their Children Online, July 16, 1997,
<http://www.cdt.org>.
66. S. 1619, H.R. 3177, supra note 3.
67. Reno, 117 S.Ct. at 2347.
68. Blue Ribbon Campaign, The Struggle Isn't Over Yet,
(visited Feb. 26, 1998) <http:// eff.org/blueribbon.html>; Electronic
Privacy Information Center, Faulty Filters: How Content Filters Block Access
to Kid-Friendly Information on the Internet, Dec. 1997, <http://www.epic.org>.
69. See CDT, Filtering Tools, supra note 65.
70. Id. at 2-3.
71. EPIC, Faulty Filters.
72. Id.
73. Id.
74. Id.
75. See CDT, Filtering Tools, ( "All major national Internet
services offer filtering...ISPs serving eighty-five percent of all Internet
users offer at least one form of filtering software..." and "[o]ver ten
different filtering softward products (exist), reflecting a diversity of
values...Three PICS-based labeling services have rated over 300,000 sites
around the world" and "[t]hirty percent of web browsers have built-in filtering
capability using PICS...available at no cost."
76. David J. Weitzner, Center for Democracy and Technology, Blocking
and Filtering Content on the Internet after the CDA: Empowering Users and
Families Without Chilling the Free Flow of Information Online, Oct. 15,
1997, <http:www.cdt.org/speech/rating_
issues.html>.
77. Id.
78. Id.
79. Id.
80. Id.
81. Weitzner at 8.
82. Id.
83. Id.
19