I The Internet Trinity | II Foucault & Digital Libertarianism | III Safe Harbours and Unintended Consequences | IV Privatised Panopticons and Legalised Enclosures | V A Communications Sampler | Conclusion |
The Communications Decency act has been hailed as the nadir of Congressional
regulation of communications technology. Badly drafted, inconsistently
worded(25) and palpably unconstitutional,
it appeared to most of the Internet community to be a case of technological
ignorance run rampant. Here was a Congress regulating what it did not understand,
and doing so in a way that would be practically futile because of the amount
of content that came from beyond the jurisdiction of the United States.
The reactions ranged from condescending amusement at the lack of Congress's
technological knowledge to proprietary anger that the law was overtly asserting
its power over the electronic frontier. "Keep your laws off our Net"
went the slogan.
When the CDA was struck down by two different three- judge panels(26)
and then by a unanimous Supreme Court(27)
the decisions were seen as an inevitable vindication of these libertarian
views. The fact that the lower court opinions referred to the constitutional
problems raised for the CDA by the fact that it could not reach much of
the content on the Net merely sweetened the victory. Federal judges had
come a long way towards recognising both the technological resistance of
the Net to censorship and the fact that a global net could never
be effectively regulated by a single national jurisdiction.(28)
Two of the three parts of the Internet trinity had been acknowledged in
the Federal Reporters. What's more they had actually been plugged into
the framework of conventional First Amendment analysis. Given the fact
that the CDA would be likely to be ineffective, could we possibly say that
it passed strict First amendment scrutiny?(29)
Wasn't this a case of substantially restricting "the freedom of speech"
without effectively achieving the compelling state interest?
Seen through the lens provided by the jurisprudence of digital libertarianism,
these reactions were entirely warranted. A command backed by threats uttered
by a sovereign and directed towards a geographically defined population
had met and been annihilated by a right held by citizens against intrusion
by state power, in part because of the sovereign's inability to regulate
those outside its borders. The Communications Decency Act vanishes as if
it had never been -- an utter failure. Yet this analysis misses the developments
surrounding the CDA: not the public criminal sanction but the shaping and
development of privately deployed, materially based, technological methods
of surveillance and censorship.
The Communications Decency Act aimed to protect minors from indecent
material; however, if it did so by substantially limiting the speech of
adults it would be held unconstitutional as overbroad; "burning down
the house to roast the pig" in the words of Justice Frankfurter.(30)
The CDA's answer to this problem was to create safe harbours for indecent
but constitutionally protected speech aimed at adults, provided that speech
was kept from the eyes of minors.(31) The
Act offered a number of methods to achieve this goal, such as "requiring
[the] use of a verified credit card, debit account, adult access code,
or adult personal identification number."(32)
Given the technology and economics of the Net, however, the most important
safe harbour for non-profit organisations was clearly going to be that
provided by §223(e)(5)(A), offering immunity to those who had used
"any method which is feasible under available technology."(33)
It is here that the irony begins. When the Communications Decency Act
was first proposed, a number of computer scientists and software engineers
decided that they would do something more than merely railing against its
unconstitutionality. They were convinced that an answer to the perceived
need for regulation could be met within the language of the Net itself.(34)
I am not using the "language of the Net" as part of some deconstructive
or Saussurean trope, the idea was literally to provide a filtering system
whose markers would be built into the language that makes the World Wide
Web possible, Hyper Text Markup Language or HTML. Conceiving of technical
solutions as intrinsically more desirable than the exercise of state power
by a sovereign, as facilitators of private choice rather than threats of
public sanction. they offered an alternative designed to show that the
Communications Decency act was, above all, unnecessary. It is called the
Platform for Internet Content Selection or 'PICS' and it allows tags rating
a web page to be embedded within "meta-file" information provided
by the page about itself.(35) The system
can be adapted to provide both first party and third party content labelling
and rating.(36) The system is touted as
"value-neutral" because it could be used to promote any
value-system. Sites could be rated for violence, for sexism, for adherence
to some particular religious belief, for any set of criteria that was thought
worthwhile. The third party filtering site could be the Christian Coalition,
the National Organization for Women or the Society for Protecting the Manifest
Truths of Zoroastrianism. Of course in practice, we might believe that
the PICs technology would be disproportionately used to favour a particular
set of ideas and values and exclude others, just as we might believe that
in practice a Lochner regime of "free contract" would
actually favour some groups and hurt others, despite the fact that each
is -- on its face -- value neutral. But this kind of legal realist insistence
on looking at actual effects and scrutinising actual, rather than formal
power, is much less a part of our First Amendment discourse than of our
private law discourse as Owen Fiss, Jack Balkin and Richard Delgado have
each pointed out, though in very different contexts.(37)
While PICS and a variety of other systems offered a technical solution
at the "speaker" end of the connection, other software programs
also offered technical solutions at the listener end. These programs would
not offer speakers a safe harbour from the reach of the Act. Rather they
would "empower" computer users to protect their families from
unwanted content through the use of software filters, thus raising in civil
libertarians hearts the hope that the whole act was unnecessary. Programs
such as SurfWatch, CyberPatrol, NetNanny and CyberSitter, would block access
to unsuitable material and do so without the need for constant parental
intervention.(38) Typically these programs
maintained a list of forbidden sites as well as a text-search filter which
would not load documents containing forbidden strings of words.
The irony that I mentioned is that these technical solutions were used
by both sides in the dispute over the CDA. Those challenging the CDA argued
that the availability of privately implemented technological fixes meant
that the CDA failed First amendment scrutiny: clearly it was not the least
restrictive means available to achieve the objective. "Listener-centered"
blocking software would allow parents to control what their children saw
while "Speaker-centered," or third party, rating systems such
as PICS would offer a private solution to the problem of rating the content
available on the Net.
The government took the opposite position, arguing that the availability
of systems such as PICS meant that the CDA was not overbroad. Adult speakers
would not be burdened by the law because such systems provided adequate
methods for adult speakers to segregate their indecent but protected speech
from the eyes of minors. Thus, in their eyes, the PICS scheme, developed
to destroy the CDA, actually saved it.(39)
The Supreme Court ultimately disagreed, though Justice O'Connor left open
the possibility that future technical developments might change that conclusion.(40)
Before the decision was even handed down President Clinton was already
signalling his political preference for a technical solution to the question
of regulating speech on-line, talking vaguely of a "V-chip for the
Net."(41) Bills have already been
advanced in Congress which would require Internet Service Providers to
provide filtering software to customers and aim at the development of an
"E-chip."(42)
So where does the on-line speech stand after the Supreme Court's decision
in Reno v. ACLU? From the perspective of the digital libertarian, the Net
remains unregulated and the Internet trinity is undisturbed. From the perspective
I have been developing here, things seem much more mixed. As the CDA was
being constitutionally voided, the technological "solutions"
were proceeding apace, some because of the CDA, some in spite of the CDA;
In contrast to the extensive attention given to CDA, much of this process
was effectively insulated from scrutiny because of the assumptions about
law and state I have been exploring here.
PICS is wonderful tool for content selection, and if one assumes a world
very much like the idealised version of the marketplace of ideas, in many
ways an unthreatening and beneficial one. Yet its technological goal --
to facilitate third as well as first party rating and blocking of content
-- helps to weaken the Net's supposed resistance to censorship at the same
moment that it helps provide a filter for user-based selection. If national
networks can be more easily run through a kind of PICS-filtered firewall,
what happens to the notion that the of Internet tap can only be turned
to "off" or "full"? One wonders how China or Singapore
or Iran would choose to employ this "value-neutral" system. The
technological component of the Internet faith does not fall but it is weakened.
The state may not be able to deploy Austinian sanctions backed by threats
over the Net but the technology provided by PICS gives it a different arsenal
of methods to regulate content materially rather than juridically, by everyday
softwired routing practices, rather than by threats of eventual sanction.
As for the listener based software filters, they present even more problems.
Journalists studying these programs found that their list of selected sites
was problematic and -- most importantly -- was actually hidden from the
users.
A close look at the actual range of sites blocked by these apps shows
they go far beyond just restricting "pornography." Indeed, some
programs ban access to newsgroups discussing gay and lesbian issues or
topics such as feminism. Entire domains are restricted, such as
HotWired. Even a web site dedicated to the safe use of fireworks is blocked.
All this might be reasonable, in a twisted sort of way, if parents were
actually aware of what the programs banned. But here's the rub: Each company
holds its database of blocked sites in the highest security. Companies
fight for market share based on how well they upgrade and maintain that
blocking database. All encrypt that list to protect it from prying eyes.(43)
The programs turned out to ban sites ranging from the National Rifle
Association to the National Organization for Women and to do so in a way
that is often undetectable by their purchasers. Nevertheless enthusiasm
for these programs continues unabated. President Clinton promises that
government is working on an Internet V-Chip,(44)
Boston city libraries are installing them on computers accessible to children(45)
and Texas is considering mandating that Internet access companies make
copies of these programs available to all their new customers.(46)
Representative Markey introduced a Bill into this session of Congress which
would require both the creation of an "E-chip" and the provision
of free or "at cost" blocking software.(47)
In constitutional terms this raises interesting questions of state action.
One of the attractions of the technical solution is often that it allows
the state to enlist private parties to accomplish that which it is forbidden
to accomplish directly. But this state action problem is merely the constitutional
incarnation of the political limitations of the jurisprudence of digital
libertarianism -- its sole focus on state power, narrowly defined, its
blindness towards the technical and economic shaping, rather than the legal
sanctioning of the communications environment.
I do not want to overstate the effect of the mindset that I am describing.
Not everyone in the digital world thinks this way. Libertarians too, have
been worried by the dangers posed by technologically invisible filtering
of communication -- indeed one of the most interesting thing about Internet
politics is that they have forced libertarians to confront some of the
tensions in their own ideas.(48) Finally
other commentators have made the points I make here, though they also lamented
the blindness imposed by an entirely libertarian focus.(49)
Nevertheless, the result of the Supreme Court's decision in Reno v. ACLU
will simply be to sharpen the turn to the kinds of filtering devices here
and it is unlikely that this will leave the Net as free, or the state as
powerless as the digerati seem to believe.
I The Internet Trinity | II Foucault & Digital Libertarianism | III Safe Harbours and Unintended Consequences | IV Privatised Panopticons and Legalised Enclosures | V A Communications Sampler | Conclusion |
ENDNOTES TO SECTION III
25. Compare 47 U.S.C. §223(a)(1)(A)(ii) "obscene, lewd, lascivious, filthy, or indecent" with §223(a)(1)(B)(ii) "obscene or indecent" and §223(d)(1)(B) "in terms patently offensive as measured by contemporary community standards." None of these terms are defined and it is not clear that they are intended to be distinct from each other. The Telecommunications Act of 1996, Pub. L. No. 104-104, tit. V, §§ 501- 61, 110 Stat. 56 (1996). With some reservations the lower courts treated both phrases as equivalent to "indecency" as defined in Pacifica (FCC v. Pacifica Foundation, 438 U.S. 726 (1978)). The Supreme Court was less willing to waive away the statute's internal inconsistencies. "Regardless of whether the CDA is so vague that it violates the Fifth Amendment, the many ambiguities concerning the scope of its coverage render it problematic for purposes of the First Amendment. For instance, each of the two parts of the CDA uses a different linguistic form. The first uses the word "indecent," 47 U. S. C. A. §223(a) (Supp. 1997), while the second speaks of material that "in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs," §223(d). Given the absence of a definition of either term, this difference in language will provoke uncertainty among speakers about how the two standards relate to each other and just what they mean." Reno v. ACLU, No. 96-511, WL 348012 (U.S. June 26, 1997). Perhaps in desperation, the government's strategy in the case was to argue that the Act was intended to regulate only "commercial pornography" a phrase that appears nowhere within it. This argument was rejected both in the three judge panel below, ACLU v. Reno, 929 F. Supp. 824, 854-55 (E.D. Pa. 1996), and in the Supreme Court; Reno v. ACLU at __.
26. See ACLU, 929 F. Supp. 824 (E.D. Pa. 1996). In striking down the CDA, the District Court held that "[j]ust as the strength of the Internet is chaos, so the strength of our liberty depends upon the chaos and cacophony of the unfettered speech the First Amendment protects. For these reasons, I without hesitation hold that the CDA is unconstitutional on its face." Id. at 883 (Dalzel, J., concurring).
27. Reno v. ACLU, No. 96-511, WL 348012 (U.S. June 26, 1997).
28. ACLU v. Reno, 929 F.Supp. 824, 832 (E.D. Pa. 1996).. (discussing findings of fact) "There is no centralized storage location, control point, or communications channel for the Internet, and it would not be technically feasible for a single entity to control all of the information conveyed on the Internet."
But cf. Chief Justice Rehnquist's question during oral arguments (visited Jun. 24, 1997) <http://www.aclu.org/issues/cyber/trial/sctran.html>"But if 70 percent [of indecent speech on the Internet] is shielded and 30 percent isn't, what kind of an argument is that against the constitutionality of the statute?"
29. Charles Nesson & David Marglin, The Day the Internet Met the First Amendment: Time and the Communications Decency Act, 10 Harv. J.L. & Tech. 113, 115 (Fall 1996).
30. Butler v. Michigan, 352 U.S. 380, 383 (1957), quoted in Sable Communications v. FCC, 492 U.S. 115, 127.
31. The Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56 (1996) (to be codified at 47 U.S.C. §223(e)(5)(A)).
32. The Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56 (1996) (to be codified at 47 U.S.C. §223(e)(5)(B)).
33. The Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat. 56 (1996) (to be codified at 47 U.S.C. §223(e)(5)(A)).
(5) It is a defense to a prosecution under subsection (a)(1)(B)
or (d) of this section, or under subsection (a)(2) of this section with
respect to the use of a facility for an activity under subsection (a)(1)(B)
of this section that a person--
(A) has taken, in good faith, reasonable, effective, and appropriate actions under the circumstances to restrict or prevent access by minors to a communication specified in such subsections, which may involve any appropriate measures to restrict minors from such communications, including any method which is feasible under available technology;
34. See Paul Resnick and Jim Miller, The CDA's Silver Lining, Wired (1996) vol. 4(8) at 109.
35. See generally Albert Vezza, Platform for Internet Content Selection: What Does It Do? (visited Jun. 24, 1997) <http://www.w3.org/PICS/951030/AV/StartHere.html> For a critique of PICS see Lessig Cyber Rights Now: Tyranny in the Infrastructure supra note 24.
36. First party rating is rating provided by the person posting the information. Third party rating is rating provided by some other entity. World Wide Web Consortium, PICS Statement of Principles (visited Jun. 24, 1997) <http://www.w3.org/PICS/principles.html>
37. See Owen M. Fiss, Free Speech and Social Structure, 71 Iowa L. Rev. 1405, 1424-25, ("Today abolition of the fairness doctrine can be passed off as just one more instance of 'deregulation.' It seems to me, however, that there is much to regret in this stance of the Court and the [First Amendment] Tradition upon which it rests. The received Tradition presupposes a world that no longer exists and that is beyond our capacity to recall--a world in which the principal political forum is the street corner."), Liberalism Divided (Westview Press 1996), J.M. Balkin, Some Realism About Pluralism: Legal Realist Approaches to the First Amendment, 1990 Duke L.J. 375 (1990) ("In assessing what constitutes substantial overbreadth or vagueness, I do not think it inappropriate to employ common sense judgments about the way the world works. Although the distinction between public power and private power is significant, even more significant for me are what power relations (public or private) exist in the standard case in which the statute operates."), Richard Delgado, First Amendment Formalism Is Giving Way to First Amendment Legal Realism, 29 Harv. C.R.-C.L. L. Rev. 169 (Winter 1994) ("The transition to the new [legal realist] paradigm is, however, far from complete."). But cf. Steven G. Gey, The Case Against Postmodern Censorship Theory, 145 U. Pa. L. Rev. 193, 195-97 (Dec. 1996) ("The theoretical advances celebrated by Delgado and other progressive critics of the First Amendment are not really advances at all. They are simply refurbished versions of arguments used since the beginning of modern First Amendment jurisprudence to justify government authority to control the speech (and thought) of citizens. ... Moreover, despite the different objectives of the new censors, their reasons for supporting government control over speech are not significantly different from those of their reactionary predecessors. ... The postmodern censorship theory offered by this new generation of politically progressive legal scholars is neither progressive nor, for that matter, even "postmodern." In the end, it is just censorship.")
38. See generally Kathryn Munro, Filtering Utilities, PC Magazine, Vol. 16, No. 7 (Apr. 8, 1997) at 235 (describing and reviewing various filtering software products).
39. For a fuller version of this argument, see James Boyle et al., Before the Supreme Un-Court of the United States (visited Jun. 24, 1997)
<http://www.wcl.american.edu/pub/faculty/boyle/unreno.htm> (Justice Un-Scalia, dissenting)
40. "Despite this progress, the transformation of cyberspace is not complete. Although gateway technology has been available on the World Wide Web for some time now, id., at 845; Shea v. Reno, 930 F. Supp. 916, 933-934 (SDNY 1996), it is not available to all Web speakers, 929 F. Supp., at 845-846, and is just now becoming technologically feasible for chat rooms and USENETnewsgroups, Brief for Federal Parties 37-38. Gateway technology is not ubiquitous in cyberspace, and because without it "there is no means of age verification," cyberspace still remains largely unzoned--and unzoneable. 929 F. Supp., at 846; Shea, supra, at 934. User based zoning is also in its infancy. For it to be effective, (i) an agreed upon code (or "tag") would have to exist; (ii) screening software or browsers with screening capabilities would have to be able to recognize the "tag"; and (iii) those programs would have to be widely available--and widely used--by Internet users. At present, none of these conditions is true. Screening software "is not in wide use today" and "only a handful of browsers have screening capabilities." Shea, supra, at 945-946. There is, moreover, no agreed upon "tag" for those programs to recognize. 929 F. Supp., at 848; Shea, supra, at 945." Reno v. ACLU, No. 96-511, WL 348012 at 24 (U.S. June 26, 1997) (O'Connor, J., concurring in part and dissenting in part).
41. Remarks by President Clinton at Town Hall meeting in Bridgeport, W. Va. (May 22, 1997) "[I]t may be that what we have to do is to try to develop something like the equivalent of what we are developing for you for television, like the V-chip ... It's technically more difficult with the Internet. ... But I think that is the answer; something like the V-chip for televisions, and we are working on it."
42. See, e.g., Ed Markey, Empowerment Act (Fed. Doc. Clearing House 1997) (press release June 19, 1997).
43. Declan McCullagh and Brock Meeks, Keys to the Kingdom (visited Jun. 24, 1997) <http://www.eff.org/pub/Publications/Declan_McCullagh/cwd.keys.to.the.kingdom.0796.article>
45. Geeta Anand, Library OK's limits on 'Net access; Compromise calls for filter software only on computers used by children, Boston Globe, Mar. 22, 1997, at A1.
46. Marc Ferranti, Site-filtering issue goes to state level, InfoWorld, Apr. 21, 1997, at 60.
47. Ed Markey, Empowerment Act (Fed. Doc. Clearing House 1997) (press release June 19, 1997).
48. With a cavalier disregard for the problems that this raises for my thesis, some of the best investigative reporting on, and discussion of, the politics of private technological censorship has been done by the cyber journalist Declan McCullagh and his "Fight Censorship" discussion list. (See Meeks & McCullagh, Keys to the Kingdom, supra note __.) In one sense, this raises the issue that I discussed earlier -- the politics of the Net are up for grabs and the conventional categories of political ideology and theory are much more mutable there.
49. "Although many people were surprised at [the revelations in the McCullagh and Meeks article], it was in fact completely predictable from a historical perspective. Too much discussion of the future of unfettered electronic communications takes place in a social vacuum, from an extremely simplistic viewpoint (I refer to this the "net.libertarian" mindset). Because of a perspective that might be rendered "government action bad, private action good" there's great unwillingness to think about complicated social systems, of private parties acting as as agents of censorship." Seth Finkelstein, Internet Blocking Programs and Privatized Censorship, The Ethical Spectacle, August 1996 <http://www.spectacle.org/896/finkel.html>