This paper was presented at the Association for Education in Journalism and
Mass Communication in San Francisco August 2006.
I am not the author. If you have questions about this paper,
please contact the author directly.
If you have questions about the archives, email rakyat [ at ]
eparker.org. For an explanation of the subject line, send email to
[log in to unmask] with just the four words, "get help info aejmc," in the
body (drop the "").
Evaluating cross-border Internet Hate Speech Regulation: a normative
School of Journalism and Mass Communication
University of Minnesota
Email: [log in to unmask]
Tel: 612 309 7679
Address: 4412 Washburn Ave S.
This paper tries to develop a normative framework for assessing
regulations of Internet content originating from another country.
This framework is rooted in a representative concept of sovereignty,
the end-to-end principle and the more practical principle of
effectiveness. The framework was designed to assess regulatory
attempts of European countries to limit hate speech originating from
the United States, but can also be applied to other instances of
cross border Internet content regulation.
Evaluating Cross-Border Internet Hate Speech Regulation: A Normative
It is a well documented fact that American First Amendment law is
more tolerant than European laws when it comes to regulating hate
speech.1 Unless hate speech amounts to threats, incitement to illegal
conduct or fighting words, it is generally protected under First
Amendment law. Most European countries, however, as do most other
democracies, have much more stringent hate speech laws. The European
Court of Human Rights has upheld convictions of people denying the
Holocaust2 or distributing leaflets promoting racist ideas.3
According to Article 10 of the Convention for the Protection of Human
Rights and Fundamental Freedoms4 and case law of the European Court
of Human Rights (ECHR), the right to free speech does not cover
speech that is threatening, denies or destroys the human dignity and
integrity, or speech that directly incites harm or advocates violent
behavior against other human beings.
At the European level, as well as at the country level, attempts have
been made to limit hate speech online. In 2001, the Council of Europe
drafted "The Additional Protocol to the Convention of Cybercrime
Concerning the Criminalisation of Acts of a Racist and Xenophobic
Nature Committed through Computer Systems" (the Protocol). It
requires that signatories criminalize "distributing, or otherwise
making available, racist and xenophobic5 material to the public
through a computer system."6 The United States declined to sign on to
the Protocol (which was opened for signature to non-member countries
as well) because it is inconsistent with American Constitutional
guarantees.7 This is but one example of how the Internet has exposed
the different approaches that exist towards regulation of hate speech
in the United States and Europe. This paper will not discuss this
difference, but refers the reader to the numerous scholarly articles
describing these different approaches.8
As a consequence of these different regimes, European courts,
anti-racism organizations and lawmakers are struggling with the
problem of how to regulate hate speech that reaches Europe via the
Internet but originates in the United States, where it is
constitutionally protected. A result of aggressive prosecution of
online hate speech combined with the failure of these approaches in
reaching American conduct is that many European Web masters or
providers of speech that is banned in their home country host their
sites in the United States.9 As Europe cracks down on Internet hate
speech, this trend most likely will only get stronger. Reliable
figures for the extent of this phenomenon are hard to come by, but
the United States is often referred to as the country where most of
the hate online emanates from. A French former minister of Justice
said that 2500 of the 4000 racist sites counted worldwide were hosted
in the United States.10 European countries trying to block the
availability of online hate speech could do so in various ways;
trying to regulate foreign content providers and ISPs, imposing
filtering technology, issuing blocking orders against local ISPs,
regulating search engines, trying to convince American ISPs to adopt
acceptable uses policies that would ban hate speech, to name a few.
This paper does not try to propose a solution for this problem, but
attempts to design a set of criteria that attempts to regulate speech
originating from outside a country's borders should abide by. The
normative criteria will be developed on the basis of more general
principles that should guide attempts to regulate hate speech
originating from across borders. The normative model outlined below
allows critical evaluation of proposed solutions to the issue of
trans border content regulation and can be applied not only to hate
speech but also to other types of speech. The normative model is
steered by three fundamental principles: Respect for the open,
layered structure of the Internet, a representational concept of
sovereignty and effectiveness.
2. Maintaining the layered structure of the Internet
2.1. Regulating the Internet: normative and practical considerations
Before proposing a normative framework to evaluate content
regulation, it is important to ask whether or not the Internet can
and should be regulated. The answer to this question depends on how
the Internet is conceived. If one sees the Internet as a "machine,"
as the French Judge Gomez did in the Yahoo! case, there is little
reason to assume that the Internet should not be subjected to the
same kinds of regulations as any other "machine." If, on the other
hand, one sees the Internet not as a tool, but as a separate
sovereign place -a "cyberspace" with its own norms and values- a
distinct regulatory approach for the Internet is more likely to be
advocated. Claims of "cyber independence" were particularly strong up
to the early to mid nineties, when the Internet was used or
"inhabited" by a relatively small community that had specific values
and norms. For example, quite contrary to the Internet as we know it
now, commercial activity was not tolerated.11 In 1994, "the Internet
was viewed not as a space for commercial advertising but as a place
for community, sharing, and public discourse."12
Out of this maverick ethos emerged a strong feeling of autonomy or
"cybersovereignty." Fundamental to this ethos was the conviction that
cyberspace is a space separate and different from "real space," and
that the governments of the real world should respect cyberspace's
autonomy and not try to impose their rules on this new space:
"Governments of the Industrial World, you weary giants of flesh and
steel, I come from Cyberspace, the new home of Mind. On behalf of the
future, I ask you of the past to leave us alone. You are not welcome
among us. You have no sovereignty where we gather."13 These
"cyber-independents" believed in the liberating potential of this
alternative reality, and therefore thought that the real world should
not impose its laws upon this new "space" where different norms and
values apply. Cyberspace was conceived to be the new Western
frontier, a place of relative lawlessness, a lawlessness that does
not lead to chaos but that is conducive to prosperity and development.14
This cyberlibertarian argument is based upon both factual and
normative arguments that ought to be distinguished from each other.
The factual argument states that attempts to regulate the Internet
are futile because of certain characteristics of the Internet. Those
who make this argument see the Internet as a separate place that
cannot be intruded upon. They point out that because the identity and
location of Internet users is often time unknown and because it is
decentralized and borderless, traditional local laws cannot be
applied to the Internet.15 Because the geographical indeterminacy of
the network architecture, an IP address tells you nothing about the
geographic location or identity of the user,16 in fact, the Internet
was designed to not permit the flow of geographical information.17
Therefore, regulation critics have argued that local governments
should accept the fact that they are unable to control information
flow across their borders. This argument states that the Internet
poses problems of choice of law as well as enforcement of these laws
of such magnitude that Internet regulation through law is nearly impossible.
The normative argument, on the other hand, claims that governments
should not regulate the Internet because to do so would be
detrimental to its development and to the libertarian ideals upon
which it was built. The normative argument is rooted in the belief
that the Internet is a separate, autonomous place where local
governments have no authority. Internet regulation, those pioneers
believed, should emerge from within and be enforced by the Internet
community and not imposed from above. This could be done through, for
example, contractual relationships and user preferences.18 There have
indeed been a number of instances of collective action on the
Internet in the mid nineties that were successful in enforcing these
values against governmental and commercial players.19
The scholarly translation of this idea can be found in an important
1996 article by David Johnson and David Post20 in which they argued
that the Internet has challenged the ability and legitimacy of nation
states to regulate because of the disconnect between the Internet and
any specific jurisdiction. They maintained that "[t]he Net thus
radically subverts the system of rule-making based on borders between
physical spaces, at least with respect to the claim that Cyberspace
should naturally be governed by territorially defined rules."21 (Note
he use of a capital in the word "Cyberspace.") The challenge for
these scholars was to formulate what this cyberlaw should look like,
what the underpinnings should be for this new law that would apply in
this new sovereign jurisdiction. During numerous symposia held in
1996 and 1997, theorists formulated proposals for what kind of
distinct regulatory approach would be best suited to the Internet and
analyzed legal doctrines that could work in cyberspace. This
scholarship was unified by the belief that cyberspace is and should
be an autonomous place, regulated independently from the "sovereigns"
of the real world.22 At that time, the idea of cybersovereignty was
well established and a substantial amount of scholarship did not
question it, but was dedicated to establishing a regulatory model for
the Internet, independent of the laws of the "real world."
However, as the Internet became more popular and more commercial in
the second half of the nineties, this view would come under attack.
Jack Goldsmith criticized the factual argument of the "regulation
skeptics," arguing that there is no reason why local laws could not
apply to cyberspace.23 The Internet, according to Goldsmith, may
present some new legal challenges, but he disagreed with the notion
that transactions and interactions taking place on the Internet are
qualitatively different from transactions taking place in off line,
where transnational transactions also sometimes pose legal problems.
Goldsmith argued that in cyberspace, transactions between people from
different jurisdictions can take place easily, and that these kinds
of transactions may pose jurisdictional problems, but this
choice-of-law problem is not unique to Internet transactions and does
not mean that these kinds of transactions cannot be governed by
traditional law. At around the same time, scholars also began to chip
away on the normative argument supporting cyberindependence. They
argued that there is no reason to believe that the liberal democratic
ideals of individual liberty, popular sovereignty and consent of the
governed could be guaranteed by cyber governance any better than by
Still, even though the cyberlibertarians' values emerged in a time
when the Internet's uses were not as varied as they are today, the
open structure of Internet they envisioned has advantages. Many
observers have noted that the open and free flow of information, with
few rules and no central authority, has been crucial to the
development of the Internet because it allows, among other things,
for a low cost innovation.25 On the Internet, innovators can design
new applications or create new content without having to worry about
their compatibility with the network, without having to obtain
permission or have big financial resources to release them. From a
technological perspective, the Internet is a far more egalitarian
medium than traditional mass media; it treats alls users and messages
alike, regardless of the content of the messages or the identity of
the users. Once the information is broken down and transported, it
does not matter whether the data packets are part of the New York
Times Web site or a local blog about ice fishing. Naturally, at the
content level, not all information is egalitarian, and the New York
Times Web site is not equal in power, resources and popularity to an
amateur Web site. But as far as the medium is concerned, it takes few
resources to have one's content distributed. This egalitarian nature
has facilitated the emergence of, for example, blogs as important
voices in national and international debates and news stories. But
the low-cost-of-innovation argument applies mainly to technological
innovation. A 16-year-old wiz kid can develop a new application in
his bedroom and make it available to the Internet community by means
of a couple of mouse clicks.
This simplicity, ease of access and low cost of innovation have been
contributing factors to the Internet's development and has created an
environment conducive to innovation, both technological and cultural:
Not innovation in just the dotcom sense, but innovation in the ways
humans interact, innovation in the ways that culture is spread, and
most importantly, innovation in the ways in which culture gets built.
The innovation of the Internet -- built into its architecture -- is
an innovation in the ways in which culture gets made. Let the dotcom
era flame out. It won't matter to this innovation one bit. The
crucial feature of this new space is the low cost of digital
creation, and the low costs of delivering what gets created.26
But the Internet has become much more than a place to exchange ideas
or create culture. It is now a medium through which one can offer
goods for sale, shop, file taxes, register for classes, rent movies,
play fantasy football, book trips etc. With this proliferation of
functions, complexity may have to be built into the network to ensure
security and reliability of the network, making some form of
government oversight necessary. However, a balanced approach is
needed, an approach which "would retain some measure of the original
network's simplicity and lack of structure, and aspire toward
alternatives that do not needlessly fragment on-line users and their
communities."27 One does not need to subscribe to the notion that
cyberspace is a distinct geographical location with distinct rules to
agree with the proposition that any kind of regulation should try to
respect and maintain the structure of the Internet that has made the
development of the medium possible. Solum and Chung, in a very
expansive law review article,28 develop this principle further,
proposing some specific guidelines for Internet regulation, relying
heavily on the work of Lawrence Lessig.
Solum and Chung rely on Lessig in developing a criterion to suggest
how this balance between openness and regulability can be struck and
how a set of principles can be developed that should guide regulation
of the Internet. In order to do this, they draw on Lessig's notion
that code is what operates and determines activity on the Internet.29
Lessig launched his code concept in an influential 1999 book,30 in
which he argues and explains how computer code determines behavior on
the Internet, capturing this idea in the catchphrase that "code is
law" on the Internet.31 Code, according to Lessig, determines the
parameters for behavior and action on the Internet. Because of the
particular way the Internet is designed, because of its architecture,
which is determined by code,32 certain things can be done with ease
while others are hard or impossible. The Internet's architecture is
analogous to the road system in a city. Just like the road system of
a city determines where you can go with your car, so does the
Internet architecture determine how you can move around in
cyberspace. Cities can decide to make their city center accessible
only for busses, to levy tolls to use certain lanes at certain times
or to require permits to drive in certain areas of town at certain
times. Cities may try to reward certain behaviors such as carpooling
or using public transportation and they may try to discourage or
punish others, such as speeding or driving during rush hour. Code,
according to Lessig, does the same thing for the Internet: It
determines how we can move around on the Internet.
The Internet was designed to allow a rapid flow of information,
uninhibited by physical borders and without central authority. This
"design" has consequences for how you can move around on the
Internet. Because of these architectural features, users in France or
Germany have access to Web sites hosted on American servers that may
be illegal in their own country. Currently, there are relatively few
opportunities for governments to regulate Internet traffic or speech
on the Internet. This has led Lessig to state that the architecture
of the Internet has exported a First Amendment in code "more extreme
than our own First Amendment in law."33 This does not mean that code
always enables free flow of information. For example, it is also code
that enables content providers to make material available only for
viewing and not for downloading, or to allow access to certain
content only to paying subscribers or to people willing to divulge
The French order in the Yahoo! case34 serves as a good illustration
of the tenuous relationship between architecture and regulation of
the Internet. In 2001, a French court ordered Yahoo! to use
geolocation software in order to determine whether or not surfers
accessing its auction site on which Nazi memorabilia, the public
display of which is forbidden in France, were offered for sale, were
French and block their access if they were. Experts testified that
70% of the French users could be identified this way. Part of the
reason that only 70% of the users could be identified as French
through geolocation software was that French AOL subscribers' IP
addresses come up as being located in the United States. This is
because AOL uses the services of the UUNET network, a commercial
Internet service provider. As a result of this setup, dynamic IP
addresses (IP addresses assigned to a surfer for the length of his
connection) attributed by AOL appear as being localized in Virginia,
where UUNET is located.35
However, this does not necessarily have to be so. It is conceivable
that AOL would make changes in the way it assigns IP addresses or
would organize its network differently. Geolocation software could
(and has) become more sophisticated and find ways to distinguish
French users otherwise. In other words, the cyberlibertarian claim
that location does not matter on the Internet is not one that
captured an inherent characteristic of the Internet,36 but one that
described the Internet architecture and code at a certain point in
time. A central idea in Lessig's work is that this architecture can
be changed, and that these changes may have consequences for the way
information and culture are distributed over the Internet. In this
respect, Lessig distances himself from the factual claims (though he
certainly shared their normative claims) of the cyberlibertarians
that stated that the essential nature of the Internet is that it is free.37
While the architecture of the Internet as it is now (or as it was at
the time of his book in 1999) makes it hard for governments to
regulate behavior on the Internet, it is not impossible for lawmakers
to regulate the architecture of the Internet.38 Lessig's attitude
towards government regulation of the Internet is somewhat ambiguous.
He seems to have no problems with governments trying to impose their
laws on the online environment. He does not share the notion that the
Internet is a sovereign space the government has no business
regulating, but he warns against governmental attempts to regulate
the architecture of the Internet and the stifling consequences for
innovation and creativity this could have.
When developing a normative framework for Internet regulation, Solum
and Chung focus on this open architecture that has been conducive to
innovation and development as a normative starting point. The fact
that the code of the Internet reflects American First Amendment
values is not by itself a sufficient argument for maintaining this
openness, since it is only convincing to those who subscribe to the
American First Amendment concept to begin with. But by taking the
open structure of the Internet and its importance to innovation as
the starting point for a normative theory, it has a broader appeal.
Solum and Chung develop this argument further by borrowing another
concept from Lessig; the end-to-end principle.
2.3. The end-to-end principle and the layered structure of the Internet
The end-to-end principle39 means that on the Internet, intelligence
is kept at the ends (the applications) of the network, while the
network itself is kept simple and basic. In order to apply this
concept and arrive at their normative principle for Internet
regulation, Solum and Chung use the concept of a layered Internet.40
They argue that a key feature of the Internet is that it is comprised
of different "layers," each with its own function in the information
processing and transporting process that constitutes Internet
activity.41 The layer model is a spatial metaphor in which
information is passed on from the top layer to the lower layers. Data
contain a header with instructions for each layer, and when a layer
receives data from a different layer, it performs its function
according to the information found in the header, and passes it on to
the next layer. 42
Their model43 describes the layered nature of the TCP/IP protocol,
the protocol that enables the connection of networks and constitutes
the code of the Internet.44 The first layer they identify45 is the
(1) content layer, referring to the content of a given Web site,
email or other Internet application; the specific signs that
constitute the message. The second layer is the (2) application
layer, the layer that handles the details of a certain Internet
application. Examples of application layer protocols are the HTTP
protocol, which enables the World Wide Web, the FTP protocol for file
transfer, the Domain Name System (DNS) for connecting IP addresses to
URLs and the SMTP protocol for email. The (3) transport layer (TCP)
provides the flow of data between two hosts. It is where the data
from the application layer are broken up into data packets and handed
to the network (IP layer). At the receiving end, data packets
received from the IP layer are assembled and delivered to the
application layer. The (4) IP layer (Internet Protocol), or the
network layer, handles the movement of data packets around the
network and the encoding of IP addresses (to figure out where data
need to be sent to). The (5) link layer is the layer handling the
physical linking of interfacing computers' hardware with the network
hardware. When new hardware is used for Internet communication, all
that needs to be done to find a way to hook it up to the Internet.
Typically, this is accomplished by a device driver for the specific
piece of hardware. This way, the upper layers are independent from
the TCP/IP layer, which is not burdened by having to adapt to new
hardware. The (6) physical layer is the last layer, the physical
infrastructure over which the actual transfer of bits takes place.
Solum and Chung claim that this layer separation is fundamental to
the design of the Internet. The lower layers do not "know" what
happens in the layers above them; their functioning is not determined
by the upper layers. Solum and Chung clarify this point by explaining
the workings of the Internet in some more detail. Most Internet users
know the HTTP protocol, as the application (layer 2) that enables the
World Wide Web, yet other applications are also commonly used. For
example, the Simple Mail Transfer Protocol (SMTP) is the application
that makes email possible and the File Transfer Protocol (FTP) is the
protocol that usually is used when one downloads a file from a server
on the Internet.
So when we state that the lower layers don't "know" what happens in
the layers above, or that intelligence is kept at the "ends" of the
network, this simply means that the lower layers cannot distinguish
between data packets that are part of an email, a Web page or an MP3
file that is being shared through a file sharing network. Once the
transport layer (3) has broken the data in several small data
packets, the IP layer (4) cannot differentiate between PDF and MP3
files. The Internet can also not ensure that all data packets arrive
at a receiver's computer at the same time, the majority of the data
packets making up a network communication may arrive almost
immediately, while the remaining data packets may take much longer,
because they are being rerouted.46 Once it is traveling over the
Internet, all information is equal.47
2.4. Guidelines for regulators
Solum and Chung argue that regulation of the Internet should respect
the layered nature of the Internet, because it guarantees that cost
of innovation is kept low. Based on this end-to-end principle, the
layered structure of the Internet and the code thesis, Solum and
Chung derive two specific guidelines for regulators of the Internet:
(1) They "should not adopt any regulation that would require one
layer of the Internet to differentiate the handling of data on the
basis of information available only at another layer, absent a
compelling regulatory interest." 48 (principle of layer separation)
The second guideline requires that "if compelling regulatory
interests require a layer-crossing regulation, public internet
regulators should adopt the feasible regulation that minimizes the
distance between the layer at which the law aims to produce an effect
and the layer targeted by legal regulation."49 In other words, this
principle states that layer crossing should only be done if no other
regulatory alternative is available, and if the interest that is
furthered by regulation is a compelling one. Even then, the layer
crossing should be minimized. The problem of regulating hate speech
is manifested at the level of the content layer, so according to
these rules, regulations of content should be dealt with at the
content (1) layer, or if this is impossible, to a layer as close as
possible to this layer (2)-(6).
In their article, Solum and Chung then apply this layer principle to
a variety of examples. They discuss the Burmese government's policy
to strictly control the physical layer by limiting access to
communication lines, equipment and network hardware (physical layer
(6)) in order to prevent government criticism (content layer(1)) as
an example of layer crossing violation.50 China's Internet policy is
also discussed in this context. The Chinese government also exerts
control over the physical layer to control content; for example, it
has a monopoly over all Internet connections going in and out of the
country and ISPs are required to register with the government.51
China also blocks numerous foreign sites such as the New York Times
or sites dealing with human rights issues. ISPs are required to block
access to all sites from originating from a certain IP address, so
this is regulation of content at the level of the IP layer
(4). According to the model of Solum and Chung, this is also a layer
crossing violation, but a less serious one than trying to regulate
content through controlling the physical layer, as it does not cross
as many layers.
These kinds of regulations, according to Solum and Chung, do have
negative consequences for the Internet. For example, the IP blocking
interferes with the fluidity of the network. IP blocking usually
means that a router (a device that, upon receiving a data packet,
determines the next network point to which a data packet should be
forwarded on its way towards its destination) will drop the data
packet. The router accepts the data packet but is programmed to drop
it if originates from a blocked IP address, even if the final
destination of the information is outside China. This slows down and
complicates Internet traffic.52 If every country would have these
kinds of measures, the Internet would be reduced to s set of
interconnected intranets. By eliminating the "stupidity" of the
network and by making the IP layer intelligent and discriminatory,
the end-to-end principle is violated.
Solum and Chung also discuss the Yahoo order in this context,
pointing out that the order of the French judge also constitutes a
layer violation because it would be blocking that takes place at the
IP layer. However, the authors suggest that in this case there may
have been no alternative and that a change in Internet architecture
may be justified since France is pursuing a compelling government interest.53
The model provided by Solum and Chung is applicable to the issue of
hate speech. The model suggests that hate speech, which takes place
at the content layer, should be addressed at the content layer level.
If this cannot be done, regulation should be targeted at layers
closer to the content layer. Solum and Chung's model allows for layer
violations if there is no alternative available and if there is a
compelling government interest in regulating. However, it is not
entirely clear what constitutes a compelling government interest and
how one can determine that there is no non-layer violating
alternative solution available. The compelling-government-interest
requirement is somewhat troublesome because this is exactly the crux
of the problem with Internet regulation, there are many local
interests that may all be very compelling, but accommodating them all
would be detrimental to the open structure of the Internet. The
condition that layer violations are permissible only if there are no
non-layer violating solutions available is too vague. There are
numerous possible ways content can be regulated, most of which have
some problems. They may be unfeasible technologically or politically,
they may be only partially effective or they may easily be
circumvented. More specific criteria about what qualifies as an
available alternative solution are needed.
Solum and Chung's argument against layer violation is powerful, and
layer violation is a concern that needs to be taken into
consideration when regulating content online. But we also need
clearer guidelines to determine under which circumstances and to what
extent a government is justified in trying to furthering its
compelling interest on the Internet, as doing so may have an effect
on other countries.
2. Sovereignty and Jurisdiction
2.1. Imperialism or democracy?
One of the main issues of contention in the Yahoo! case was whether
or not the French court erred in asserting jurisdiction over Yahoo!,
an American company located in the United States with no assets in
France. Did the French court act in an imperialistic way or was it
merely trying to uphold its own laws, as a sovereign nation is
entitled to do? Should the French court have asserted jurisdiction
over Yahoo!, merely on the basis that its Web sites could be accessed
in France? Guidelines are needed that spell out in which cases, if
ever, regulators from one nation should attempt to regulate speech
that originates outside its borders but that can be accessed in their
country. These guidelines will depend on what one considers to be the
prerogatives are of a sovereign nation in enforcing its laws on the
Internet, in other words, what one understands under the term "sovereignty."
In a 1999 Harvard Law Review note,54 the problem of nation states and
their desire to have their laws respected on the Internet is
discussed in the context of the "sovereignty" concept. The author
tries to devise a normative sovereignty concept that could guide
politics regarding Internet regulation. A distinction is made between
three conceptions of state sovereignty: the realist, the
representational and the postmodern. The postmodern sovereignty
concept relies on the factual assumption that cyberspace is an
independent, sovereign space responsible for its own regulation, an
assumption whose factual basis has become highly questionable and
therefore we will here mainly focus on the realist and
representational sovereignty concepts as normative anchor points for
2.2. The realist sovereignty concept
The realist conception of sovereignty is derived from international
relations' realist theory.55 This theory states that nation states
are the primary actors in international politics and that states'
main and rational concern is the maximization of their power.
According to this theory, states will try to have their laws
respected at all costs, even at the expense of (actors in) other
states. According to this theory, sovereignty means that the state,
and the state alone, is the supreme authority that has exclusive
jurisdiction over its citizens and internal affairs. Any limitation
of this authority is seen as a limit on this sovereignty. The author
points out that states attempting to regulate the Internet often do
so on realist assumptions. They see the Internet as a threat to their
sovereignty and try to impose their laws on the medium to fight off
that threat. States operating within the realist paradigm rely on two
principles for asserting jurisdiction: the effects principle and the
territoriality principle.56 The territoriality principle (applied to
Internet communication) states that "a state has authority to
regulate the transmittal of information across its borders and the
use of that information by individuals within its territory."57 The
effects principle is invoked when states impose their domestic rules
and laws upon out-of-state actors based on the fact that their speech
or actions, even though originating in a forum where they are legal,
have effects in a place where they are not.58 The authors mention
China's attempts to block certain information from entering its
territory, as an example of this approach.59 The claims of the
cyberlibertarians, who argue that cyberspace is separate from the
"real world," also rely on the realist assumption of territorial
control because they think that cyberspace should be regulated from
within and not from outside laws.60
Reliance on realist assumptions cannot provide a normative framework
of sovereignty in which one can anchor an approach towards Internet
Internet regulation. First of all, it is unlikely to be successful.
In this respect, the cyberlibertarians were right in arguing that it
may be hard for nations to enforce their local rules on a global
medium. States can assert jurisdiction, but as long as the person (or
business) over whom they assert jurisdiction does not reside within
their territory, does not have assets there or is not subject to
extradition, any judgment rendered against him will be unenforceable.
However, the effectiveness of these kinds of measures is not a
consideration at this point. A more important reason to reject this
approach is that it would lead to a situation in which every nation
would try to impose its norms on the Internet, which –if successful-
would lead to an Internet governed by the lowest common denominator.
In a realist paradigm, the attempts of a country like France to
uphold its democratically adopted laws are not distinguishable from
non-democratic governments trying to limit citizens' ability to
gather and spread information critical of the government. In both
instances, countries are trying to unilaterally impose their laws on
the Internet. A normative model of sovereignty in which to anchor
guidelines for Internet regulation needs to more fine-tuned than the
one provided by realist concept of sovereignty.
2.3. The representational sovereignty concept
The second model of sovereignty the authors discuss is the
"representational conception,"61 which has it roots in liberal
ideology. According to this view, the individual, not the state, is
the most important unit of analysis in the international system. In
this model, the state derives its sovereignty from the fact that it
represents the general will of its people. Under this sovereignty
concept, citizens in any state should have the right to have their
democratically passed laws enforced, granting the state the power to
ensure that their will is followed. States can regulate Internet
content such as pornography, gambling or hate speech provided that a
democratic consensus exists that these kinds of speech are not to be
tolerated. This is the argument that Reidenberg makes in his article
in which he defends the French court's decision in the Yahoo case.62
This theory distinguishes countries like France from repressive
regimes that try to restrict their citizens' access to the Internet
or that try to control, block and or filter content because it could
challenge their power or because it is contrary to state-imposed
doctrine. Because these countries do not represent the will of their
people, they cannot make claims of sovereignty. However, a state has
to recognize that other democratic states also try to uphold their
democratically passed laws and have the right to have their laws
respected within their borders:
[T]he legitimacy of applying a state's laws to conduct that occurs in
another state's territory depends on whether such laws 'would prevent
[that] State from functioning as a sovereign; that is, the extent to
which such generally applicable laws would impede a state
government's responsibility to represent and be accountable to
citizens of the State.63
In the context of hate speech this is important. Engaging in (most
types of) hate speech is a constitutionally protected activity in the
United States; therefore, trying to limit this right would impede
with the United States' responsibility to uphold its constitutional
principles. The representational concept of sovereignty suggests that
nations have to be aware of the fact that other states that respect
human rights and the principle of democratic representation can also
make claims of sovereignty,64 and that restraint should be exercised
when attempts to enforce local policies encroach upon other nations'
ability to represent the will of their people. Of course this is a
vague guideline. In the example of the Yahoo! case, for example, one
could argue that the French court "exercised restraint;" it addressed
questions of jurisdiction and tried to come up with a solution
(filtering based on location) that would not affect American Internet
users. Arguments that the French court did not use caution can also
be easily made, as the many articles criticizing the French decision
illustrate. If this concept will be of any use, it needs to be
established more clearly what is meant by "exercising
restraint." Contrary to the assertions of the author, the
representational conception of sovereignty provides some specific
guidance for Internet content regulation in an international context.
First of all, as stated above, it allows for a distinction to be made
between democratic and non-democratic states' attempts to regulate
the Internet. If we add to this the notion that a state should be
less inclined to pursue its policy goals if doing so is likely to
interfere with other governments' abilities to represent the will of
their citizens and act in a sovereign way, we have the beginning of a
broad normative framework. According to this normative framework,
states should not pursue their interests when doing so has negative
consequences for actors in other states with representative
governments. An important practical consequence of this normative
framework in the context of Internet regulation is that it requires
abandoning the effects test. Countries should not assert jurisdiction
or regulatory power over out-of-state actors merely because their
speech can be accessed within their borders, as this would not always
respect other nations' sovereignty.
If two democratic regimes have different ideas about permissibility
of Internet content, the representational model of sovereignty argues
that a nation should try to have its laws respected (because they
represent the will of the people), without imposing "negative
externalities" upon actors in other states. In other words, whereas
the realist paradigm advocates a unilateral approach towards Internet
regulation, this version of the representational conception proposes
a multilateral approach in which various interests are balanced
against each other. This of course is still a general principle that
needs to be translated into more specific guidelines. In order to do
so, it is illuminating to look at how jurisdictional issues have been
dealt with within the United States.
2.4. Assessing jurisdiction online: Lessons from the American approach.
The federal structure of the United States has forced the legal
system to deal extensively with Internet jurisdiction issues. The
American experience provides some insights that can also be useful in
the international context. While jurisdictional issues arising in a
federal nation such as the United States are substantially different
from those arising in an international context, some of the
rationales applied by the courts in regards to Internet and
jurisdiction are instructional.
2.4.1. The "traditional approach."
Traditionally, jurisdiction over non-state residents within the
United States has been based on International Shoe v. Washington.65
Absent a traditional basis for jurisdiction (such as residence),
courts will assert jurisdiction on the basis of whether a state's
long-arm statute extends to the non-resident defendant in the light
of specific facts. Courts must determine whether or not exercising
personal jurisdiction in a particular case comports with due process
guaranteed by the constitution.66 This requires examining whether or
not the defendant has had minimum contacts with the forum state, i.e.
if he has "purposefully availed" himself of the laws of the forum
state through his activities. Lastly, courts need to determine
whether or not asserting jurisdiction comports with "traditional
notions of fair play and substantial justice." In order to establish
whether or not the minimum contacts requirement has been met, a
three-pronged test was developed following International Shoe:67 (1)
There must be an act by which the defendant purposefully avails
himself of the laws of the forum state. Purposeful availment includes
the conduct of the defendant, and whether he intends this conduct to
have an effect in the forum state. (2) The claim must arise as a
result of the defendants' activities. (3) Exercise of personal
jurisdiction must be reasonable.68 With the rise of the Internet,
courts were faced with the question if maintaining a Web site
constitutes purposeful availment. Initially, the answer of American
courts to this question was that it did.
In Inset Systems, Inc. v. Instruction Set,69 a trademark infringement
case, the Federal District Court in Connecticut held that a
Massachusetts based company using a Web site to advertise its
products, had manifested purposeful availment in all jurisdictions
where Internet access is available. Other courts followed this
rationale,70 stating that having a Web site constitutes purposeful
availment, implying that everyone who operates a Web site could be
hauled into any courtroom in the country. Web advertising was seen as
advertising to the whole nation, and by trying to reap sales
nationwide (even if the business is clearly local), Web site
operators should assume the risk of being sued outside their home
states, the argument went.71 The court likened advertising over the
Internet to a continuous advertisement, and ruled that Inset had
purposefully directed its activities towards Connecticut and should
therefore have anticipated being hauled in court there.72 The court
provided no in-depth analysis of the Internet as a medium. Instead,
it made an analogy with traditional media forms, and applied existing
law. In doing so, it proceeded much in the same way as the French
court in the Yahoo! case had. It considered only the effects of the
Internet communication, and asserted jurisdiction based on the
Internet's reach, without considering the actual intent of the
content provider. In the wake of this decision, many courts followed
this approach,73 though some deviated.74
2.4.2. The Zippo test
The Inset rationale clearly cast too wide a net by establishing that
everyone operating a Web site purposefully avails himself to every
jurisdiction where this Web site can be accessed. A next line of
cases would present a more nuanced test by considering the kind of
Web site that is being operated. This new test was developed in Zippo
Manufacturing Co. v. Zippo Dot Com, Inc.,75 an infringement case in
which the Pennsylvania based manufacturer of the famous lighters sued
a California based company for various trademarks infringements.76
Zippo Dot Com operated a Web site advertising and offering paid and
free access to its Internet news service. To become a paid
subscriber, prospective members had to submit names and addresses via
an online form and pay for their memberships via credit cards, either
over the phone or online. They were then sent a password to access
Internet news group messages. At the time of the lawsuit, about 3,000
Pennsylvanians had subscribed to the service. The court had to decide
whether personal jurisdiction arose out of the contacts Zippo Dot Com
had established with Pennsylvania through its Web site. Rather than
engaging in the analysis established by Inset, the court decided that
the likelihood that personal jurisdiction can be exercised is
"directly proportionate to the nature and quality of commercial
activity that an entity conducts over the Internet."77
In what came to be known as the Zippo test, courts weighed the
relative interactivity of a website to determine whether assertion of
jurisdiction is appropriate. At the one end of the spectrum are the
active Web sites, where "a defendant clearly does business over the
Internet. If the defendant enters into contracts with residents of a
foreign jurisdiction that involve the knowing and repeated
transmission of computer files over the Internet, personal
jurisdiction is proper."78 At the other end of the sliding scale are
passive Web sites, where someone merely posts information that is
accessible to users in other jurisdictions, which is not sufficient
basis for asserting jurisdiction.79 The middle ground of the sliding
scale is occupied by Web sites where a user can exchange information
with a host computer, in which cases the level of interactivity and
commercial nature of the activity needs to be taken into
consideration when making decisions about exercising jurisdiction.80
In the Zippo case, the court ruled that, given the nature of the
contacts Zippo Dot Com had forged through its Web site with
subscribers from Pennsylvania and ISPs based in Pennsylvania,
jurisdiction was proper.81 In the years following, the Zippo test was
used in numerous cases.82
The Zippo test was clearly a step forward from the analysis presented
by the Inset court, as it looked to find a balance between a lawless
Internet and an over-regulated one.83 Yet, it has come under
increasing criticism in recent years. While the Zippo test provided a
more fine-tuned tool than the test in the Inset case (which was not
really a "test"), it still suffers from some shortcomings. It does
not provide a clear standard that allows a business to gauge the risk
it exposes itself to of being vulnerable to out-of-state lawsuits by
taking its business online. Most Web sites are neither completely
active nor completely passive, and fall in the "middle zone." In that
case, the court's analysis of the specific facts of the case will
determine whether the online activity constitutes purposeful
availment.84 It remains unclear how much interactivity or
commercialism (is a site that does not offer anything for sale but
makes money from banner advertising a site that "does business?") is
required to assert personal jurisdiction?85
The Zippo test also seems to be inapplicable to libel cases, as it
applies mainly to commercial and interactive sites. It seems to
suggest that jurisdiction claims in cases where defamatory statements
are made on a passive non-commercial Web site with the knowledge and
purpose to harm the plaintiff in the forum state are to be
dismissed.86 Nevertheless, the Zippo test has been applied in libel
cases.87 It is also not always easy to distinguish an active Web site
data collecting technologies.88 Standards for what is considered
active and what is passive may shift constantly as technology
changes.89 These weaknesses of the Zippo test have caused some
scholars to propose abandoning the test altogether, and apply
classical standards to determine purposeful availment instead of
designing a test specifically for the Internet.
2.4.3. Beyond the Zippo test
In recent years, some courts have moved away from Zippo's
active-passive test and have started to adopt the effects doctrine90
established in the Supreme Court decision Calder v. Jones.91 The name
"effects doctrine" is somewhat misleading in this context, as it is
not the same as the blanket effects rationale applied in the Yahoo!
case or the Inset case, where jurisdiction was asserted on the basis
that Web sites could be accessed in a certain territory. It has come
to be interpreted more as a targeted test, in which the intention of
the sender to "target" a specific jurisdiction is taken into
consideration. The Calder doctrine holds that jurisdiction over a
defendant is proper when "the defendant's actions are expressly aimed
at, and the brunt of the injury is felt in, the forum state."92
Calder v. Jones and its companion, Keeton v. Hustler Magazine,93 both
dealt with jurisdiction issues in libel cases, and in both cases
defendants were hauled into out-of-state courts based on the fact
that their publications had substantial circulations in those
jurisdictions.94 This seems to imply that Internet publishers could
also be hauled in every courtroom in the country, as they can be
considered to have a country-wide publication. But this is not how
some courts have applied Calder to jurisdiction issues in online libel cases.
In Griffis v. Luban,95 a target based test was applied in an online
defamation case. The case resulted from an argument between two
Egyptologists on an Internet news group in which Marianne Luban, a
Minnesota resident, questioned the credentials of Katherine Griffis,
an Alabama resident. Among other things, Luban had stated that
Griffis had received her degree from "a box of
crackerjacks."96 Griffis sued for libel in an Alabama court and was
awarded S25,000 in a default judgment.97 Luban fought the enforcement
of the judgment in Minnesota. The state trial court and the appellate
court ruled that Luban had had minimum contact with Alabama,98 but
the Minnesota Supreme Court reversed, holding that the judgment was
not enforceable in Minnesota because the Alabama court did not have
jurisdiction over Luban.
It argued that the message posted in the news group did not
specifically target Alabama, as the forum was nationwide in scope.
Even though the defamatory statements could be read in Alabama, this
did not demonstrate that Alabama was the focal point of Luban's
tortuous conduct, the court argued. It rejected the view that Calder
supports a broad effects based test in which jurisdiction is
supported merely because the effects of a tort committed in another
jurisdiction can be felt in a given forum.99 The standard expressed
by the Minnesota Supreme Court supports jurisdiction in libel cases
only if the statements are expressly aimed at the forum state:
While the record supports the conclusion that Luban's statements were
intentionally directed at Griffis, whom she knew to be an Alabama
resident, we conclude that the evidence does not demonstrate that
Luban's statements were "expressly aimed" at the state of Alabama.
The parties agree that Luban published the allegedly defamatory
statements on an internet newsgroup accessible to the public, but
nothing in the record indicates that the statements were targeted at
the state of Alabama or at an Alabama audience beyond Griffis herself.100
Young v. New Haven Advocate101 presents another example in which a
narrow target test is applied in an online defamation case. In this
case, a Virginia prison warden sued two Connecticut newspapers, which
had no or a very limited circulation in Virginia, for libel in the
federal District Court for the Western District of Virginia.102 The
newspapers had printed articles and columns describing poor
conditions in the prison where Young was the warden, and allegedly
implied that he held racist beliefs. Although the papers had very
limited circulations in Virginia, they did maintain online versions
of their papers that could be accessed in Virginia. The Fourth
Circuit Court of Appeals applied Calder, and held that the papers had
not had minimum contacts with Virginia, since most of the content of
the papers was directed at a Connecticut readership and that most of
the advertising in the papers also was clearly aimed at Connecticut
residents.103 The court here interpreted the Calder test as a
targeted test: "We thus ask whether the newspapers manifested an
intent to direct their website content - which included certain
articles discussing conditions in a Virginia prison - to a Virginia
audience."104 Conducting this analysis, the court found that as a
whole, the papers' sites would not be of interest to residents of Virginia.
In Winfield Collection, Ltd. v. McCauley,105 a copyright infringement
case, the United States District Court for the Eastern District of
Michigan mounted a poignant criticism of the Zippo test and suggested
that in the absence of a specific test to establish what constitutes
"minimum contacts" on the Internet, traditional legal principles can
However, the distinction drawn by the Zippo court between actively
managed, telephone-like use of the Internet and less active but
"interactive" web sites is not entirely clear to this court. Further,
the proper means to measure the site's "level of interactivity" as a
guide to personal jurisdiction remains unexplained. Finally, this
court observes that the need for a special Internet-focused test for
"minimum contacts" has yet to be established. It seems to this court
that the ultimate question can still as readily be answered by
determining whether the defendant did, or did not, have sufficient
"minimum contacts" in the forum state. The manner of establishing or
maintaining those contacts, and the technological mechanisms used in
so doing, are mere accessories to the central inquiry.106
In this case, the court ruled that selling Items over eBay does not
automatically mean that one purposefully avails himself of the
governing laws of the buyer's jurisdiction. The seller cannot be
expected to have advance knowledge of where the items will be sold.
The court was not prepared to hold that "the mere act of maintaining
a Web site that includes interactive features ipso facto establishes
personal jurisdiction over the sponsor of that Web site anywhere in
the United States."107
Whether the cases discussed above are based on a misreading of the
Calder test (which in its original formulation was an effect based
test) or a logical return to first principles after the contrived
Zippo test is not a question that needs to be addressed here, but it
has inspired some scholars who thought the latter is the case, to
propose a similar target test in an international context.
2.5. A Target based test for Internet transactions
Observing this shift away from Zippo, Geist proposes a new test based
on a target-based analysis for asserting jurisdiction in Internet
transactions.108 The test proposed by Geist would want courts to take
three factors into consideration when determining when jurisdiction
is proper in Internet cases. The first factor considers whether or
not either party utilized a contractual provision to specify which
law should govern their transactions.109 The second factor would
consider whether or not the Web page sponsor used technology on the
Web site to either target or avoid jurisdiction.110 Geist focuses
here mainly on whether or not the sender of the information used
technology to target a certain geographical area. The third factor
assesses whether or not a party has or ought to have knowledge about
the geographic location of the online activity.111 By this, Geist
means that in certain instances where a Web site has no technology in
place to target a specific region or in which there is no contractual
arrangement between parties, a Web site sponsor may still have
sufficient knowledge that his Web site targets a specific
jurisdiction to confirm jurisdiction. Geist gives the example of a
gambling Web site whose owners claimed they did not know that they
were taking bets from New York residents (which would be illegal), as
the site required people to give their residence before accepting
bets, and that it would not let New Yorkers place bets. However, the
court112 ruled that the casino operators very well knew that most
people could easily circumvent this by entering a fake out-of-state address.113
2.6. Asserting jurisdiction
Henn114 builds upon Geist's proposal to formulate a target-based test
for determining whether minimum contacts have taken place on the
Internet in an international context. She points out that Geist's
proposal is applicable only to active commercial Web sites, whereas
in many instances, hate speech or other controversial speech is
conveyed through "passive" Web sites.115 Henn suggests another
three-pronged test to determine if a Web site's content provider
purposefully avails himself to the laws of a certain jurisdiction.
A primary test that could be applied to a non-interactive Web site,
according to Henn, is to consider whether or not the site uses a
foreign language. For example, neo-Nazi sites based in the United
States but written in German could be considered to be targeting
Germany. Secondly, Henn argues that a content provider has targeted
a foreign jurisdiction if the information that is available on that
Web site directs the viewer to local information.116 For example, if
a Web site provides links to Web sites that are clearly local, or if
users are directed to physical locations in that forum, this
requirement would be met. It is surprising that Henn does not include
in her proposal the requirement that the information is clearly of a
local character and that it deals with local topics. She proposes a
slightly stricter criterion, namely that the information contained on
the site "directs the viewer to local information."117 Finally, Henn
also suggests that a Web site that uses software to target its
advertising to users in a specific jurisdiction, also avails itself
to the laws of that forum: "If the content provider has tools that
are sophisticated enough to target advertising, then they should also
have the ability to monitor what country's citizens are accessing
their Web site and, thus, have reason to know the laws to which they
could potentially be subjected."118
So what would be the practical application of this proposal? This
proposal is designed to vet out Web sites hosted on American servers
that clearly target other jurisdictions. The test of Henn is designed
to assess the intent behind a Web site, as an alternative to basing
jurisdiction on an effect doctrine. A German language neo-Nazi site
discussing German policy on immigrants, for example, or a German
language message board discussing similar topics would, according to
this proposal, avail itself of the laws of Germany, even if hosted in
the United States. The test proposed by Henn would bar courts from
ascertaining jurisdiction over content that is illegal in their
country, but has not targeted it. Only content providers who
explicitly target certain jurisdictions would open themselves up to
prosecution in these jurisdictions. This proposal is of course rather
vague and not without problems. Henn's test lacks clarity, would need
to be accepted on an international level through a treaty and could
still raise First Amendment concerns if applied to hate speech. The
First Amendment also protects non-English speech dealing with
non-American topics. However, the effectiveness and feasibility of
having such a test for determining jurisdiction will be discussed in
greater depth in the next section. But as a normative concept, the
target approach can fine tune the representational model of
sovereignty in its rejection of the effects based test. For our
purposes, the test proposed by Henn contains some elements that can
help to build a normative framework for Internet regulation across
borders. Based on Henn's test and the representational concept of
sovereignty, a guideline can be developed that states that any kind
of Internet regulation in which an actor, whether government actor or
private actor, tries to regulate speech originating across borders in
a way that puts a burden upon actors outside its borders, should be
based on a this targeted approach.
For example, European regulators wanting to bar their citizens
accessing an American site on which the Holocaust is denied but that
does not target specifically other nations through content or
advertising should do so in a way that does not burden out-of-state
actors. This does not mean that states have no power to restrict
their citizens' access to proscribed speech, but it means that they
should do so without burdening out-of-state actors, such as foreign
ISPs or content providers. "Burdening" could include unilaterally
imposing any kind of restrictions or pressure on out-of-state actors
that, if successful (which in many cases they will not be), would
limit their ability to exercise their free speech rights or
"burdening" could mean restricting access to this kind of speech to
people living in jurisdictions where it is protected. As the Supreme
Court decision in Reno made clear, American First Amendment
provisions do not allow that speech is made less available to those
who have a constitutional right to have access to this speech, in an
attempt to limit access to those who do not have this right. The
French Yahoo! order obviously failed this test, but this would not
mean that a French anti-racism organization could not attempt to
enter in a debate with content providers of hate speech in the United
States in order to convince them to remove certain content, as this
approach is based on dialogue and cooperation, rather than unilateral
enforcement. However, this approach would avoid attempting to drag
actors in a foreign court room for engaging in speech protected by
their laws. Trying to convince ISPs in the United States to remove
content or to change their terms of service to ban certain materials
would be permissible. As discussed above, the representational
concept is against unilateralism but supports dialogue and mutual
agreement, and therefore this kind of informed self-regulation is not
at odds with this sovereignty concept. However, this may raise
concerns that private groups would become de facto censors of the Internet.
2.6.1. Loci of content control on the Internet
This still somewhat vague and general requirement articulated above
can be further clarified by clarifying the concept of out-of-state
actors as it relates to the Internet. Who are the players in the
Internet communication process that can influence communication over
the Internet and can be regulated themselves; where and who are the
different loci of control that exist on the Internet? Zittrain119
identifies four loci of control: the source, the source ISP, the
destination and the destination ISP. For regulators, it is important
distinguish between these actors as "loci of control."
1.Control at the source:120 The easiest way to control content is at
the level of the source instigating the transfer of information. The
sender of information can restrict access to the material by
requiring passwords, remove the material or can choose to do nothing
at all.121 This is the level at which control of content can be
exercised most effectively. However, most people, or at least most
purveyors of hate speech, use the Internet mainly because they can
have a potential global audience at a low cost and have little
incentive to limit their potential audience.
2. Source ISP:122 Zittrain makes a distinction between ISPs (Internet
Service Providers) and OSPs (Online Service Providers). ISPs serve as
a link between a client and the Internet, allowing an individual to
connect to the Internet; as such, ISPs pass along packets of
information to and from an individual's computer. In addition, ISPs
also sometimes host content placed on their servers by subscribers.
ISPs can remove content from their servers if they choose to do so;
for example, if the hosted content violates acceptable use policies
or because they are ordered to do so by the authorities. When Yahoo!
decided to remove certain Web pages from its Geocities Web hosting
service following the French court order, it exercised this power.
It is important to make this distinction between source and source
ISP. A source does not need to be located at the same place as the
source ISP. One can maintain a Web site hosted on an American server
without being in the United States herself. One can upload content
anonymously on an American server from Germany. While this American
server cannot be regulated by German authorities, the provider of the
content can be subjected to German law, provided of course, that
German authorities know the identity of the content provider.
3. Destination:123 Content control can also occur at the destination,
the recipient of the information, at the moment prior to an Internet
user's exposure to this content. This requires action at the level of
the Internet user's computer, through installing software or changing
browser settings. For example, libraries may install filtering
software that blocks pornographic content. Classrooms can have
browsers configured so that only a limited number of pre-approved
content can be accessed. However, this kind of content regulation can
be successfully achieved only if the owner of the computer agrees to
take the necessary steps.
4. Destination ISP.124 Unlike source ISPs, a destination ISP does not
benefit from a relationship it has with the content provider, and
cannot remove his content from its server or cancel his account. As
Zittrain describes, destination ISPs are merely "off ramps" for data
solicited by the destination ISPs' customers. Of course, an ISP can
be, and usually is, both a source and destination ISP, depending on
the specific data transfer.125 When performing the function of a
destination ISP, a provider cannot remove or make material
unavailable to the whole Internet, but it can make material
unavailable to its subscribers if it wants to do so. However, this is
not always simple and can impose a burden on ISPs. In 2002, the
Pennsylvania legislature enacted a law requiring an ISP to remove or
disable access to child pornography "residing on or accessible
through its server"126 upon receiving notice from the Pennsylvania
attorney general. The Center for Democracy and Technology, the
American Civil Liberties Union of Pennsylvania and Plantagenet (an
ISP) challenged the law in court. They argued, among other things,
that efforts to disable access to child pornography had led to
overblocking in violation of the First Amendment and that the
procedure spelled out in the law amounted to an unconstitutional
prior restraint on speech, an argument with which the court would
agree in striking down the law.127 In the context of Internet
regulation, it is important to note that governments and regulators
oftentimes do not have any power over source ISPs, if they are not
located in their jurisdictions, but destination ISPs usually are
located in the same jurisdiction as their subscribers and are
therefore easier to regulate.
This section started out with an attempt to establish a normative
framework for evaluating Internet regulation based on a normative
sovereignty concept. The representational concept of sovereignty
provided the most appropriate model, as it did not rely on the notion
that the Internet is its own sovereign domain, nor did it assume that
nations should try to adopt an effect based approach as the realist
sovereignty model suggests. However, avoiding these two extremes did
not provide any specific guidelines. By combining Henn's target test
with Zittrain's loci of control for the Internet, more specific
normative guidelines can be developed.
The normative model we will adopt here demands that, when trying to
regulate hate speech, regulators do not target out-of-state content
providers or out-of-state source ISPs that do not specifically target
the jurisdiction of the regulator. Rather, solutions should be sought
at the level of the destination ISP; the destination of the content
or at the source of the content if located within the jurisdiction of
the regulator.128 However, when speech specifically targets a forum,
and Henn's test provides guidance in determining this, attempts to
regulate content at the level of the content provider (source) and
source ISP can be made, even if they are located abroad. Even though
Henn's test is not without problems, in many instances, the intent of
the content provider to target a specific jurisdiction will be
obvious. The fact that regulatory attempts that fulfill this
criterion may be unsuccessful does not matter at this point. This
criterion only tries to outline under what circumstances what kind of
regulatory attempts are appropriate; their effectiveness will be
discussed as a separate criterion below.
A last requirement is that a regulation be effective. "Effectiveness"
does not only mean that the regulation works, but also that the
regulation is not overly broad, that it is feasible and that the
content which the regulations try to affect is in fact illegal.
Any kind of measure should fulfill the regulatory goal that is set.
In the Yahoo! case, for example, it is not clear how successful the
order of the French judge ultimately was in enforcing French law.
One could argue that its effect has been minimal since it could not
be enforced in the United States, but it may have been an effective
strategy in trying to change the behavior of foreign ISPs. In order
to assess the effect of a regulatory measure, one needs to try to
assess its regulatory goals. A measure does not always need to be
absolutely 100% efficacious to fulfill a regulatory goal. For
example, the 70% accuracy with which French users could be identified
and blocked could seem low, but it may be an acceptable number for
the policy this order was supposed to serve. Solutions do not have to
be perfect in order to be effective. When evaluating Internet
regulation, its efficacy needs to be considered.
3.2. Against overinclusiveness
However, regulations should not be too efficacious; they should not
be "overinclusive" and affect more speech than the speech that is
targeted. In the example of hate speech, regulations should only
affect speech that is illegal based on hate speech laws in the
country of the regulators. In addition, no more people should be
affected by the regulation than necessary. For example, regulations
should not have the effect that American residents are barred from
accessing certain types of speech merely because it is illegal in
another part of the world (as long as it does not concern speech that
specifically targets that jurisdiction). By demanding that
regulations do not unilaterally affect source ISPs or content
providers, the risk that regulations affect more people than strictly
necessary is reduced.
Related to this concept of overinclusiveness and to the
representational concept of sovereignty is the demand that
regulations reflect the will of the people. Therefore, there should
be safeguards in place to ensure that the content that is regulated
(banned, removed, blocked,…) because it is in violation of certain
laws is, in fact, in violation of those laws. The provider of the
content should be able to appeal the measure, and the institution
responsible for the measure of for flagging that content should be
transparent and accountable to the general public in order to avoid
that a private organization would become a de facto censor of the Internet.
A last component of effectiveness is the practicality/feasibility of
a measure. How easy or complicated it is to implement a certain
regulation or measure will also determine its success. For example,
having all content providers of hate speech voluntarily identify
their speech as hate speech so it could be filtered more easily may
be an effective measure, but it is not likely that this would happen.
For any kind of measure, practicality needs to be a consideration.
Although it is hard to establish fixed evaluative criteria to assess
practicality or feasibility, a measure will usually be more feasible
if the success of a measure depends on the efforts of a few versus
many, and if the cost and time it requires are relatively low.
In this article, a set of criteria was developed to evaluate European
regulatory approaches towards online hate speech in general and the
United States in particular. This set is based on three general
principles: (1) Internet regulation should respect the open layered
structure of the Internet; (2) It should be based on a
representational concept of sovereignty; (3) It should be effective.
The first principle led us to adopt Solum and Chung's guideline that
Internet regulations should cross layers only if there is no other
solution possible and even then, the distance between the layer at
which the regulation aims to produce an effect and the layer targeted
by that regulation should be minimized. The representational concept
of sovereignty adopted here demands that regulation should not target
out-of-state content providers or out-of-state source ISPs when
trying to regulate hate speech that does not target their
jurisdiction, but that solutions should be sought at the level of
the resident destination ISP, resident content providers or the
destination of the content. However, when the speech involved
specifically targets a forum, attempts to regulate content at the
level of the content provider and source ISP may be appropriate,
though these attempts may not be successful.
The normative framework demands also that regulatory measures must be
efficacious without being overinclusive. This means that regulations
should fulfill their regulatory goal without targeting more speech
than needed or making it unavailable to more people than necessary.
Effective regulation of hate speech also demands that the people or
body responsible for determining what speech to regulate are
accountable to the public, and that their decisions can be appealed.
Lastly, effective regulation also demands that the proposed methods
are feasible. In this model, the fact that effectiveness is separated
from the demands of the representative sovereignty concept is
important. It may be the case that in most cases, attempts to
regulate out-of-state actors will also not be efficacious or
feasible, but that is not necessarily always the case. Even if
out-of-state actors could be regulated effectively (because they have
assets in other jurisdictions or because changes in the technological
and legal landscape would make it easier), the second criterion
demands that this is only done in specific cases. The guidelines
developed here are not limited to the issue of hate speech, but also
apply to other kinds of speech about whose legality there is no
consensus between democratic nations.
1 See for example: William B. Fisch, " American Law in a Time of
Global Interdependence: U.S. National Reports to the XVIth
International Congress of Comparative Law: Section IV Hate Speech in
the Constitutional Law of the United States," 50 Am. J. Comp. L. 463
(2002) at 471-476; Stephanie Farrior, "Molding the Matrix: The
Historical and Theoretical Foundations of International Law
Concerning Hate Speech," 14 Berkeley J. Int'l L. 3 (1996) at 14
Michael Rosenfield, "Conference: Hate Speech in Constitutional
Jurisprudence: A Comparative Analysis" 24 Cardozo L. Rev. 1523 (2003)
2 X v. Federal Republic of Germany, Appn. No. 9235/81, 29 DR 194 (1982).
3 Glimmerveen and Hagenbeek v. The Netherlands Appn. Nos. 8348/78 &
8406/78, 18 DR 187 (1979).
5 Defined in article 2.1. as "any written material, any image or any
other representation of ideas or theories, which advocates, promotes
or incites hatred, discrimination or violence, against any individual
or group of individuals, based on race, colour, descent or national
or ethnic origin, as well as religion if used as a pretext for any of
6 Article 3, 2.
7 Amy Oberdorfer Nyberg, "Is all Speech Local? Balancing Conflicting
Free Speech Principles on the Internet," 92 Geo. L.J. 663 (2004) at 670.
8 See footnote 1.
9 Brian Levin, "Because of the Constitution's First Amendment, the
United States Now Hosts Hundreds of European Language Hate Sites,"
Southern Poverty Law Center: Intelligence Report (2003).
10 "Statement by Mr. Gérard Kerforn, Introducer at the Fourth Session
of the Conference on Racism, Xenophobia and discrimination."
Vienna, 4-5 September, 2003.
11 When two lawyers used the Internet to advertise their services in
1994 through mass emails and mailing lists they subscribed to, the
reaction of the 'Internet community" was one of outrage and they were
forced to stop their online commercial soliciting.. See Gurak Laura
J. Gurak, Persuasion and Privacy in Cyberspace : The Online Protests
Over Lotus MarketPlace and the Clipper Chip (1997) at 13..
12 Laura J. Gurak, Cyberliteracy : Navigating the Internet with
Awareness (2001) at 130.
13 John Perry Barlow, "A Declaration of Independence of Cyberspace,"
February 8, 1996. <http://homes.eff.org/~barlow/Declaration-Final.html>
14 Alfred C. Yen, "Western Frontier Or Feudal Society?: Metaphors
and Perceptions of Cyberspace," 17 Berkeley Tech. L.J. 1207 (2002) at 1224.
15 Ibid. at 1225-1226.
16 However, geolocation software has made it possible to determine
the location of a user.
17 Matthew Fagin, "Regulating Speech Across Borders, Technology Vs.
Values," 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 404.
18 David R. Johnson and David G. Post, "The New "Civic Culture" of
the Internet." <http://www.cli.org/paper4.htm>
19 See for example: Laura J. Gurak, Persuasion and Privacy in
Cyberspace : The Online Protests Over Lotus MarketPlace and the
Clipper Chip (1997).
20 David R. Johnson and David R. Post, "Law and Borders - the Rise
of Law in Cyberspace," 48 Stanford Law Review 1367 (1996).
21 Ibid. at 1370.
22 Dan Hunter, "Cyberspace as Place and the Tragedy of the Digital
Anticommons," 91 Calif. L. Rev. 439 (2003) at 448-449.
23 Jack L. Goldsmith, "The Internet and the Abiding Significance of
Territorial Sovereignty," 5 Ind. J. Global Leg. Stud. 475 (1998) at
1199-1200. See also Jack L. Goldsmith, "The Internet and the Abiding
Significance of Territorial Sovereignty," 5 Ind. J. Global Leg.
Stud. 475 (1998); Jack Goldsmith, "Unilateral Regulation of the
Internet: A Modest Defense," 11 Eur. J. Int'l L., 135 (2000); Allan
R. Stein, "The Unexceptional Problem of Jurisdiction in Cyberspace,"
32 Int'l Law. 1167 (1998).
24 Dan Hunter, "Cyberspace as Place and the Tragedy of the Digital
Anticommons," 91 Calif. L. Rev. 439 (2003) at 450-451.
25 Matthew Fagin, "Regulating Speech Across Borders, Technology Vs.
Values," 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 405.
26 Lawrence Lessig, "The Architecture of Innovation," 51 Duke L.J.
27 Matthew Fagin, "Regulating Speech Across Borders, Technology vs.
Values." 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 406.
28 Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet
Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004).
29 Ibid. at 827-829.
30 Lawrence Lessig, Code : And Other Laws of Cyberspace (1999) at 297.
31 Ibid. at 6.
32 The exact meaning of the term "code" and its relationship to
"architecture" is not always clear in Lessig's book. At times, his
use of the term code seems to be the meaning it has amongst computer
programmers, sometimes its meaning seems to be more
metaphorical. "Architecture" seems to refer to the hardware,
software and protocols on which the Internet is run and "code" to the
computer languages and the software and hardware environment that
make up the Internet. The terms do overlap in meaning, but code seems
to be a more fundamental concept, underpinning the architecture of
33 Ibid. at 224.
34 Licra and UEJF v. Yahoo! Inc and Yahoo France. Order of May 22,
2000 by the Superior Court of Paris. <http://www.lapres.net/yahen.html>
35 UEJF and Licra v. Yahoo! Inc and Yahoo France. Superior Court
of Paris. Order of November 20, 2000 by the Superior Court of
36 Matthew Fagin, "Regulating Speech Across Borders, Technology vs.
Values." 9 Mich. Telecomm. Tech. L. Rev. 395 (2003) at 412-415.
37 Lawrence Lessig, Code : And Other Laws of Cyberspace (1999) at 5.
38 Ibid. at 43.
39 Lawrence B. Solum and Minn Chung, "The Layers Principle: Internet
Architecture and the Law," 79 Notre Dame L. Rev. 815 (2004) at 829.
40 Ibid. at 845.
41 Ibid. at 816.
42 Ibid. at 842.
43 Ibid. at 839-840.
44 Ibid. at 838.
45 Ibid. at 852.
46 Ibid. at 829-831.
47 Ibid. at 846.
48 Ibid. at 866.
50 Ibid. at 878-888.
51 Ibid. at 896.
52 Ibid. at 908-909.
53 Ibid. at 920.
54 "Cyberspace Regulation and the Discourse of State Sovereignty,
Developments; the Law of Cyberspace," 112 Harv. L. Rev. 1680
(1999) at 1680-1697.
55 Ibid. at 1683.
56 Ibid. at 1683.
57 Ibid. at 1683.
58 Ibid. at 1684.
59 Ibid. at1683-1684.
60 Ibid. at 1685.
61 Ibid. at 1686.
62 Joel R. Reidenberg, "The Yahoo Case and the International
Democratization of the Internet," Fordham Law & Economics Research
Paper no. 11 (2001) at 4.
63 "Cyberspace Regulation and the Discourse of State Sovereignty,
Developments; the Law of Cyberspace," 112 Harv. L. Rev. 1680 (1999) at 1687.
64 Ibid. at 1687.
65 326 U.S. 310 (1945).
66 Dennis T. Yokoyama, "You can't always use the Zippo Code: The
Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54
DePaul L. Rev. 1147 (2005) at 1152.
67 Titi Nguyen, "A Survey of Personal Jurisdiction Based on Internet
Activity: A Return to Tradition," 19 Berkeley Tech. L.J. 519 (2004) at 521.
68 Ibid. at 522.
69 937 F. Supp. 161 (D. Conn. 1996).
70 Dennis T. Yokoyama, "You can't always use the Zippo Code: The
Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54
DePaul L. Rev. 1147 (2005) at 1157.
72 937 F. Supp. 161 at 165.
73 Dennis T. Yokoyama, "You can't always use the Zippo Code: The
Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54
DePaul L. Rev. 1147 (2005) at 1157.
74 For example Bensusan Restauran Corporation v. King 937 F. Supp.
295 (S.D.N.Y. 1996) aff'd 126 F. 3d. 25 (2nd Cir 1997), a case in
which a New York City club called "The Blue Note" which owned the
federal trademark in that name, brought a trademark infringement and
dilution action against a club in Missouri with the same name. The
Missouri club had a Web site on which one could find general
information, a calendar of events and ticket information. However,
tickets could not be bought online and also were not sent over the
mail. The court had to rule whether or not the presence of this Web
site constituted purposeful availment and justified hailing the
Missouri club owner in a New York court room. Given the fact that the
Web site was merely passive the court ruled that it did not cause any
infringing activity in New York.
75 952 F. Supp 1119 (W.D. Pa. 1997).
76 Ibid. at 1124.
81 Ibid. at 1126-1127.
82 See: Michael A. Geist, "Is there a there there? Toward Greater
Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345
(2001) at footnote 114.
83 Ibid. at 1370.
84 Ibid. at 1377-1379.
85 Titi Nguyen, "A Survey of Personal Jurisdiction Based on Internet
Activity: A Return to Tradition," 19 Berkeley Tech. L.J. 519 (2004) at 529-530.
86 Ibid. at 538.
87 Dennis T. Yokoyama, "You can't always use the Zippo Code: The
Fallacy of a Uniform Theory of Internet Personal Jurisdiction," 54
DePaul L. Rev. 1147 (2005) at 1176.
88 Michael A. Geist, "Is there a there there? Toward Greater
Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345
(2001) at 1379.
89 Ibid. at 1379-1380.
90 Ibid. at 1371.
91 465 U.S. 783 (1984) at 789.
92 Titi Nguyen, "A Survey of Personal Jurisdiction Based on Internet
Activity: A Return to Tradition," 19 Berkeley Tech. L.J. 519 (2004) at 351.
93 465 U.S. 770 (1984).
94 Patrick J. Borchers, "Personal Jurisdiction in the Internet Age:
Internet Libel: The Consequences of a Non-Rule Aproach to Personal
Jurisdiction," 98 Nw. U.L. Rev. 473 (2004) at 478.
95 646 N.W.2d 527 (Minn. 2002).
96 Ibid. at 530.
97 Ibid. at 529.
98 Ibid at 531.
99 Ibid at 533.
100 Ibid. at 535.
101 315 F.3d (4th. Cir. 2002) at 256.
102 Ibid. at 259.
103 Ibid at 259-260.
104 Ibid. at 263.
105 105 F. Supp. 2d 746 (E.D. Mich. 2000) at 750.
107 Ibid. at 751.
108 Michael A. Geist, "Is there a there there? Toward Greater
Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345
(2001) at 1386.
109 Ibid. at 1386-1392.
110 Ibid. at 1393-1402.
111 Ibid. at 1402-1404.
112 People v. World Interactive Gaming 714 N.Y.S.2d 844 (Sup. Ct. 1999).
113 Michael A. Geist, "Is there a there there? Toward Greater
Certainty for Internet Jurisdiction," 16 Berkeley Tech. L.J. 1345
(2001) at 1392.
114 Julie L. Henn, "Targeting Transnation Internet Content
Regulation," 21 B.U. Int'l L.J. 157 (2003).
115 Ibid. at 174-175.
116 Ibid. at 175.
119 Jonathan Zittrain, "Internet Points of Control," 44 B.C. L. Rev.
120 Ibid. at 659.
122 Ibid. at 664.
123 Ibid. at 669.
124 Ibid. at 672.
125 ISPs can also be neither, if they merely transfer and reroute packages.
126 Ibid at 610.
127 Ibid at 611.
128 The source, or content provider, can be located in a different
forum than the source ISP. For example if a German citizen uploads
content on an American server from his home computer.