32
It may, at first glance, seem paradoxical to suggest that counter-speech is likely to be less
effective in an environment of informational abundance. Yet that abundance includes an
abundance of hate speech, the pervasiveness and permanence of which is assured by the
internet’s archiving, hyperlinking and searching capabilities.
Whether overall
informational abundance will drown out the abundance of hate speech, or dilute its impact, is
too broad a question to answer in abstracto.
Another relevant consideration is that enhanced individual selection and filtering capacities
allow individuals to choose (or “pull”) their own content instead of having particular content
“pushed” towards them by general intermediaries, as the institutionalized media have
traditionally done. These capacities increase the ability of individuals to avoid exposure to
particular types of content. The broader consequence of this is that they also reduce the
chances of conflicting opinions meeting each other head-on in an online environment.
Such individual selection and filtering capacities can affect communicative practices at a
societal level in different ways. Growing reliance on these capacities can lead to the creation
of a multitude of “public sphericules” instead of a unified public sphere
and lead to the
proliferation of communities of interest in which ideological insulation and intensification
take place. The online forums in which particular types of information and especially
viewpoints are reinforced by their amplification have been described as “online echo
chambers”.
As a result of these informational and communicative trends, the likelihood of intergroup
engagement and interaction in cyberspace cannot simply be assumed; its potential is
significantly reduced, compared with the offline, real-world context. Granted, “alternative
(mini-) spheres”
can prove vitally important for intragroup communication, for purveyors
of hate and minority groups alike. Some empirical research even suggests that deliberation in
online echo chambers does not necessarily/always lead to more entrenched/extreme positions
and that intra-group deliberation can benefit inter-group deliberation.
Nevertheless, in
order for more speech or counter-speech strategies to have any prospect of fostering tolerance,
there must be, as a minimum, communicative intent and actual communicative contact.
The failure of internet-based expression to achieve linkage to “the general public domain”
could lead to communication being predominantly spatial and insufficiently social. Online
hate speech has real-life consequences, as explained above,
so it is crucial for online
counter-speech to also realize its potential for offline effects. The promotion of targeted
educational, media literacy (generally understood as “the ability to access, analyze, evaluate,
Elizabeth Phillips Marsh, “Purveyors of Hate on the Internet: Are We Ready for Hate Spam?”, op. cit., at 391.
For a general discussion of selection and filtering issues concerning the Internet, see: Jonathan Zittrain, “A
History of Online Gatekeeping”, 19(2) Harvard Journal of Law and Technology (2006), 253-298.
See further: Todd Gitlin, “Public Sphere or Public Sphericules?”, in Tamar Liebes and James Curran, Eds.,
Media, Ritual, Identity (London, Routledge, 1998), pp. 168-175.
See further, Cass R. Sunstein, Republic.com 2.0 (Princeton NJ, Princeton University Press, 2007).
Donald R. Browne, Ethnic Minorities, Electronic Media and the Public Sphere: A Comparative Approach
(Cresskill, New Jersey, Hampton Press, Inc., 2005), p. 11. For a broad discussion of relevant issues, see: John
Downing and Charles Husband, Representing ‘Race’: Racisms, Ethnicities and Media (London, SAGE
Publications, 2005), esp. Chapter 9, “The Multi-Ethnic Public Sphere and Differentiated Citizenship”.
See, for example: Cass R. Sunstein, “Ideological Amplification”, 14(2) Constellations (2007), 273-279; Cass
R. Sunstein, Why Groups Go to Extremes (Washington, D.C., The AEI Press, 2008).
Myria Georgiou and Eugenia Siapera, “Introduction: Revising multiculturalism”, 2(3) International Journal
of Media and Cultural Politics (2006), 243-247, at 246.
See further, Alexander Tsesis, “Hate in Cyberspace: Regulating Hate Speech on the Internet”, 38 San Diego
Law Review (2001), 817-874, at 836 et seq.