Date of Award

Spring 1-1-2013

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

First Advisor

Robert Trager

Second Advisor

Andrew Calabrese

Third Advisor

Shu-Ling Berggreen

Fourth Advisor

Robert Nagel

Fifth Advisor

Peter Simonson

Abstract

Social media are rife with hate speech. A quick glance through the comments section of a racially charged YouTube video demonstrates how pervasive the problem is. Although most major social media companies such as Google, Facebook and Twitter have their own policies regarding whether and what kinds of hate speech are permitted on their sites, the policies are often inconsistently applied and can be difficult for users to understand.

Many of the decisions made by the content removal teams at these organizations are not nearly as speech protective as the First Amendment and U.S. Supreme Court precedent on the subject would mandate. Thus, the current situation gives social media companies’ unprecedented power to control what videos, text, images, etc. users may or may not post or access on those social media sites.

In an effort to identify solutions for curtailing hate speech in social media, this dissertation will explore the scope and nature of the problem of hate speech in social media today, using YouTube as an exemplar. A review of arguments for and against regulating hate speech online is then presented, along with an overview of current U.S. hate speech and Internet regulations and relevant jurisprudence. The approaches proposed by other legal and communication scholars about whether and how to limit hate speech online are examined and evaluated. Finally, a solution that seeks to minimize hate speech on social media Web sites, while still respecting the protections established by the First Amendment, is proposed.

Specifically, a recommendation is made to encourage self-regulation on the part of social media companies, which involves a move from a “.com” generic top-level domain to one called “.social.” In order to be part of the consortium of companies included on the “.social” domain, which will hopefully include YouTube, Facebook, Twitter, Instagram and others, an organization must abide by the industry-developed, uniform rules regarding what kinds of hate speech content are and are not permitted on these sites. A working group comprised of social media decision-makers will develop the policy and staff members of the Federal Communications Commission will facilitate this process. Hopefully, the resulting approach will better reflect precedent on the issue, which hesitates to place any restrictions on expression, regardless of how offensive it may be.

Share

COinS