This thesis explores the evolving legal environment surrounding speech liability, and the accompanying scope of free speech, on social media platforms. This project all started with a class titled "The First Amendment in 2017." That was two years ago, and little did I know that it would spark my interest in the First Amendment and lead me to choose the topic for this paper. When Alex Jones, an American radio host and widespread conspiracy theorist, was banned by Twitter for violating their abusive behavior policy in early September 2018, the discussion surrounded his First Amendment rights; when the Center for Immigration Studies was banned from using "illegal alien" on Twitter, they also turned to the First Amendment to make their case.1 Left and right, situations like these are causing users to complain First against companies technology, claiming that the platforms are in fact public forums and therefore, users have the right to post whatever they choose.
While speech is protected from restrictions imposed by government agencies under the First Amendment, this does not necessarily protect speech posted on proprietary platforms. It will encourage industry solutions where social media technology companies can be proactive and reactive in how they regulate content on their platforms.
CURRENT LEGAL ENVIRONMENT: INTERMEDIARY LIABILITY, SECTION 230 OF COMMUNICATIONS DECENCY ACT AND
Strict liability is when an intermediary service is responsible for third party or user content, even if the service does not know that the content is illegal or does not know that the content even exists. An example of a broad immunity provision is Section 230 of the Communications Decency Act in the United States. As it currently stands, Section 230 of the Communications Decency Act (CDA) is the applicable legal standard for online broker liability cases in the United States.
According to Article 230 of the CDA, social media platforms cannot be treated as publishers, such as newspapers or broadcasters; therefore, they are immune from civil actions for libel or defamation and cannot be held liable for the content their users post on their platforms. Although they are categorized as an "interactive computer service" in coordination with the language of the CDA, this law does not necessarily cover the legal responsibilities of technology companies in the best way.
Legislative timeline of Section 230 of CDA 1
INTERNET FREE SPEECH, THE CONTENT-MODERATION PROCESS AND CONCERNS WITH SECTION 230’S BROAD PROTECTION OF
For example, as mentioned earlier, China's strict liability model allows for greater censorship of individuals' expressions online than the broad immunity model in place in the United States. This necessitates a discussion about the dynamics of online speech, how speech and content are moderated online, censorship concerns, and why Section 230 protects the cultural tradition of American free speech. It is true that the First Amendment's free speech clause in the physical world is not an absolute right of the people.
However, it has backfired in the past; examples of speech online that could be. Some argue that Section 230 should be amended to make these companies responsible for eliminating harmful speech. This site relies on Justice Oliver Wendell Holmes' "marketplace of ideas" theory, based on the philosophy of John Stuart Mills, which argues that competition will result in ideas being accepted and rejected.
Corynne McSherry, legal director at the Electronic Frontier Foundation, argued on More Perfect “Twitter and the Law” that amending Section 230 would also lead to greater censorship of speech and ideas and could be used against acceptable and valuable speech. In fact, McSherry said it is the First Amendment right of companies to allow any speech on their platforms. Those who want to change Section 230 often believe that the government and those in power should do more to limit or regulate unwanted speech, such as hate speech and fighting words; those who want Section 230 to remain the same believe in the ability of public opinion to reject and reject hate speech and fighting words without government intervention.
These companies are based in the United States, so speech standards are likely to reflect the culture of those responsible for creating them. While these would be considered political cartoons in the United States and protected under the First Amendment, insulting the king is illegal in Thailand and was punishable by up to 15 years in prison at the time. Content moderation is also a balancing act that sometimes requires decision makers in the companies to act almost like publishers.
In the spring of 2018, in response to this and similar cases, lawmakers had to make a decision on how to amend Section 230 so that it no longer protected sites that served as platforms for illegal activities, such as human trafficking.37 The bill included a . The most recent Supreme Court ruling dealing with the protection of criminal online activity is the decision in Packingham v.
A NEW MEDIA: DEFINING ONLINE SERVICES’ ROLE
Although the hesitancy to impose the same responsibilities on new media as traditional media has been conducive to the growth of the Internet and social media, it does not address the question of what role social media platforms play. The Herald challenged this statute, saying it violated the Free Press Clause of the First Amendment, and the Supreme Court ruled in favor of the newspaper, 9-0. But newspapers operate through journalists, while social media platforms are created by users' decisions to post individually without editorial approval.
Although the Court in Reno ruled that broadcasting and radio's invasive nature, history of regulation and scarcity of frequencies did not apply to the Internet, social media has redefined the nature of online platforms enough to revisit their similarities. The issue before the Court was whether the FCC's Fairness Doctrine rules violated the Free Speech Clause of the First Amendment. The language in this case that legally binds it to new social media platforms is the Court's argument that "without government control, the medium would be of little use because of the cacophony of competing voices, none of which could be heard clearly and predictably." Describing the environment of broadcast media at the time, this phrase almost perfectly fits the environment that social media users encounter every time they log on.45.
While comparisons can be drawn between the legal protections and responsibilities of traditional media, social media is still a whole new frontier. No sector is more comparable to the social media giants that have taken over the internet. Therefore, it might be necessary to deviate a little from precedent to define it in legal terms. Some would like to grant them publisher status, while others would like to think of them as a public utility or even a public forum, but none of these encompass the entirety of social media's various roles.
This is why finding solutions to these issues is difficult because not everyone can agree on what they think social media should be and so they can't decide what they want it to be. The solution will look different for social media as it is unlike any industry that has come before, but looking back at examples of industry solutions in these other areas of media provides some guiding ideas for what tech companies can do. 47. It should not be based entirely on previous definitions, as it is completely new ground that warrants innovative language.48 It is this culmination of definitions that has led to some ideas on how to define and regulate social media giants.
MAINTAINING PUBLIC DISCOURSE AND INTERNET FREE SPEECH
Social media serves a completely different purpose and ultimately serves a combination of several factors, the most important of which are hosting and online distribution.49. This allows users to monitor content that could potentially cause problems for them in the future. By making these changes to the design of the Facebook platform, Zuckerberg hopes to address the growing list of concerns users have with social media.
King's concern stemmed from the feeling that social media sites such as Facebook are biased and the moderation process is more likely to remove content from conservative Republicans, a growing complaint among that group. Although he never offered a specific proposal, former President Barack Obama has spoken out against social media's dangerous ability to spread misinformation if it doesn't regulate that content. Speaking at the Nelson Mandela Annual Lecture in 2018, he said: “We need to guard against the tendency for social media to become solely a platform for spectacle and outrage and disinformation.”56.
Warner, a former tech executive who worked in Silicon Valley, believes the 1990s Section 230 framework is outdated compared to the growth social media platforms have experienced. He said, "by about 2016, more than half of the American people got their news from Facebook, let alone social media as a whole. Suddenly, that framework from the 1990s might not be quite right." Warner believes that a change to the doctrine would not "destroy the public space" but would rather update the law to be functional in the modern world of social media.58.
Free Speech in the Algorithmic Society: Big Data, Private Governance, and the New School Speech Regulation. UC Davis Law Review. The latter framework is one in which the companies that own social media platforms privately run their spaces as a liaison between the people and the state, while still remaining autonomous outside the territorial government. Klonick calls social media companies "The New Governors of the Digital Age." Instead of thinking of these companies as companies, then, she suggests that people look at them as their own mini-governments that run online activities.62.
CONCLUSION: WHERE THIS ISSUE STANDS AND WHERE IT IS HEADED
Inside the private Justice Department meeting that could lead to new investigations into Facebook, Google and other tech giants. The Washington Post, WP Company, September 25. Governance of and by platforms.” Culture digitally, Sage Handbook of Social Media, 2017, culturedigitally.org/wp-. Harmful Content: The Role of Internet Platform Companies in the Fight Against Incitement to Terrorism and Politically Motivated Disinformation.” Squarespace, NYU Center for Business and Human Rights, Nov.
Intermediary Liability in the United States.” Page Has Moved, Berkman Center for Internet and Society at Harvard University, February 16.