Do global legal frameworks hold social media platforms accountable for hosting content that incites violence?

dc.contributor.advisorLutchman, Salona
dc.contributor.authorRubenstein, Ruvenna Samantha
dc.date.accessioned2026-01-22T12:11:13Z
dc.date.available2026-01-22T12:11:13Z
dc.date.issued2025
dc.date.updated2026-01-22T10:58:11Z
dc.description.abstractThe digital era has witnessed an unprecedented expansion in social media platforms' use, influence, and societal impact.1 Sixty percent of the global population uses social media, with the daily exchange of messages reaching into the billions.2 As of 2023, Facebook boasts 2.98 billion monthly active users,3 YouTube exceeds 2.68 billion users,4 and X (formerly Twitter) had 450 million users.5 These platforms offer users unrestricted capacity for expressing views and communication, often with minimal (though not constant) oversight while facilitating the concealment of user identities.6 While this technological advancement has opened new avenues for global connectivity and communication, it has also given rise to an alarming increase in the spread of hate speech.7 In the last twenty years, these online platforms have evolved into environments where hateful narratives and stereotypes flourish unchecked, primarily aimed at marginalized groups, leading to increased communal violence, ethnic cleansing, and even genocide.8 Major platforms such as Facebook, X, and YouTube have been criticized for failing to remove harmful content promptly and effectively and for mistakenly removing content that does not breach their policies.9 This research endeavours to comprehensively investigate the accountability of social media platforms in addressing and mitigating the impact of hate speech that fuels acts of violence within the public sphere. Legal, ethical, and technological perspectives will be considered to examine the responsibilities borne by social media platforms in moderating user-generated content. A detailed analysis of existing legal frameworks, both national and international, governing hate speech and its consequences will be conducted to evaluate whether social media platforms are held accountable for content that incites violence. A comparative analysis of diverse social media platforms will be integral to this research, considering variations in policies, enforcement mechanisms, and responsiveness to instances of hate speech inciting violence. Case studies will be examined to illustrate specific incidents, shedding light on the challenges faced by social media platforms and the repercussions of inadequately addressing hate speech. This research aims to determine the legal responsibilities and accountability of social media platforms for hosting content that incites violence and examines whether the current measures are sufficient in addressing this critical issue.
dc.identifier.apacitationRubenstein, R. S. (2025). <i>Do global legal frameworks hold social media platforms accountable for hosting content that incites violence?</i>. (). University of Cape Town ,Faculty of Law ,Department of Public Law. Retrieved from http://hdl.handle.net/11427/42658en_ZA
dc.identifier.chicagocitationRubenstein, Ruvenna Samantha. <i>"Do global legal frameworks hold social media platforms accountable for hosting content that incites violence?."</i> ., University of Cape Town ,Faculty of Law ,Department of Public Law, 2025. http://hdl.handle.net/11427/42658en_ZA
dc.identifier.citationRubenstein, R.S. 2025. Do global legal frameworks hold social media platforms accountable for hosting content that incites violence?. . University of Cape Town ,Faculty of Law ,Department of Public Law. http://hdl.handle.net/11427/42658en_ZA
dc.identifier.ris TY - Thesis / Dissertation AU - Rubenstein, Ruvenna Samantha AB - The digital era has witnessed an unprecedented expansion in social media platforms' use, influence, and societal impact.1 Sixty percent of the global population uses social media, with the daily exchange of messages reaching into the billions.2 As of 2023, Facebook boasts 2.98 billion monthly active users,3 YouTube exceeds 2.68 billion users,4 and X (formerly Twitter) had 450 million users.5 These platforms offer users unrestricted capacity for expressing views and communication, often with minimal (though not constant) oversight while facilitating the concealment of user identities.6 While this technological advancement has opened new avenues for global connectivity and communication, it has also given rise to an alarming increase in the spread of hate speech.7 In the last twenty years, these online platforms have evolved into environments where hateful narratives and stereotypes flourish unchecked, primarily aimed at marginalized groups, leading to increased communal violence, ethnic cleansing, and even genocide.8 Major platforms such as Facebook, X, and YouTube have been criticized for failing to remove harmful content promptly and effectively and for mistakenly removing content that does not breach their policies.9 This research endeavours to comprehensively investigate the accountability of social media platforms in addressing and mitigating the impact of hate speech that fuels acts of violence within the public sphere. Legal, ethical, and technological perspectives will be considered to examine the responsibilities borne by social media platforms in moderating user-generated content. A detailed analysis of existing legal frameworks, both national and international, governing hate speech and its consequences will be conducted to evaluate whether social media platforms are held accountable for content that incites violence. A comparative analysis of diverse social media platforms will be integral to this research, considering variations in policies, enforcement mechanisms, and responsiveness to instances of hate speech inciting violence. Case studies will be examined to illustrate specific incidents, shedding light on the challenges faced by social media platforms and the repercussions of inadequately addressing hate speech. This research aims to determine the legal responsibilities and accountability of social media platforms for hosting content that incites violence and examines whether the current measures are sufficient in addressing this critical issue. DA - 2025 DB - OpenUCT DP - University of Cape Town KW - Social media KW - Violence KW - Legal LK - https://open.uct.ac.za PB - University of Cape Town PY - 2025 T1 - Do global legal frameworks hold social media platforms accountable for hosting content that incites violence? TI - Do global legal frameworks hold social media platforms accountable for hosting content that incites violence? UR - http://hdl.handle.net/11427/42658 ER - en_ZA
dc.identifier.urihttp://hdl.handle.net/11427/42658
dc.identifier.vancouvercitationRubenstein RS. Do global legal frameworks hold social media platforms accountable for hosting content that incites violence?. []. University of Cape Town ,Faculty of Law ,Department of Public Law, 2025 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/42658en_ZA
dc.language.isoen
dc.language.rfc3066eng
dc.publisher.departmentDepartment of Public Law
dc.publisher.facultyFaculty of Law
dc.publisher.institutionUniversity of Cape Town
dc.subjectSocial media
dc.subjectViolence
dc.subjectLegal
dc.titleDo global legal frameworks hold social media platforms accountable for hosting content that incites violence?
dc.typeThesis / Dissertation
dc.type.qualificationlevelMasters
dc.type.qualificationlevelLLM
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis_law_2025_rubenstein ruvenna samantha.pdf
Size:
1.5 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.72 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections