Abstract
The regulation of sexual exploitation on social media is a pressing issue that has been addressed by government legislation. However, laws such as FOSTA-SESTA has inadvertently restricted consensual expressions of sexuality as well. In four social media case studies, this paper investigates the ways in which marginalized groups have been impacted by changing censorship guidelines on social media, and how content moderation methods can be inclusive of these groups. I emphasize the qualitative perspectives of sex workers and queer creators in these case studies, in addition to my own experiences as a content moderation and social media management intern for Lips.social. This paper concludes with potential solutions to current biases in social media content moderation.
Faculty Advisor/Mentor
Dr. Michael Lapke
Document Type
Paper
Disciplines
Applied Ethics | Computer Sciences | Gender and Sexuality | Quantitative, Qualitative, Comparative, and Historical Methodologies
DOI
10.25776/jftv-sx08
Publication Date
12-1-2023
Upload File
wf_yes
Included in
Applied Ethics Commons, Computer Sciences Commons, Gender and Sexuality Commons, Quantitative, Qualitative, Comparative, and Historical Methodologies Commons
Lip(s) Service: A Socioethical Overview of Social Media Platforms’ Censorship Policies Regarding Consensual Sexual Content
The regulation of sexual exploitation on social media is a pressing issue that has been addressed by government legislation. However, laws such as FOSTA-SESTA has inadvertently restricted consensual expressions of sexuality as well. In four social media case studies, this paper investigates the ways in which marginalized groups have been impacted by changing censorship guidelines on social media, and how content moderation methods can be inclusive of these groups. I emphasize the qualitative perspectives of sex workers and queer creators in these case studies, in addition to my own experiences as a content moderation and social media management intern for Lips.social. This paper concludes with potential solutions to current biases in social media content moderation.