Tuesday saw the release of a report by a Northumbria University researcher outlining legislative proposals to address platform governance disparities in the use of «flagging» and «de-platforming» on social media.
As part of a larger research project she is leading at the Centre for Digital Citizens (CDC), Dr. Carolina Are, whose research focuses on the interaction between online abuse and censorship, has written the Co-designing platform governance policies report in collaboration with The World Wide Web Foundation and Superbloom.
Dr. Are is interested in redesigning platform regulations related to de-platforming, also known as account deletion, and malicious flagging, or the abuse of the reporting mechanism to quiet certain accounts, as a user having personal experience with online harassment and de-platforming.
According to her research, unscrupulous people have the ability to ‘ban’ or de-platform other social media users with whom they disagree by using the flagging or reporting tools on social media sites. This has disproportionately impacted marginalised users including pole dancers, journalists, activists, and LGBTQIA+ (lesbian, gay, bisexual, transgender, queer, intersex, or asexual) and BIPOC (black, indigenous, and other people of colour) content providers.
The report’s central thesis is that content moderation frequently overlooks the human experience and lacks the requisite compassion for users who are subjected to abuse, censorship, loss of livelihood, network disconnection, and emotional pain. The paper claims that users who are deplatformed from social media frequently lose access to their communities, networks, career possibilities, information, and education. Research has shown that this has a negative impact on users’ mental health and welfare.
According to Dr. Are, an Innovation Fellow in the Psychology Department at Northumbria, «The time, resources, and attention allocated to engagement with the stakeholders who are directly impacted by technology are awarded sparingly too often in my research and personal interactions with social media companies».
«The idea underpinning this report is that content moderation often fails to take the human experience into account to prioritise speed and platform interests, lacking in the necessary empathy for users who are experiencing abuse, censorship, loss of livelihood and network as well as emotional distress».
«As a result, this report is a free resource for both users to feel seen in a governance process that often erases them and, crucially, for platform workers to avoid escaping stakeholder engagement in the drafting of their policies».
The report‘s recommendations were compiled through a series of workshops organised in accordance with The World Wide Web Foundation’s Tech Policy Design Labs (TPDLs) playbook. The report was co-designed with 45 end-users, who Dr. Are claims are frequently disregarded when drafting the rules governing the spaces they depend on for their social and professional lives.
Participants in the workshops pushed for radical transparency and for a duty of care by platform conglomerates, demanding information, workers’ rights, and compensation when platforms fail to protect their users from censorship and/or abuse because users found that current legislation falls short of protecting them on social media.
Therefore, the paper offers social media platforms recommendations that are user-centered and based on research to enhance the functionality and design of their flagging and appeals features.
The workshops were organised by Dr. Henry Collingham, an Ageless Citizen Innovation Fellow and product designer, and funded by the Engineering and Physical Sciences Research Council and the Policy Support Fund at Northumbria University.