Technology-facilitated violence (TFV) is any form of violence that uses technology to cause harms. It includes behaviors such as harassment, image-based sexual abuse, hate speech, sexual extortion, defamation, impersonation, doxing, and/or stalking a person in digital spaces including social media websites or through text messages. Like physical violence and abuse, members of equality seeking groups are often targeted with TFV because of underlying systemic discrimination that makes them vulnerable to violence. Trans and cis women and girls, agender, non-binary, and gender variant individuals are subject to technology-facilitated gender-based violence and face online attacks related to their gender and sexuality. Attacks often become more frequent and severe for those individuals who are also racialized, Indigenous, disabled, or have other intersecting social locations. TFV significantly limits the freedoms of those people who are attacked. It can make them feel unsafe in digital and physical spaces, it can harm their reputations and impact their ability to find work, and it can lead to the reduction or silencing of their voices in digital spaces.
TFV significantly limits the freedoms of those people who are attacked. It can make them feel unsafe in digital and physical spaces, it can harm their reputations and impact their ability to find work, and it can lead to the reduction or silencing of their voices in digital spaces.
Digital platforms are central spaces for maintaining relationships, employment, and community. They are the modern day public square where political and social ideas are debated. Because digital platforms play a central role in our everyday lives they are often the sites where TFV occurs. As such, there is a substantial need for better responses to TFV from social media platforms, including through government regulation. As this problem proliferates, governments and social media companies have been struggling to implement adequate solutions to address TFV. For years, feminists have been making recommendations to governments and social media companies on how those institutions could respond to and prevent TFV. These feminist models and proposals could and should be used to shape the future of social medial content regulation and government responses to TFV. There are multiple feminist digital rights organizations worldwide working on this issue from whom governments and social media companies could find direct suggestions for change. This essay will review the work of several Canadian feminists who have recently examined and proposed regulatory changes that could help respond to and prevent TFV.
Better Social Media Content Moderation
Social media companies have not developed adequate solutions to address TFV on their platforms. This has been documented for many years but was recently highlighted once again in the Wall Street Journal’s investigation, The Facebook Files. This investigation showed that Facebook is not only aware of some of the harmful effects of using their platform on women and girls but also shows that Facebook had a specific system in place that exempted certain high-profile users from the company’s content moderation policies, which are supposed to apply to everyone.
On Facebook’s website, their guideline states:
We remove intimate images or videos that were shared without the permission of the person pictured, or that depict incidents of sexual violence. Additionally, we remove any images, videos, or other content that threaten to share intimate images without permission of the person pictured, or that depict incidents of sexual violence. In most cases, we also disable the account that shared or threatened to share the content on Facebook or Instagram.
Yet, in a case reported in the Facebook Files, a famous soccer star was able to post nude images of a woman who accused him of sexual assault along with her name and personal information on Facebook’s platforms without immediate recourse. Because the normal content moderation rules did not apply to special users like this famous soccer player, these images were not taken down. Instead they remained up for some time and were viewed by around 56 million people on Facebook and Instagram. This woman was doxxed, had her nude images shared around the world, and faced substantial harassment online following the incident.
As mentioned above, Facebook’s content moderation guidelines require the company to delete sexual photos shared without consent and, sometimes disable the accounts of the person that posted them. That did not happen in the case detailed above because of Facebook’s special treatment of some of its most valuable users, demonstrating a preference for maintaining users with massive followings over the safety of their other users.
The lack of response to TFV is clearly not limited to what was reported by the Wall Street Journal. For years, advocates have been pointing out the prevalence of TFV on social media websites and the inadequate responses to this harmful content. For example, in 2018, Amnesty International published its report #ToxicTwitter, that demonstrated the staggering number of cases of TFV against women and girls on Twitter, particularly against those who are racialized, LGBTQ+, or disabled. The report also highlighted the lack of a meaningful response by Twitter to address this harmful content. On multiple occasions the former CEO of Twitter, Jack Dorsey, has recognized that his platform has failed to systematically deal with hate and abuse. Toxic Twitter is just one report of many examining a particular social media company, but the issue of TFV is pervasive on most social media sites and to date, not a single company has come up with a comprehensively workable solution to TFV on its platform.
There is a lack of understanding on how decisions are being made about harmful content, and there is little to no accountability for these companies when it comes to ensuring that they fairly and regularly implement their own rules and standards. With social media companies playing such a powerful role in our society and their use leading to demonstrable harms to many of their users, there is a need to regulate some of their practices.
Social media companies play an important role in moderating TFV and should have relevant content moderation rules for their platforms that apply to all users. The problem is, that even with companies that do have content moderation policies on the books, how they implement those rules remains opaque and many users continue to complain about their ineffectiveness. There is a lack of understanding on how decisions are being made about harmful content, and there is little to no accountability for these companies when it comes to ensuring that they fairly and regularly implement their own rules and standards. With social media companies playing such a powerful role in our society and their use leading to demonstrable harms to many of their users, there is a need to regulate some of their practices.
In her recent report for the Women’s Legal Education and Action Fund (LEAF), “Deplatforming Misogyny”, Cynthia Khoo developed a proposal for a feminist informed regulatory response to TFV. Her report looks at the role digital platforms play in the spread of TFV and they ways in which social media companies could be held accountable for TFV through regulation and laws in Canada. She made 14 recommendations for law reform in relation to platform liability and accountability.
The recommendations included four central points for developing a feminist governmental response. First, any legislative reforms must center human rights, substantive equality, and intersectionality to ensure that any new legislation would serve the needs of equality seeking groups. Substantive equality recognizes that individuals and groups may have special needs that must be addressed in order to achieve equality in outcome, rather than formal equality which assumes all people and groups should be treated exactly the same way. Second, legislative reforms should include the development of a government body that would provide remedies to victims/survivors of TFV, such as assistance in removing harmful content, along with funding and promoting research and education on TFV. Third, it should be legally obligatory for platform companies to have accessible reporting mechanisms, which meaningfully respond to victims/survivors’ complaints of TFV on their platforms, and to publish transparency reports on their content moderation practices. Fourth, governments must fund education resources and training programs on TFV, including developing public information campaigns and funding frontline victim services organizations so they can have the capacity to respond to clients impacted by TFV. This final factor is one that will help address the longer-term prevention of TFV and help shift social norms around this behavior.
Social media companies’ content moderation practices could also be improved with government regulation, such as those recommended by LEAF. Ensuring that social media companies provide transparent reports of their content moderation practices will allow users to better understand their policies and identify where there is a need to advocate for change by the companies whose products they are using. If understandable and meaningful content moderation practices were in place and implemented effectively, many victims/survivors could access the solution they most desire through the social media company: having the harmful content taken down.
Government Bodies that Help with TFV
As proposed by LEAF, governments should provide direct supports to victims/survivors of TFV. A statutorily developed body to address TFV can provide accessible options for victims/survivors of TFV. One of the challenges for victims/survivors of TFV is that they need immediate support to manage their situation but there are few organizations providing victim-services support that understand TFV. There are criminal and civil laws that make some forms of TFV illegal but, as will be discussed below, there are barriers within the criminal justice and civil systems that may not make these legal avenues desirable for all victims/survivors. Further, traditional spaces of support for victims/survivors of gender-based violence, such as transition houses and victim service centres, are often under-resourced and lack expertise in the technological aspects of gender-based violence, such as knowing how to detect stalkerware on a phone.
As proposed by LEAF, one solution to this problem is for governments to develop and fund statutorily-based bodies to provide immediate support needed by victims/survivors and share information about TFV to community organizations that need help when clients may be impacted by TFV. These bodies can work on longer-term societal measures to prevent TFV generally, such as funding and disseminating research and information about TFV. They can also take much of the burden off of the shoulders of victims/survivors seeking help and can provide the technical and emotional support that many of them desire. In addition, governments should fund anti-violence organizations that already provide community-based support. This will help create another source of support for victims/survivors of TFV. This is particularly important for members of marginalized communities who may trust community-based organizations more than governmental ones.
For some victims/survivors of TFV, it is important to have the criminal justice system recognize the harms they have experienced and to hold the perpetrator accountable for their actions. If the incidence of TFV reaches a criminal threshold, a victim/survivor can report it to the police. For example, in Canada there are multiple criminal offenses applicable to TFV including the non-consensual distribution of intimate images, harassment, threats, and mischief in relation to data. However, there are several challenges that victims/survivors face when approaching police with these complaints. First, the justice system has a disappointing track record when it comes to addressing sexual and gender-based crimes. Research has shown that complaints of sexual crimes are not always taken as seriously as they should be by the police. This negative trend has continued with TFV, with many victims reporting that their complaints are minimized or discounted when they report them to the police or that the police lack training on how to handle these types of complaints. Second, even when criminal complaints are taken seriously, the criminal law system is slow and may not provide the remedies that victims/survivors desire. For victims/survivors who are primarily interested in getting content taken down quickly, the criminal law system may not be an effective system for their needs as a decision by the courts may take months or even years.
The civil system is also an option for some victims/survivors who wish to be compensated for their losses and/or are seeking a civil remedy, such as an injunction to get the content removed from a social media platform. In Canada, there are a variety of civil options for a victim/survivor of TFV. They can sue under legal provisions against crimes like defamation, copyright infringement, intentional infliction of mental suffering and multiple privacy torts. The civil system can order an injunction to have the content taken down but like the criminal system, this may take some time. One of the biggest barriers to pursing a civil response is the cost. A civil case may not be a viable option for many victims/survivors because of the cost of litigation and even if they are awarded damages for their losses, it can be difficult to collect those damages if the person who harmed them doesn’t have the means to pay.
Despite these challenges, many victims/survivors will want to seek a formal legal remedy. If they do so, their complaints should be taken seriously and they should be treated with respect by the justice system.
It is important to ensure that there are extra-judicial remedies available for victims/survivors who do not wish to seek a solution through the justice system. This can include government bodies and civil society organizations that address TFV. In several countries, governments have already created governmental bodies or funded civil society organizations that can help victims/survivors of TFV not only navigate the legal system but also provide non-legal support. Furthermore, such organizations can also inform and educate people about TFV.
The benefits of a diversity in responses to TFV were noted in Alexa Dodge’s examination of CyberScan, a government body located in Nova Scotia, Canada that helps victims/survivors of cyberbullying and non-consensual image sharing. Part of CyberScan’s mandate is to assist victims/survivors who wish to report their abuse to the police or file a civil legal claim. After reviewing CyberScan’s materials and interviewing several of their staff, Dodge found that most victims/survivors reached out to CyberScan for emotional and technical support, rather than support with a formal legal response. Having their experiences validated by the CyberScan staff was an important aspect of the support some victims/survivors sought. The staff often provided informal and technical support on how to remove content and manage the situation. After being provided that support, most people did not want to pursue additional legal avenues. For those that did want to pursue a legal response, the CyberScan staff could provide them information about the legal process. CyberScan’s practices are a move in the right direction, however, Dodge made a series of helpful recommendations on how CyberScan could improve their services, including improving their restorative justice practices and educational initiatives.
We need more such bodies worldwide. In Pam Hrick’s article, “The Potential of Centralized and Statutorily Empowered Bodies to Advance a Survivor-Centred Approach to Technology-Facilitated Violence Against Women”, she details the ways in which governmental bodies can provide a meaningful and survivor-centred approach to addressing TFV. Hrick argues that adopting a survivor-centred model allows victims/survivors to choose how they want to address the TFV they are experiencing. She identifies three central ways that these bodies can fulfil a survivor-centred response. First, by providing direct service provision to people experiencing TFV. Second, by delivering educational and information on TFV that can influence social norms on the differences between healthy and unhealthy behaviors online. Third, by conducting research on TFV to support evidence-based education and responses to TFV.
In her article, Hrick reviewed New Zealand’s Netsafe, the eSafety Commissioner in Australia, Nova Scotia’s Cyberscan Unit in Canada, and the Canadian Centre for Child Protection in Manitoba, Canada. While none of these bodies are a perfect example of a survivor-centred approach, they are building on a useful model. Hrick notes that these bodies approach TFV “through education, research, and direct involvement in resolving instances of violence.” In New Zealand, the government tasked a non-profit agency, Netsafe, with supporting people experiencing TFV. Netsafe provides direct support to complainants, and uses a variety of techniques including advice, negotiation, mediation, and persuasion, through a free and direct helpline and online service for people to report TFV, along with legal remedies. In Australia, the eSafety Commissioner is a government body that has relationships with large social media companies like Facebook and Twitter, which can help facilitate the removal of harmful content involving cyberbullying or the non-consensual distribution of intimate images. The eSafety Commissioner has reported a 90% success rate in getting non-consensually shared nude images removed from the internet. It also conducts research and disseminates education and research on online safety to the public. In Canada, CyberScan provides pubic educational materials on online safety, and supports victims/survivors seeking either a legal remedy or an informal response. Informal responses advertised include voluntary advice, negotiation, mediation, and restorative justice approaches. The Canadian Centre for Child Protection has a tip line for child sexual exploitation and provides some intervention and educational material on image-based abuse.
Governments should work towards shaping social norms around healthy and unhealthy behaviors online, and encourage positive and respectful engagement in digital spaces.
In addition to providing direct support, many of these organizations conduct research and develop information campaigns on TFV. The eSafety Commissioner has established itself as the gold standard for such bodies. It produces a wide variety of research that helps educate the public and informs its own responses to TFV. The educational and research work undertaken by these bodies is essential for longer-term societal change. Ultimately, governments should work towards shaping social norms around healthy and unhealthy behaviors online, and encourage positive and respectful engagement in digital spaces. To do this, people of all ages will need to be educated on what TFV is and why it is important to stop it. Research also allows governments to better understand how TFV is manifesting in digital spaces and what responses would be most meaningful for victims/survivors.
A feminist response to TFV requires a dynamic and multi-sectoral approach. It can include legal responses, such as criminal or civil laws, access to well-funded governmental and civil society organizations who can help victims/survivors with the violence they are experiencing, substantial education and research campaigns, and regulating social media companies’ content moderation practices, among other things. Victims/survivors of TFV need social media companies to better respond to the TFV that happens on their platforms. To date, these companies have not managed to find adequate solutions. Consequently, it is time for governments to regulate some aspects of these companies to ensure that their digital platforms are free of violence.
More importantly, those who have been harmed by TFV need more direct support options. Governments should provide funding and training to civil society organizations that are doing anti-violence work, but need assistance in managing TFV cases. Additionally, governments should establish or support organizations that can conduct research, develop educational campaigns, and provide direct support to victims/survivors of TFV. Governments and social media companies need to take action in order to create a safer, more feminist internet.
This is the sixth piece from our special issue on Feminist Re-imagining of Platform Planet.