Skip to main content
default

Cybersecurity must learn from and support advocates tackling online gender-based violence

Cybersecurity must learn from and support advocates tackling online gender-based violence
Julia Slupska, Toby Shulruff, Tara Hairston

Technology-enabled abuse – like stalking, harassment, and image-based sexual abuse (regrettably known as ‘revenge porn’) – has too often been excluded from threat models used in cybersecurity research and industry. As feminist international politics and security theorists have long argued, by treating gendered vulnerabilities as “private matters” we exclude these security problems from the public and political domain. 

The lack of attention to gender in cybersecurity has contributed to an environment in which a certain level of harmful misuse of technology is accepted or at least tolerated. Although they are often the subject of public debate, emerging tech like artificial intelligence or the Internet of Things are not the only technologies with gendered impacts. More readily accessible and widespread tech such as SMS messaging, social media, and features designed to locate lost devices are more commonly used to perpetuate tech abuse. Another example is global positioning systems (GPS) in phones and automobiles that can be used to stalk, harass, and abuse survivors of gender-based violence (GBV).  

As UNIDIR’s recent report “Gender Approaches to Cybersecurity” has shown, a gendered approach to cybersecurity can better mitigate or prevent these harms, and create environments where abuse is not considered an inevitable cost of online participation. To advance gendered cybersecurity practices, this commentary outlines further areas for action to be pursued by States, the tech industry and the third sector (i.e. civil society, voluntary and non-profit organisations working on gender-based violence). Working in collaboration, these actors can more effectively tackle online GBV and abuse, providing a more robust and inclusive range of options for justice for victims and affected communities. 

Re-examining cyber threats, response and justice through a gender lens

Cybersecurity threat models tend to assume that the attacker is a network-based hacker or thief external to the household. But a unique set of risks and challenges emerge when the attacker is an abusive partner or family member who may have physical access to devices, the ability to guess or coerce passwords, or other means to circumvent cyber hygiene measures. Security efforts often do not account for ‘abusability’ or how technology can be used as intended, but with the purpose to perpetrate violence or abuse. This has been highlighted in previous research, as well as the need to include marginalised groups in threat modelling (a method known as participatory threat modelling). Beyond that, it is crucial to broaden our concept of what counts as cybersecurity to include harms to people marginalised by surveillance and abuse.

Inattention to gender considerations in cybersecurity has contributed to a response system in which users have been left to fend for themselves, and routinely encounter barriers to accessing evidence, reporting, and re-establishing control of compromised accounts and devices. Human interaction is typically replaced with automated systems like virus scans, FAQ’s, web forms, and the like. Cybersecurity response systems can promote hierarchical and expert-driven procedures and mechanisms that create a distance between cybersecurity professionals and victims. These practices and assumptions can place the onus for security squarely on users, in this case victim-survivors of tech abuse. Shifting responsibility to victim-survivors is problematic, calling to mind discredited “rape myths” in which a person’s choice of clothing or activity is considered to invite violence against them. 

Furthermore, crimes in cyberspace, and therefore, the legal responses to said crimes, remain ill-defined. And this lack of definition is not limited to cybercrime. The World Bank notes that 45 countries do not have domestic violence legislation, and out of the 155 that do have such legislation, only 101 define domestic violence to include physical, psychological, sexual, and economic abuse.  Whether these differences reflect inherent biases or a need for capacity building or both, it is important to acknowledge that criminal justice systems can amplify rather than resolve existing gender inequalities.  

Black feminist activists and women of colour have long led calls for restorative justice (which elevate the voice of the victim or survivor, recognize the impact of violence on community members, and allow the perpetrator of harm to more fully understand, and sometimes repair, harm) and transformative justice (which opposes not only the criminal justice system but also reform measures that can serve to further legitimize the existing system for crime control) that move beyond carceral responses rooted in retribution and punishment. Such approaches may be particularly valuable for people of colour and gender-nonconforming folk who have often been failed or even victimised by state systems

How to address online GBV and abuse

One of the goals of the Generation Equality Forum is to prevent and eliminate online and tech facilitated GBV. Ahead of the Forum’s meeting in Paris, we outline a set of actions that States, the industry and the third sector can pursue to tackle online GBV and abuse.

States can shape the market by channelling procurement towards technologies which take Intimate Partner Violence into account in their threat models. This will also require engaging the tech sector to expand how industry defines security- and privacy-by-design to include interpersonal threat modelling and to incentivise technologists and tech companies to build safer products and services. 

States can also adopt legislation establishing a duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services (see for example, the incoming UK Online Safety Bill). These online harms must include GBV like stalking, harassment, and image-based sexual abuse. 

As part of multi-stakeholder efforts, States can incentivise industry to build and deploy safer technology through consultation with marginalised and vulnerable populations, civil society, and academia and informed by participatory threat modelling. Resources like the guide to Coercive Control Resistant Design highlight ways in which designers can account for, and mitigate, abusive personas and use cases. Leveraging existing frameworks familiar to industry professionals like design principles, maturity models, holistic anti-abuse policies, assessment tools, skills-building regimes, standards, etc. could be useful to drive focus on, and attention to, countering abusability in technology.

When working with survivors of tech-abuse, States and the industry can learn from the third sector, which has developed informal approaches to support survivors of tech abuse despite gaps in security provision by police and formal cybersecurity institutions. Victim advocates are often doing cybersecurity (though not recognised as such), largely without support from the broader security field. Their work could be strengthened through partnerships with cybersecurity practitioners, with the Coalition Against Stalkerware as an example. This informal cybersecurity work within NGOs should be supported through funding and other concrete resources by States and the private sector.

In order for such collaboration to work, cybersecurity practitioners must not approach NGOs and survivors with a benevolent but patronizing one-way flow of “expertise.” This flow should be two-way, with formal cybersecurity practitioners and institutions recognizing the lived experience of survivors and those who work with them at NGOs. Tech practitioners must also be open to learning from the GBV sector’s creative approaches, greater knowledge of GBV threat models, and experience working with limited resources and high-stakes risks.

In the long-term, cybersecurity must evolve through cross-pollination with the expertise of NGOs like the Safety Net project in the US, or Chayn internationally, whose perspectives could deepen understanding of users, expand threat models, and improve the usability of response tools. This work could extend to collaboration on the development of tools to inform the design and development of future products and systems that build in “safety-by-design” rather than as an afterthought.

 
Authors

  • Julia Slupska, Oxford Internet Institute, University of Oxford

Julia Slupska is a doctoral student at the Centre for Doctoral Training in Cybersecurity and the Oxford Internet Institute. Her research focuses on technologically-mediated abuse like image-based sexual abuse ('revenge porn') and stalking, as well as emotion, care and metaphors in cybersecurity.

  • Toby Shulruff, School for the Future of Innovation in Society, Arizona State University, and Safety Net, National Network to End Domestic Violence

Toby Shulruff is pursuing a Master of Science in Public Interest Technology at Arizona State University, and is Senior Technology Safety Specialist at the Safety Net Project of the National Network to End Domestic Violence (US).

  • Tara Hairston, Kaspersky, Coalition Against Stalkerware

Tara Hairston serves as the Executive Director of the Coalition Against Stalkerware, a group of 40+ organizations committed to protecting individuals from stalkerware and other forms of tech-mediated abuse. She also leads public affairs and government relations work for Kaspersky, a cybersecurity vendor, in Canada and the United States.


Note from the authors 

Although this is a global problem--with experiences of technology-enabled abuse reported throughout the world--our experiences as advocates and researchers working to end such abuse are limited primarily to the US and the UK. We note that research on technology-enabled abuse has focused disproportionately on Anglo-American contexts, and therefore, it is critical to include a wider variety of perspectives on global governance of this issue.
 

Share

Author details:

Julia Slupska, Toby Shulruff, Tara Hairston