Skip to main content
Support
Publication

Malign Creativity: How Gender, Sex, and Lies are Weaponized Against Women Online

Publication cover
Publication cover

January 2021

Executive Summary

This report strives to build awareness of the direct and indirect impacts of gendered and sexualized disinformation on women in public life, as well as its corresponding impacts on national security and democratic participation. In an analysis of online conversations about 13 female politicians across six social media platforms, totaling over 336,000 pieces of abusive content shared by over 190,000 users over a two-month period, the report defines, quantifies, and evaluates the use of online gendered and sexualized disinformation campaigns against women in politics and beyond. It also uses three in-depth interviews and two focus groups to emphasize the impacts gendered abuse and disinformation have on women’s daily lives.

 

Key conclusions:

  • Gendered and sexualized disinformation is a phenomenon distinct from broad-based gendered abuse and should be defined as such to allow social media platforms to develop effective responses. The research team defines it as “a subset of online gendered abuse that uses false or misleading gender and sex-based narratives against women, often with some degree of coordination, aimed at deterring women from participating in the public sphere. It combines three defining characteristics of online disinformation: falsity, malign intent, and coordination.”
  • Malign creativity—the use of coded language; iterative, context-based visual and textual memes; and other tactics to avoid detection on social media platforms—is the greatest obstacle to detecting and enforcing against online gendered abuse and disinformation.
  • Gendered abuse and disinformation are widespread.
    • Gendered abuse affected 12 of 13 research subjects, while nine out of 13 subjects were targeted with gendered disinformation narratives. » These narratives were racist, transphobic, or sexual in nature. The overwhelming majority of recorded keywords relating to abuse and disinformation were identified on Twitter and directed toward then-Senator, now Vice President Kamala Harris, who accounted for 78% of the total amount of recorded instances.
    • Sexual narratives were by far the most common, accounting for 31% of the total data collected. The majority of these narratives targeted Vice President Kamala Harris. Transphobic and racist narratives only accounted for 1.6% and 0.8%, respectively. While these numbers appear low, these disinformation narratives are also relatively widespread, impacting a number of different research subjects.
    • Online gendered abuse and disinformation is often intersectional in nature, with abusers often engaging with both sex- and race-based narratives, compounding the threat for women of color.
  • Online gendered abuse and disinformation is a national security issue. In interviews with women targeted by Russian, Iranian, or Chinese state media, the research team found that gendered tropes formed the basis of disinformation and smear campaigns against them.
    • This presents a democratic and national security challenge; as adversaries attempt to exploit widespread misogyny, women may be less likely to choose to participate in public life.
  • A number of challenges complicate detecting and responding to online gendered and sexualized abuse and disinformation: 
    • Malign creativity is perhaps the greatest challenge to detecting, challenging, and denormalizing online abuse because it is less likely to trigger automated detection and often requires moderate-to-deep situational knowledge to understand.
    • “Platform policies lack a coherent definition of ‘targeted harassment,’ meaning” much of the abuse women face is not violative of platforms’ terms of service, leaving abusers to continue their activities without facing consequences. There is also a lack of intersectional expertise in content moderation, which results in abuse toward women, people of color (POC), and other marginalized communities going unaddressed.
    • Targets bear the onus of detection and reporting. Managing an onslaught of abuse on social media requires time to block, report, and mute abusers. These burdens are discounted and affect their daily lives offline.
  • Social media platforms, lawmakers, and employers have largely ignored this threat to democracy and national security. In order to mitigate the threat, the researchers recommend:
    • Social media platforms should introduce incident reports that allow women to report multiple abusive posts at once to provide more context and a more holistic view of the abuse they are experiencing.
    • They should also regularly update platform classifiers or keywords to reflect and root out malign creativity, improve automated detection methods, and introduce nudges or friction to discourage users from posting abusive content.
    • Finally, they should create a cross-platform consortium to track and respond to online misogyny, similar to existing consortiums which counter terrorism and extremism.
    • Lawmakers should include content moderation transparency reporting requirements in social media regulation bills to improve understanding of the problem and introduce accountability for women’s online protection.
    • They should create clear standards that prohibit the use of gendered and sexualized insults and disinformation in official business.
    • Critically, US lawmakers should reauthorize the Violence Against Women Act (VAWA) and include provisions against online gender-based harassment.
    • Employers should develop robust support policies for those facing online harassment and abuse, including providing mental health services, support for employees’ or affiliates’ legal fees and other expenses (such as anti-doxxing service subscriptions).
    • They should also outline clear mechanisms for targets to report such campaigns against them to official communications and human resources staff.

Gendered and sexualized abuse and disinformation online is sprawling, and even assessing it in the broadest terms presents obstacles in detection and analysis. As this report indicates, abusers’ malign creativity means addressing this problem will not be easy. Dedicated, collective efforts by platforms, policymakers, and employers can elevate this from its misidentification as a special interest issue to a question of the right to equal participation in democracy and public life without fear of abuse and harassment.


Science and Technology Innovation Program

The Science and Technology Innovation Program (STIP) serves as the bridge between technologists, policymakers, industry, and global stakeholders.  Read more