Ann ie Ann ie

Online But Not Safe: Technology and the Rise of Gender-Based Violence

⚠️ Trigger Warning (TW):
This blog post contains discussions of gender-based violence, including stalking, online harassment, non-consensual pornography, deepfake abuse, coercive control, and sexual violence. Please take care while reading. If you or someone you know is affected, support is available (see the end of the post for resources).


In the digital age, technology has become an inescapable part of our daily lives. We use it to connect, communicate, organise, and resist. It gives us space to raise our voices, challenge injustice, and build communities that transcend geography. But for many women and gender-diverse individuals, the internet is not a safe or liberating space—it’s another arena where violence, misogyny, and control are perpetuated, often without consequence.

Technology-Facilitated Gender-Based Violence (TFGBV) is not simply an extension of traditional abuse. It is its own form of harm: one that is evolving rapidly, embedded in our infrastructure, and often overlooked by institutions that claim to protect us. This post explores what TFGBV looks like, how it impacts real lives, and what must be done to create safer, feminist digital futures.


What Is TFGBV?

Technology-Facilitated Gender-Based Violence refers to any act of harm or abuse that is committed, facilitated, or amplified through digital technologies—most often targeting women, girls, LGBTQ+ people, and other marginalised individuals. It is rooted in the same power dynamics as offline abuse but takes on new forms in a connected world.

Here are the most common and often overlapping forms of TFGBV, with clear definitions to understand their scope:

🔍 Cyberstalking

Definition: Persistent and unwanted digital surveillance of a person’s online activity, location, or communication.
Examples: Tracking someone’s movements via GPS apps, monitoring their social media, or repeatedly sending them messages across platforms.

🗨️ Online Harassment

Definition: The repeated sending of abusive, threatening, or degrading messages through digital channels.
Examples: DMs filled with hate speech, rape threats, or body shaming; comment threads flooded with abuse.

📸 Non-Consensual Image Sharing (Revenge Porn)

Definition: Sharing intimate or explicit images of someone without their consent, often to shame or blackmail them.
Examples: A partner posting private photos after a breakup or threatening to send them to employers or family.

🤖 Deepfake Pornography

Definition: AI-generated explicit content that falsely depicts someone, often by digitally inserting their face into pornographic material.
Examples: A woman’s face being edited into porn and shared online as if it were real.

💣 Doxxing

Definition: Publishing someone’s private information (like home address, phone number, or workplace) online without their consent.
Examples: Posting a woman’s home address after she speaks publicly about feminist issues.

📱 Digital Coercion and Blackmail

Definition: Using technology to force or manipulate someone into sending explicit content or complying with demands.
Examples: Threatening to release nudes unless more are sent or coercing someone into staying silent about abuse.

🕵️‍♀️ Spyware and Monitoring Tools

Definition: Apps or software used to monitor someone’s activity without their knowledge or permission.
Examples: A partner secretly installing spyware to read texts, check calls, or track movements.

🧠 Hate Speech and Mass Trolling

Definition: Organised or sustained online abuse campaigns, often targeting women or marginalised people based on gender, race, sexuality, or beliefs.
Examples: Online mobs harassing a female journalist or influencer after she posts about gender inequality.


The Scale of the Crisis: What the Numbers Tell Us

While TFGBV is widespread, it is still vastly underreported—and dangerously normalised.

  • 73% of women worldwide have experienced online abuse (UN Broadband Commission, 2015).

  • 38% of women globally have personally faced online harassment, and 85% have witnessed others being abused (World Wide Web Foundation, 2021).

  • 1 in 3 women under 35 in the EU has experienced online violence (European Institute for Gender Equality, 2022).

  • The UK’s Revenge Porn Helpline handled over 12,000 cases in 2022 alone.

  • In South Korea, the number of non-consensual deepfake cases rose by over 100% in one year, most victims being under the age of 30.

  • LGBTQ+ and disabled women face disproportionately higher rates of digital abuse, and are often less likely to receive institutional support.

These statistics are not abstract—they reflect a systemic failure to protect women in one of the most powerful spaces of modern life.


Real Stories, Real Impact

🕊️ Lilie James: Coercive Control Turned Deadly

In 2023, 21-year-old Australian coach Lilie James was brutally murdered by her ex-boyfriend after ending their relationship. Prior to her death, he had displayed obsessive behaviour—including online stalking and social media monitoring. Lilie’s case is a tragic reminder that digital abuse is often a precursor to physical violence, and that patterns of coercive control must be taken seriously before it’s too late.

📉 South Korea’s Deepfake Porn Crisis

In South Korea, young women—some still in secondary school—have been targeted in a wave of AI-driven deepfake pornographic content. Faces are stolen from social media profiles and inserted into pornographic videos, then spread anonymously through messaging apps and websites. Victims face shame, trauma, and often withdraw from public life altogether. Despite recent laws, enforcement remains weak, and perpetrators frequently walk free.

🎓 The New Misogyny in Schools

Teachers in Australia and the UK report a disturbing rise in misogynistic attitudes among male students, linked to influencers like Andrew Tate and violent online porn. Female students and even teachers are increasingly facing verbal abuse, sexualised threats, and tech-based humiliation. The digital world is shaping boys’ attitudes toward women in real time, and without intervention, the effects will echo across generations.


The Psychological Toll

TFGBV leaves long-lasting scars. Victims often experience:

  • Anxiety, depression, and PTSD

  • Social withdrawal or self-censorship

  • Career disruption or academic decline

  • Physical health issues (headaches, insomnia, chronic stress)

  • In extreme cases: suicide

When a woman is silenced or driven offline, it’s not just her voice that disappears—it’s the knowledge, experience, creativity, and leadership she brings to public discourse.


Why Is Justice So Elusive?

Legal systems globally are failing to keep up with the pace of digital abuse:

  • Many countries have no laws against deepfakes or non-consensual image sharing.

  • Police frequently dismiss digital abuse as a “civil issue” or tell women to “just get offline.”

  • Platforms like Facebook, X (Twitter), and TikTok often fail to remove harmful content, even after it’s reported.

In the rare cases where perpetrators are prosecuted, they often receive light sentences or avoid jail altogether—especially if they are young, male, and from privileged backgrounds.


Feminist Resistance and the Fight for Digital Justice

Despite the scale of the problem, resistance is growing. Women and activists are building tools, networks, and campaigns to fight back:

  • #MeToo, #RevengePornIsAbuse, and #NotYourPorn have mobilised global awareness and policy change.

  • Apps like Block Party, HeartMob, and Silent Link help women protect themselves and report abuse more safely.

  • Feminist scholars and technologists are calling for design justice—ensuring digital platforms are built with safety, equity, and accountability at the core.

But this work cannot fall on victims and activists alone. We need system-wide change.


What Needs to Happen

  1. Stronger Laws: Governments must criminalise all forms of digital abuse, including deepfakes and spyware, and enforce them consistently.

  2. Platform Accountability: Tech companies must invest in robust moderation systems, transparency, and survivor-centred design.

  3. Education and Prevention: Schools must teach digital citizenship, consent, and respect—starting early and continuing throughout education.

  4. Survivor Support: Funded services must be made widely available, including mental health support, legal aid, and digital safety training.

  5. Intersectional Awareness: All responses must consider how race, class, sexuality, and disability shape how women experience and survive digital violence.


Conclusion: A Feminist Internet Is Possible

At Sisters for Justice, we believe the internet should be a place of freedom—not fear. A space where women and marginalised communities can speak, thrive, and challenge the systems that oppress them. But that future will not be given to us—we have to build it.

TFGBV is not just a tech problem. It is a feminist issue. A justice issue. A human rights issue.

The internet reflects the world we live in—and the one we want. Let’s make sure it reflects our values, our safety, and our power.


If You’ve Been Affected, You’re Not Alone

Here are some trusted organisations you can turn to for help:

Read More