New Whitepaper Launch: Rethinking Accountability in Image-Based Sexual Abuse

New Whitepaper Launch: Rethinking Accountability in Image-Based Sexual Abuse

How A Multi-Layered Model of Harm Reduction Can Address Perpetrator Behaviour

We are proud to announce the release of a collaborative new whitepaper with STISA that explores perpetrator behaviour and harm reduction within image-based sexual abuse (IBSA). Titled “A Model of IBSA Harm Reduction: Image Angel’s Proposed Impact on Perpetrator Behaviour,” the report offers both a sobering assessment of current failures and a promising roadmap forward.

Why Now?

The need for this conversation has never been clearer. While some may successfully take their abusers to court this remains a rarity, especially due to perpetrator anonymity. For most victims, justice feels out of reach. Within the sex industry, the situation is even more precarious. Content theft, non-consensual sharing, and covert filming are disturbingly common.

Yet platform inaction, legal ambiguity and deep-rooted stigma mean many never report their abuse, let alone see a resolution. This whitepaper draws on both lived experience and academic research to explore how IBSA manifests, who it affects and what can be done to stop it.

Understanding the Harm

IBSA is not a one-off incident, it is a form of sustained abuse. Victims consistently report profound psychological consequences: anxiety, depression, hypervigilance, isolation and, in some cases, suicidal ideation. Many are forced to leave jobs, delete social media, or abandon careers altogether.

Sex workers, who rely on digital visibility to survive, face a unique form of weaponisation: private images used to out them, threaten them or destroy their relationships. Despite high levels of victimisation, sex workers are often excluded from mainstream conversations about online safety.

This whitepaper centres their experiences, while also examining why the systems meant to protect victims—laws, platforms, enforcement bodies—can fail.

Introducing Image Angel: A New Model of Deterrence

Rather than placing the burden solely on victims to report, remove or relive their trauma, Image Angel proposes a preventative approach—one that targets perpetrators through traceability, accountability and increased risk.

At its core, Image Angel is a forensic watermarking tool that embeds invisible, user-specific data into content. If an image is leaked, it can be traced back to the source.

This kind of technology doesn’t just support takedowns. It disrupts the illusion of anonymity that so many perpetrators rely on. It introduces real consequences, real traceability and real fear of exposure. That’s what deterrence looks like.

But the solution doesn’t stop at tech.

Image Angel also proposes a layered model of harm reduction that includes:
  • Mandatory consent training for users on adult platforms
  • Early intervention education targeting harmful myths about image-sharing
  • Automated evidence capture to support survivors in legal proceedings
  • Policy recommendations for platform integration and regulation
  • A global approach, connecting NGOs and public authorities on the ground, alongside platforms and technology partners such as Image Angel.

The result? A new kind of safety net, one that works before harm is done, not just after.

The whitepaper, “A Model of IBSA Harm Reduction: Image Angel’s Proposed Impact on Perpetrator Behaviour,” is now available to download.

It is a vital read for:

  • Platform designers and tech developers
  • Lawmakers and policy advisors
  • Anti-violence advocates
  • Journalists covering online safety
  • Educators and prevention programme leaders

[Download the whitepaper here]

Final Thoughts

We are long past the point where taking down content is enough. Survivors deserve systems that prevent harm at the point of access.

This whitepaper offers a bold but achievable vision: one where perpetrators are deterred, victims are supported and platforms are made to take responsibility.

With the right tools, the right partnerships and the right public pressure, we can move towards a digital landscape where consent is not optional and accountability is built in by design.

#News & Events

Stay Updated with us