Reframing the Shame

We all have a role to play

The digital world has opened doors for connection, creativity, and self-expression. But it has also created new opportunities for abuse. While these crimes are committed by perpetrators, the blame too often falls on those who have been harmed.

Words matter - Use them wisely

The way we talk about digital crimes shapes how we see them, how we respond, and how survivors feel about their experiences. Too often, the language used minimises the harm and shifts blame onto the victim rather than the perpetrator.

Take the term revenge porn. It makes it sound like a personal dispute, as if someone did something to deserve revenge. But we know from our work with NotYourPorn that that’s not the reality! The real story is about someone trusting the wrong person, only to have that trust shattered when their private images are shared without consent. 

Calling it revenge porn makes it seem like a scandal rather than what it really is—intimate image abuse. It’s not about revenge, and it’s certainly not pornography. It’s a violation.

The way we talk about romance scams and online fraud is also problematic. When someone is deceived online, the first thing they often hear is, How could you fall for that? But scammers are experts at manipulation. They spend weeks, even months, gaining someone’s trust. They use carefully crafted messages, emotional blackmail, and even deepfake technology to make their deception convincing. 

Perhaps the most common phrase that needs to change is, They shouldn’t have sent that picture. This kind of statement suggests that the victim is responsible for what happened to them, rather than the person who chose to betray their trust. The reality is, people in relationships—whether online or offline—share intimate moments. The issue isn’t that a private image exists; the issue is that someone made the choice to steal, exploit, and share it without permission.

When we shift the language we use, we shift the way society sees these crimes. Instead of questioning a victim’s choices, we should be questioning the people and systems that allow these violations to happen in the first place.

Beyond language, we also need real solutions.

Platforms and technology providers must step up.

Victims shouldn’t have to carry the burden of protection alone.

Identity Verification to Stop Scammers
Scammers thrive in anonymity. OneID is a tool that allows people to verify their identity safely using online banking, making it harder for fraudsters to create fake accounts and prey on others. If a platform isn’t using tools like this, ask why.

Invisible Watermarking for Image Protection
Image Angel’s technology embeds invisible watermarks into images at the server level, ensuring that if an image is shared without consent, there’s a way to track and identify the origin of the misuse. 

Platform Accountability
Social media, dating sites, and other online platforms must invest in safety features to protect their users. If a platform doesn’t offer protections, it’s prioritising engagement over safety.

Before sharing or engaging online, check what protections are in place. If they don’t exist, demand better.

If you’ve been a victim of anything we’ve covered in today’s story, head to the NO MORE global directory to find the help you need, where you need it.