When There’s No Proof: Sarah’s Experience of Intimate Image Abuse

Case Study Summary

Sarah’s* experience of intimate image abuse did not begin with a public leak. It began earlier, within a private relationship built on trust and in systems that offered her no protection when that trust was broken. During their relationship, Sarah shared intimate images and videos with her partner consensually. Some were taken on her phone, others on his, and some were sent directly between them. She repeatedly asked that the content be deleted after viewing, and was reassured that it had been. At the time, Sarah had no reason to doubt this. Like many people, she believed that once something was deleted, it was gone.

*Names and identifying details have been changed to protect anonymity.

From Trust to Threats

After the relationship ended, Sarah began receiving threats. Her ex-partner warned that he would share the intimate images with people who mattered most to her, including her parents and her college peers. Because they attended the same college and were part of shared group chats, the threats felt immediate and real.

When Sarah refused demands for money and pressure to resume the relationship, the situation escalated. One video was shared in a college group chat. Later, a naked image of Sara appeared publicly on a social media page, an intimate image she did not even know had been taken. False narratives followed, including claims that she was selling content online.

Living Without Proof

A defining part of Sarah’s experience was the absence of evidence. Although she believed she knew who was responsible, she had no way to show how the intimate images were being kept, restored, or shared. The threats originated from multiple phone numbers and online accounts, making it difficult to connect incidents or identify patterns of behaviour.

Without proof, Sarah felt unable to report what was happening to platforms or authorities. As she later described it, “It’s like taking someone to court without proof of what they did to you.”

She was also left trying to defend herself socially, repeatedly explaining the situation to friends, classmates, and her parents without anything concrete to support her account. The responsibility to be believed rested entirely with her. “I was alone in the whole situation,” she said. “No one to lean on. No one to believe me.”

The Impact

The harm Sarah experienced went far beyond the intimate images themselves. She described intense anxiety, particularly once the threats began. Not knowing whether the situation would escalate made it difficult to feel safe. She also described feeling isolated. While people expressed sympathy, there was no clear way for anyone to step in or help stop what was happening. The situation continued until the damage was already done.

Where Image Angel Could Potentially Have Made a Difference

Sarah was clear that no technology could undo what had already happened. But there were moments when things might have unfolded differently had Image Angel been part of the platform where she shared her intimate images with her partner. 

“If I could trace where those photos were coming from, I would have reported to the authorities,” she explained. “That would have been enough proof for me to defend myself. It would have made it believable.”

If Image Angel had been embedded at the point intimate images were accessed or downloaded, those interactions could have carried traceable identifiers. This could have created technical evidence linking repeated image access back to the same underlying device or session, even when different accounts or numbers were used. 

Having that kind of evidence may have reduced the burden on Sarah to prove what was happening. Instead of relying only on personal explanations, she might have been able to support reports to platforms or authorities with verifiable data. Access to proof at the point when threats began could have helped with earlier reporting and intervention.

As Sarah reflected later, “Back then I was helpless. This would have helped people like me fight.” She described how earlier support might have changed the trajectory of what happened: “It would have stopped the flame before it turned into fire.”

Why This Matters

Sarah’s experience highlights a gap in how intimate image abuse is currently addressed. Too often, systems respond only after harm becomes visible, while individuals are left without support when abuse begins.

Tools like Image Angel are being developed to explore how accountability and traceability could be built into platforms earlier, shifting responsibility away from those experiencing harm and towards those who misuse, steal, and distribute images without consent.

Sarah’s story reflects one person’s experience, but it raises important questions about how trust, technology, and accountability intersect.

Disclaimer

The examples, quotations, and case studies referenced in this document are drawn from voluntary potential user interviews conducted for product development, safety design, and contextual understanding.

These interviews are not academic research, nor are they designed or presented as formal studies, surveys, or statistical analysis. They do not aim to produce representative samples, generalisable findings, or population-level conclusions. The purpose of these interviews is to understand how image-based abuse and content misuse are experienced in practice, in order to inform the design, deployment, and evaluation of protective technology and operational safeguards.

Individual accounts are presented as contextual illustrations of real-world use cases, not as evidence of prevalence, causality, or typical outcomes across all users or platforms. Participation was voluntary and based on informed consent. Identifying details have been removed or altered where necessary to protect privacy, safety, and personal autonomy.

Find the reports and guidelines for the studies conducted here:

Download The Case Study
View All Research Reports
Download Case Study Summary FAQ