As data justice and digital crimes like image-based abuse increasingly enter public consciousness, Image Angel offers vital digital safeguarding against the misuse of technology to harm women and girls. We welcome the UK government’s heightened focus on Violence Against Women and Girls (VAWG), particularly its technology-facilitated forms, but the latest landscape review by the Open University reveals both the scale of the challenge and the urgency of systemic change.
The report exposes what many victims, survivors and frontline officers already know: policing in the digital age remains behind the curve. While existing offences like domestic abuse and stalking have long demanded attention, their digital evolution — through image-based abuse, deepfakes, cyberflashing and covert surveillance — has rendered many established policing models outdated. These are no longer fringe threats. Technology-facilitated gender-based violence (TFGBV) is now an endemic threat.

Fragmented Response
What alarms us most at Image Angel is the fragmented and inconsistent policing response to tech-enabled abuse. The report acknowledges this alongside some of the existing strategies adopted by the government to tackle TFGBV. For example, Operation Modify offers an optional e-learning course for frontline officers on digital investigation and tech-enabled abuse, while digital safeguarding involves collaboration between cyber police and the National Cyber Security Centre to protect victims of VAWG online.
However, the report criticises how victims in one part of the country might receive informed support and digital safeguarding, while others face confusion, inaction or disbelief. The absence of a national standard defining what an effective response should look like only deepens this disparity. The report rightly acknowledges the successful evolution of a national model for tackling online child sexual abuse and exploitation. It took two decades to build. We cannot afford to wait that long for a similar approach to image-based abuse and TFGBV in general. Women and girls should not be expected to suffer in silence while digital harms multiply and policing fails to catch-up.
Image-Based Abuse: Our Frontline Concern
As a platform that exists to protect against non-consensual image sharing, we are particularly dismayed by the report’s findings regarding image-based abuse. Despite widespread recognition of the harm caused, there is inconsistent reporting and recording of this crime. Services like the Revenge Porn Helpline have seen a surge in demand, with over 22,000 reports in 2024 alone, reflecting a growing crisis. The report also cites research which suggests over 1.4% of UK women experience non-consensual intimate image abuse each year, accounting for nearly 50 million victims. Yet support services like the Suzy Lamplugh Trust and the Cyber Helpline face chronic underfunding. We will actively support and collaborate with these charities by signposting victims to their services, ensuring they receive specialised help and guidance.
The report also suggests a significant challenge remains the under-reporting and lack of data sharing from social platforms, helplines and academic institutions. Image Angel can help by offering victims a way to report offenders to police or alert platforms directly. Using its Sentinel system, repeat offenders attempting to rejoin affiliated platforms can be flagged and blocked, helping prevent further harm and enabling more robust, coordinated responses.
Given the capabilities of our Sentinel system to record perpetrator details and generate hashes, we now have the potential—though not yet the infrastructure—to tackle issue by tracking where and how image-based sexual abuse is occurring across platforms. Such data could inform targeted intervention strategies, policing and even platform-specific policy responses. This data can help answer critical questions such as: Why is image-based abuse more prevalent on certain platforms or in particular regions? And how can this knowledge help reduce reoffending and support victims more effectively?
Rettenberger and Eher (2024) demonstrate that reforms based on the risk principle—matching interventions to an offender’s risk level—led to more conditional releases and lower sexual violence recidivism, showing that evidence-based management can reduce reoffending. By enabling early identification and accountability in image-based sexual abuse, Image Angel could support prevention and intervention efforts that align with this principle and help reduce recidivism.
A Call to Co-Design the Future
We echo the report’s call for a new, cross-sectoral network focused on technology-facilitated VAWG. The challenges are too vast, too fast-moving and too complex for policing alone. A community of practice — bringing together tech providers, survivor advocates, researchers and digital forensics experts — must shape the next chapter. Crucially, survivors themselves must be co-authors in that story.
We urge the National Centre for VAWG and Public Protection to ensure that platforms like Image Angel are not only consulted but embedded in this new ecosystem. Tech providers bear a responsibility to anticipate harm and build protective infrastructure into digital environments. But law enforcement must have the tools, training and operational models to pursue justice effectively in these same spaces.
We also welcome the notion of innovation trials to test and scale new investigative tools and safeguarding interventions. Image Angel offers such an intervention model to be tested at scale, to evaluate the real-world impact of protective and reporting mechanisms against image-based abuse. In addition, Image Angel can be integrated into the reports recommendation for ‘policy and practice dialogues to drive legislative and operational improvements.’