On day one of our ‘Sixteen Activists or Organisations Around the Globe Fighting to End Digital Violence Against all Women and Girls’ campaign for UNite, Yiğit Aydinalp of the European Sex Workers’ Rights Alliance discusses digital violence against sex workers and why decriminalisation is essential.
Yiğit Aydinalp is currently working as Interim Executive Director of the European Sex Workers’ Rights Alliance (ESWA), while conducting a PhD in Sociology at the University of Sheffield. ESWA’s network of over 100 member organisations, in more than 30 countries, is driving crucial global recognition that sex workers’ rights are human rights, across both offline and online environments. Yiğit has been a human-rights activist for most of his life, working primarily on sex-workers’ and LGBTQI+ rights. Over the years, he has helped build organisations and been involved in grassroots movements in both Turkey and the UK. This experience has taught him that ‘meaningful change happens when the people most affected lead the work themselves, a principle that sits at the heart of ESWA’s approach.’ Participatory research, and the co-production of knowledge, is indeed increasingly accepted as the gold standard for ethical practices in marginalised communities.
Yiğit first joined ESWA in 2014 as the organisation’s Admin and Finance Officer, supporting the alliance’s growth and its network of sex-worker-led groups across Europe and Central Asia. In 2020, Yiğit developed ESWA’s Digital Rights Programme and has since moved into the position of Senior Programme Officer and current role as Interim Executive Director. Through this work, Yiğit has developed significant expertise and leadership over ESWA’s work on technology, digital safety and online advocacy. Recognising that the digital sphere had become fundamentally central to sex workers’ rights inspired Yiğit to establish ESWA’s digital rights programme. He points out that shifts in advertising and screening clients, alongside increasing use of adult platforms and online payment systems, has meant that ‘the intersections of labour rights, privacy, surveillance, platform policy and digital violence become unavoidable.’
ESWA has become a leading voice advocating for sex workers’ digital rights amid the growing challenges posed by AI, with support over several years from the European AI & Society Fund. As adult platforms like OnlyFans have become essential to sex workers’ livelihoods, AI-driven content moderation, non-consensual deepfakes, and overlooked protections in emerging AI regulation have created significant risks. With funding since 2020, ESWA has advanced sex workers’ inclusion in EU policymaking on the Digital Services Act and AI Act, educated community members about digital harms, and developed innovative tools such as training sessions and a deck of “Sex Work and Tech Tarot” cards to promote online safety. Their advocacy has strengthened relationships with major social media platforms and pushed for fairer content moderation, including Meta.
In this interview Yiğit talks to us about ESWA’s research and programmes of support, the way in which digital violence manifests towards sex workers, stigma, research ethics and the critical changes needed to end violence within this marginalised labour sector.
What’s the experience been like advocating for sex workers’ rights in the digital sphere?
Advocating in the digital sphere has been both deeply rewarding and persistently challenging. On the one hand, we get to intervene in what might previously have felt like abstract technology debates (for example: AI systems, platform content moderation, financial exclusion) and bring the lived realities of sex workers to those debates. On the other hand, many of these systems were built without sex workers in mind, so we constantly find ourselves pointing out unintended harms, invisible exclusions and structural barriers.
For example, through ESWA’s digital rights work we’ve hosted peer-to-peer learning sessions such as Digital Rights 4 All (the convening for sex-work, technology & human rights), developed resources Sex Work & Tech Tarot Cards to engage with issues like online safety, surveillance, platforms, algorithmic policing and conducted research on various topics relevant to the rights of sex workers, such as Sex Workers Belong On Social Media: Recommendations for Platform Accountability (2025) or Labour rights, safety and privacy: How to centre sex workers’ needs on advertising platforms (2024).
So, the experience has involved research, training, advocacy, alliance-building with tech-policy actors, platforms and regulators and constant translating of sex-worker realities into the language of digital policy.
But at the same time, it’s often frustrating: sex workers’ voices are still too often excluded, policies are made without our input, platforms apply blunt rules that criminalise or censor sex workers, and the harms are real. That tension is what keeps the work urgent.
How does digital violence manifest towards sex workers?
Digital violence against sex workers shows up in many interconnected ways:
Censorship, de-platforming and shadow-banning: Sex workers are often removed, hidden or suspended even when they are simply using social media or advertising platforms in ways that others use without sanction. For instance, ESWA’s research reports show that content moderation algorithms flag posts or accounts that mention “sex work,” “hustle,” or related terms. This can even include posts that are entirely non-commercial or non-sexual in nature.
Financial exclusion and payment-platform bans: Many sex workers rely on online platforms and digital payments; when services like banks, payment processors or “creator” platforms restrict or ban sex-work-related activity, that’s digital violence through economic exclusion.
Surveillance, privacy threats and algorithmic policing: Technologies used for anti-trafficking, surveillance or platform moderation are used to monitor, profile and control sex-workers especially migrant, racialised or LGBTQ+ sex workers. For example, the weaponisation of AI, predictive policing or data-driven surveillance.
Intimate-image abuse, doxing and online harassment: Sex workers face threats of non-consensual sharing of images, stalking or doxing through online platforms. While discussions often focus on other groups, sex workers are also very exposed.
Platformisation of work and loss of agency: As more sex work moves onto digital platforms, sex workers find themselves subject to new forms of control: platform rules, algorithmic visibility, forced disclosures, surveillance. This undermines autonomy and can create new harms. For example, the ESWA report Conditions, Control and Consent: Exploring the impact of platformisation of sex work (2023) addresses this.
In sum, digital violence isn’t just bad comments online it’s systemic, structural and tied to the broader digital economy, platform governance and regulatory frameworks.
Is there adequate visibility of sex-worker voices and sex-worker rights organisations within research on technology-facilitated gender-based violence?
No there is a significant gap. While there is increasing research on technology-facilitated gender-based violence (GFGBV), sex-work-specific voices and organisations are too often missing or marginalised. There are a few issues.
For example, research often frames sex workers as victims or clients rather than as agents, rights-holders, or experts in their own work.
Many studies omit sex workers’ organisations in designing instruments, setting questions or interpreting findings. This replicates the “nothing about us, without us” problem.
Technology research (on platforms, harassment, image-based abuse, algorithmic bias) tends to concentrate on more visible or mainstream populations; the particular legal, economic and social vulnerabilities of sex workers (including criminalisation, stigma, digital exclusion) mean that many of the findings simply don’t map well onto their realities.
ESWA pointed out that many of the new policy debates (for example the EU’s Digital Services Act (DSA) did not meaningfully include sex-worker communities in their consultations, meaning that the needs of those most at risk (sex workers) were not central in the regulatory design.
You recently published really timely ethical gold standards for research with sex workers. Can unintended digital violence arise for sex workers who partake in academic research?
Yes and this is precisely one of the reasons those new “Gold Standard Guidelines for Safe and Ethical Sex Work Research in Europe” (2025), published by ESWA and the European Sex Work Research Network (ESWORN), matter.
Unintended digital violence in research participation can take many forms:
Being outed via research participation: If a sex worker’s participation becomes visible (through data, publication, listing, network links) that could expose them to criminalisation, stigma, loss of livelihood or violence.
Data security and privacy failures: Digital tools used in research (surveys, interviews, storage) may inadvertently expose identities, personal data, location or working practices. If that data becomes accessible (e.g., by law-enforcement or hostile actors), sex workers are at risk.
Poor compensation, tokenisation and lack of meaningful involvement: When sex workers are treated simply as “subjects” rather than co-designers, research can inadvertently reinforce extractive practices causing further marginalisation.
Publication of findings without protecting anonymity or context: Some research names specific working practices, platforms, country contexts; if this information is accessible online, it might enable surveillance or platform moderation targeting the participants or their networks.
Digital dissemination of findings can amplify risk: For example, if a published study is widely shared, referencing specific groups or individuals, even aggregated data might lead back to individuals in small networks or communities.
The gold-standards guideline emphasises principles like ongoing consent (not one-off), making sure sex-workers have control over how findings are used, making research outputs accessible (not behind paywalls), and acknowledging the risk of digital harms.
As someone working at the intersection of academic research, sex-workers’ rights and digital safety, I’ve seen first-hand that even well-intentioned studies can trigger digital violence by exposing communities to platform bans, algorithmic exclusion, or surveillance. We must treat research participation as a digital safety question as much as an ethics question.
What work is ESWA doing to address and end digital violence towards women and girls? Are there any ways that digital violence towards sex workers, alongside advocacy efforts, differ in member countries?
ESWA is doing multiple things:
- Producing briefing papers and resources.
- Running capacity-building and peer-learning workshops: For example, convenings where sex workers and activists discuss digital safety, platforms, algorithmic threats and develop peer tools.
- Engaging policy processes: ESWA feeds into EU policy, such as monitoring the DSA, providing input on AI regulation, raising concerns about mandatory phone registration for adult content creators.
- Advocating for digital inclusion: Ensuring that sex workers have access to digital literacy, safety, privacy tools, and that platforms and regulators recognise sex workers’ rights as part of digital rights.
- Raising visibility: Making sure sex-worker rights’ organisations are part of digital rights coalitions and tech-policy conversations, not excluded.
Yes, the manifestation of digital violence and the scope of advocacy work varies significantly between countries because of differing legal regimes, social stigma, technological infrastructure and platform access. In countries where sex work is criminalised or highly regulated, digital surveillance or “anti-trafficking” AI tools may be more pervasive and used to police sex workers under the guise of technology. This means digital violence is amplified by offline law-enforcement mechanisms.
In some countries digital advertising or payments are tightly restricted for sex-work-related content, leading to economic exclusion which is itself digital violence. Digital exposure (via platforms, payment systems, data capture) intersects with immigration control, which can differ hugely between countries. While global platforms have certain policies, local enforcement and language-based moderation differ, so sex workers in different languages or regions might experience “shadow-banning” or censorship differently.
What needs to change on a policy, platform and societal level to address digital violence towards sex workers?
Policy level changes:
- As a crucial first step, decriminalise sex work so that sex workers can claim human rights like any other workers. ESWA recommends meaningful inclusion of sex-worker communities in policy design.
- Ensure digital regulatory frameworks (e.g., the DSA, AI regulation, data protection laws) explicitly account for sex workers as a marginalised group. For example, impact assessments must include how platform moderation, algorithmic systems or platform payment systems affect sex workers.
- Mandate transparency and accountability for platforms: how algorithms moderate content, what redress mechanisms exist, how payment/financial exclusion occurs. Sex workers need a seat at the table.
Platform level changes:
- Platforms must revise community standards and content-moderation systems with an awareness of sex work (and of the difference between sex work and trafficking). Blanket bans or removal of sex-work-related content or language (“sex work”, “hustle”) need reviewing.
- Payment processors, banks and financial services must eliminate discriminatory practices against sex workers and ensure transparent decision-making and appeal mechanisms.
- Design “safe by default” systems: tools and features that protect the privacy of sex workers (e.g., pseudonymity, separation of identities, safe payment flows, safe deletion). Increased digital literacy and peer-training is also key.
- Require platforms to consult sex-worker organisations when designing features, algorithms or moderation systems this avoids harmful unintended consequences.
Societal level changes:
- Reduce stigma around sex work: one of the root causes of digital violence is that sex work is seen as deviant or criminal, which allows exclusion, censorship, financial bans and algorithmic discrimination to go unchallenged.
- Raise digital literacy and safety among sex-worker communities and sex-worker-rights organisations so that they can navigate platform risks, surveillance, doxing and privacy threats.
- Amplify sex-worker-led research and narratives so platforms, policy-makers, technologists recognise sex workers as rights-holders and experts rather than “problems” to be policed.
- Build alliances across digital-rights, labour-rights and sex-worker-rights movements so that sex workers are not isolated when confronting platform or state harms.
What are your hopes for ending digital violence towards sex workers and the part ESWA will continue to play in such a complex task?
My hope is for a future where sex workers can fully participate in digital spaces as professionals, workers, creators and organisers without fear of exclusion, censorship, economic marginalisation or surveillance simply because of their work. I hope for platforms, regulators and society to recognise that sex work is work, and that digital rights apply equally.
I believe ESWA will continue to push this agenda. Ultimately, I envision a moment when digital violence towards sex workers is recognised not as a “sex work issue” only, but as a core digital-rights issue and when the solutions are co-designed with us, not done to us. ESWA is already doing important work here. I’m excited to see what happens next.