We must act now on deepfake abuse

I never imagined that one day, I would wake up to see a false video of myself being circulated online via WhatsApp. One I knew I had never been part of, in a setting I had never seen. Yet to the untrained eye, colleagues, constituents, my family, it looked all too real, writes Cara Hunter MLA.
Deepfake videos are real videos with an image super imposed onto it.
These videos can cause significant emotional and psychological toll on its victims. As a public representative, I have grown used to scrutiny and political attacks. However, nothing prepares you for the violation of seeing your own face superimposed onto something degrading; a grotesque distortion of your identity shared for shock value amid the last three weeks of an election. These fake images and videos are not a joke. They are not a novelty. They can ruin lives.
Deepfake pornography is a uniquely modern abuse. It is insidious, hard to track, and easily weaponised. The victim experiences a profound loss of control over their image, their reputation, and their sense of self. The impact is far-reaching: from strained personal relationships to professional harm, anxiety, depression, and even withdrawal from public life. It is a form of dehumanisation that leaves deep psychological scars.
I know this, because I have lived it.
That is why I strongly welcome the recent announcement of a public consultation from the Department of Justice in Northern Ireland on the criminalisation of deepfake abuse.
This cannot be just another exercise in box-ticking. It must lead to robust, victim-centred legislation that recognises the specific harms caused by the creation and sharing of sexually explicit deepfake images and videos. This is about justice and safety.
The North cannot afford to be left behind.
The South has already taken steps to respond to this technological threat. Under the Harassment, Harmful Communications and Related Offences Act 2020 (commonly known as ‘Coco’s Law’) the non-consensual distribution of intimate images is now a criminal offence. More recently, lawmakers in Dublin have begun to consider specific legislative responses to AI-generated deepfake content, recognising that this issue is evolving rapidly and requires ongoing attention.
Meanwhile, Norway has also emerged as a leader in this space. Earlier this year, it passed legislation making the creation and sharing of non-consensual deepfake images and videos a criminal act. Crucially, Norway’s laws focus not just on intent, but on the impact on the victim, recognising that even so-called “jokes” can have devastating consequences.
These are models we must learn from. They are examples of governments stepping up, listening to victims, and showing political courage in the face of new challenges.
In the North, we must ensure that any legislation emerging from this consultation is similarly bold and informed by the lived experience of survivors. Offenders must face serious consequences. The law must clearly state that using AI to sexually exploit or humiliate another person is not just immoral, it is illegal.
While the current consultation focuses on adult victims, we cannot ignore the growing threat these technologies pose to children and young people. From deepfake nudes of schoolchildren to AI-generated grooming tactics, the risks are real and increasing. Protecting our children must be a legislative priority.
I urge everyone, especially those who have felt powerless in the face of this abuse, to respond to the consultation which will be live until 6 October 2025. Your voice matters. We need to send a clear, unified message that Northern Ireland will not allow technology to be used as a weapon of sexual harm and degradation.
The SDLP will continue to fight alongside campaigners, survivors, and legal experts to ensure that we do not lag behind the rest of these islands or our European neighbours. This is not just about policy. It is about dignity, justice, and for me, it is deeply personal.




