Online abuse and trolling

In May 2021, the UK Government published the draft Online Safety Bill in a bid to establish a new regulatory framework to tackle harmful content online provisions. When passed, the Bill will apply to Northern Ireland.

Speaking in a recent interview with agendaNi, Justice Minister Naomi Long MLA outlined her belief that the use of social media had become a “law of diminishing returns” for public representatives, given the levels of online abuse and trolling that are now prevalent across most platforms.

Long, on whose behalf the PSNI is currently pursuing a number of individuals in relation to online abuse, is conscious that the current levels of online abuse and trolling could act as a deterrent to young people, particularly women entering politics or public life.

However, while Long praised the work of the PSNI and their technical team in tracing abusive accounts and attempting to end some of the most egregious examples of online abuse, government in Northern Ireland has little power in creating laws to directly tackle the changing face of online abuse and trolling.

Following devolution, telecommunications was retained as a reserved policy area, meaning any decision on online offences and the regulation of the internet is a matter for the UK Government.

For context, Northern Ireland’s current legal framework follows the general principle that an act which is illegal offline is also illegal online. Various pieces of legislation, including the Malicious Communications (NI) Order 1988 and the Protection from Harassment (Northern Ireland) Order 1997, can be applied to some forms online abuse and trolling to prosecute criminal activity. In essence, the Northern Ireland Executive can keep aspects of the criminal law under review to ensure appropriate action is taken to strengthen it where necessary.

In comparison, while telecommunications is also a reserved matter in Scotland, there are a number of existing offences which can address online abuse and trolling if the behaviour amounts to criminal activity, such as in relation to stalking as part of the Criminal Justice and Licensing (Scotland) Act 2010 and the improper use of a public electronic communications network, as part of the Communications Act 2003. In April 2021, the Scottish Government enacted a Hate Crime Bill following a 2018 independent review of hate crime legislation. The Bill did consider but did not include a public petition on abusive and threatening communication.

In the Republic of Ireland, the Government approved the integration of the Broadcasting (Amendment) Bill, 2019, into the Online Safety and Media Regulation Bill and the introduction of three further Heads of Bill in May 2021. The Bill introduces online safety codes to instruct how designated online service providers should address harmful online content, establishes an Online Safety Commissioner as a regulator and defines harmful online content. Provision is also made in the Bill for the addition of further categories of harmful online content in the future.


In the UK, the introduction of the Internet Safety Green Paper in 2017 and the Online Harms White Paper in 2019 were precursors to the publication of the draft Online Harms Bill in May 2021, following several years of examination by the UK Government as to how the internet can become a safer place for users, through the application of rules and online behaviour.

The Law Commission’s work in recent years has examined the criminal law provisions that apply to individuals and not the liability of platforms.

On 12 May 2021, the Government published the Online Safety Bill, all provisions of which will apply across England, Wales, Scotland and Northern Ireland. The draft Bill would impose duties of care on providers of online content-sharing platforms and search services. Ofcom would enforce compliance and its powers would include being able to fine companies up to £18 million or 10 per cent of annual global turnover, whichever is higher, and have the power to block access to sites.

‘Regulated content’ would be considered harmful if:

  • it is designated in secondary legislation as “primary priority content” that is harmful to children or “priority content” that is harmful to children or adults;
  • a service provider has “reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact” on a child or adult of “ordinary sensibilities”; and
  • a service provider has “reasonable grounds to believe that there is a material risk” of the dissemination of the content “having a significant adverse physical or psychological impact” on a child or adult of “ordinary sensibilities”.

Part 2 of the draft Bill sets out the duties of care that would apply to providers of user-to-user and search services. All regulated services would have to take action to tackle ‘illegal content’ and ‘content that is harmful to children’. Category 1 regulated services would also have to address “content that is harmful to adults”.

Interestingly, the framework would not put any new limits on online anonymity. However, under the duty of care, companies would be expected to address anonymous online abuse that is illegal through ‘effective systems and processes’.

The UK Government has said it is working closely with devolved administrations on a number of areas where there us possible interaction with devolved competencies and has stressed that their legislation is not seeking to change the law in relation to offences in devolved regions but seeking to “clarify the responsibility of businesses to tackle this activity on their services”.

Show More
Back to top button