The Internet is fantastic. It offers limitless opportunity for interaction, engagement and expression. But it also provides a platform for abuse, often with tragic consequences. The Lord Chief Justice Sir Declan Morgan stressed the need for a more robust legislative response to this soaring area of crime at a conference run by the Justice Committee in October 2015. Since then, we have been working on a potential solution and tabled an amendment to the Justice (No.2) Bill.
The amendment wasn’t selected to go forward for debate, but we won’t let the issue go. Nobody likes a bully. Bullies are insidious and destroy lives. What’s even worse is a bully hiding behind a computer screen, often shielded by anonymity as they torture their victim.
Our proposals sought to put more responsibility on the social media companies. They must take reasonable steps to make the safety and protection of their users a priority, or face heavy fines. There’s also a need for a mechanism to strip the bullies of their anonymity; without the difficulty and expense of obtaining a court order unless entirely necessary. We also wanted tougher penalties for those who cause real harm and suffering to their victims; be it either mental or physical.
Critics of the proposals say they are an attack on free speech. Of course they aren’t. Free speech is a fundamental principle of democracy and a Convention Right. However that right is not absolute. It is not a right to be enjoyed at the expense of someone else’s physical or mental well being.
It is a balancing exercise, which ultimately, should be decided by the courts in the most serious of scenarios. The ‘reasonable person’ test lies at the heart of our common law system, and would be applied in all cases concerning the amendment as a result of the definition of ‘harmful electronic communication’.
It is really disappointing that this won’t go forward for debate on Wednesday. The phenomenon of cyberbullying has become a real worry for people, young and old; but is of particular concern to schools who are struggling protect students. Therefore, Basil McCrea MLA has committed to bring forward a Private Member’s Bill forward in the next mandate to strengthen the law and tackle the trolls.
Here is a copy of proposed the amendment. Comment and Feedback welcome.
Harmful electronic communications
(1) A person who, without lawful authority or reasonable excuse, intentionally or recklessly shares a harmful electronic communication that—
(a) includes sexually explicit content of another person, or
(b) causes alarm, distress, or injury including psychological harm to another person, shall be guilty of a summary offence and liable to up to 12 months’ imprisonment or a fine not exceeding £5,000 or both.
(2) Where the harmful electronic communication—
(a) incites or encourages another to cause serious harm to themselves,
(b) incites or encourages another to commit suicide,
(c) is part of an ongoing campaign of harmful communication, or
(d) has been communicated by multiple accounts used by one or more people, the person shall be convicted of an indictable offence and liable to up to 7 years’ imprisonment or unlimited fine or both.
(3) If on the evidence the court is not satisfied that the person should be convicted of an offence under subsection (1) or (2), the court may nevertheless make any of the following orders upon application to it if, having regard to the evidence, the court is satisfied that it is in the interest of justice to order—
(a) that the person remove or delete specific electronic communication(s);
(b) that the person shares an apology or correction as the court deems appropriate in the circumstances;
(c) that the person shall not, for such period as the court may specify, communicate by any means with the other person or that the person shall not approach within such distance as the court shall specify of the place of residence or employment of the other person;
(d) a person who fails to comply with the terms of an order under this section shall be guilty of a summary offence and liable to up to 12 months’ imprisonment or a fine not exceeding £5,000 or both.
(4) For the purposes of this section, a social media platform is defined as any person or company who provides an online platform for people to post any public content.
(5) All social media platforms must have adequate policies and procedures in place to balance the right of free speech with the duty to protect the physical and mental well-being of people.
(6) Social media platforms have a duty to maintain a full record of a person’s identity and to establish any linked accounts—
(a) for the purposes of this section, a linked account is an account which may have a different name but emanates from the same person, or the same IP address, or an account shared with another person for a common purpose;
(b) failure to do so, or failure to have adequate protections for people in place will result in the social media platform being guilty of an offence and liable to a fine to be determined by the court.
(7) A person who is the subject of a harmful communication may apply to the social media platform to have the perpetrator identified.
(8) A social media platform must identify a person who has posted harmful communication(s) on receipt of an application which—
(a) specifies the complainant’s name,
(b) sets out the harmful communication concerned and explains why it is harmful to the complainant,
(c) specifies where on the social media platform the harmful communication was posted, and (d) satisfies the social media platform that the communication is harmful and may constitute an offence under subsection (1) or (2).
(9) Where a person has made an application concerning multiple or linked accounts, communications between the accounts held by the social media platform may also be requested.
(10) Should the social media platform be satisfied that the application meets the requirements in subsection (8), the complainant should be notified of the perpetrators identity within 30 days of the complaint being received.
(11) Where a responsible body, including but not exclusively, a Board of Governors, makes an application on behalf of a person under the age of 18, the social media platform has a duty to remove any harmful communication which meets the requirements of subsection (8) within 24 hours of the complaint being received.
(12) Should the social media platform refuse to identify users and then be compelled to do so by court order, the social media platform will be guilty of an offence and liable to a fine to be determined by the court.
(13) In this section, “harmful electronic communications” means any method of electronic communication which would be grossly offensive, threatening, intimidating, or menacing to a reasonable person.’