Authored By: Anisah Uddin
University Of Roehampton
Introduction
Social media has become central to public discourse, offering a platform for global expression, debate, and idea-sharing. However, challenges such as hate speech, misinformation, and cyberbullying have emerged. In the UK, balancing freedom of speech with protecting citizens from harm has become increasingly complex. This article explores how UK law addresses this balance.
Legal Foundations of Free Speech in the UK
In the UK, the right to freedom of expression is protected under Article 10 of the European Convention on Human Rights (ECHR), affirming the country’s commitment to free speech while allowing for certain limitations. 1 Alongside this, the UK’s common law traditions uphold the principle of free expression, though it must be carefully balanced with other rights, such as privacy and public safety. 2
There are key legal restrictions on free speech in the UK. The Public Order Act 1986 3 and the Racial and Religious Hatred Act 2006 4 criminalize hate speech that incites racial or religious hatred, particularly when it threatens violence or discrimination. 5
Defamation law, under the Defamation Act 2013 6, distinguishes between damaging speech and matters of public interest, with social media becoming a frequent platform for defamation disputes. 7 Additionally, the Terrorism Act 2006 8 prohibits speech that encourages or glorifies terrorism, including online content, with social media companies under pressure to prevent the spread of extremist propaganda. 9
The Role of social media in Free Speech
In 2025, platforms like Facebook, X (formerly Twitter), and Instagram are major arenas for public debate, where free expression is exercised on a global scale. With billions using these platforms for instant communication, they provide unprecedented access for individuals to share opinions and participate in discussions. 10 Social media has lowered barriers to public discourse, amplifying a wider range of voices.
The Challenge of Regulating Speech Online
The challenges which comes with regulating free speech regarding social media is the lack of checks and balances. 11 Unlike traditional media, where the content was created by professional journalists and news boards, social media allows everyone to share their beliefs to the entire world. 12 This creates a challenge in moderating harmful content that would spread rapidly and widely. 13
Another challenge is the anonymity of social media. For example, on social media users can be anonymous meaning they can easily make offensive statements without facing the same social and legal consequences they would in person because it would be harder to find and punish those individuals. 14 This borderless nature of social media would complicate with the enforcement of UK laws, creating conflicts about which country’s laws should apply. 15
The Growing Issue of Harmful Content on social media
Hate Speech:
Hate Speech is any speech, gesture, conduct or writing that may incite violence or discrimination against a particular group based on race, religion, gender or other protected characteristics. 16 In the UK, social media has become a significant platform for hate speech, with many incidents of racial abuse on platforms such as X and Facebook being widely reported. 17
In legally addressing hate speech the UK has created the Public Order Act 1986. 18 This act prohibits threatening, abusive, or insulting words likely to stir up hatred against racial and religious groups. 19 The online hate speech enforcement is often focused on combating hate speech through automated systems, such as flagging and reporting harmful content, though critics argue these systems are not always effective. 20
Misinformation and Disinformation:
Misinformation refers to false information that is spread without intent to deceive, while disinformation is purposefully falsifying information to mislead others to follow their beliefs or intend to cause harm to others. 21 In the UK, misinformation has led to many harms to the public by spreading misinformation through false election campaigns, public health matters like COVID 19 and immigration. 22
Regarding misinformation, the Online Safety Bill aims to tackle this by placing a duty of care on social media platforms to identify and remove harmful and false information, especially when this misinformation poses a danger to individuals or society. 23
Cyberbullying and Harassment:
Cyberbullying involves the use of digital platforms to harass, intimidate, or harm others. This includes bullying based on race, gender, or sexual orientation, and can have severe mental health consequences. 24
The Malicious Communications Act 1988 25 and the Communications Act 2003 26 are often used to prosecute cyberbullying offenses in the UK, but social media companies face growing pressure to do more to prevent these harms on their platforms. 27
UK Legal and Regulatory Responses to Harmful Content
The Online Safety Act 2023 28, also known as the Online Harms Bill, is the UK government’s primary initiative to regulate online content by establishing a duty of care for social media companies to protect users from harmful material. 29 Key provisions require platforms to remove not only illegal content but also material that is “legal but harmful,” such as disinformation, cyberbullying, and hate speech. 30 It includes specific protections for children, shielding them from harmful content like pornography and violence. Companies that fail to comply face substantial fines or could be banned from operating in the UK. 31
However, the bill has sparked concerns over free speech, with critics warning that fear of penalties could drive platforms to over-censor lawful but controversial speech. 32 As part of broader moderation efforts, UK-based platforms like Twitter, Facebook, and YouTube are increasingly held responsible for managing harmful content. 33 They rely heavily on AI and algorithms to detect and remove problematic material, but these technologies often struggle with language and context, leading to both under- and over-moderation. 34
The Ethical and Practical Challenges of Balancing Free Speech with Harm Prevention
Social media has sparked many ethical debates about free speech. On one hand, protecting the right to express oneself, as guaranteed by Article 10 ECHR, is vital, even when individuals share controversial or offensive views. 35 On the other hand, this freedom can incite violence, spread falsehoods, or promote discrimination, which can harm society. 36
The challenge is finding a balance between free speech and societal interests, requiring both effective legal frameworks and ethical guidelines to manage content moderation and speech regulation.37
One concern with stricter content moderation is the potential chilling effect, where individuals may self-censor out of fear of being flagged, removed, or arrested—even for lawful speech. 38 This could deprive people of their right to express differing views, leading to suppression of opposing opinions, which is undemocratic. For example, if someone criticizes the ruling Labour Party, they should not be arrested or demonetized for expressing their views.
Content moderation became a significant issue after the Southport riots, which followed the stabbing of three girls by Axel Rudakubana. 39 After it was revealed that the murderer was an alleged illegal immigrant, viral social media posts fueled anti-immigrant riots. Some high-profile individuals contributed to this chaos by sharing racist messages online. 40
Police arrested individuals for racist posts, such as Tyler Kay, who called for action against asylum seekers and immigration solicitors. 41 He was sentenced to 38 months in prison for stirring up racial hatred under the Public Order Act 1986. 42 Rosemary Ainslie, Acting Head of the CPS Special Crime and Counter Terrorism Division, said: “Online actions have real consequences. 43 This kind of behavior will not be tolerated, and offenders like Kay will be brought to justice swiftly.” 44 This shows that the police have adopted the strategy of arresting people who spread hate online and treat them as real-life crimes. 45
These arrests raised concerns about content moderation. While critics argue it stifles free speech, others see it as necessary regulation in a digital world. Without careful regulation, online hate could lead to broader censorship, suppressing important debates.
Future Directions in the UK’s Legal Framework for Social Media
As technology advances, the UK’s approach to regulating social media must adapt. These advances can be seen through AI, legislation and government platforms as shown below:
- AI and machine learning offer more efficient ways to detect harmful content, but concerns about bias remain. 46
- The evolving Online Safety Bill may soon introduce clearer definitions of harmful content and stronger enforcement standards. 47
- The growing influence of big tech companies raises questions for greater transparency and accountability, particularly in how algorithms and moderation decisions are made. Ensuring social media companies are held responsible for the content they host will be essential in shaping a fair regulatory future. 48
Conclusion
In the age of social media, balancing the right to free speech with the need to prevent harm has become an increasingly complex challenge for the UK. Current laws aim to protect individuals while preserving fundamental freedoms. However, care must be taken to ensure that regulation does not suppress lawful expression or debate. A balanced approach is essential—one that holds platforms accountable for harmful content without compromising democratic values. As technology continues to evolve, the UK’s legal framework must remain committed to both protecting public safety and upholding the right to free expression.
Bibliography:
Legislation
Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms (adopted 4 November 1950, entered into force 3 September 1953) ETS No 5, art 10
Communications Act 2003
Defamation Act 2013, s 6
Malicious Communications Act 1988 Online Safety Act 2023, s 7
Public Order Act 1986
Racial and Religious Hatred Act 2006, s 1 Terrorism Act 2006
Reports and Parliamentary Documents
Sarah Tudor and Russell Taylor, ‘Freedom of Expression Online: Communications and Digital Committee Report’ (House of Lords Library, 19 October 2022) https://lordslibrary.parliament.uk/freedom-of-expression-online-communications-and-digital- committee-report/ accessed 27 April 2025
Online Sources
Crown Prosecution Service, ‘Hate Crime’ https://www.cps.gov.uk/crime-info/hate-crime accessed 17 April 2025
Crown Prosecution Service, ‘Man Jailed Just Two Days After Posting Online During Public Disorder’ (9 August 2024) https://www.cps.gov.uk/cps/news/man-jailed-just-two-days-after-
posting-online-during-public-disorder accessed 13 April 2025
Josh Halliday, ‘Axel Rudakubana: From ‘Unassuming’ Schoolboy to Southport Killer’ The Guardian (25 January 2025) https://www.theguardian.com/uk-news/2025/jan/25/axel-
rudakubana-from-unassuming-schoolboy-to-notorious-southport-killer accessed 10 April 2025 National Bullying Helpline, ‘Cyber Bullying and Online Harassment Advice’ https://www.nationalbullyinghelpline.co.uk/cyberbullying.html accessed 14 April 2025
The Alan Turing Institute, ‘More than 90% of the UK Population Have Encountered Misinformation Online’ (30 May 2024) https://www.turing.ac.uk/news/more-90-uk-population- have-encountered-misinformation- online#:~:text=Filed%20under,content%20they%20are%20presented%20with accessed 17 April 2025
1 Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms (adopted 4 November 1950, entered into force 3 September 1953) ETS No 5, art 10.
2 Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms (adopted 4 November 1950, entered into force 3 September 1953) ETS No 5, art 10.
3 Public Order Act 1986, s 5.
4 Racial and Religious Hatred Act 2006, s 1.
5 ibid.
6 Defamation Act 2013, s 6.
7 ibid.
8 Terrorism Act 2006, s 8.
9 ibid.
10 Sarah Tudor and Russell Taylor, ‘Freedom of Expression Online: Communications and Digital Committee Report’ (House of Lords Library, 19 October 2022)
<https://lordslibrary.parliament.uk/freedom-of-expression-online-communications-and-digital- committee-report/> accessed 27th April 2025.
11 Sarah Tudor and Russell Taylor, ‘Freedom of Expression Online: Communications and Digital
Committee Report’ (House of Lords Library, 19 October 2022)
<https://lordslibrary.parliament.uk/freedom-of-expression-online-communications-and-digital- committee-report/> accessed 27th April 2025.
12 ibid.
13 ibid.
14 ibid.
15 ibid.
16 Crown Prosecution Service, ‘Hate Crime'<https://www.cps.gov.uk/crime-info/hate-crime> accessed 17th April 2025.
17 ibid.
18 Public Order Act 1986.
19 ibid.
20 ibid.
21 The Alan Turing Institute, ‘More than 90% of the UK Population Have Encountered Misinformation Online’ (30 May 2024) <https://www.turing.ac.uk/news/more-90-uk-population- have-encountered-misinformation- online#:~:text=Filed%20under,content%20they%20are%20presented%20with> accessed 17th April 2025.
22 The Alan Turing Institute, ‘More than 90% of the UK Population Have Encountered
Misinformation Online’ (30 May 2024) <https://www.turing.ac.uk/news/more-90-uk-population- have-encountered-misinformation- online#:~:text=Filed%20under,content%20they%20are%20presented%20with> accessed 17th April 2025.
23 The Alan Turing Institute, ‘More than 90% of the UK Population Have Encountered Misinformation Online’ (30 May 2024) <https://www.turing.ac.uk/news/more-90-uk-population- have-encountered-misinformation- online#:~:text=Filed%20under,content%20they%20are%20presented%20with> accessed 17th April 2025.
24 National Bullying Helpline, ‘Cyber Bullying and Online Harassment Advice’
<https://www.nationalbullyinghelpline.co.uk/cyberbullying.html> accessed 14th April 2025.
25 Malicious Communications Act 1988.
26 Communications Act 2003.
27 ibid.
28 Online Safety Act 2023, s 7.
29 ibid.
30 ibid.
31 ibid.
32 Online Safety Act 2023, s 7.
33 ibid.
34 ibid.
35 Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms (adopted 4 November 1950, entered into force 3 September 1953) ETS
No 5, art 10.
36 ibid.
37 ibid.
38 Sarah Tudor and Russell Taylor, ‘Freedom of Expression Online: Communications and Digital Committee Report’ (House of Lords Library, 19 October 2022)
<https://lordslibrary.parliament.uk/freedom-of-expression-online-communications-and-digital- committee-report/> accessed 27th April 2025.
39 Josh Halliday, ‘Axel Rudakubana: From ‘Unassuming’ Schoolboy to Southport Killer’ The Guardian (25 January 2025)<https://www.theguardian.com/uk-news/2025/jan/25/axel- rudakubana-from-unassuming-schoolboy-to-notorious-southport-killer> accessed 10th April 2025.
41 Crown Prosecution Service, ‘Man Jailed Just Two Days After Posting Online During Public Disorder’ (9 August 2024) <https://www.cps.gov.uk/cps/news/man-jailed-just-two-days-after- posting-online-during-public-disorder > accessed 13th April 2025.
42 ibid.
43
44 ibid.
45 ibid.
46 Sarah Tudor and Russell Taylor, ‘Freedom of Expression Online: Communications and Digital Committee Report’ (House of Lords Library, 19 October 2022)
<https://lordslibrary.parliament.uk/freedom-of-expression-online-communications-and-digital- committee-report/> accessed 27th April 2025.
47 ibid.
48 ibid.