Free expression online

How to Fix the Online Safety Act: A Rights First Approach

In this report, we analyse the Online Safety Act (OSA or ‘the Act’) 2023, which imposes new duties on online service providers to protect children from harmful content, and Ofcom’s guidance to compliance with these duties. The OSA is the UK’s part of an international trend toward increased content moderation and age verification that is reshaping speech on the Internet. Ofcom is the designated regulator for implementing the OSA’s requirements.

Executive Summary

Open Rights Group believes that Ofcom should root its regulation, and the UK Parliament should base its law, in human rights law, taking advantage of prior work on content moderation such as: the 2013 Rabat Plan of Action: former UN Special Rapporteur David Kaye’s 2018 report on content regulation; and the collected case decisions of Meta’s independent Oversight Board. Ofcom should also consult standards set in the EU’s Digital Markets Act. In this analysis of the law and Ofcom’s guidance on compliance, Open Rights Group uses these as a lens in considering the impact on privacy, freedom of expression, legality, and legitimacy of aims. Hundreds of thousands of sites are expected to fall within the OSA’s scope. The most significant of these are sites and services operated by the biggest technology companies, which are divided into category 1 (user-to-user services with more than 7 million UK users); category 2A (search engines); and category 2B (user-to-user services with 3.5 to 7 million UK users). Any potentially regulated site must perform an annual risk assessment and, if children may encounter any of 17 categories of primary priority content, implement “highly effective” age verification.

The requirement to conduct a risk analysis may apply even to very small sites operated by a sole trader or team of volunteers, and the cost of implementing age verification may make their survival impossible unless they join larger, better-sourced providers. A very large non-commercial site, such as Wikipedia, may also find it impossible to comply. Ofcom itself admits that the costs of compliance may force smaller providers to leave the UK market, resulting in further consolidation of dominance of a handful of vast tech companies.

Ofcom’s guidance to date appears to encourage restricting content beyond the law’s requirements in order to avoid risk of non-compliance. This ’bypass strategy’ would remove legal content, often disproportionately, creating a chilling effect on freedom of expression. Ofcom’s guidance also fails to provide redress for users other than news publishers, journalists, and others who believe their content is journalistic. Open Rights Group believes that the lack of redress will disproportionately affect speech of democratic importance relating to already marginalised issues.

Ofcom’s consultation discusses a number of methods of age verification that might be “highly effective”, noting that whatever method is adopted must be technically accurate, robust, reliable, and fair. Ofcom leaves the choice of solution up to service providers, who seem likely to subcontract third-party vendors to provide age assurance. Open Rights Group is particularly concerned about the privacy consequences of creating a large, new market for largely untested services that will handle sensitive data. Open Rights Group believes that service providers should allow users to choose the method and identity provider they use, and favours allowing app stores on users’ own devices to handle verification, as user profiles on personal devices already store extensive sensitive personal data locally under users’ own control. Such a method avoids creating new security and privacy risks.

Finally, the state of current regulation presents a risk to legislators and policy makers. Because the policies do not tackle the centralisation of market power and dominance of a few platforms, nor attempt to empower users systemically, the online harms approach risks failure on its own terms. The OSA interventions are acting as a force to further concentrate market power, and thereby exacerbate the underlying reasons for unwanted content to circulate. Furthermore, as the policies necessarily tackle the most extreme kinds of content, but are unlikely to succeed, there will be pressure for further action, in the fields of legal but unwanted content (often called ‘lawful but awful’, or ‘legal but harmful’). Future legislation is therefore very likely, but also likely to repeat the same mistakes. We make recommendations for government to avoid this, by concentrating on other measures designed to introduce market forces for users to choose platform experiences that favour trust and pleasant interactions over misdirection and provocation.

Recommendations

This report contains detailed recommendations for the UK government and Ofcom. Our key recommendations are:

  • Align with international human rights standards
  • Ensure transparency and accuracy through incentives
  • Ensure proportional takedown decisions
  • Protect privacy and security, including end-to-end encryption
  • Exempt low risk and public interest sites, services and purposes
  • Protect news and free expression including appeals to the courts

Below are more detailed recommendations that show the main steps needed towards achieving these goals.

Align with international human rights standards

  1. The Government should pause further regulation until it is clear how the current legislation can align with fundamental rights.
  1. Ofcom should explain how UK and EU regulatory duties can align to avoid duplication.
  1. Where the EU’s DSA’s content regulations provide more robust protections for free expression and privacy, Ofcom should note the divergence and explain where to take the EU standard as ‘best practice’ or adapt it for UK legal circumstances.
  1. Ofcom should reference the Rabat Plan standards in policy analysis and, where appropriate, guidance.
  1. Ofcom should ensure that the OSA’s implementation complies with international standards on restricting free expression, especially legality, legitimate aim, and necessity and proportionality.
  1. Ofcom should ensure that the OSA’s implementation considers the high standard of evidence for content removal required by the Rabat Plan.
  1. Future legislation should be written to align with these international standards, to which the UK is committed.
  1. Ofcom should use Facebook’s Oversight Board decisions for guidance on applying human rights to platform policies and enforcement.
  1. Ofcom should use Facebook’s Oversight Board decisions to provide guidance on how to publicly release potentially confidential or sensitive information about platform policies and enforcement.
  1. Ofcom should recommend that platforms incorporate tests of necessity and proportionality in their content enforcement practices.
  1. Ofcom should recommend that platforms ensure their policies comply with legality and legitimate aim requirements.

Ensure transparency and accuracy through incentives

  1. Ofcom should use its transparency powers to enable trusted third parties from academia and civil society to understand and inspect platforms’ use of automation, ranking algorithms, and other protected proprietary technology. This will increase public confidence and lead to better standards across the sector.
  1. Ofcom should provide incentives for greater accuracy as far as it is able within the Act. These could include transparency on frequency of mistakes, egregious examples, and best practices for rectifying errors.
  1. Ofcom should ensure that platforms seriously consider false positives, especially in light of the need for measures to be necessary and proportionate.
  1. Future legislation should provide for incentives to improve accuracy and reduce incorrect takedowns.

Ensure proportionate takedown decisions

  1. Ofcom should include in its guidance reference to ensuring that decisions are proportionate.
  1. Future legislation should require specific safeguards for free expression, such as routes for appeal, and privacy, such as ensuring compliance standards are in place and enforced.
  1. Ofcom should work with platforms and regulators to develop means of bridging the gaps in fundamental rights protections, such as clear standards for appeal processes, or privacy compliance.
  1. Ofcom should remove all reference to the bypass strategy and instead recommend that platforms ensure that decisions affecting free expression are made on the basis of the Act, and remind platforms of the need to apply tests of necessity and proportionality to those decisions.
  1. Ofcom should reference appeals processes in their guidance as a means for meeting necessity and proportionality tests.
  1. Future UK legislation should align appeals processes with the rights enjoyed under the European Digital Services Act.
  1. Ofcom should include guidance on what is meant by harm and how it may be assessed. Ofcom should make reference to other concerns that need to be taken into account when assessing a proportionate response, such as free expression.
  1. In its guidance, Ofcom should consistently define ‘illegal harms’ narrowly.
  1. Search engines should notify sites that may be de-indexed.
  1. Search engines should provide a complaints process for sites that have been wrongly de-indexed or have resolved past problems.
  1. Ofcom should set out thresholds for judging the illegality of content and require human review.

Protect privacy and security, including end-to-end encryption

  1. Ofcom should not require client-side scanning through the OSA.
  1. Platforms should provide users with detailed documents regarding the use of their data so that they can understand the risks to their privacy and data.
  1. Ofcom and the ICO should work with industry to create a high standard for privacy in age verification.
  1. Ofcom should recommend that age verification solutions include the use of high, independently managed data protection standards, and meet interoperability and accessibility needs.
  1. Future legislation should incorporate privacy, accessibility, and interoperability requirements for age verification and assurance.
  1. Section 81 duties should be redrafted in future legislation to ensure no impact to privacy and as minimal impact as possible on free expression.

Exempt low risk and public interest sites, services and purposes

  1. Ofcom guidance should ensure that responsible public interest services such as Wikimedia are clearly exempt from age assurance or verification duties, especially where compliance would disproportionately require the wholesale removal of existing privacy preserving practices.
  1. Ofcom to issue guidance that allows bloggers, small businesses and small federated service to minimise the administrative burden placed on them.
  1. Exempt public interest services, small blogs, small businesses and small federated services, through secondary or primary legislation.
  1. Search is a vital mechanism for freedom of information and expression. Regulation of search should be limited to the strictly illegal.
  1. Ofcom guidance should ensure that online search services are excluded from age verification or assurance where compliance would impose a disproportionate impact on privacy protections.
  1. The definitions in Schedule 5 should be rewritten to exclude small, low risk sites, and low risk, well- managed public interest sites.
  1. Ofcom must make clear to the public that their access to content that may fall within the wide categories of “harmful to children” will soon be contingent on age verification. This has huge implications for user privacy and anonymity on the Internet.

Protect news and free expression including appeals to the courts

  1. The law should be amended to protect democratic and news content by category, rather than by actor. Current protections for news publishers should be extended to all users who assert they are publishing news or democratically important content.
  1. UK law should provide an appeals process similar to the EU’s DSA, whereby content takedowns can be adjudicated by a lower court without resorting to onerous challenges under contract law.

A Guide for Organisations Working with the ONline safety Act

Find out more