11.03.2023

|

Updates

Last week, the UK’s Online Safety Bill received royal assent and became law. With this development, Ofcom, the regulator for the new Online Safety Act (the Act or OSA), has published a roadmap to explain how the Act will be implemented over the next two years.

Ofcom has made it clear that it will move quickly to implement the Act and develop codes of practice in three phases: (1) duties regarding illegal content; (2) duties regarding the protection of children; and (3) additional duties for certain designated services to provide transparency and empower users. This Update offers an overview of the scope, key requirements, and expected timeline for each of the three phases.

Phase 1: Illegal Content—Risk Assessments and Safety Duties for All Users

Scope. All user-to-user and search services with a significant number of UK users or that target the UK market, regardless of where they are located, are in scope. Ofcom has indicated that more than 100,000 online services, large and small, could be subject to these rules. The Act recognizes that different risk governance and safety measures will be appropriate for different types of services.

Key requirements. The Act requires companies to assess and manage safety risks arising from online content and conduct. Ofcom has been explicit that, like the EU’s Digital Service Act (DSA) and Australia’s Online Safety Act, the Act is about systems change and risk governance. Ofcom is not expecting providers to eradicate illegal or harmful content, but rather to institute safety and product design measures that mitigate content risks while protecting free expression and privacy.

While the precise requirements will vary between categories of services, most providers will need to do all of the following:

  • Conduct risk assessments to assess the risk of harm to all users of illegal content, separately assessed for each kind of “priority illegal content” as defined in the Act (including terrorism and child sexual exploitation and abuse content), as well as other categories of illegal content.
  • Take effective and proportionate steps to manage and mitigate risks identified through the risk assessments.
  • Provide clarity in the terms of service about how users will be protected.
  • Provide easy-to-access reporting and complaint mechanisms for illegal content and content harmful to children.

Timeline. The expected timeline for implementation is as follows:

  • Q4 2023. Ofcom will publish its first draft codes of practice for illegal content, including child sexual abuse material and terrorist content, and engage in consultation with relevant stakeholders.
  • Q4 2024 – Q1 2025. Ofcom will finalize illegal harms codes, services in scope will complete an illegal harms risk assessment, and services must comply with the illegal content safety duties.

Phase 2: Child Safety Duties and Pornography

Scope. Online services that are “likely to be accessed by children” are in scope. User-to-user and search services will need to undertake a “children’s access assessment” to determine whether their service is “likely to be accessed by children.”

Key requirements. The second phase of Ofcom’s work will address the Act’s children’s safety duties, which are focused on protecting children (under the age of 18) from harmful (but legal) content, including:

  • Pornography.
  • Content related to suicide, self-harm, and eating disorders.
  • Content that is abusive and targeted at, or incites hatred against, people on the basis of protected characteristics.
  • Bullying.
  • Content that depicts serious violence.

To address these categories of content, providers in scope may need to comply with the following requirements:

  1. Age assurance mechanisms. Online services that host pornographic content need to use age assurance mechanisms to prevent children from being able to access pornographic content. Ofcom will consult on draft age assurance guidance in December 2023.
  2. Children’s safety risk assessments and mitigation. Providers in scope will need to comply with children’s safety duties in the Act, including risk assessments and risk mitigation. Providers will have three months to carry out access assessments once Ofcom publishes final guidance (currently anticipated for early 2025).

Timeline. The expected timeline for implementation is as follows:

  • Q4 2023. Ofcom consultation on age assurance mechanisms.
  • Q2 2024. Ofcom will publish proposals for consultation on, among other things:
    • Draft guidance for services on carrying out children’s access assessments.
    • Ofcom’s analysis of the causes and impacts of harms to children.
    • Draft guidance on carrying out children’s risk assessments.
    • Draft codes of practice setting out recommended measures to protect children online.
  • 2025. Ofcom intends to take the following steps:
    • Issue final guidance on children’s access assessments (Q1 2025).
    • Issue its main statement on children’s safety duties (Q2 2025).
    • Submit children’s code of practice to the secretary of state (Q2 2025).
    • Begin investigations and impose sanctions for noncompliance with child protection safety duties (Q2 2025).
    • Consult on draft guidance on protecting women and girls, including on content and activity that disproportionately affect women and girls (Q2 2025).

Phase 3: Additional Duties for Categorized Services

Scope. Providers designated as “Category 1” or “Category 2B” user-to-user services, or “Category 2B” search services or combined services (a regulated user-to-user service that includes a public search engine) are in scope. Ofcom will establish a register of user-to-user services that fall under the categories above, and the secretary of state will create regulations that specify the threshold conditions for each category of service. The criteria will include the number of users and functionalities of the service, as well as any other characteristics or factors of the service that the secretary of state considers relevant.

Key requirements. Ofcom’s third phase will focus on an additional set of heightened online safety duties relating to transparency reporting, user empowerment, fraudulent advertising, and user rights.

Timeline. The expected timeline for implementation is as follows:

  • Q2 2024. Ofcom intends to publish guidance to the secretary of state on categorization of services and draft guidance on its approach to transparency reporting.
  • Q4 2024. Assuming the secretary of state sets threshold categorizations in secondary legislation by summer 2024, Ofcom intends to publish a register of categorized services by the end of 2024.
  • 2025. If the above timing remains accurate, Ofcom anticipates that it will then:
    • Publish further proposals on categorized services duties, including a draft code of practice on fraudulent advertising (Q1 2025).
    • Issue transparency notices (Q2/Q3 2025) depending on the timing of secondary legislation.
    • Issue its final codes and guidance (Q4 2025).

Enforcement and Penalties

Voluntary compliance and supervision. Ofcom has indicated that its preference is to work with services to encourage voluntary compliance. This will include offering support and resources as companies implement the codes of practice.

For the largest and riskiest services, Ofcom will regularly engage by making formal and enforceable requests for information. These requests will seek to understand the nature of content and conduct harms on a service, assess the effectiveness of safety measures, and as appropriate, recommend improvements. For smaller companies and startups, Ofcom plans to offer digital tools for support and guidance.

Penalties. Where Ofcom identifies compliance failures, it can impose fines of up to £18 million or 10% of worldwide revenue (whichever is greater). In the most serious cases of noncompliance, Ofcom is authorized to seek a court order that would require third-party internet service providers (ISPs) or services to withdraw or limit access to the offending service.

Takeaways

Companies that offer user-to-user and search services may wish to start identifying content risks and opportunities to integrate UK OSA preparation into existing content regulation compliance processes. While there are key distinctions between the OSA and content and safety laws in other key jurisdictions, such as the United States, European Union, Australia, and Singapore, a coordinated, global approach may enable useful efficiencies.

© 2023 Perkins Coie LLP


 

Sign up for the latest legal news and insights  >