On October 26, 2023, the UK Online Safety Act (the “Act”) received Royal Assent, making it law in the UK. The Act seeks to protect children from online harm and imposes obligations on relevant organizations, including social media platforms, to prevent and remove illegal and harmful content. In a press release, the UK Government stated that the Act “takes a zero-tolerance approach to protecting children from online harm, while empowering adults with more choices over what they see online.” For example, the Act requires relevant organizations to:
- remove illegal content quickly or prevent it from appearing in the first place;
- prevent children from accessing harmful and age-inappropriate content, including pornographic content; content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders; and content depicting or encouraging serious violence or bullying content;
- be transparent about the risks and dangers posed to children on their sites, including by publishing risk assessments; and
- provide parents and children with clear and accessible ways to report problems online when they do arise.
The Act will be regulated by Ofcom, the UK authority for broadcasting, telecommunications and postal industries. Failure to comply with the Act may result in fines of up to £18 million or 10% of annual global revenue, whichever is higher. Ofcom is expected to immediately begin enforcing the Act by identifying illegal content, with a consultation process taking place on November 9, 2023. Ofcom will then take a phased approach to bringing the Act into force, prioritizing enforcing rules against the most harmful content as soon as possible. The majority of the Act’s provisions will be enforceable within two months.