Online platforms must begin assessing whether their services expose users to illegal material by 16 March 2025 or face financial punishments as the Online Safety Act (OSA) begins taking effect, BBC reported.
Ofcom, the regulator enforcing the UK’s internet safety law, published its final codes of practice for how firms should deal with illegal online content on Monday.
Platforms have three months to carry out risk assessment identifying potential harms on their services or they could be fined up to 10% of their global turnover.
Ofcom head Dame Melanie Dawes told BBC News this was the “last chance” for industry to make changes.
Under Ofcom’s rules, platforms need to identify if, where and how illegal content might appear on their services and ways they will stop it reaching users.
Many large tech firms have already brought in safety measures for teenage users and controls to give parents more oversight of their social media activity in a bid to tackle dangers for teens and pre-empt regulations.
For instance, on Facebook, Instagram, and Snapchat, users under the age of 18 cannot be discovered in search or messaged by accounts they do not follow.
The Guardian reported: Social media platforms have a “job of work” to do in order to comply with the UK’s Online Safety Act and have yet to introduce all the measures needed to protect children and adults from harmful content, the communications regulator said.
Ofcom on Monday published codes of practice and guidance that tech companies should follow to comply with the act, which carries the threat of significant fines and closure of sites if companies breach it.
The regulator said many of the measures it is recommending are not followed by the largest and riskiest platforms.
Every site and app in scope of the act — from Facebook, Google and X to Reddit and OnlyFans – now has three months to assess the illegal risk of content appearing on the platform.
CNBC reported: The U.K. officially brought its sweeping online safety law into force on Monday, paving the way for stricter supervision of online safety law into force Monday, paving the way for massive fines for technology giants like Meta, Google, and TikTok
OfCom, the British media and telecommunications watchdog, published its first-edition codes of practice and guidance for tech firms laying out what they should be doing to tackle illegal harms such as terror, hate, fraud and child sexual abuse on their platforms.
The measures form the first set of duties imposed by the regulator under the Online Safety Act, a sweeping law requiring tech platforms to do more to combat illegal content online.
In my opinion, it sounds like the UK’s regulator is intent on making sure that children – and adults – who use social media should be protected from seeing thing that do not belong online.