The BBC reported that the Department for Culture, Media and Sport has proposed an independent watchdog and a code of practice that tech companies would have to follow. Senior managers of social media platforms and other websites would be held liable for breaches, with a possible fine.
The Online Harms White Paper covers a range of issues. The Guardian reported that those issues include: child abuse, terrorist acts, revenge pornography, cyber bullying, spreading disinformation and encouraging self-harm. Senior social media executives could be held personally liable for failure to remove such content from their platforms.
The new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, including social media platforms, file hosting sites, public discussion forums, messaging services, and search engines.
Other proposals in the Online Harms White Paper:
- Government powers to direct the regulator on specific issues such as terrorist activity or child sexual exploitation
- Annual transparency reports from social media companies, disclosing the prevalence of harmful content on their platforms and what they are doing to combat it
- Co-operation with police and other enforcement agencies on illegal harms, such as incitement of violence and the sale of illegal weapons
- Companies will be asked to comply with a code of practice, setting out what steps they are taking to ensure that they meet the duty of care – including by designing products and platforms to make them safer, and pointing users who have suffered harm towards support.
The Online Harms White Paper comes after the Australian government passed legislation to crack down on violent videos on social media. It is called the Sharing of Abhorrent Violent Material bill.
The Australian bill creates new offenses for content service providers and hosting services that fail to notify the Australian federal police about or fail to expeditiously remove videos depicting “abhorrent violent conduct”, which is defined as videos depicting terrorist acts, murders, attempted murders, torture, rape or kidnap.
Personally, I believe that social media (and similar websites) truly do need some regulation. The specific issues listed in Australia’s bill and in the UK bill are things that reasonable people would immediately recognize as things that do not belong online.
It is abundantly clear that social media sites, and other websites that allow users to post content, are failing at self-regulation. Growth and interaction is valued by those companies over the health and safety of users. I suspect that when social media managers are fined by governments – for failure to clean up their websites – the cost will eventually get high enough for those companies to do the right thing.