Tech News : Online Safety Bill Amendment Gets Mixed Reaction
The Governmentâs decision to amend the Online Safety Bill to allow âlegal but harmfulâ content has received a mixed reaction with criticism coming from The Samaritans.
What Is The Online Safety Bill?Â
The UK governmentâs Online Safety Bill, originally proposed by former PM Teresa May, is (draft) legislation thatâs designed to place a âduty of careâ on internet companies which host user-generated content in order to limit the spread of illegal content on these services.
The idea of the Bill is to prevent the spread of illegal content and activity (e.g. images of child abuse, terror material, and hate crimes), as well as to protect children from harmful material. Until the recent amendment it was also designed to protect adults from legal but harmful content.
Where?Â
The Online Safety Bill applies to social media platforms, video-sharing platforms, search engines, plus other tech services, and requires them to implement systems and processes to remove illegal content as soon as they become aware of it. The Bill also requires these services to take additional proactive measures with regards to the most harmful âpriorityâ forms of online illegal content.
The Amendment â Legal But Harmful Material OK?Â
Following five months of delays, plus a return to Parliament, Culture Secretary Michelle Donelan confirmed an amendment to the Bill which shifts the emphasis away from general safety towards child safety. The amendment means that social media sites will no longer need to remove material designated âlegal but harmful.âÂ
Michelle Donelan said: âAny incentives for social media firms to over-remove peopleâs legal online content will be taken out of the Online Safety Bill. Firms will still need to protect children and remove content that is illegal or prohibited in their terms of service, however the Bill will no longer define specific types of legal content that companies must address.â  She added: âThis removes any influence future governments could have on what private companies do about legal speech on their sites, or any risk that companies are motivated to take down legitimate posts to avoid sanctions.âÂ
Why?Â
The government says that the amendment has been made over fears that the Bill would stifle free speech and created a quasi-legal category between illegal and legal where sizable platforms and companies would have to remove both illegal content, plus any content that had been named as legal but potentially harmful. This echoes concerns by free speech campaigners that the Bill could allow the government or tech platforms to censor content.
Replaced With What?Â
The government says the legal but harmful measures will be replaced with âtriple shieldâ which it says will âstrike the right balance with its protections for free speech.â This will take the form of ânew duties which strengthen the Billâs free speech requirements on major online platformsâ which it says will make them more accountable for their policies. The new duties will prohibit online platforms from removing or restricting user-generated content, or suspending or banning users, where the content doesnât actually go against the platformâs terms of service or the law.
ConcernsÂ
The amendment has been met with mixed reactions. Although there is broad agreement that the Billâs emphasis on protecting children is a good thing, concern has been expressed by some groups. For example, concern has been expressed by the Samaritans in an open letter online which says that:Â âWe have been pleased to see continued commitment from the Government to protecting vulnerable children as it considers modifying the Bill. But susceptibility to harm from suicide and self-harm content does not end when people reach the age of 18. Anyone, including young adults aged 18-24, can be just as vulnerable to harm from this type of contentâ.Â
What Does This Mean For Your Business?Â
In practical terms, businesses like social media platforms will now need to do less assessing, restricting, and removing of content and will need to relate judgement of content to whether its legal or prohibited in their terms of service. Other work may need to be done by platforms too, e.g. looking more closely at their terms of service, being required to show how they enforce their user age limits, answering to a toughened regulator, being more transparent about how their algorithms work and publishing details of when the regulator Ofcom has taken action against them. Although the amendment still provides protection for children, there is an argument, as made by the Samaritans, that young people over 18 need protection too from certain content.
That said, the change should please free speech campaigners, plus there are other aspects to the Bill that should provide a wide range of projections in law, particularly for young people from many types of harmful content and online behaviours.