Social media companies will be held to account for potentially harmful material on their platforms by broadcasting regulator Ofcom, it has been reported.
The Government has previously mooted a statutory duty of care for internet companies with an independent regulator enforcing new guidelines against so-called online harms.
The Financial Times has reported that Culture Secretary Nicky Morgan will announce a widened remit for Ofcom as an internet watchdog on Wednesday.
A role for Ofcom in dealing with technology firms like Facebook, Instagram and YouTube has previously been suggested and followed the publication of a Government white paper into online harms in April.
Proposals were put forward in August which suggested allowing the regulator to issue fines against platforms and websites it judges to have failed to protect users from seeing harmful videos such as those depicting violence or child abuse.
The suggested Ofcom scheme would serve as an interim arrangement until an online harms regulator is appointed under future legislation.
But the latest proposal could extend the remit of the regulator further with the Government white paper looking to regulate “illegal activity and content to behaviours which are harmful but not necessarily illegal”.
Concerns about the impact of social media on vulnerable people come amid suicides such as that of 14-year-old schoolgirl Molly Russell in 2017, who was found to have viewed harmful content online.
Molly’s father Ian spoke of the urgent need for greater action in an emotional foreword to a Royal College of Psychiatrists report, in which he described the “wrecking ball of suicide” that “smashed brutally” into his family, blaming “pushy algorithms”.
He said of her social media accounts: “Among the usual schoolfriends, pop groups and celebrities followed by 14-year-olds, we found bleak depressive material, graphic self-harm content and suicide-encouraging memes.
“I have no doubt that social media helped kill my daughter.”
Ofcom was established in 2003 and took on duties previously performed by the separate regulators of broadcasting, telecommunications and radio.
Andy Burrows, the National Society for the Prevention of Cruelty to Children’s head of child safety online policy, said: “Any regulator will only succeed if it has the power to hit rogue companies hard in the pocket and hold named directors criminally accountable for putting children at risk on their sites.
“Boris Johnson can protect families and support law enforcement by standing firm against some of the world’s most powerful companies.
“To do that it’s imperative that we have a duty of care model that puts the onus on big tech to prevent online harms or answer to an independent regulator.”
Press Association