The UK government is proposing new rules that would make internet companies legally responsible for unlawful and damaging content.
Tech executives could be hit with substantial fines and criminal penalties under the proposal unveiled Monday.
The government said that an independent regulator would be created to enforce the new rules, which focus on removing content that incites violence, encourages suicide or constitutes cyber-bullying. Content related to terrorism and child abuse would face even stricter standards.
The regulations would create a statutory “duty of care” for social media companies such as Facebook and Twitter to protect young people who use their sites. The rules would be overseen by an independent regulator funded by a levy on internet companies.
“No one in the world has done this before, and it’s important that we get it right,” Media Secretary Jeremy Wright told the BBC. “And I make no apologies for the fact that we will put forward proposals here, which we believe are the right way to approach this, but we will then listen to what people have to say about them.”
A 12-week consultation will now take place before the draft bill is published.
While the United States has largely relied on market forces to regulate content in a country where free speech is revered, governments in Europe have signaled they are willing to take on the tech companies to block harmful content and prevent extremists from using the internet to fan the flames of hatred.
‘Clean up their act’
Britain’s Home Secretary, Sajid Javid, criticized tech firms for failing to act despite repeated calls for action against harmful content.
“That is why we are forcing these firms to clean up their act once and for all,” Javid said.
Facebook’s U.K. head of public policy, Rebecca Stimson, said the goal of the new rules should be to protect society while also supporting innovation and freedom of speech.
“These are complex issues to get right and we look forward to working with the government and Parliament to ensure new regulations are effective,” she said.
Wright insisted the regulator would be expected to take account of freedom of speech while balancing against preventing harm.
“What we’re talking about here is user-generated content, what people put online, and companies that facilitate access to that kind of material,” he said. “So this is not about journalism. This is about an unregulated space that we need to control better to keep people safer.”
In the US
Sparks will fly when reps from Facebook and YouTube appear before the House Judiciary Committee on Tuesday on a panel about the rise of white nationalism through social media with reps from groups like ADL and… Candace Owens. As Oliver Darcy pointed out, Owens recently had to clarify comments she made about Hitler.
And on Wednesday reps from some of the social media companies will be back on the Hill for a Senate hearing titled “Stifling Free Speech: Technological Censorship and the Public Discourse.” Expect a lot of talk about perceived algorithmic anti-conservative bias.