Acebook parent company Meta has asked its supervisory board to advise on whether its current Covid-19 disinformation policy is still appropriate as it says the status of the pandemic has “evolved”. .
Meta’s president of global affairs, Sir Nick Clegg, said the company wanted guidance on whether its extensive measures to tackle misinformation linked to the virus, introduced in the early days of the pandemic, were now effective. are also relevant and relevant because in many places “attempts have been made to return. to a more normal life”.
The oversight board was established in 2020 and is able to make binding decisions about Facebook’s content removal actions and policies, even overruling the platform and executives.
The former Liberal Democrat leader and deputy prime minister said tough measures to curb the spread of misinformation were vital before the outbreak of Covid-19, but the social networking firm felt now was the right time to ask if it was. “The right approach remains? The coming months and years.”
Resolving the inherent tension between free expression and safety is not easy, especially when faced with unprecedented and fast-moving challenges, as we have been in the pandemic.
“The world has changed a lot since 2020,” Sir Nick wrote in a blog post, adding that vaccination rates are high in many countries, while identifying and dispelling misinformation and its risks to people. There were online tools and resources to inform. is now widespread.
But it acknowledged that wasn’t the case everywhere, and the company now wants guidance on how best to protect people from harmful content while protecting free speech.
“It is important that any META policy is appropriate for the full range of circumstances that countries find themselves in,” he said.
“Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard.
“But some misinformation can pose a risk of physical harm, and we have a responsibility not to spread that content. The policies in our Community Standards protect free expression by preventing this dangerous content.
“But the inherent tension between free expression and safety is not easy to resolve, especially when faced with unprecedented and fast-moving challenges, as we have been in the pandemic.
“That’s why we are seeking advice from the Oversight Board on this matter. Its guidance will also help us respond to future public health emergencies.
It is time for people in the UK and elsewhere to have democratic oversight of life-changing decisions thousands of miles away in Silicon Valley.
However, online safety campaigners have accused Meta of trying to deflect from what they say was a failure to stop the spread of massive amounts of misinformation during the pandemic.
Calum Hood, head of research at the Center for Countering Digital Hate (CCDH), said: “This move is a distraction from Meta’s failure to act on the flood of anti-vaccine conspiracy theories spread by opportunistic liars during the pandemic. designed for .who made millions of dollars by exploiting social media’s massive audience and algorithmic amplification.
“CCDH’s research, as well as Meta’s own internal analysis, shows that the vast majority of anti-vaccine misinformation originates from a small number of very good bad actors.
“But Metta has failed to follow up on key figures who are still reaching millions of followers on Facebook and Instagram.
“Platforms like Meta should not have absolute power over life-and-death issues that affect billions of people. It’s time for people in the UK and elsewhere to have democratic oversight of life-changing decisions thousands of miles away in Silicon Valley. be done.”
Sir Nick said Meta’s policies had helped remove Covid-19 misinformation on an “unprecedented scale”, saying it had removed more than 25 million pieces of content globally since the start of the pandemic. has gone