Facebook confirms tests of a new anti-extremism warning prompt
Facebook is testing prompts that will link users to anti-extremism support and resources if the company believes the user knows someone who could be on the path to extremism, or if the user has been exposed to extremist content themselves, according to a report by CNN Business.
In a statement to The Verge, a Facebook spokesperson said that the test is part of the company’s “larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk.”
Facebook says that it’ll continue to remove extremist content that violates its rules, though the company has had issues tracking down and removing similar content, even from groups that it’s actively tried to kick off the platform. Facebook has long been the subject of scrutiny from the public and lawmakers, as many say that its algorithms divide people and push them towards extreme ideologies, something the company has itself recognized.
Facebook asks: Are your friends becoming extremists?
July 1 (Reuters) – Facebook Inc (FB.O) is starting to warn some users they might have seen “extremist content” on the social media site, the company said on Thursday.
Screenshots shared on Twitter showed a notice asking “Are you concerned that someone you know is becoming an extremist?” and another that alerted users “you may have been exposed to harmful extremist content recently.” Both included links to “get support.”
The world’s largest social media network has long been under pressure from lawmakers and civil rights groups to combat extremism on its platforms, including U.S. domestic movements involved in the Jan. 6 Capitol riot when groups supporting former President Donald Trump tried to stop the U.S. Congress from certifying Joe Biden’s victory in the November election.
Facebook said the small test, which is only on its main platform, was running in the United States as a pilot for a global approach to prevent radicalization on the site.
“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” said a Facebook spokesperson in an emailed statement. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”