Facebook is launching a UK initiative to train and fund local organisations it hopes will combat extremism and hate speech. The UK Online Civil Courage Initiative's initial partners include Imams Online and the Jo Cox Foundation. Facebook's chief
operating officer, Sheryl Sandberg said: The recent terror attacks in London and Manchester - like violence anywhere - are absolutely heartbreaking. No-one should have to live in fear of terrorism - and we all have a
part to play in stopping violent extremism from spreading. We know we have more to do - but through our platform, our partners and our community we will continue to learn to keep violence and extremism off Facebook.
Last week Facebook
outlined its technical measures to remove terrorist-related content from its site. The company told the BBC it was using artificial intelligence to spot images, videos and text related to terrorism as well as clusters of fake accounts. Facebook
explained that it was aiming to detect terrorist content immediately as it is posted and before other Facebook users see it. If someone tries to upload a terrorist photo or video, the systems look to see if this matches previous known extremist content
to stop it going up in the first place. A second area is experimenting with AI to understand text that might be advocating terrorism. This is analysing text previously removed for praising or supporting a group such as IS and trying to work out
text-based signals that such content may be terrorist propaganda. The company says it is also using algorithms to detect clusters of accounts or images relating to support for terrorism. This will involve looking for signals such as whether an
account is friends with a high number of accounts that have been disabled for supporting terrorism. The company also says it is working on ways to keep pace with repeat offenders who create accounts just to post terrorist material and look for ways of
circumventing existing systems and controls. Facebook has previously announced it is adding 3,000 employees to review content flagged by users. But it also says that already more than half of the accounts that it removes for supporting terrorism
are ones that it finds itself. Facebook says it has also grown its team of specialists so that it now has 150 people working on counter-terrorism specifically, including academic experts on counterterrorism, former prosecutors, former law
enforcement agents and analysts, and engineers. One of the major challenges in automating the process is the risk of taking down material relating to terrorism but not actually supporting it - such as news articles referring to an IS propaganda
video that might feature its text or images. An image relating to terrorism - such as an IS member waving a flag - can be used to glorify an act in one context or be used as part of a counter-extremism campaign in another. |