3 July 2017

Facebook's secret weapon for fighting terrorists: Human experts and AI working together

By Conner Forrest 

Facebook had declared that it is actively fighting terrorism online, and it is using artificial intelligence (AI) to do so. In a Thursday blog post, the company detailed its strategy for removing terrorist content from Facebook, and how it's working to protect users from such material.


The post said that radicalization typically occurs offline, but there's no denying that the internet is a major communication channel for terrorist groups around the world. The Islamic State (ISIS) is thought to have hundreds of social media accounts, even doing recruiting drives on social media.



It's a massive problem, and Facebook wants to help solve it.


"We remove terrorists and posts that support terrorism whenever we become aware of them," the blog post said. "When we receive reports of potential terrorism posts, we review those reports urgently and with scrutiny. And in the rare cases when we uncover evidence of imminent harm, we promptly inform authorities."


With billions of users speaking some 80 languages, the post noted, the challenge is enormous. But Facebook said it believes that AI can act as a solution.



One of the ways that the technology can help is by matching images and videos to known terrorist content. The hope is that the company would be able to prevent other accounts from uploading a photo or video that was previously removed from the site for its ties to terrorist activity, the post said.



Facebook's systems are also looking out for language. Text that praises extremist groups, or seems to be promoting the work of terrorist groups, can be recognized and flagged for removal. The site also uses signals to determine if a particular page is a central location for a terrorist cluster so they can remove it, the post said.



According to the post, Facebook is also working harder to eliminate fake accounts used to circumvent the site's policies. The company is also attempting to tackle terrorist activity on WhatsApp and Instagram as well, the post said.



AI isn't the only solution—people are also a big part of Facebook's anti-terrorism strategy. In addition to reports and reviews from its Community Operations team, Facebook is also employing some 150 counter-terrorism experts as well, including academic experts, former prosecutors, former law enforcement agents, analysts, and engineers, the post said. And if a threat is imminent, a separate Facebook team communicates with law enforcement.



Additionally, specialized training, partner programs, industry collaboration, and government partnerships all play a role in Facebook's work against terrorists online.

The 3 big takeaways for TechRepublic readers
Facebook is fighting terrorists online, using a combination or AI and human experts to flag content for removal and protect users.
Image matching, language understanding, eliminating fake accounts, and taking down terrorist cluster pages are all a part of Facebook's plan.
Facebook also employs 150 counter-terrorism experts, along with its Community Operations team, to add human expertise to its strategy.

No comments: