After Facebook, YouTube now details its efforts to curb extremist content online

After Facebook, YouTube now details its efforts to curb extremist content online
HIGHLIGHTS

YouTube largely seems to have the same idea with Facebook, although Google is enlisting help from third parties as well.

When it comes to managing extremist content on its platform, YouTube seems to have the same idea as Facebook. Days after Facebook announced its plan to take on extremist on its network, Kent Walker, senior VP and general counsel of Google, explained YouTube’s initiatives for the same. And the two are remarkable similar, driven by a combination of human and AI elements. Walker’s post, which was originally published as an op-ed on Financial Times, has now been posted on Google’s blog as well. In it, he explains four steps that YouTube is taking to curb extremist content on the video streaming platform.

First amongst these initiatives is to use machine learning algorithms to identify and remove extremist videos. “This can be challenging: a video of a terrorist attack may be informative news reporting if broadcast by the BBC, or glorification of violence if uploaded in a different context by a different user,” Walker writes. YouTube is using video analysis models to “find and assess” terrorism related content it has removed over the past six months. Walker says YouTube will devote “more engineering resources” and apply the company’s “most advanced machine learning research” to train new “content classifiers”. 

Next, YouTube is going to use human resources to manage the “nuanced decisions” when removing videos. YouTube will expand its Trusted Flagger Programme for this and walked says these flaggers are accurate over 90 percent of the time. Unlike Facebook, YouTube is also enlisting third parties in its effort. The expansion of the Trusted Flagger Programme includes adding 50 NGOs, who will get operational grants from YouTube, which in turn will use their expertise in flagging or removing extremist content. The company already has 63 organisations onboard and plans to add 50 more. 

The third point in Walker’s op-ed is perhaps a direct result of the boycott YouTube faced in Europe recently. This includes taking a “tougher stance” on videos that border on the edge of violating the company’s policies. Content that does not clearly violate YouTube’s policies will appear behind interstitial warnings and they won’t be monetised. YouTube will also bar comments from such videos and users won’t be allowed to endorse them either. Essentially, YouTube is making these videos tougher to find and reducing engagement on them, hoping that it helps in curbing them from spreading as well. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” writes Walker.

Lastly, YouTube is taking more steps towards counter-radicalisation through its Creators for Change programme. Here, the company will redirect users targeted by ISIS and other extremist groups to counter-extremism content instead. “In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages,” Walker writes.

Prasid Banerjee

Prasid Banerjee

Trying to explain technology to my parents. Failing miserably. View Full Profile

Digit.in
Logo
Digit.in
Logo