Extremism: YouTube wants to act harder against violence

About 98 percent of the deleted videos on the platform are tracked by self-learning machines. YouTube is now investing in more staff and infrastructure.

Extremism: YouTube wants to act harder against violence

Google's video platform YouTube wants to act on criticism and political pressure against violence and extremism. Among or things, number of people reviewing content will increase to 10,000 in coming year, announced YouTube boss Susan Wojcicki.

The portal was misused to "mislead, manipulate, harass, or even inflict suffering," said Wojcicki. In meantime, your company has developed a technology to detect videos with extremist content or content that could endanger children's safety.

Machine learning helps human reviewers to remove almost five times more videos. Since June, 150,000 videos have been deleted due to violent extremism. Meanwhile, 98 percent of m would be tracked by self-learners.

Machines Replace 180,000 employees

This allows YouTube to delete such content faster. Nearly 70 percent of it had been removed within eight hours of uploading and almost half in two hours. The algorithms had been working on a volume of videos since June, for which it would have taken 180,000 people in a 40-hour week.

YouTube has been put under pressure by advertisers after ir ads have landed in context of extremist videos and paedophile comments on videos with children. Wojcicki announced "A new approach to advertising on YouTube" so that "ads only run where y are supposed to run". Among or things, re should be more human control besides algorithms and a more careful examination of which channels and videos are eligible for advertising.

In Germany, regulations of so-called Network Enforcement Act, which foresees a rapid deletion of banned content such as hatred and violence, also take over from January.

Is content is quickly found

Facebook also uses machine learning and automatic detection systems to find terrorist content. For example, 99 percent of photos, videos, and texts that are related to IS and al-Qaeda are discovered before publication, says company. It is more difficult for or terrorist groups. The many languages in which content is disseminated make it difficult to find m. The fact that automatic systems could also be used against regionally active terrorist groups seems unlikely at present.

More than 50 companies in global Internet forum to Counter Terrorism (GIFCT) also exchange so-called hashes of terrorist content, something like digital finger impressions. This technique uses Facebook to find revenge pornography. Also, external organizations such as flashpoint, Middle East Media Research Institute (MEMRI), SITE Intelligence Group and University of Alabama are looking for and reporting terrorist content on Facebook.

Date Of Update: 06 December 2017, 12:04
NEXT NEWS