Following recent pressure on social media platforms’ efforts to reduce extremist content, YouTube has outlined the steps it is taking to confront the issue.
In an op-ed published in Financial Times, Kent Walker, senior vice-president of Google, writes that YouTube have been working in tandem with law enforcement agencies and government officials to tackle extremist content and have invested heavily in systems to help with the mission.
Walker also acknowledged the need for more work in this section of the industry, and said that any developments need to be made as quick as possible.
The first of YouTube’s new plans is for its automated systems to be expanded in order to easier identify any extremist- or terror-focused videos.
YouTube plans to use machine learning to “train new ‘content classifiers to help [YouTube] more quickly identify and remove such content.”
Alongside this, the company will also be increasing the number of ‘Trusted Flagger’ users, with these being a group of users with special privileges that allow them to review any content that is flagged as violating the site’s guidelines.
The company will almost double the number of people taking part in the program “by adding 50 expert NGOs to the 63 organisations who are already part of the programme,” which will be funded by Google providing addition grant money.
It is hoped that the expansion of the program will allow for greater specialisation in finding unwanted content, such as self-harm and terrorism.
YouTube will also be taking a much firmer stance of videos that, while not necessarily violating the site’s community standards, “contain inflammatory religious or supremacist content”.
Videos of this nature will not be removed, but will come with a warning and will gain no revenue from adverts, continuing on the company’s changes to ad policy from earlier this year.
The company will also make further use of its ‘Creators for Change’ program, which will redirect users who have been targeted by extremist groups to counter-extremist content.
“This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.”
Walker also explains how YouTube is working with large social media companies including Twitter and Facebook to develop new techniques to stop terrorist content being available online.
The call for increased regulation has been widespread across Europe, with Germany considering imposing large fines against any social media companies that do not remove extremist content.