YouTube has a problem with the extremist content even more serious than they think is solving

The last time we talked about Robert Kyncl, YouTube CBO , it was related to the Logan Paul scandal and what the platform was going to do with certain content. Kyncl is still trying to clean up the image of the company and in an interview with Hollywood Reporter has offered more interesting facts about this controversy.

Do not forget that YouTube, like Google, is a platform based heavily on advertising. For this reason, the company is striving to regain the confidence of advertisers , trying to prevent advertising from appearing in videos that could be considered inappropriate.

Every second, 400 hours of content is uploaded to YouTube , so it is practically impossible to personally review everything that is published. Thanks to its algorithms, Kyncl ensures that they are able to more easily finish with the extremist content :

“Between July and December 2017, the algorithms have carried out the work of 180,000 people working 40 hours a week, exclusively on extremist videos.”

Kyncl says that “when everyone has a chance to have a voice, some of these people may not have good intentions . ” On the case of Logan Paul in particular, Kyncl declares that after pausing the “original projects” and removing him from Google Preferred he “has been cooperating in the process” to be able to fit into the ecosystem of the platform.

He acknowledges that “they should have done better, being quicker in the response “, but he has not wanted to clarify if this will be the method with which you will punish the YouTubers who emulate this type of behavior.

Algorithm to power

Every time you enter YouTube, the cover receives you with a large selection of content that might interest you. Obviously, it is based on your tastes : channels to which you are subscribed, videos that you have seen, videos that you have given, etc.

Precisely, The New York Times harshly criticized YouTube a few days ago, saying that the platform acts as a “great radicalizer” of our society. The author of the article (Zeynep Tufekci) experimented with the YouTube algorithm , checking how the recommendations always showed the same version of the story .

During the 2016 elections he had to watch several videos of Donald Trump rallies . Since then, YouTube started recommending far-right videos, something that also happened in videos that play automatically.

Radicalized recommendations, which seek to attract visitors and introduce you into a repetitive loop.

As she did not consume that kind of content, she decided to create another YouTube account and watch rallies by Bernie Sanders and Hillary Clinton . In this case, the recommendations were related to conspiracy theories about the Government’s implications in the attack of September 11, 2001.

As we see, in both cases YouTube recommended a totally extreme content, both left and right. Maybe this type of controversial videos serve to draw the attention of users, seeking to spend as much time as possible on the platform.

In addition, users may have the feeling of being stuck in a bubble , something that can also be applied to social networks like Facebook. It is sought that the advertising and recommended content is more akin to our tastes, but this type of measures could serve to further divide society.

Kyncl assures that YouTube operates under four principles of freedom: freedom of expression, freedom of opportunity, freedom to belong and freedom to inform. When asked if the company makes decisions based on political ideas, the CBO defends itself by stating that YouTube “does not try to position itself from one side to the other”.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *