This year, YouTube ended up being involved in several scandals related to videos of children showing pedophile comments and inappropriate videos present on YouTube Kids.
According to a report by BuzzFeed, YouTube offers its human moderates, who are responsible for training the platform's Artificial Intelligence (AI) a very confusing manual causing inappropriate videos to reach children.
- How to download videos from almost ANY website without programs in 2022?
- The 10 biggest social networks in 2022
In this way, some distortions can allow inappropriate content to gain good results in search engines.
"Even if a video is disturbing or violent, we have to flag it, but we still have to say it's high quality."
The moderator even mentioned a child exploitation video that had many views because it was well produced and professionally edited. What YouTube's moderation policy considers ideal for being featured on the platform.
To complete, the moderator cites another example. According to him, YouTube asks them to flag videos as "OK" or "Not OK" for children ages 9 to 12. THE clip Taylor Swift's Bad Blood was not approved, as the videos with animal suffering end up passing.
" An example of what I call a 'value misalignment.' Controversial and extreme content - video, text or news - spreads better and therefore leads to more views, more platform usage and increased revenue," said Cornell University artificial intelligence professor Bart Selman.
As we well know, it is not today that YouTube has been facing problems related to violent videos. Recently, the company even said that it would intensify the verification of this type of content.
We hope that the platform, in 2018, will be able to stop certain content and prevent it from being made available.