How Safe Are YouTube Algorithms?
How safe do you find the internet? We are all surfing the internet 24/7 regardless of all spatial, cultural, political and economic restraints. Even if we put all the potential cyber security risks aside, the internet and specifically Google, and its privately owned video sharing company YouTube can harm us in ways that we can hardly think of. Our data might be safe, our financial transactions secure. But our minds exposed, vulnerable to all the bad the internet is continually giving to our contemporary era.
If the content available on YouTube was simply lacking of noble motives the situation would have rather been unfortunate, but not threatening. However, the situation is not as harmless as it seems, the several hours of videos available on YouTube carry much more ulterior motives than just commercial advertising. While the origins and motives behind such inappropriate content remain obscure, the potential anguish, these videos are capable of causing is a lot.
Let’s consider a utopian model. All searching and browsing is strictly filtered, and there is exactly zero deviance from the consumer’s end. So has the problem disappeared? Have we figured our way out of this technical malaise, and the corrupt society? The answer is definitely ‘NO’, in the broadest sense of the word. Even if the aspect of morbid user-engagement is completely eliminated, the software mechanisms and coded algorithms, which have laid foundation for the digital site, will still contribute to the problem.
Role of YouTube Auto-play
Ever heard of YouTube Auto play? Well for the non-tech savvies, it allows you to play back to back videos without manually selecting the next video. The feature has technically always been there, but was not set ‘on’ by default before 2015.
Firstly, let’s understand how keywords work. Search results that appear as a list on YouTube are generated by compatibility of relevant key words used. If those particular words match in with the titles used in the video, then that video will show up. For instance, if someone keys in ‘Batman and cat woman fight’, the video ‘S*xy Batman and cat woman fight’ would show up. Now such a video might only show up on the 10th page of the search results.
But if the autoplay feature is on, the video might appear automatically as a person keeps watching the videos. Videos will be wheeled in time after time. And every new video will carry some relevance to the prime keywords of the search. So if a video that has been packaged with the same keywords using SEO knowledge by a troll or prankster, it will ultimately hit the screen.
Problems of Supervision of YouTube Content
YouTube is vast – even used by millions of users with little tech knowledge, or even no knowledge at all. Content management and supervision is generally difficult, and not all videos can be administered for their content. Expanding the employee base will also be of only limited avail. YouTube thus announced that it will carry dry-runs and rundowns on its present algorithms as to limit the availability of misinformation, fake news, and fabricated data in its database.
The algorithms were indeed revised, carefully analyzed, and revamped. It was all to carter the demands of digital security, and data integrity. By doing so YouTube was able to ensure safer and more relevant search results. Thereby it prevented direct entry to YouTube’s dark world.
However, if YouTube is smart, the trolls proved to be smarter. They carefully examined the revamp for loopholes, corrupting the algorithms in their control. So now even if you choose to look for the right data, dark, violent, sexually explicit content is still there.
The more dangerous aspect of YouTube algorithms is it cannot look past the uploader’s claim of what the video has to offer. An anonymous user might just imitate animations from famous cartoons and put them in scenarios simply traumatizing for children. A ten minute plagiarized Mickey mouse video can show the infamous cartoon committing suicide in the 3rd minute of the video, and this will easily escape the digital eye.
James Bridle in his Medium post expresses the biggest concern for us all,
“What concerns me is that this is just one aspect of a kind of infrastructural violence being done to all of us, all of the time, and we’re still struggling to find a way to even talk about it, to describe its mechanisms and its actions and its effects.”
A few updates are not enough to resolve the issue. More people will fix this problem. There should also be a system to prevent unwanted content to appear on YouTube in the first place. The sure fire way of doing that is cutting down the ad revenue for those videos. YouTube needs not more algorithms but more people to sift through the content.
These cynical times call for more active prohibition of explicit content, efficient cyber policing and dire consequences for offenders.