Video site YouTube – owned by tech titan Google – has found itself the source of scandal over its dated and ineffective algorithms. On the one side, users are complaining that innocent content has been hidden in YouTube’s “restricted mode”, while big brands have been left red-faced over the revelation that their ads are being shown over inappropriate videos.
In the first of the two criticisms, YouTube’s restricted mode filter has been seemingly blocking content that its creators – and members of the wider YouTube community – do not deem inappropriate for children. While the filter has to be activated (meaning such content is not automatically hidden when visiting the site), it’s not just specific videos that vanish from view when it’s in place – in some cases, entire channels are affected.
Restricted mode’s issues were first brought to light by the LGBTQ+ community, when videos that didn’t contain objectionable content (ie. violence, nudity, and bad language) were blocked by the filter. Other content from other categories – such as video games – was also found to be affected, with some adult content still available, and some child-friendly content blocked.
YouTube’s algorithms are meant to scan videos, along with their tags and titles, to decide what constitutes as “adult content” – but after claiming that “[only] videos that discuss more sensitive issues” were blocked under the filter, the VP of Product Management, Johanna Wright, later admitted that “This feature isn’t working the way it should… and we’re going to fix it.”
Aside from YouTube’s issues with vloggers, algorithms have also proved problematic when it comes to placing ads on the site – whether banners during streamed content, or active commercials that are played before the main video. The algorithms depend on keyword filters and visual recognition software, but extremist content is often creative with the tags it uses. As a result, big brands (such as M&S and Audi, and even the UK government) pulled their ads when it emerged that they appeared to be endorsing racist, homophobic and even terrorist content.
Google is promising to increase its efforts to police the site, including introducing a new “loyalty scheme” where users are rewarded for flagging inappropriate content and cases where ads appear in the wrong context – but with the sheer volume of content that is uploaded every minute (estimated to be around 400 hours), let alone the amount currently on the site, this seems an impossible task for a team to undertake.