Global brands suspend YouTube and Google advertising after it was placed alongside videos paedophiles exploited
Big brands are pulling advertising from YouTube and Google after their ads were found to be displayed against content being exploited by paedophiles.
Despite YouTube promising to take an “even more aggressive stance” against predatory behaviour, the confectionery giants Mars and Cadbury, the supermarket Lidl, Deutsche Bank and Adidas have led a wave of brands removing advertising from YouTube.
According to investigations by BBC News and the Times, there are estimated to be tens of thousands of predatory accounts evading protection mechanisms to leave indecent comments on videos of children. Some videos are posted by paedophiles and many are innocently posted by youngsters.
Some of the comments are said to be sexually explicit, while others reportedly encourage children posting the videos to perform sexual acts.
A Mars spokesperson said: “We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content. We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally. Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
Lidl said: “It is completely unacceptable that this content is available to view and it is, therefore, clear that the strict policies which Google has assured us were in place to tackle offensive content are ineffective. We have suspended all of our YouTube advertising with immediate effect.”
Deutsche Bank said: “We take this matter very seriously and suspended the advertising campaign as soon as we became aware of it.”
A Cadbury spokesperson said: “Whilst we investigate this matter we have suspended all advertising on the channel until we have clarity from YouTube on how this situation occurred and are satisfied that an acceptable solution has been put in place.”
A spokesperson for Adidas said: “We recognise that this situation is clearly unacceptable and have taken immediate action, working closely with Google on all necessary steps to prevent this from happening again.”
HP, Diageo and Now TV, which is owned by Sky, are also reported to have pulled their advertising from the video site.
Some of YouTube’s volunteer moderators from the site’s “trusted flagger” scheme told the BBC there could be “between 50,000 to 100,000 active predatory accounts still on the platform”, while another told the Times there are “at least 50,000 active predators” on the site.
As well as trusted flaggers, YouTube also uses algorithms to identify inappropriate sexual or predatory comments. But the system is said to be failing to tackle the problem and paedophiles are continuing to comment on videos of children.
Anne Longfield, the children’s commissioner, said the findings were “very worrying”, while the National Crime Agency said it was “vital” online platforms had robust protection mechanisms in place when they were used by children.
Tony Stower, the public affairs manager for the NSPCC, said sites such as YouTube should not be “marking their own homework”.
“Government intervention is vital to protect children from the moment they sign up to social networks, rather than waiting until social networks deem it the right time to act,” he said.
“We need a set of rules enshrined in law to make social networks design protections into their sites, and we need an independent regulator to enforce those rules. That also means fining social networks when they fail to protect children.”
Ads for several major international brands, including a global sportswear brand and food and drink giants, appear alongside the videos, raising concerns that they could be indirectly funding child abuse.
YouTube said it had noticed a growing trend of content “that attempts to pass as family-friendly, but is clearly not” in recent months and announced new ways to toughen its approach.
Johanna Wright, vice-president of product management at YouTube, said in a blogpost: “We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors.
“Comments of this nature are abhorrent and we work … to report illegal behaviour to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
Longfield told the BBC: “This is a global platform and so the company need to ensure they have a global response. There needs to be a company-wide response that absolutely puts children protection as a number one priority, and has the people and mechanisms in place to ensure that no child has been put in an unsafe position while they use the platform.”
The Guardian