After facing a major backlash from some advertisers, YouTube has finally responded to the issue of showing ads on “sexualized videos of children”. These videos on YouTube had garnered comments from suspected paedophiles.

YouTube states that it has nixed more than 270 such accounts and also removed more than 150,000 unsuitable videos and has even turned off comments on over 625,000 videos that drew child predators. The video streaming giant gave the statement to VICE News and has also withdrawn ads from videos that were erroneously depicted as family-appropriate.

Here’s what YouTube said in a statement:

"Over the past week we removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content. Content that endangers children is abhorrent and unacceptable to us."

This is not the first time when YouTube has faced backlash over videos and its ad policies. Volunteer moderators had claimed in the past that tools used to moderate comments are ineffective and inappropriate comments from child predators are still being posted from 50,000 to 100,000 accounts.

Disturbing suggestions in autofill

YouTube was also called out for filling auto filling search results with disturbing pedophiliac terms, for example – when a user types something like “how to”, the autocomplete generator comes up with suggestions like “have s*x kids” and “have s*x with your kids”. These results were first reported by BuzzFeed, as some users spotted on the site. However, when searched, we didn’t find these terms turning up in the suggestions.

YouTube says that the issue is under investigation. “Earlier today our teams were alerted to this awful autocomplete result, and we worked to quickly remove it. We are investigating this matter to determine what was behind the appearance of this autocompletion,” the company said in a statement given to BuzzFeed News.

In a previous blog post, Google explained that predictions are based on popularity and the freshness of search terms. The company also clarified that its search algorithm filters offensive, hurtful, or inappropriate queries. After the recent backlash from advertisers, YouTube said that it would take action on paedophile videos.