Paedophiles are using YouTube as a shop window to advertise abused children to other predators before then exchanging underage pornography, it has been claimed.

An investigation by The Times found numerous suggestive videos of boys and girls on Google's video sharing platform alongside details of how to see them nude via encrypted messaging apps such as Telegram and Wickr.

It has led the NSPCC to condemn the content as "despicable" and to demand YouTube do more to pull the videos down.

The findings are just the latest in a long line of bad press for Google as serious questions are raised over the content its platforms host.

The Times investigation found one Brazilian YouTuber, called Samara Santos, was allowed to upload a dozen short videos on the site this month alone showing girls under the age of 10 suggestively licking their lips or dancing.

One of the more brazen clips even had a masked girl saying: "Hey guys, I got new underwear."

All the videos were emblazoned with the Santos's email address, which when contacted would see him boast of having available 315 gigabytes of high definition video – equivalent to 13 days' worth.

Asked by an undercover reporter at The Times whether the videos showed girls dancing or some other content, he replied: "No dancing, just naked whatsapp webcam exclusive and paid."

Another paedophile under the username "HornyPastor" was also allowed to upload numerous suggestive videos of children to YouTube, including one called, "12 yr old Nancy twerking in grey outfit".

Tony Stower, NSPCC head of child safety online, condemned the content.

"It is absolutely despicable that abusers are able to blatantly use YouTube as a shop front for the distribution of indecent videos," he said.

"But YouTube must take responsibility. YouTube must step up measures to proactively remove known abusers and block indecent content using simple search terms. It must immediately act to remove these dangerous users from its platform today."

YouTube said content that endangered children was "abhorrent" and that the company worked aggressively to take down such videos.

It said it had removed more than 150,000 videos in recent weeks and worked with third parties to prevent child sexual abuse imagery from being uploaded.

It added that more people are being hired to help combat child abuse on its platform and that the company is working on technological solutions to "detect this type of content quickly and at scale".