Youngsters aged 11-13 accounted for nearly half of all child abuse images flagged by the UK organisation responsible for identifying and removing such content from the internet last year, an annual report has shown.

Figures published by the Internet Watch Foundation (IWF) also showed hundreds of babies were subjected to the worst abuse, while girls featured far more often than boys.

Almost a third (29%) of web pages taken down by the IWF contained self-generated imagery – content created using webcams and shared online, often after children have been groomed.

The IWF flagged 130,915 images to be removed from the internet last year, up from 103,529 in 2018, and significantly higher than the 76,939 in 2017.

The vast majority of those images were hosted in the Netherlands where digital freedoms are much greater than in the UK.

Susie Hargreaves, IWF chief executive, told the PA news agency: “The key issue for us in this year’s annual report is not just that numbers have gone up across the board but that nine out of 10 URLs are in Europe, the majority of it in the Netherlands.

“If we could just address that by all working together we would have an immediate impact on the amount of child sex abuse available in the world.”

Paedophile comments
IWF chief executive Susie Hargreaves said the UK has a zero-tolerance approach to hosting illegal content (IWF handout/PA)

Of last year’s images, 63,533 (48%) were found to contain children in the 11-13 age group, while 45,744 were in the 7-10 age category.

Some 1,609 images last year featured babies and toddlers aged 0-2, with the vast majority of those (71%) being deemed category A – the most serious definition.

Similarly, there were more category A images among the 3-6 age group (6,218 images, or 41% of the total for that age range) than any other.

Overall, around one in five of the total images (27,005) across all age ranges was deemed to be in category A, depicting rape or sexual torture, up from 23,879 the previous year.

The data also showed that 92% of all images featured girls, with a further 3% containing boys and girls.

Of the self-generated content, three in every four (76%) showed a girl aged 11 to 13 – something the IWF said was at risk of increasing during lockdowns where children are likely to be spending more time in their bedrooms on computers.

Ms Hargreaves said: “It’s a massive issue and one we have been raising awareness of, particularly during coronavirus.

“Those children are so young, they’re so vulnerable to being coerced and tricked into sharing sexual images and engaging in sexual activities on webcams, and they don’t have the emotional wherewithal to know they are being exploited.

“By the time that is videoed and recorded it is on a child sexual abuse website which is where we see it.

“It’s a sad fact that the younger the child, the worse the level of abuse.”

The figures also showed that nine out of 10 images analysed in 2019 were hosted in Europe (89%) with 71% of the images around the world hosted in the Netherlands, followed by Slovakia (6%) and the US (5%).

Fewer than 1% of images were hosted in the UK.

Ms Hargreaves said: “It is not necessarily that the Netherlands is a bad country or anything.

“The big issue is that they don’t take it down, it’s tied into digital freedoms and digital rights, and the fact you have to get a court order to take it down – that’s tens of thousands of trips to court.

“We also have an absolutely zero-tolerance attitude to it in the UK.”

“They (the Netherlands) recognise they have a major issue and will do what they can to revise the legislation.

“I am confident something will happen but it needs to happen quickly.

“If every country stood up to it like we did in the UK, it would have nowhere for it to go.”

Andy Burrows, head of child safety online policy at the NSPCC, said: “Hundreds of thousands of child abuse images continue to be accessed by offenders in the UK and more needs to be done to tackle the demand for and circulation of this terrible material.

“Tech firms should be required to proactively scan files being uploaded to cloud devices to identify and remove child abuse images before they can be shared more widely.”