Dutchman Leon D. was exposed by a church window.

In 2011, he was convicted for molesting two young boys and producing and distributing imagery.

It located the church in a small town, Den Bommel.

AI-algorithms identify pedophiles for the police — here’s how it works

Leon D.s home was right across the street.

It’s free, every week, in your inbox.

Worring and his team developed an algorithm that can recognize objects commonly used in sexual imagery of children.

Article image

Things like beds or teddy bears, for example, he tells me.

And obviously, children.

The ambiguity of what counts as abuse poses another challenge.

But with 17-year-olds, most human experts cant even tell the difference.

Another algorithm by researchers at the University of Amsterdam translates images into text.

Textual descriptions are more specific because they also convey the relationship between objects, adds Worring.

Often, these tools are openly available to other parties.

We are talking about extremely privacy-sensitive information here we cant just upload those pictures to Google Cloud.

And even if their software is completely secure, we would still be sharing data with third parties.

What if someone working for Google decides to make a copy?

Sometimes the end will justify the means, adds Duin.

In 2017, 130 Dutch victims could be identified.

Unfortunately, the number of reported cases is also increasing drastically.

This number increased to 12,000 in 2016 and, this year,it will probably reach 30,000.

The internet has completely changed the playing field, says Duin.

These days, we seize hard drives with up to a million files.

In that sense, technology has been a double-edged sword in the battle against sexual imagery of children.

And despite its abundance on the web, TBKK still employs the same number of officers about 150 people.

Theres no budget available to expand the team.

Worring predicts future tools will be able to do so by combining different data.

The way human experts interact with these algorithms will change as well, he adds.

Right now, the algorithm spits out results and a detective checks if these images really depict abuse.

But their feedback isnt shared with the algorithm, so it doesnt learn anything.

In the future, the process will be a two-way street where humans and computers help each other improve.

Worring isnt sure that will ever be possible.

But we can make the system more accurate to minimize exposure, he adds.

Now that much crime has gone digital, the Dutch police need tech talent more than ever.

Check out thevarious tech jobsthey have to offer.

Story byAnouk Vleugels

As Publisher, Anouk is responsible for TNW’s overall media strategy.

Also tagged with