The $300 system in the fight against illegal images
A security researcher has built a system for detecting illegal images that costs less than $300 (£227) and uses less power than a lightbulb.
Christian Haschek, who lives in Austria, came up with the solution after he discovered an image showing child sex abuse had been uploaded on his image hosting platform Pictshare.
He called the police, who told him to print it out and bring it to them.
However it is illegal to possess images of child abuse, digitally or in print.
"Erm... not what I planned to do," Mr Haschek said.
Instead he put together a homegrown solution for identifying and removing explicit images.
Mr Haschek used three Raspberry PIs, powering two Intel Movidius sticks, which can be trained to classify images. He also used an open source algorithm for identifying explicit material called NSFW (Not Safe For Work), available free of charge from Yahoo.
He set it to find images which the system could say with 30% or more certainty was likely to contain pornography - he said he set the possibility low so as to be sure not to miss anything.
He has since discovered 16 further illegal images featuring children on his platform, all of which he reported to Interpol and deleted.
He then contacted a larger image hosting service, which he declined to name, and found thousands more by running images uploaded to their platform through his system as well.
"When I first started working on my open source image hosting service PictShare I didn't think anyone but myself would use it," Mr Haschek said on his blog.
"Over the years the usage has increased and with increased usage of a site where you can upload images anonymously, there will be those who upload illegal things.
"There are thousands of images on PictShare - I can't look them through even in a year so I had to think of something else."
Prof Alan Woodward from Surrey University said Mr Haschek's project was encouraging.
"Law enforcement agencies around the world are struggling to find this horrible material and have it taken down. Sadly, the police have to work with tech firms and that takes time," he said.
"I like the idea that this particular site has taken responsibility and found a solution that mitigates the problem.
"The scale of the problem faced by the large tech firms is admittedly enormous and although these solutions could be scaled up, it takes money and effort. However, where there's a will there's a way."