How innocent photos of children have been exploited on Twitter
Despite attempts by social networks to clamp down on child porn, some Twitter users have been swapping illegal images and have used tweets to sexualise otherwise innocent photos.
They begin as innocuous selfies or pictures taken by friends or family members. But in the eyes of a small cohort of warped Twitter users, they become something else entirely.
"The pictures are usually young girls in their school uniform or a swimsuit," says Joseph Cox, a freelance journalist writing for Motherboard, part of Vice News. "Some have been taken by the girls themselves. It's not clear whether they've then sent them to a boyfriend who's uploaded them… others appear to have been ripped from their social media sites."
Cox's investigation into this underground world started with a search of one hashtag which threw up one of the otherwise innocent-looking photos.
"Users were asking to trade pictures of similar aged girls and they were commenting on her appearance and how attractive they found her," he says. "Some of the comments did get very explicit."
The pictures themselves are not pornographic but Twitter's guidelines are clear: child sexual exploitation isn't tolerated.
It's policy on the issue states: "When we are made aware of links to images of or content promoting child sexual exploitation they will be removed from the site without further notice." In addition, users face a permanent ban for promoting child sexual exploitation. Most of the posts that Cox found were later taken down by Twitter.
Listen to BBC Trending Radio
But the murky world of comments and replies is not the only exploitation problem on the social network. One American woman who spoke to BBC Trending said she uncovered a huge amount of child pornography on Twitter after reading rumours about it on Reddit.
"There was a minimum 14,000 accounts involved in the creation, distribution or retweeting of child porn," says Molly (not her real name).
The victims? "Girls as young as five, and definitely under 15."
Molly, a game developer, says that once she started probing, it didn't take her long to uncover the network and see that sexual images of children were being swapped with startling openness.
Once she had found one account, "you click on their retweets and that opens up more accounts and it creates this rabbit hole where you just keep finding more and more child porn," says Molly.
Some of the images, she believes, appear to be produced by paedophiles, while others are nude selfies that young people have texted to one another.
Molly says she's reported the images to Twitter and the US Federal Bureau of Investigation. In addition, she publicly named and shamed the users sharing child porn. Twitter, she says, reacted quickly, shutting all the accounts down.
Other social networks have also clamped down on sexualised or sexual images of children. In 2011, Reddit had a similar issue with a forum called "Jailbait" - while many images being shared there weren't illegal, the site closed it down.
Cox, the journalist, says he was surprised by how openly people were talking about child exploitation in the posts he viewed.
"One account even ran a poll asking its followers: 'Hey how old are you?'" he says. "The vast majority said over 20 years old, so if that's accurate these are adults communicating and looking at these pictures."
In a statement, Twitter told BBC Trending: "We do not tolerate child sexual exploitation," and said it works with authorities and organisations including the Internet Watch Foundation in the UK to combat the exploitation of children.
Reporting by Sam Judah
Next story: The conservatives 'outing' liberal professors
A conservative group is naming and shaming professors they say are biased - but some academics are fighting back with a mocking hashtag. READ MORE