Reddit has become the latest social-media platform to admit that Russian propaganda was used on its site during the 2016 US presidential election.
It follows leaks from news site The Daily Beast showing a Russian troll farm active on the website.
Co-founder Steve Huffman said that it had removed "a few hundred accounts" suspected of being of Russian origin.
In a blogpost, he said "indirect propaganda", which was more complex to spot and stop, was the biggest issue.
"For example, the Twitter account @TEN_GOP is now known to be run by a Russian agent. Its tweets were amplified by thousands of Reddit users, and sadly, from everything we can tell, these users are mostly American and appear to be unwittingly promoting Russian propaganda."
Mr Huffman added: "I believe the biggest risk we face as Americans is our own ability to discern reality from nonsense, and this is a burden we all bear.
"I wish there was a solution as simple as banning all propaganda, but it's not that easy. Between truth and fiction are a thousand shades of grey.
"It's up to all of us—Redditors, citizens, journalists—to work through these issues."
The @TEN_GOP account appeared to be run by Republicans in Tennessee. It tweeted a mix of pro-Trump content and conspiracy theories, as well as more obvious fake news stories.
The Daily Beast investigation suggested no outright support of any particular candidate or viewpoint and concluded that Russia's aim was to provoke and divide Americans on the internet and, as a result, in the physical world too.
Social media 'weapon'
Social media platforms are under increased scrutiny from the US Congress over the issue of Russian meddling in the 2016 election.
Facebook has given the Senate Intelligence Committee thousands of ads believed to have been purchased by Russian agents.
The Washington Post reported that Reddit was now likely to be questioned over its involvement in the "weaponisation of social media" during the election.
Special counsel Robert Mueller has charged 13 Russians with interfering in the US election, all of whom are linked to troll farm the Internet Research Agency.
Meanwhile, pressure is mounting on Reddit to clean up the content on its platform.
In February, it banned a group that was generating fake porn - imagery and videos that superimpose a person's face over an explicit photo or video without permission.
This week, it emerged that another subreddit was sharing images of dead babies and animals being harmed.
Mr Huffman said the company was aware of the group, which currently has nearly 19,000 subscribers, and that the community was "under review".