Facebook adds human reviewers after 'Jew haters' ad scandal

Sheryl Sandberg said she was "disgusted" by ProPublica's discovery Image copyright Getty Images
Image caption Sheryl Sandberg said she was "disgusted" by ProPublica's discovery

Facebook will add more human reviewers to its advertising system after admitting it failed to prevent, or even notice, anti-Semitic targeting on the network.

Sheryl Sandberg, the network’s chief operating officer, said she was "disgusted" by the findings of a ProPublica investigation published last week.

The report found that ads could be bought to specifically target users who described themselves as “Jew haters”, as well as other similarly hateful terms.

“We never intended or anticipated this functionality being used this way – and that is on us,” Ms Sandberg wrote.

In a post on Facebook, she added: “Seeing those words made me disgusted and disappointed - disgusted by these sentiments and disappointed that our systems allowed this.

“Hate has no place on Facebook – and as a Jew, as a mother, and as a human being, I know the damage that can come from hate.

“The fact that hateful terms were even offered as options was totally inappropriate and a fail on our part. We removed them and when that was not totally effective, we disabled that targeting section in our ad systems.”

Hold tech firms accountable

Responding to the announcement, the Anti-Defamation League said it welcomed Ms Sandberg’s comments.

“We spoke to Facebook last week to understand what happened and asked for detailed steps they'd take to prevent this sort of hateful ad-targeting,” said Jonathan Greenblatt, the ADL’s chief executive.

“We are glad that they are taking immediate, meaningful action, and ADL will continue to hold tech companies accountable for following through on these actions.”

Ms Sandberg said the team of human reviewers would be responsible for monitoring the terms that can be used to sell advertising.

In addition, she said the company was working on a new system that would allow users to report inappropriate advertising categories, much in the way normal posts can be reported today.

Facebook is not the only company to fall foul of poorly designed advertising algorithms in recent months. Google also had to take action after its system allowed the placement of search ads next to racist and anti-Semitic terms. So too did Twitter have to take action against abuses of its ad system.

The stories are being seen as a wake-up call for companies that rely on algorithms to handle the heavy lifting on their platforms - without any thought given to the potential for abuse.

Facebook's ads were also in the spotlight earlier this month when it was discovered that a Russian group was likely behind the purchase of politically charged advertising targeting American voters. That information was passed on to a wider US investigation into Russian meddling in last year's presidential election.


Follow Dave Lee on Twitter @DaveLeeBBC

You can reach Dave securely through encrypted messaging app Signal on: +1 (628) 400-7370

More on this story