Facebook’s Community Standards Report – The Buyer’s Journey 62

Facebook’s community standards report is here, a look into how well the platform is doing mediating posts on the platform. We discuss!

Facebook’s Community Standards Report

Mark Zuckerberg believes that time spent on Facebook should be quality time. To that end, he has less worry about users staying on site longer and focuses on more time well spent. Facebook’s latest algorithm update that puts greater emphasis on posts from your best friends is one example of this mentality. Appropriately enforcing your community standards is another. Facebook’s community standards report is here offering a peak at how Facebook has handled this issue thus far.

A chart showing a variety of data from Facebook's community standards report.

Here are some quick numbers from Facebook’s community standards report:

  • Between 90% and over 99% of community standards violations are caught by Facebook. This means that no user had to report the content, Facebook’s algorithms found it themselves.
  • Facebook only caught 14% of the 2.6 million instances of harassment reported.
  • Facebook internally flagged 65.4% of the 4.0 million moments of hate speech users reported.
  • In Q1 2019, Facebook found and removed 2.19 billion fake accounts. That’s an increase of 1 billion accounts in Q1.
  • 1.76 billion instances of spammy content were removed in the first quarter.
  • Facebook said it removed 5.4 million pieces of content depicting child nudity or sexual exploitation.
  • 33.6 million pieces of violent or graphic content were also taken down.*
  • Facebook reversed itself 152,000 times out of the 1.1 million appeals it heard related to hate speech.
  • Facebook was also unlikely to accept appeals for posts related to the sale of regulated goods like guns and drugs.

The Commexis Take

As Josh and I discuss on the podcast, we’re happy to see Facebook being transparent in their community standards. It’s great to see that spammy content is now becoming a major target for removal. Furthermore, the removal of fake accounts is fantastic. It’s wild that 2.19 billion account is approaching the 2.38 billion users Facebook has according to Statista. Of course, Facebook has counter measures put in to stop fake accounts sooner, but it’s still quite a large number.

That said, it’s clear there are clear ways where moderation can be improved. While catching 14% of the 2.6 million instances of harassment is good, there can still be improvement. Unfortunately, we don’t know how Facebook goes about flagging that kind of behavior. Does it only look at wording? Frequency of messages? It’s unclear. Perhaps better transparency on this issue can help improve understanding the low statistic.

Jonathan Shieber on Tech Crunch also writes about the independent third-party report on Facebook’s content moderation. Be sure to check that our here.

Check out our next episode where we’ll discuss a news article covering the Retention stage of The Buyer’s Journey.

Tune into more of The Buyer’s Journey by checking out our YouTube and Soundcloud, and take us on the go on the iTunes, TuneIn, Google Play Music, and Stitcher.

*Josh and I mention an article about the high influx of graphic content on Facebook and how that content is moderated. You can read the disturbing story about moderator’s working conditions and mental health concerns here.

Join the Commexis mailing list

Top