I learned an important lesson on Facebook today:
I learned that people speaking out against hate speech and violent imagery are more offensive to Mark Zuckerberg and his team of administrators than the hate speech itself.
This morning I shared a post asking people to report a Facebook page. Taking my cue from my friend Chris Conzen, I spread the message because it was apparent that Facebook was not taking action. The page, titled “RIP Trayvon Martin” was anything but a page in his memory. Rather it included photoshopped images of 17-year-old Trayvon in a concentration camp and being hanged.
Let me reiterate that: There were photoshopped, edited images of a child being hanged and lynched.
Shortly after submitting my report, I received an e-mail from Facebook telling me that they reviewed my report, but found nothing that violated their hate speech terms and so the page would remain.
Unsurprisingly, I clicked the “Give Feedback” option you see in the bottom right corner and informed Facebook that their hate speech terms may need to be reviewed more closely, that they also may need to include reference to incidents of bias in order to be more inclusive.
Hours later, I received a text message from Kathryn informing me that my own status asking people to report the page was removed from my page, as well both her page and Chris’ page.
Let me be clear: Facebook refused to remove the RIP Trayvon Martin page with offensive and violent imagery, but my own post — my own words tied to my own name and not hidden behind cowardly anonymous bigotry of a page — were removed from my own personal page. There was no notification of the removal of my own post, only an error message when I tried to go back to it by following earlier notifications:
Facebook came under fire last month for not doing more to intercede in incidents of violence against women and perpetuation of rape culture. But their feeble attempts to recover in that arena have revealed them impotent in facing the issue that their larger than life social network is an arena of oppression of all sorts. No one is safe — not even the whistleblowers and advocates.
Facebook has admitted they have too few humans reviewing reported items. There is work to be done, certainly, and in the meantime, Facebook and its leadership owe an apology and explanation to those who volunteer their time flagging the offensive, hateful, harmful, and hurtful.
Edit: I am editing this to add that the RIP Trayvon page is gone, though it’s not clear if Facebook removed it or if it was removed by the page’s owner, who was being tracked by angry users via an exposed IP address. It has also been suggested that Chris, Kathryn, and I may have violated Facebook’s ToS by requesting a mass reporting. I can’t argue with ToS I agreed to, but I can point out the flaw — a mass reporting can be a solid indicator of a problem, like a neon sign pointing to it, making it easier to find and address.