MediaJustice

Contact: Libeth Morales, [email protected]

Malkia Cyril, executive director and co-founder of the Center for Media Justice has issued the following response to the article published June 28, 2017 by ProPublica on the algorithms that Facebook’s censors use to differentiate between hate speech and legitimate political expression:

Photo credit: Jorge Caballero Jiménez via Visual hunt / CC BY-SA

“The Center for Media Justice has long worked with groups like Black Lives Matter to address the censoring of Black activists on Facebook. Black activists report consistently being censored for mentioning race, referring to or describing structural racism, or reacting to incidents of police violence with frustration or sadness. By contrast, explicit white supremacists have called for the death of Muslims, threatened to kill Black, indigenous and other activists of color and more without being found to violate Facebook community standards. This is clearly discriminatory.

That’s why, in October 2016, the Center for Media Justice, Color of Change, Sum of Us and Daily Kos sent a letter signed by over 80 racial justice organizations urging Facebook CEO Mark Zuckerberg to make changes to Facebook’s censorship policies. Facebook executives met with our coalition, but used the time to “explain” why there was nothing they could do. In response to our proposed recommendations, we received a formal yet brief and noncommittal response from Facebook’s Head of Global Policy, Joel Kaplan.

Our coalition sent a second letter in January 2017 requesting a meeting, following Kaplan’s inadequate response. When we heard nothing back, we sent Facebook more than 570,000 petitions urging change to their censorship policies. They declined to meet or formally respond.

Now we know why — racial discrimination is built into the structure of how they manage content, and it harms communities of color using the platform. It’s time for a change.

There are three big changes we believe are needed: reduce the heavy reliance on algorithms, as algorithms don’t understand power imbalances and cannot effectively take them into account when identifying hate speech. More human censors with more training are needed. Also, greater transparency is critical — not only should Facebook make the criteria for censorship known to its users, but it should publish an annual report on trends in who and what it censors. Finally, Facebook is run by a handgun of Ivy League educated white men (and one or two women), their diversity records at their highest level of management are horrible, and this impacts their decision-making about content and censorship.

Facebook needs thoughtful collaboration from racial justice partners in a number of countries that can ensure human rights with regard to race, national origin, and religion among others, are respected and rooted in reality. In short: training, transparency, and thoughtful collaboration from community partners would at least be a very good start.”

News

See All