Facebook’s Preposterous Community Standards

Read Alex Jones or tune him out. It’s a choice that has been in your hands. It’s always been in your hands. You could always have made the censorship decision yourself. Facebook, Apple and YouTube users have always been their own censors, not these companies.

Nobody actually requires these companies to decide for them what may or may not be read and viewed. However, these companies have always censored content, which means that they have removed some censoring decisions from the hands of individual users. They did this as a business decision. Presumably, they hoped to gain more customers who liked the convenience and content of corporate censorship than they stood to lose from those who preferred to do their own censoring. A product produced by a company never fully satisfies everyone.

Facebook’s censorship is preposterous. The Facebook community standards that guide their censorship have always been vague and malleable. Critical language in their standards is baloney that cannot be applied in any objective or fair way.

Compare the treatment of Alex Jones against this Facebook statement:

“Our mission is all about embracing diverse views. We err on the side of allowing content, even when some find it objectionable, unless removing that content can prevent a specific harm. Moreover, at times we will allow content that might otherwise violate our standards if we feel that it is newsworthy, significant, or important to the public interest. We do this only after weighing the public interest value of the content against the risk of real-world harm.”

This is all vague baloney, providing no known criteria for Alex Jones or anyone else. The claim is that “allowing content” is their foremost objective, unless it will cause a “specific harm”. How does Facebook determine that something will cause a specific harm? What kind of harm? They say that harm includes “physical, financial, and emotional injury.” Is it credible that Facebook can judge “emotional injury” or even these other forms? What did Alex Jones say that raised the specter of causing specific harm? Does Facebook monitor and censor billions of communications by applying these criteria? This is an impossibility.

Facebook clearly singled out Alex Jones. It didn’t embrace “diverse views”. It didn’t remove him because of some “specific harm” he was causing. If he were doing some harm, Facebook would have reported him to the police for investigation; or the people being harmed could have sued Jones.

Yet Facebook backs off this already vague statement by giving itself the room to allow content “that might otherwise violate our standards”. Their “feeling” controls their decision. They must “feel” that the content is “newsworthy, significant, or important to the public interest”. Where do these feelings and judgments come from? For users of Facebook who may face censorship, there are no known criteria. The fact is that Facebook is incapable of judging the “public interest” content of the communications shared on its circuits. It has no known capacity to judge the “risk of real-world harm”. Weighing one against the other in any rational or explicit fashion known to its users is impossible. This is all baloney.

In banning Alex Jones, has Facebook produced an internal document that assesses his lengthy record of communications and goes through the steps of judging the public interest, judging the real-world harms and weighing one against the other? We’d like to see such a document.

Facebook has a right to produce a product that entails censorship exercised by it in arbitrary and biased ways. It also has a right to claim that it’s adhering to some rational standards to be found in its “Community Standards”. That claim, however, is nonsense.

Facebook’s removal of Alex Jones is politically-motivated. It’s coming from the Left, but it would not be surprising if well-known right-wing politicians and deep state members joined in the attack on Jones because he questions the Establishment and conventional narratives. Alex Jones is “proud to be listed as a thought criminal against Big Brother”. His basic orientation is to question appearances and to propose that considered judgments and aims of large-scale, powerful and monied interests lie behind important events. In his own case of being banished by major communications companies, this appears to be true!

Share

1:57 pm on August 7, 2018