Page Nav

HIDE

Pages

Classic Header

{fbt_classic_header}

Breaking News:

latest

When Did Mark Zuckerberg Become Public Enemy No. 1?

  This week, Facebook is experiencing a media gang-tackle the likes of which we have rarely seen.   The Associated Press sternly announces ,...

 This week, Facebook is experiencing a media gang-tackle the likes of which we have rarely seen. The Associated Press sternly announces, “The Facebook Papers represents a unique collaboration between 17 American news organizations, including the AP. Journalists from a variety of newsrooms, large and small, worked together to gain access to thousands of pages of internal company documents obtained by Frances Haugen, the former Facebook product manager-turned-whistleblower.”

But the “four revelations from the Facebook Papers” that the Financial Times touts (link at Ars Technica) seem… fairly mundane, compared to the accusations that Facebook swung the 2016 election, idled as Russian propaganda brainwashed Americans, provided fertile ground for extremist movements, and so on.

The first revelation is “Facebook is often accused of failing to moderate hate-speech on its English-language sites but the problem is much worse in countries that speak other languages.” That’s not good news, but that’s not really why Americans who are angry at Facebook are so upset with the company. You would have to look far and wide to find Americans who are up in arms because the company isn’t doing a good enough job of rooting out hate speech in Burmese or Arabic dialects.

The second revelation is “news feed ranking is more influenced by people who share frequently than those who share and engage less often, which may correlate with race. That results in content from certain races being prioritized over others.” But if you build a system that prioritizes the posts shared by those who share the most often, you have no guarantee that each demographic will choose to share posts with equal frequency. The only way for Facebook to balance this out would be to prioritize posts shared by minority users, a sort of affirmative action for Facebook posts.

The third revelation is “according to a March 2021 note by a group of researchers, the company takes action on only as little as 3 to 5 percent of hate speech and 0.6 percent of violent content. Another memo suggests that it may never manage to get beyond 10 to 20 percent, because of it is ‘extraordinarily challenging’ for AI to understand the context in which language is used.”

It would be preferable to know just how these researchers define “hate speech.” As for the failure of artificial intelligence to understand context, this is again not good news, but not necessarily surprising. Back in 2018, Facebook set up a policy of blocking ads with nudity… and banned a tourism campaign for the Belgian city of Flanders, because it featured nude paintings from a Rubens museum. (I guess you could say the anti-nudity policy is… Baroque-en.)

The fourth revelation is “memos show that the company switched off certain emergency safeguards in the wake of the November 2020 election, only to scramble to turn some back on again as the violence flared. One internal assessment found that the swift implementation of measures was hindered by waiting for sign-off from the policy team.” It’s hard to begrudge Facebook “switching off certain emergency safeguards in the wake of the November 2020 election” because the election was over. And once again, one department at a company slowing down the decision-making of another department is not good, but it’s also not particularly surprising.


When you set up a system where any user can register an account for free and start posting to his heart’s content, catching offensive, objectionable, or harmful content is always going to be a game of whack-a-mole. When the system grows so large that the number of users are in the billions, no administrator is going to be able to review, evaluate, and judge the appropriateness of ever post of every user with any sense of speed.

An online social media network is a tool, just like a hammer, a computer, or a gun. What impact that tool has on the world is determined almost entirely by who uses it and for what purpose.

Because social media networks can be used by anyone, you’re going to find bad people using those networks. Social media companies undoubtedly bit off a lot more than they could chew when they initially declared that they were not media companies, and not responsible for what gets written on their sites – that they were no more responsible for the content of what is posted than bricklayers and bathroom stall makers were responsible for what graffiti is written on their work.

If you give the whole world a blank canvas, some people will create great art, some people will doodle, some will draw stick figures and some will write profanities and others will create horrible, hateful stuff. Sooner or later, these companies were going to censor somebody, because someone was going to come along who was so terrible, indefensible, unjustifiable and irredeemable that the companies would feel compelled to say, “no, we didn’t set up this network so something like that could find an audience.” Way back in 2008, Clay Shirky’s book Here Comes Everybody: The Power of Organizing Without Organizations discussed online pro-anorexia groups. The world is full of nutjobs, weirdos, extremists, lunatics and people who hold all kinds of strange, abhorrent, repulsive and taboo beliefs. When your company puts out a welcome mat to the world, some of those weirdos are going to come in with the rest.

Robby Soave at Reason argues the Facebook papers represent “a Big Fat Nothingburger” and concludes, “The social media site is unsafe because there’s too much content that the mainstream media and the government would prefer users not see. They’re upset that the person in charge of deciding what belongs on Facebook is Mark Zuckerberg and not Joe Biden—and no amount of handwringing about addictive platforms or monopolistic practices can disguise the fact that the site is losing popularity with young people, and increasingly looks like a dying star.”

Spending a lot of time on social media may well have some unhealthy consequences. But it’s hard to believe that Facebook is some sort of unique societal menace, while Twitter, Instagram, YouTube, and the rest are just fine and dandy. (A much fairer and consequential criticism of Facebook is the evidence that the company knows its subsidiary Instagram is a toxic influence in the lives of many teenage girls, and doesn’t seem to care.)

What does seem to be a factor is that Facebook is the most popular social media network among older people — and older people tend to skew conservative and vote for Republicans, which suggests some of the ire directed towards Facebook is really an ire at the types of people most likely to be using it.

Note the project’s title, “The Facebook Papers,” is no doubt attempting to evoke the “Pentagon Papers” and “Panama Papers.”

(Did you notice that in the multiple batches of “The Panama Papers” that were released, American news audiences largely yawned? As I noted at the time, there weren’t that many Americans named in the Panama Papers, and the ones that were mentioned had fraud indictments or other reasons to hide their assets from prying eyes of investigators. If the Icelandic Prime Minister is hiding his money, that’s really an issue for Icelandic law enforcement and Icelandic voters to address. Regarding the more recent release of the “Pandora Papers,” maybe it’s something of a scandal that former United Kingdom Prime Minister Tony Blair is using offshore accounts to shield assets from taxes. But is anybody surprised? And is anybody really surprised that Jordan’s King Abdullah II or associates of Vladimir Putin do the same? Did anybody think this crowd was an all-star team of financial and ethical rectitude?)

One other odd wrinkle: All of the news institutions that are participating in this massive expose of the alleged evils of Facebook… have Facebook pages.

No comments