Mark Zuckerberg wrote a 6,000-word manifesto on the history of humanity and how Facebook will shape humanity’s future. Zuckerberg’s manifesto has been subjected to a large backlash among the intelligentsia of the internet, and although it has problems, I’m glad he is finally publically examining the moral force Facebook has on society. In contrast, Ezra Klein thinks Zuckerberg would probably be better off just focusing on making as much profit as possible.
[Zuckerberg suggests] that Facebook will become crucial not just to learning about politics but participating in it. He says that in the 2016 US election, Facebook’s voter registration program “was larger than those of both major parties combined,” and suggests that Facebook could “enable hundreds of millions of more people to vote in elections than do today, in every democratic country around the world.”
Will Zuckerberg succeed in all, or any, of this? I have no idea. Skepticism is surely in order….
But to do that, he has to move Facebook beyond being a neutral platform and tie it to an idea of where humanity can and should go next. Religions do this. Political parties do this. National governments do this. And now Facebook is doing it too. What Zuckerberg is offering here isn’t a business plan so much as it’s a philosophy or an ideology. But philosophies and ideologies are harder and more dangerous to follow than business plans. [emphasis added]
This last sentence is a dangerous idea itself. It is a version of the positivist fallacy. Nobody can avoid moral philosophy and ideology. Every choice we take is driven by an ethical ideology. In this example, Klein’s moral philosophy is just another version of Milton Friedman’s ideology that the only moral responsibility of business is to make as much profit as possible. Zuckerberg is merely recognizing that his business choices are already having a profound impact upon democracies around the world and some of these impacts have been harmful. For example, Facebook has amplified political divides and is the premiere distribution network for fake news. This isn’t necessarily the most profitable business model for Facebook. Facebook has many choices about how to make a profit. Some choices are probably more harmful than others even when they are equally profitable, so recognizing a social responsibility doesn’t have to even hurt profits. Klein objects that:
The harder Facebook pushes to curate media content, or to generate political participation, the greater the threat certain governments will perceive from its presence.
But Facebook already curates media content. Its algorithms determine what you see first and what you don’t see. That power can easily (and undoubted has already) swayed elections. Facebook has been pushing fake news and fostering extremist bubbles that amplify partisanship because that gets users more “engaged” with facebook and that is more profitable.
Government objections are inevitable to any organization with that much power. Facebook is going to be controversial no matter what they do because it is too late to avoid having a big impact upon politics and Zuckerberg is finally accepting a little bit of the heavy responsibility that must accompany the power he wields. I’d welcome a lot more public discussion of how much Facebook is aware of how its business decisions are influencing politics. That would give me more confidence that they aren’t being malevolent even if they only have an accidental malevolence caused by the blind pursuit of profit.