WIRED: As the summer wore on, it became unmistakable that Facebook’s problems ran deeper than fake news. In June, Facebook officials reportedly met with the Senate Intelligence Committee as part of that body’s investigation into Russia’s election interference. In August the BBC released an interview with a member of the Trump campaign saying, “Without Facebook we wouldn’t have won.”
At last, in September, Facebook broke its silence. The company admitted it had received payments for ads placed by organizations “likely operated out of Russia.” These were troll operations with a wide range of phony ads designed to fan the flames of American racism, anti-LGBT sentiment, and fervor for guns—as well as to build opposition to Clinton. Zuckerberg announced that the ads had been turned over to Congress, and he intimated that an internal investigation at Facebook would likely turn up more such ad deals: “We are looking into foreign actors, including additional Russian groups and other former Soviet states, as well as organizations like the campaigns, to further our own understanding of how they used all of our tools.”
The statement sounded more like fact-finding than soul-searching. Zuckerberg seemed to be surveying a different Facebook from the one that allowed possibly Kremlin-backed entities to target people who “like” hate speech with racist propaganda. A Facebook like that would need a gut renovation; Zuckerberg’s Facebook just needed tweaks.
Facebook is indeed a new world order. It determines our digital and real-world behavior in incalculable ways. It does all this without any kind of Magna Carta except a vague hypothesis that connectivity is a given good. And yes, it’s largely unregulated, having styled itself as nothing more than a platform—a Switzerland pose that lets it seem as benign as its bank-blue guardrails, which stand as a kind of cordon sanitaire between Facebook and the rest of the unwashed internet.
In 2006, a college kid talked me off Myspace and onto Facebook by insisting that Facebook was orderly while Myspace was emo and messy. That kid was right. Facebook is not passionate; it’s blandly sentimental. It runs on Mister Rogers stuff: shares and friends and likes. Grandparents and fortysomethings are not spooked by it. Like the animated confetti that speckles Facebook’s anodyne interface, our lives on Facebook—the bios and posts—seem to belong to us and not to the company’s massive statehouse, which looks on indifferently as we coo over pups and newborns. (Or is it a penal colony? In any case, it keeps order.) Facebook just is the internet to huge numbers of people. Voters, in other words.
But that order is an illusion. Nothing about Facebook is intrinsically organized or self-regulating. Its terms of service change fitfully, as do its revenue centers and the ratio of machine learning to principled human stewardship in making its wheels turn. The sheen of placidity is an effect of software created by the same mind that first launched Facemash—a mean-spirited hot-or-not comparison site—but then reinvented it as Facebook, an “online directory,” to prevent anyone from shutting it down. The site was designed to make the libertarian chaos of the web look trustworthy, standing against the interfaces of kooky YouTube and artsy Myspace. Those places were Burning Man. Facebook was Harvard.
Siva Vaidhyanathan, whose book about Facebook, Anti-Social Media, comes out next year, describes Zuckerberg as a bright man who would have done well to finish his education. As Vaidhyanathan told me, “He lacks an appreciation for nuance, complexity, contingency, or even difficulty. He lacks a historical sense of the horrible things that humans are capable of doing to each other and the planet.” MORE