Another former Facebook exec has spoken out about the harm the social network is doing to civil society around the world. Chamath Palihapitiya, who joined Facebook in 2007 and became its VP for user growth, said he feels “tremendous guilt” about the company he helped make. “I think we have created tools that are ripping apart the social fabric of how society works,” he told an audience at Stanford Graduate School of Business, before recommending that people take a “hard break” from social media.
Facebook is being called to public account for Russian meddling in the 2016 election. People describe its algorithm-driving News Feed as a Frankenstein-like monster that even its creators can’t control. Meanwhile, Facebook’s CEO uses his profile to craft an image of himself as an approachable and empathetic everyman — a big operation.
Facebook’s founders knew they were creating something addictive that exploited “a vulnerability in human psychology” from the outset, according to founding president Sean Parker. Parker, whose stake in Facebook made him a billionaire, criticized the social networking giant at an Axios event in Philadelphia this week. Now the founder and chair of the Parker Institute for Cancer Immunotherapy, Parker took the time to provide some insight into the early thinking at Facebook at a time when social media companies face intense scrutiny from lawmakers over their power and influence.
The average Facebook user sees only 20 percent of the 1,500 stories per day that could have shown up in their news feed. The posts you receive are determined by algorithms whose bottom line is Facebook’s bottom line. The company is constantly adjusting all kinds of dials, quietly looking for the optimal mix to make us spend more of our time and money on Facebook. The more we’re on Facebook, the more information they have about us to fine-tune their formulas for picking ads to show us.
The business model: We create and give Facebook, for free, the content they use and the data they mine to hold our attention, which Facebook in turn sells to advertisers.
A journalist remarked that Internet.org sounded like a gateway drug. Mr. Zuckerberg retorted that he preferred to think about it as an “on-ramp to the Internet” — an on-ramp that would shunt an increasing amount of content through Facebook, giving it enormous influence over not just how its users got access to entertainment or news, but also how they received education, health, banking and other social services.
Most folks consider such thinking nefarious. Silicon Valley thinks it’s virtuous. As Zuckerberg put it, one of its goals is to show “people why it’s rational and good for them to spend the limited money they have on the Internet.” This model shows something else: If you run a site or an app, it’s also rational for you to move them inside Facebook’s ecosystem, so that your audience will not have to pay to access it.
First, Facebook’s leadership doesn’t seem to understand the nuances of diverse identities. Second, Facebook’s approach to most issues, including authentic names or hate speech, is to create one-size-fits-all policies it claims will work for the majority of users. Third, Facebook does not share the details of its enforcement guidelines, release data on the prevalence of hate speech, or give users a chance to appeal decisions and receive individualized support. Fourth, Facebook appears unwilling to invest adequate resources to address these issues.
The task posted by Global Science Research appeared ordinary, at least on the surface. The company offered turkers $1 or $2 to complete an online survey. But there were a couple of additional requirements as well. First, Global Science Research was only interested in American turkers. Second, the turkers had to download a Facebook app before they could collect payment. Global Science Research said the app would “download some information about you and your network … basic demographics and likes of categories, places, famous people, etc. from you and your friends.”
When Instagram demonstrated that sharing photos on mobile devices was a new social media platform, Facebook acquired it. When WhatsApp demonstrated that messaging was alive and well in a pure form, Facebook acquired it. In each case Facebook paid handsomely — but did little else. In effect, Facebook bought an option against possible disruption but did not want to touch or risk anything else about those companies. Facebook controls possible disruptive events that appear to pick off customers based on initially niche cases by buying the competition.
The suspicion that Facebook was satisfying a need I didn’t even know I had — training me for a certain behavior — made me wonder if I needed a support group. That my child had become online “content” was even creepier. I wasn’t always like this.
I first logged in merely to look at photos from my far-flung family. A few years back, my brother invited me to join. He posted for a while and then seemed to completely vanish from his page — no more updates, no more pictures. He had sent an ominous email, something to the tone of “Be careful. It’s addicting.” We never spoke of it again.
This feature is designed to provide people some of the tools they need to make informed decisions about which stories to read, share, and trust. For links to articles shared in News Feed, we’re testing a button that people can tap to easily access additional information without needing to go elsewhere.