Cities suffer when communication reinforces social, professional and ideological barriers. How are people using social tech to connect and understand and support each other? How can social tech be used to create opportunities, build trust and strengthen communities?
Facebook has been shaken to its core as it came to grips with being perhaps the most powerful media outlet in the world. Among other things, the company grappled with claims of liberal political bias, accusations that it was destroying the free press, and outrage that it had sold ads to Russians trying to influence the 2016 election. What was it like inside the company as these crises unfolded? How has Facebook changed as a result? What is it doing now to address its shortcomings?
WIRED’s Nicholas Thompson and Fred Vogelstein spoke with more than 50 current or former Facebook employees to answer these questions in an enthrallingly detailed article titled “Inside the Two Years that Shook Facebook.” Thompson and Vogelstein realized in October that they were both interested in writing features on different aspects of the epic tale, so they decided to team up. Having worked together on “The Plot to Kill Google” in 2009, they knew it would be a happy marriage. The result is an investigative tour de force.
Years of limited oversight and unchecked growth have turned Facebook into a force with incredible power over the lives of its 2 billion users. But the social network has given rise to unintended social consequences, and they’re starting to catch up with it.
Facebook is behind the curve in understanding that “what happens in their system has profound consequences in the real world,” said Fordham University media-studies professor Paul Levinson. The company’s knee-jerk response has often been “none of your business” when confronted about these consequences, he said.
That response may not work much longer for a company whose original but now-abandoned slogan — “move fast and break things” — still seems to govern it.
“There’s a general arrogance — they know what’s right, they know what’s best, we know how to make better for you so just let us do it,” said Notre Dame professor Timothy Carone, who added that it’s true of Silicon Valley giants in general. “They need to take a step down and acknowledge that they don’t have all the answers.”
Apple users usually expect for their devices to perform basic system management and maintenance, monitoring background processes so that a rogue task doesn’t drag down the currently active app, for example.
But when Apple confirmed users’ suspicions that a recent update was aggressively slowing older devices, the story quickly gained national attention, culminating in the company cutting the price of battery replacement service and apologizing for the misunderstanding in an open letter to customers. Though Apple never goes as far as to admit wrongdoing in the letter, their direct appeals to customers’ “trust” and “faith” serve as an implicit acknowledgement that the company disregarded a boundary somewhere.
Given how social media and messaging services have, as Jenny Davis says, “extended beyond apps and platforms, taking on the status of infrastructures and institutions,” Apple’s moves to smooth device performance and subtly automate connectivity make some sense. “They have become central to society’s basic functions, such as employment, public safety, and government services,” Data & Society scholars argued in response to Carpenter v. United States.
The ubiquity of networked phones not only facilitates access but furnishes society’s layers of contingency – the many convenient, useful and at times crucial services we enjoy and rely on every day. When our societal infrastructure shifts, as it inevitably does, we feel it and often anticipate its impact.
Indeed, as part of the “cyborgian bargain,” we both expect and are specially equipped to continually renegotiate our status within ever shifting socio-technical systems. For the trust we exercise conditionally with and through society’s mediating infrastructures and institutions, we do not expect an equitable exchange so much as we demand reciprocation, however tenuous and incomplete, commensurate with our wants and desires.
The sort of user we are becoming now might be better described as interstitial, a status emerging from our agency in relation to and actions afforded by socio-technologies. Instead of the ancillary user that platforms imagine molding and fixing in place, the interstitial user contends that our interests and desires necessarily defy simple categorization and we will use what options we have at our disposal to pursue our aims in spite of designers’ wishes.
The most important thing that’s lacking is actually any kind of private space where you are not being monitored by the corporations whose tools you’re using to have whatever conversation you’re having.
So, every time you have a conversation in a digital environment, all of it, there’s a third party who’s got that information — always a corporation. And then all of that exchange is also being monitored by the government.
If the fundamental premise is that this activity of non-profits happens outside of those realms, it literally doesn’t exist in digital space, because we’re playing in their house, if you will. We may well need and would all benefit from an environment that provides some protections for us in those spaces as they exist.
When we talk about digital civil society we always say, ‘Look, we need to invent this, because we don’t have it.’ The best way to protect somebody else’s digital data in that environment is to not collect it. If you don’t have it, then it’s not at risk.
Non-profits have been excited to use things like free online documents and spreadsheets that are stored in the cloud and shared across organizations, and this comes at no direct financial cost to them. If you upload to those systems the names of everyone participating in your programs, with their address their email and their phone number, you’ve just given it away to other parties.
But, if you collect that information and don’t store it online, for one, or you encrypt it, for two, or you store it on your own servers and not in other people’s houses, as I like to think about it, then you are providing the same degree of integrity to that data that you again provide to the money that you rely on to do your business in the first place. You’re treating it with integrity toward your mission.
And if your mission, for example, is helping vulnerable people in your community, don’t do it in such a way that you essentially make them more vulnerable.
In the first episode of the new Netflix series, My Next Guest Needs No Introduction with David Letterman, former President Barack Obama reflects on political tensions and life after the White House, and Dave visits Selma with Representative John Lewis.
Obama: One of the biggest challenges we have to our democracy is the degree to which we don’t share a common baseline of facts. What the Russians exploited (but was already here) is that we operate in completely different information universes.
Letterman: I was under the impression that Twitter would be the mechanism by which truth was told around the world.
Obama: If you are getting all your information off algorithms being sent through a phone, and it’s just reinforcing whatever biases you have, which is the pattern that develops… And that gets more reinforced over time.
That’s what’s happening with these Facebook pages where more and more people are getting their news from. At a certain point you just live in a bubble. And that’s part of why our politics is so polarized right now. I think it is a solvable problem, but I think it’s one that we have to spend a lot of time thinking about.
Letterman: It seems like a valuable tool that has turned against us.
A “toxic shock” has resulted from the algorithmic infection proliferated by News Feed, Google Search and other neocolonialist forms of digital content curation. The simple fact that Facebook impairs the ability to obtain objective information and engage meaningfully is reason enough to keep the social network at arm’s length.
As of 2014, all HuffPost comments are on Facebook’s system. This implies a conflict of interest for editors who would promote opinions that portray the network in a bad light. A smart move by a social network in crisis control mode, managing how millions of left-leaning millennials learn and share about it.
Facebook founders have since come out against the social network, admitting to what many suspect: that Facebook is, as a hacker might say, designed to exploit a vulnerability in human psychology. But we’re here. What happens now?
Want to bypass the drama and create a stronger bond with your audience?
Expand your reach to additional platforms;
Facilitate and implement diversified content streams;
Go deeper with your engagement;
Start a podcast;
Set up a listserv for each demographic or interest you serve; and
Most importantly, be proactive, listen, and reciprocate.
Recode’s Kara Swisher said last year that journalists needed to be tougher on serial liars in tech and politics. Last month, Swisher returned to Recode Media with Peter Kafka to grade whether the media lived up to that goal in 2017 — and the impact of the Silicon Valley companies whose platforms distribute most of their content.
Swisher is frustrated by the unwillingness of tech leaders to accept their share of responsibility in the media space, and not because they’re blind to the problem.
“I think they know that these platforms are being badly misused, and they don’t know what to do about it,” Swisher said. “I think it was a slow burn, a slow dawning on them. The penny dropped really lugubriously.”
On the new podcast, Swisher also shares why she’s more impressed by Snapchat CEO Evan Spiegel than by his peers, why Silicon Valley isn’t thinking about AI’s potential for reinforcing bias, and why she’s tired of tech’s perpetual-victim mentality.
In Clicking Bad, once you’ve clicked your way to selling $20 of meth, you can buy a Storage Shed, which cooks a batch every five seconds — without requiring you to click at all. On the distribution side, you can acquire a Drug Mule, and eventually a Drug Van — just like that, you’ve moved from labor to management. Your scrappy start-up is on its way to becoming a corporate powerhouse.
Our society is allowing its wealth to concentrate in the holdings of a few companies like Apple and Facebook, because the games are playing us. And, unlike [another clicker game] Universal Paperclips, they often don’t look like games. They are decoratively skinned as social media, giving us a sense of connection to people we kinda, sorta know, or as infotainment platforms that make us informationally obese.
Who do you associate with online? In a brief video from 2010, internet activist Ethan Zuckerman argues that cultural barriers are preventing us from using the internet to tackle global issues. Flame wars be gone. On the danger of the “ideological echo chamber effect” on society by today’s mainstream social networks, Zuckerman says, “What you’re looking for is a conversation, not to win a fight.”
Social networking forms an important part of online activities of Web users. However, social networking sites present two problems. Firstly, these sites form information silos. Information on one site is not usable in the others. Secondly such sites do not allow users much control over how their personal information is disseminated, which results in potential privacy problems.
This paper presents how these problems can be solved by adopting a decentralized approach to online social networking. With this approach, users do not have to be bounded by a particular social networking service. This can provide the same or even higher level of user interaction as with many of the popular social networking sites we have today. It also allows users to have more control over their own data.
A decentralized social networking framework described is based on open technologies such as Linked Data [Berners-Lee 2006], Semantic Web ontologies, open single-signon identity systems, and access control.