News content

Facebook whistleblower describes competing motivations in the company’s approach to news content

Is Facebook bad for the company and is the company knowingly contributing to division and angst, in order to maximize usage and profit?

It’s a key question that has persisted for a few years, especially since the 2016 U.S. election. And now we get a glimpse of Facebook’s own thinking on the subject – over the past two weeks, The Wall Street Journal reported a series of internal studies and responses to them from Facebook executives, which were leaked by a former Facebook staff member who sought to expose the company’s inaction to address the main flaws in its design.

This former employee was revealed last night by SCS to be Francoise Haugen, an algorithmic design expert who had worked on Facebook’s civic integrity team before it was dissolved following the 2020 US election. According to information shared by Haugen, Facebook has indeed knowingly avoided taking more aggressive steps to address the worst aspects of its platform, due to the impacts such measures could have on use, and therefore on profits.

And while Facebook has refuted Haugen’s claims, his statements match what many previous reports have suggested, highlighting major concerns about the societal impacts of Zuckerberg’s social giant.

Haugen’s key claim is that Facebook knowingly overlooked or downplayed conclusions, based on its own research, in favor of sustaining user usage and engagement.

As explained by Haugen:

“What I saw over and over on Facebook was that there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, has chose to optimize for his own interests, such as earning more money.

Which, to some extent, makes sense – Facebook is, after all, a business, and as such, it is driven by profit and delivers maximum value to its shareholders.

The problem, in the case of Facebook, is that it harnesses the last network of interconnected humans in history, approaching 3 billion users, many of whom use the app to stay informed, on various fronts, and collect key information around the news of the day. As such, he has significant power to influence opinion.

This means, as Haugen notes, that his decisions can have big impacts.

Facebook makes more money when you consume more content. People like to engage in things that elicit an emotional response. And the more they are exposed to anger, the more they interact and the more they consume.

Indeed, among the various results highlighted in Haugen’s study Facebook files, the thousands of internal documents that she essentially smuggled out of Facebook’s head office, are suggestions from Facebook:

  • Ignored the prevalence and impact of hate speech on its platforms, due to the fact that such content also generates more engagement among users
  • Minimized Instagram’s negative impacts on young users, results showing platform amplifies negative body image
  • Did not address key concerns about using Facebook in developing regions, in part due to a cost / benefit analysis
  • Failed to combat the spread of anti-vaccine content

Again, many of these have been widely reported elsewhere, but Haugen’s files provide direct evidence that Facebook is indeed well aware of each of these aspects, and has at times chosen not to act, or to take action. major counter-actions, in large part due to a conflict with its business interests.

The Facebook PR team has worked hard to counter such claims, providing point-by-point answers to each of the Facebook File reports, noting that the existence of these research reports, in and of itself, shows that Facebook is strives to address these concerns, and combat these problematic elements.

Facebook highlights various changes it has made to Instagram to provide more protection and control options for users, while Facebook is also working to improve the ranking of its algorithm to limit exposure to source content from. division and angst.

But at the same time, Facebook has played down the impacts of such things on a larger scale.

As facebook Vice President of Policy and Global Affairs, Nick clegg noted on the suggestion that Facebook played a key role in contributing to the post-election protests on Capitol Hill.

“I think the statement [that] January 6 can be explained because of social media, I just think it’s ridiculous.

Clegg’s point is that Facebook is only a small part of a larger societal change, and that just can’t be the central issue that has led to such a major conflict, in various regions.

It’s impossible to know what Facebook’s impact is in this regard, but clearly from Haugen’s files there are key contributors there.

Facebook makes more money when you consume more content. People like to engage in things that elicit an emotional response. And the more they are exposed to anger, the more they interact and the more they consume.

Anger is the emotion that elicits the most responses, the most engagement, and Haugen essentially argues that Facebook is profiting from this, by making it easier to deliver hate-inspiring content that, as a by-product, amplifies the division.

When we live in an information environment filled with angry, hateful and polarizing content, it erodes our civic trust, it erodes our faith in each other, it erodes our ability to care for each other, the version of Facebook that exists today. tears our societies apart and causes ethnic violence around the world.

There are two sides to this, and both can be equally correct. One, as Haugen notes, is that Facebook has an underlying motivation to facilitate the delivery of hateful content, which drives more engagement among its users, while exacerbating societal divide – which, across Facebook, can have a significant impact.

On the other hand, as Facebook notes, it doesn’t conduct such searches for nothing. To turn a blind eye to these issues would be tantamount to not conducting these studies at all, and while Zuck and Co. may not be taking as much action as all parties would like, there is some evidence to suggest that the company is doing so. ‘tries to solve these problems. concerns, but in a more measured way that ideally also reduces the business impact.

The question is whether “business impact” should be factored into such consequential decisions?

Again, Facebook operates the largest interconnected network of people in history, so we don’t know what the full impacts of its algorithm-influenced sharing may be, as we don’t have another example to refer to. , there is no precedent for Facebook and its wider impact.

In some ways, Facebook, in its scale and influence, should really be a public service, which would then change the motives of the company – as Haugen notes:

No one at Facebook is malicious, but the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People like to engage in things that elicit an emotional response. And the more they are exposed to anger, the more they interact and the more they consume.

Basically, this is the main problem – we now have a situation where one of the main vehicles for distributing and disseminating information is not motivated by keeping people informed reliably, but by keeping people informed. generating as much engagement as possible. And the way to do that is to elicit an emotional reaction, hatred and anger being among the most powerful motivations for people to react.

According to research, almost a third of American adults regularly access news content on Facebook – that means at least 86 million Americans get direct insight into the latest events from a platform that has a clear motivation to show them the most agonizing and emotionally charged views on every problem.

News editors know it as well, as do politicians – in fact, according to the Facebook Files, various political groups have shifted to more partisan and confrontational approaches in their approaches in order to appease Facebook’s algorithm.

When you consider the scale of the platform’s reach and impact on these posts, it’s clear that Facebook has an influence on how we engage.

But with competing motivations and a need to maximize engagement in the face of growing competition, can we really expect Facebook to change its approach for the greater good?