If you haven’t heard already, Facebook is being called to account – again. But this time, the silver bullet appears to have been shot from the inside by one woman: Frances Haugen. And while the information she’s brought to light isn’t exactly surprising if you’re a Facebook skeptic, it’s perhaps the most damning account of the social media giant’s own perspectives yet.
Frances Haugen got her start in social media working for Google
Her team became central to investigating Facebook’s role in spreading political falsehoods, conveying misinformation, and stoking violence. But following the conclusion of the U.S. presidential election in 2020, her team was dissolved, and the members shuffled to other projects.
Haugen left Facebook in May after becoming frustrated by what she described as the company’s lack of openness about its potential for harm – and its unwillingness to adopt safety initiatives that made it harder to attract, engage with, and profit from its users. But she didn’t just leave: she took with her tens of thousands of documents exposing the company’s flaws in lurid detail.
“60 Minutes” revealed her identity for the first time on Sunday night after she shared the documents with The Wall Street Journal, which used them as the backbone of a multi-part series known as “The Facebook Files” that dove deep into the company’s dark underbelly.
Haugen also testified in front of a Senate committee on Tuesday that hopes to toughen laws protecting children online. Her testimony included remarks on how Facebook focuses on profits over safety, stokes discord and division, and creates products that harm children.
Of course, none of this is news to those who follow Facebook in the media. In fact, the company has battled intense criticism for years, especially following reports in 2016 and 2017 that Russia used the platform to meddle in the U.S. presidential election. However, the documents obtained by Haugen paint the clearest picture of a company that, time and again, and despite congressional hearings, pledges, and media attention, chooses its profits over safety and meaningful action.
Just minutes after Haugen’s testimony concluded on Facebook, the company issued a statement to discredit its former employee, stating that Haugen was with Facebook “for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question.”
In the next breath (or keystroke), Facebook also noted that it agrees that “it’s time for Congress to act” and “create standard rules for the internet” rather than expecting the industry to regulate itself.
CEO Mark Zuckerberg also posted a lengthy message on his personal Facebook page addressing the testimony. In particular, he took umbrage with how “many of the claims don’t make any sense” and “most of us just don’t recognize the false picture of the company that is being painted.” He also claimed that both the whistleblower testimony and the media have mischaracterized the company’s work and motives, calling the argument that Facebook deliberately pushes anger-inducing content for profit “deeply illogical.”
At the same time, Mr. Zuckerberg also underscored Facebook’s importance in researching tough issues. Moreover, he echoed his company’s calls for Congress to act, stating that “at some level, the right body to assess tradeoffs between social equities is our democratically elected Congress.”
But the company’s mass outage on Monday – just one day after Haugen’s “60 Minutes” interview, and the day before her Senate committee hearing – was due, of course, to configuration changes in its backbone routers.
The Bug is the Feature
In societal and media discourse, it’s common to talk about Facebook like it’s a neutral space, the digital equivalent of a bulletin board to which anyone can tack their post. And in some circles, it’s not uncommon to hear that Facebook is essentially a message board censoring certain viewpoints – typically, claimants say, the wrong viewpoints to censor.
But Facebook is neither of those things, and it’s far from random. Rather, the system is built on an engagement algorithm. And it’s this algorithm that determines everything at Facebook, from its profits and growth to its usage and usefulness.
Essentially, the more time you spend on Facebook, the more time they have to show you ads that make money. These ads generate billions of dollars in revenue annually. So, naturally, it’s in the company’s best interest to keep you scrolling longer, to learn more about your habits, so they can target you more precisely.
The problem is that there are literally thousands of apps vying for your attention at any given time. (Not to mention the real world). As such, Facebook needs to find a way to keep your engagement –and it’s done so by making a product that’s as “fundamentally addictive” as cigarettes. From status updates to likes to photo tagging, every feature of the platform is designed to draw you in.
And once you’re in, their algorithm’s responsibility is to keep you there by putting content on your screen that grab you. The algorithm does this in several ways, but two of the most prominent include showing you content that others have engaged with already, and by promoting content similar to what you’ve liked and engaged with in the past.
Anger and Divisiveness Lead the Way (and the Profits)
In Ms. Haugen’s first interview with “60 Minutes” on Sunday, she expounded upon what she believes to be “systemic” problems with Facebook’s ranking algorithm. In particular, she noted, Facebook’s own research suggests that “angry” and divisive content gets the most engagement.
Said Haugen, “When you have a system that you know can be hacked with anger, it’s easier to provoke people into anger. And publishers are saying, ‘Oh, if I do more angry, polarizing, divisive content, I get more money.’ Facebook…is pulling people apart.”
As Haugen reports, this problem amplified after a 2018 algorithm change to promote “meaningful social interactions” through “engagement-based rankings.” In other words, content that gets more comments, likes, and shares is more likely to reach a wider audience.
But as Facebook’s own internal records show, negative emotions and experiences are a more powerful driver of engagement, meaning their algorithms will make you miserable while they keep you addicted and exacerbate conflict, anger, and even misinformation.
And it’s on the back of that anger – and all the other emotions and actions wrought thereafter – that Facebook now builds its empire.
The leaked research surrounding Facebook’s harmful impact on users is certainly cause for concern—that and the several-hour outage this week. Investors may want to keep a close eye on how Facebook carries on to handle the callout that critics claim Zuckerberg was initially evading. As well on what Congress does or does not do to effect change.
Facebook stock has had a lot of momentum, largely outperforming the S&P500 returns. But the last few weeks have thrown it for a loop, and it plummeted 15 percent from the all-time high it hit last month. Despite these losses, Zuckerberg has built an arguably defensible platform that nearly three billion people use. Regulatory changes and constructive criticism may only mean room for improvement and opportunity down the line—opportunity that, ideally, sacrifices neither the health nor the safety of its users. Or, yes, it could all go up in flames.
Liked what you read? Sign up for our free Forbes AI Investor Newsletter here to get AI driven investing ideas weekly. For a limited time, subscribers can join an exclusive slack group to get these ideas before markets open.