Why Facebook Will Never Fix Facebook

The last two years have shown that what the platform needs is something closer to an overhaul — stripping out some of the guts of a system that ruthlessly prioritizes engagement.

On Tuesday afternoon, Facebook rolled out a new feature: an in-app dashboard that allows Facebook and Instagram users to calculate (and ideally manage) the time they spend on the platforms’ mobile versions. The dashboard’s timing is curious — though first announced in August, the rollout comes in the middle of yet another crisis for Facebook, one that’s left lingering questions about its leadership’s fitness to steward its 2 billion–plus users. At first glance, the feature may feel like a sign of progress — a rare move to give users more control, perhaps even at the expense of coveted “time spent” metrics.

In reality, the dashboard does little to address what ails Facebook. Far from a solution to its “time well spent” initiative, the in-app calculator is little more than graphically enhanced window dressing. It gives the impression of empowering the user while offering few ways to cut down on usage (save for a notification that can be quickly dismissed). The tool is also is further confirmation that Facebook seems intent on addressing the platform’s myriad marginal issues while ignoring the architecture designed to encourage engagement that allows everything from fake news and harassment to hyperpartisanship to spread.

A quick tour of the dashboard provides some insight into how Facebook seems to calculate engagement (in a word: bluntly). Users can see how many minutes they’ve spent on the app on a particular device each day for the past week as well as time spent on the app on average. Given the mass of data collected and analyzed by Facebook, this wellness snapshot lacks granularity. As Josh Constine noted for TechCrunch, the dashboard “seems to ignore the research Facebook itself has presented about digital well-being on social networks” by treating all time spent on Facebook as one bulk figure, rather than dividing it based on what users are actually doing. Facebook could, for example, show how much time someone spends commenting on posts and photos versus scrolling through the News Feed versus interacting in Groups or planning events or even messaging.

Instead, it gives people hardly more information than they can already get from the digital well-being apps already included in the iPhone and Google Pixel.

Perhaps this shouldn’t come as a surprise. Facebook, after all, is a platform historically powered by prioritizing and rewarding a very simplistic idea of user engagement. You react to something, or you don’t. And if you do, you’ll see more of the same.

In 2014, my colleague embarked on a 48-hour binge, liking every single post that came his way — an experiment that ended with a ravenous News Feed sending him down a dark, polarized rabbit hole that in many ways presaged the hyperpartisan hellscape to come. Two years later, I attempted a similar experiment, doing everything Facebook’s algorithms told me to do. The result was a Facebook experience that felt anything but human; for all the information I fed it, the algorithm never really knew me and made clumsy choices, mistaking all passing fancies for deep obsessions.

These same machine-made calculations and rewards created Facebook’s toxic feedback loops. They’re the dizzying mathematical equations that define a system that processes an advertisement, a xenophobic meme about a migrant caravan, and a wedding announcement as roughly equivalent units of content. It’s the architecture that prioritizes engagement, even if that engagement is making us miserable. It’s the set of rules that still aims to keep you logged in and engaged for as long as possible.

This disconnect isn’t just algorithmic, it comes from the top. Guided by its CEO, Mark Zuckerberg, the company is plagued by an inability to see itself the way that outsiders do. This has been true for years and was part of the rationale behind the mixed results of everything from the Facebook “Home” phone to its “Free Basics” push to become the internet for the developing world.

These initiatives always followed what felt like an overly idealistic understanding of not just Facebook’s users, but the internet at large. Facebook’s Internet.org promotional materials argued, “the more we connect, the better it gets,” while Zuckerberg’s own ideology surrounding the company’s ruthless expansion boiled down to one simplistic credo: “It is always better to have some access than none at all.”

Even on issues of privacy, Zuckerberg has steadfastly promoted the false idea that users are in control of their own Facebook experience, despite the fact that much of the experience is controlled by what its algorithms feed the user. Zuckerberg’s own description of his platform in front of Congress this year projects a facile version of his product. “That’s what the service is: You can connect to the people that you want and share what matters to you,” he said. “That’s photos or links or posts. And you get control over who you share it with, and you can take it down if you want, and you don’t need to put anything up.”

All the while, the company — which did not respond to a request for comment — has never meaningfully acknowledged that the public has grown increasingly wary of its platform. For a company so deeply committed to collecting and analyzing information, the company has been historically inept at deciphering the data.

The company’s new dashboard reflects this. It flattens the user experience, but more importantly, it offers only superficial solutions to a complex problem. Users can tweak preferences on photos and videos and posts in order to “make the most of your time.” But those fixes are cosmetic at best — individual users might be able to tailor their feeds slightly, but the incentives that govern the platform (gamifying and monetizing attention, as well as prioritizing clicks, shares, ads, and money over quality of information) will largely remain the same.

A similar myopia applies to many of the solutions to fix Facebook. In a Washington Post op-ed after a bombshell New York Times story last week, former Facebook security chief Alex Stamos proposed that Congress “codify standards around political advertising” and work with the platform to “draw a thoughtful line between the responsibilities of government and the large technology companies.” While Washington would do well to heed Stamos’s advice, the proposal leaves out a key player: Facebook, the platform.

Internally, the social network’s fixes seem to fixate on the margins. Last Thursday, the company held a conference call to tout a number of new initiatives aimed at better enforcing community standards. Among the announcements: new technology to fight child exploitation and better ways to proactively take down hate speech, bullying, and harassment. And while these are laudable efforts, they ultimately ignore the platform mechanics that have led to the glut of hate speech, bullying, and harassment: namely, that they’ve been historically rewarded, either with audiences or money or both.

What Facebook and many of its allies struggle to see is that meaningfully fixing Facebook is not really about putting the granular ad disclosures in place (though they’d be welcomed) or finding the perfect balance between government and internal regulation (though it might help protect users). The last two years have shown that what the platform needs is something closer to an overhaul — stripping out some of the guts of a system that ruthlessly prioritizes engagement.

Ultimately, the failure to fix what ails Facebook is a failure of imagination. It’s failure to conceive of the social network as anything other than a viral advertising platform with a side of political discourse, life updates, and baby photos. As writer L. M. Sacasas argued this summer, criticisms of Facebook are "not so much a rejection of the machine ... but, at best, a desire to see the machine more humanely calibrated."

Facebook’s new dashboard is a calibration. What’s needed is something more like a realignment. That’s why the feature feels so impotent. It’s a tool designed to treat visible symptoms rather than the underlying disease built by a company that can’t see itself as anything other than it is now.

Which is another way of saying that it's a tool that only Facebook could build.


Topics in this article

Skip to footer