Facebook isn’t committed to change

The platform has proved time and again that it only reacts when it needs to salvage its reputation.

It took a whistleblower, Congressional testimony and nearly two weeks of rage from the press for Facebook to announce it's working on features to spare teen mental health on Instagram.

In an interview with CNN’s State of the Union on Sunday, Facebook’s VP of global affairs Nick Clegg said Instagram will introduce a feature called “take a break,” which will encourage young people to log off the platform if they are determined, by Facebook’s algorithms, to be using it in an unhealthy way.

Instagram will also “nudge” teen users who are falling down dangerous content rabbit holes to look at something else, and introduce new controls for parents.

These are baby steps in the right direction, and it took a lot to get here.

The timing is not a coincidence; it was just weeks ago that Facebook was getting ready to roll out a separate version of Instagram for kids. Facebook is implementing safety features directly in response to the vitriol that’s been directed at it over the past few weeks, underscoring again that this company only reacts to bad PR.

If Facebook has proved anything in its ongoing response to TheWall Street Journal’s Facebook Files series, it’s that denying until there’s no more denying is the strategy when it comes to fixing its most fatal flaws.

There are still so many unanswered questions, and no timeline for these features to roll out.

What will trigger Instagram to tell a user to “take a break?” Will someone be told to “take a break” after looking at content that glorifies eating disorders immediately, or only after an hour? Does the same rule apply for consuming hours of horseback riding or cooking content? I hope these are questions advertisers are asking.

Advertisers, of course, have done this dance with Facebook many times before. For years, the industry has tried to work with the platform and other social media giants with brand safety issues (cough, YouTube), appealing to their better angels.

Major brands have even walked away from the platform, albeit briefly, in movements such as Stop Hate for Profit. Agencies have created brand safety consortiums and hired executives solely to look after brands getting caught in unsavory situations on social media.

Still, time and again, Facebook doesn’t change. And I’m not talking about fixing its algorithms and figuring out content moderation overnight. I’m talking about its sentiment to actually want to change — to, in former Facebook employee and whistleblower Frances Haugen’s words, put people and civic integrity over profits.

At this point, the advertising industry is simply too reliant on Facebook to hold it to account. Agency execs and advertisers are torn about Facebook’s ability to redeem itself, still holding out that Facebook can fix its flaws. I wonder how much of that is wishful thinking from brands and ad executives that struggle to turn elsewhere.  

Facebook may very well make its platform safer for teens. It may also start moderating content in a fairer way, or double down on efforts to demote rageful content and misinformation.

But the company’s responses to its challenges these past few weeks, and frankly the past few years, are another proof point that, unless its reputation is on the line, Facebook won’t make any real changes to improve its platform. And even then, these changes are half-hearted and perfunctory, and the motivations behind them are questionable. 

Buyer, beware.

Tags

Subscribe today for just $116 a year

Get the very latest news and insight from Campaign with unrestricted access to campaignlive.com , plus get exclusive discounts to Campaign events

Become a subscriber

GET YOUR CAMPAIGN DAILY FIX

Don’t miss your daily fix of breaking news, latest work, advice and commentary.

register free