It’s my belief that Facebook is a force for good and a net positive for society. I believe this, because it’s what I see and experience every day.
In my job, I’m privileged to see and hear stories of people who connect with old and new friends, people who feel less alone because they’ve been able to join a group that shares their passion or concerns, and people who have been given a voice that would have in the past been unheard.
I believe this because of the other ways our platforms help people to build community. Whether that’s seeing people on Facebook raise more than £750m for charities with our fundraiser tools, or how our Safety Check feature has notified people that their families and friends are safe more than three billion times.
In the UK, it’s the four million-plus businesses that use our services, from some of the most established, sophisticated companies and agencies in the world to the people with the courage to start their own business, helping to drive sales, fuel growth and create jobs.
I say all this not to distract from the challenges we face – hard and complex issues, some of which were this week raised by Gideon Spanier and that I want to address here – but to underline that we see all of human nature on our platforms. Some of it is bad, but the vast majority is good and positive.
We recognise that while the internet has transformed how billions of people live, work and connect with each other, new forms of communication also bring huge challenges.
It’s also right that we acknowledge we’ve made mistakes. As we continue to build things that are original, we can be sure there will be more hard trade-offs and we won’t get all of them right. That’s why we will rely on our partners to keep challenging us.
One challenge that Gideon rightly identifies is how you regulate the internet. It’s a big question with profound implications and one our chief executive recently addressed in The Washington Post.
We know that to be truly effective in thwarting bad actors, we need to work together with government and regulators. The UK government’s Online Harms White Paper is an important opportunity to do this. As Mark Zuckerberg said last month, new regulations are needed so that we have a standardised approach across platforms and that private companies such as Facebook aren’t making so many important decisions alone.
It’s right to be held to account and for us to live up to the expectations of people, businesses and government.
However, we’re not just waiting for regulation, we’re taking action now.
Thanks to investments in people and artificial intelligence, we’re making progress in keeping harmful content off our platform – taking down more than 99% of fake accounts, spam and terror content before any users report it to us, more than 96% of adult nudity, sexual activity and violent content, and making progress in other important areas such as hate speech.
Our regular Transparency Report reflects our ongoing efforts to reduce bad content on our platforms.
We've also increased the number of people focused on safety and security to 30,000, about half of which are content reviewers who review more than two million pieces of content every day.
To put this in perspective, this year we’re spending more in this area than our whole revenue at the time of our public offering.
I also want to talk to the subject of brand safety on Facebook.
On any given day, there are more than two billion unique news feeds generated – everyone's is different. What people see in their feed depends on a range of factors, including their friends, who they follow, the content they view and what actions they took such as "likes" and shares. And because everyone’s news feed is different, we don't target ads to content but to people.
Where we do allow ads that are linked to the content – for example, in Watch and Audience Network – we give advertisers a great deal of control and choice over where their ads appear.
Let me also be frank: zero tolerance does not mean zero occurrence. But be assured, we're doing everything we can to stay ahead of these challenges. While it's impossible for us to promise that people with malicious intent can be fully stopped, we continue to work hard at minimising the bad and maximising the good.
It’s this good that I see on our platforms, as well as the investment and dedicated people who work hard at keeping our community safe, which makes me and all of us proud to come to work.
I promise that we’ll keep working hard every single day to keep bringing the world closer together and to provide value for the people, communities and the businesses that use and rely on our platforms.
Steve Hatch is vice-president, northern Europe, at Facebook