You have asked that your board of directors provide direct feedback and guidance on issues of serious consequence to Facebook. We believe that we are at such a time – and your board felt the need to provide you and your management team with our views regarding recent events - as well as, frankly, a dose of displeasure. We sense that Facebook's content safeguards are not being designed or managed properly. Here are our thoughts:
We understand that Facebook has 2.3 billion content sources. And we know that both human and technical controls are required. And yes, we see that your team has certainly tried hard. But, by any objective measure, Facebook's controls have failed miserably, as evidenced by the sickening tragedy in New Zealand that was broadcast live over our social network. It’s time for Facebook to grow up. Now.
First, many of us on the board were surprised to see Monika Bickert’s team profiled so openly in Vanity Fair. Or course we are glad to see our Community Standards made public, but our belief is that a team focused on policing hate speech and highly inappropriate content should maintain a more modest profile. And that (frankly) tiny-looking group of camera-ready team members made this grave function look flippant.
As you will recall, we were happy with the recent note you posted on the topic. It was well-structured, clearly espoused, and 100% reasonable. But as we discussed then, it was possible that such focus on collaboration, artificial intelligence, and respect for individual expression – while admirable – might be too weak. Perhaps the time has come for the Facebook team to become stronger – and yes, perhaps even unreasonable. Here’s what we recommend:
Tighten the Acceptability Threshold – We know that Facebook content is more engaging as it shifts to the edge of acceptableness. And yes, it’s good for business to keep this loose. But by tightening this threshold of acceptability, you can improve things considerably. And yes, a side-effect is that Michelangelo’s David will get flagged, but we need much tighter content requirements. Monika should focus on this immediately.
World-Class, Real-Time Ops – Many of us on the board have built professional operations centers, which are run with maniacal attention to details, every second of every hour. And these centers are organized with a real-time command structure, usually by ex-Military. Please instill in your team the belief that one bad post would literally end-the-world. With this in mind, we advise creation of a live, full-alert, nerve center for policing content.
Oversight With Real Consequences – As a board, we will not advocate government regulation. But we all learned through experience that necessity is the mother-of-invention. It’s time now to accept that your mission is not best-effort filtering of the vast majority of bad posts. Rather, your mission must be to stop all unacceptable posts. Period. With no exceptions. And when violated, the consequences to you personally will be severe.
None of this is fun. And it will cost a lot of money. Facebook will suffer. But we are confident that your team will identify creative solutions. Here’s an example (and maybe Monika does this now): Of the 2.3 billion posts, we'd bet that half would never, ever stream a live murder. This includes IBM, and USDA, and Catholic Charities, and so on. These groups live at the steep, front-end of a Pareto distribution, and thus require much less focus.
Your board estimates that you will need to hire three thousand full-time experts to work in this area, developing new solutions (such as time-delays for live streaming), as well as to staff the nerve center. Assuming salaries of about $150K, this will cost about $450 million. But this is reasonable for a company with a market cap of half a trillion. None of us would blink at requesting that Verizon do this sort of thing. And their market cap is half ours.
Mark, we’ve stuck with you during thick and thin, and we’ve admire your wonderful ability to evolve personally and professionally, with the changing landscape. This goes for your management team as well. But the recent live streaming of a mass murder is too much for this board – and we will not sit by and watch ineffective and insufficient security methods be used to police content. Get started on this now. We will be watching.
The Board of Directors of Facebook.