Why Mark Zuckerberg should set up a #deletefacebook team
March 2018 has not been a great month for Facebook.
The revelations about Cambridge Analytica showed that Facebook was at best complacent, or at worse complicit, in the wholesale harvesting of data for improper purposes. By the time Mark Zuckerberg broke his silence — with a qualified mea culpa and some detail on how Facebook plans to do better — a great deal of damage had been done.
There was a precipitous drop in Facebook’s share price, and the #deletefacebook phenomenon: users signalling that now is the time for them to take the social media giant out of their lives. Even WhatsApp co-founder Brian Acton — who sold WhatsApp to Facebook for $16bn — joined the #deletefacebook chorus. This was followed by Elon Musk deleting the Facebook pages for Tesla and Space X, each with approximately 2.6 followers.
So what does the #deletefacebook phenomenon mean. Could it be the beginning of the end for Facebook? What went wrong? And what can Facebook do about it? And importantly, what can others do to learn from Facebook’s errors?
Could #deletefacebook be the beginning of the end?
In short, yes it could, but it does not need to be. Facing existential threats is not new to Facebook. Facebook understand that the competitor that could kill them might emerge at any moment, and it deploys all of its resources and ingenuity to stay ahead. They live by the mantra “if we don’t create the thing that kills Facebook, someone else will”. The difference is that this time the threat comes from within.
What went wrong?
As a psychologist I know more about people than data or technology, so a key question for me is: how did a group of capable, intelligent and driven individuals make such a series of bad decisions? I believe clues can be found in a very revealing interview that Roger McNamee — an early mentor of Mark Zuckerberg and investor in Facebook — gave to Channel 4 news¹. When asked why Zuckerberg had not yet come forward to speak, he said the following:
“I think he is afraid. I think that his self image is that he has given the world this great gift, and that we should all be on our knees thanking him for having done this.”
If this is true, I think this indicates an interesting and unusual form of optimism bias: optimism bias as applied to social good rather than commercial success. Put simply, Mark Zuckerberg and Facebook may believe so fervently that they are a force for good in the world, that the opposite possibility does not enter their thinking. And this may also mean that those who disagree with that optimistic view of Facebook would either not be listened to, or would censor themselves for the sake of their own careers.
When Aleksandr Kogan — as well as many others — was using Facebook’s liberal privacy settings to harvest data, no one stepped up to say “this could be harmful!”. When Facebook realised what had happened and demanded that Cambridge Analytica deleted their data, no voice was heard saying “shouldn’t we check they did it?”. The voice of what could go wrong was not heard.
What can Facebook do about this?
Fortunately for Facebook, Optimism bias has a treatment, and that treatment is the premortem². Devised by Gary Klein, and popularised by Daniel Kahneman, a premortem is a methodology that asks a team to imagine failure in the future, and then work out what would have gone wrong to cause that failure.
This has at least two benefits: one is that it encourages the entire team to switch their mindset, and apply their imagination to pessimistic rather than optimistic versions of the future, the second is that it legitimises the voice of those team members who already harbour doubts.
I would argue though that the challenges at Facebook are sufficiently severe they should take this idea a step further. Rather than having premortems for individual projects, Mark Zuckerberg should use some of his near infinite resources to create a #deletefacebook team. He should hire the best and the brightest Facebook opponents, and dedicate them to showing that Facebook is not a social good. They should explore the potential dark consequences of Facebook’s reach into our lives, answering questions such as: how can Facebook’s data be used to cause harm? What is the cost of social media in terms of our relationships in real life? How does social media consumption affect self esteem and self image? The team’s output should be embedded in Facebook’s overall governance, ideally reporting into the board in the same way as an Audit function would. WhatsApp creator Brian Actor would be a good first hire.
What does this mean for everyone else?
If your company — small or large — seeks to combine an inspiring social vision with an opportunity for commercial success, be on your guard. Great companies are built around a vision, but there is a fine line between believing passionately in what you are doing and gulping down the corporate Kool Aid. And the products that are being built are powerful. As Nir Eyal quotes in Hooked, “if it can’t be used for evil, it’s not a superpower”. If it is your company that seeks to profit, it is your responsibility — not the responsibility of others — to ensure that you have a way to challenge your vision, and the way you are moving towards it. Building the premortem into your decision making, and your governance, is a good way of doing this. You can even have some fun with it: imagine Rick and Morty travel to a dimension where your product has destroyed the world, and then work out how that happened.
¹ https://www.channel4.com/news/roger-mcnamee-on-mark-zuckerberg-i-think-hes-afraid