Last week Unilever announced it would boycott Facebook‘s advertising if it would not clean up the swamp. And in Belgium a judge ruled that Facebook violated privacy laws. What is going on? Is it over and out for social media and other digital platforms?
The End of an Era?
Unilever wants Facebook and Google to clean up their act. They do not want to see their ads alongside messages that could diminish the value of their brands. Indeed, hate messages, sexist content, racist insults, threats, dubious humor, … you can find it all on social media.
So the second largest advertiser in the world wants to withdraw. The main reason for doing this is self-protection. Fast movers are very sensitive as far as their brand’s integrity is concerned. They don’t like their ads to appear alongside “contrasting content”. In traditional media, this is fairly easy to control. On social platforms it isn’t. An algorithm decides what you get to see and that’s pretty annoying for both the user and for the advertiser.
So it’s possible that a hateful message comes right after an ad. Or that the ad is published alongside other irrelevant ads. There is no way for an advertiser to control that. And there is actually very little control over what Facebook is doing. But the acceptance of that seems to be coming to an end.
No Control over Digital Giants
Are we getting fed up with the unlimited growth of digital giants that seem to evade the notion of control? Some of these players like Google have already been fined.
Privacy laws are increasingly strict. Facebook is publishing ads in traditional media to confirm that privacy is important to them. But it wasn’t until regulators interfered. Privacy issues, fake news, algorithms, fines, and now a possible ad boycott. It has not been easy for social media platforms lately. And also other digital platforms like Uber and Deliveroo see their aspirations corrected by new regulations, the application of old regulations or the reregulation of the industries they are in.
App Fatigue
Also, there is no unbridled growth anymore. The question is now, are users withdrawing? There are some early signs they might be. Twitter has made profit for the first time, but there is no growth in number of tweeps. Also Facebook has to admit that the use of its platform is less intense. And there is a surge, however modest, in the return to analogue applications: the paper notebook, the scrapbook, the pub, the paper book, vinyl, the face-to-face conversation. We are suffering from app fatigue.
We are suffering from app fatigue.
But this does not mean that digital goes away. We will order our scrapbook on platforms like Amazon. We will look for the restaurant review on Tripadvisor. We will evaluate our employer on Glassdoor. Everything we do has a digital dimension. We share, buy, learn, … digitally, supported by digital platforms.
Trust
Digital has changed the way we do things: how we work, play, communicate, shop, … But what has not changed is what people need. Digital platforms should not forget about the single most important aspect of relationships. That is trust. Trust makes relationships easy. Without it, people will not share, learn, connect. People need to be able to trust the platform and trust the users on the platform.
Trust is King
That trust has been violated. These platforms have been the cradle for many good things, but they also have been the haven for the worst behaviour. And there was no control.
Digital platforms have repeatedly refused to accept responsibility for the content they distribute. In a peer-to-peer society they have not taken the role of regulator, moderator, guardian of good taste. They have abandoned the idea that the cultural layer of people is rather thin, and the moment there is no control people seem to lose their good manners. One of the most extreme examples of this was the problem of rape by drivers working for Uber in India.
It is not possible for digital platforms to hide behind the statement that they are technology companies offering a platform to their users, without accepting responsibility for the behaviour of these users.
Why do People Misbehave (more) on Digital Platforms
The main thing is that people dehumanize others the moment they think there is no consequence for their asocial behavior. They scold, insult, threaten, harrass other people and think there is no problem. In social situations without visual contact, or worse, in social situations where interactions are anonymous, social norms seem to evaporate.
Facebook and others have clearly underestimated that. The world is not a nice place where everyone means well. As there is no self-correction on social media, an no control of identity an army of trolls seems to get the freedom they are denied in the physical world. They might get the occasional push-back, but that does not stop them. Because most people are passive on social media, they do not intervene, defend, correct, … just like in real life.
Back to reality
In real life people have invested time and effort in creating laws that structure society. Over generations a legal system has been created that reflects the norms of the era and regulate behavior. These laws define what’s right and what is not. And they define how behavior that does not comply is treated. The old-fashioned notion of law will only work when there is also law enforcement.
And here comes trust. We can trust most others to behave according to the social norms, often made explicit in the laws. It’s not a contradiction to base trust on norms. Norms and laws define our behaviors. Many of those laws are meaningful. Some of them come natural, and some of them require some extra reinforcement.
But I can walk the streets of my city, because I know most people will not rob me. I can send my daughters to school, knowing that they will not be harmed. I can cross a street when there’s a green light because I know the cars will wait.
Trust and Verify
And no, I am not naïve. I have to teach my daughters to be vigilent. And when I cross the street I still look to see if the cars are waiting. But still, in most of the cases there is no problem. And that’s what trust is about.
Trust is determined by someone’s competence (including in terms of the application of socil norms), someone’s loyalty and benevolence (they will not harm me) and someone’s integrity (they will act according to the reigning values).
And I have to be vigilant on digital platforms.
Digital Trust is a Matter of Control
So digital platforms need to emulate the norms and the compliance. Technology needs to integrate the need for trust. And as we used to trust people, we will also need to be able to trust technology. Trust remains very human. When I book a house on air bnb, I make sure I leave it behind in an orderly manner, not only because I think that is the right thing to do, but also because I know that I will be exposed as a filthy guest and my reputation will be slandered. So the trust is not based on spontaneous behaviour, it is also based on the human control mechanisms built into the digital platforms.
Today Facebook and others invest more in moderation and vigilance. They kick out terrorist accounts, remove inappropriate content and enforce the rules of the platform. In a sense they are policing their own platform. But how many people do you need to follow and intervene a community of more than one billion people? Here come the algorithms again.
Small is beautiful?
Maybe the platforms have become too big and too slow to adapt. Where are the new platforms that go by the rule of “small is beautiful”? Where are the platforms that emulate society better ? Where are the platforms who offer algorithms that are not biased and inspire trust?
They exist. They function like Slack and other cooperation support applications. These will be the new micro-communities where people who know each other and want to work together will digitally meet. Closed communities that are willing to pay to be ad-free. These communities will have their leaders and contributors. And maybe people will program their own platforms to escape the reach of billionaires who use their data as raw material.
Facebook might have understood this. They want their platform to focus on what friends and relatives have to say and want to shield that intimacy from the big bad world. I wonder if they will be able to do that.
A new Digital Era?
I think digital comes into a new era. Old solutions (like the rule of law) will correct the often hailed freedom of speech principles on social media. And from this conflict something better will emerge. But it won’t look like a giant Facebook any more. It’s going to be small and intimate. And above all, it’s going to be based on trust and the right amount of social norms and enforcement, in combination with the incredible power of digital.