Enabling Diversity Through Decentralised Social Media

Awareness of the problematic aspects of social media has been brewing for a long time, but recently, it has exploded in the public debate, with the effects of “fake news” and “echo chambers”, and also on the editorial role of Facebook. The response of Facebook has not impressed commentators. To make my position clear from the start: This is a very complex problem to solve and involve social, economical and psychological problems that need to be addressed. However, I wish to make the case that the technology also matters, and an approach must necessarily begin with changing the underlying technology. In this blog post, I would like to list the primary social problems that I have personally identified. However, I’ll try to take a positive angle, this is not just about Facebook and that failings, this is what we should create for the future. Out of the social problems, the echo chamber problem is only one.

The Echo Chamber Problem

My personal history with this problem doesn’t start here, it started 20 years ago, as I took on managing the Web site of the Norwegian Skeptics Society. The subjects we were concerned with were fairly marginal, and easy to label superstition. Astrology, UFOs, etc. It soon became quite apparent that the problem wasn’t to reach out with information, the actual problem was much, much harder, it was to get that information into closed minds. Since then, this problem has grown from being a marginal problem considered only by a few, to a major social problem.

Now, some may say that the problem is not technological, and possibly that technology is indifferent to the problem, any technological platform can help get information into closed minds, and any technological platform can provide infrastructure for opposing viewpoints and enable a debate between them. I disagree, and I think recent events make that clear. Even if technology alone cannot open closed minds, there are technological choices that are critical in enabling the needed platform for an open public debate.

First, it is important to constrain the scope of the technological problem, by understanding the problems that are of different origin. The reason why fake news thrives on Facebook is complicated and this article argues it comes down to the emotions of the people in your social network. This article contains a discussion of why “popping the bubbles” is problematic. It is also important to be reminded of the effects of identity protective cognition. Facebook has themselves published research on the issue. What is interesting is that nowadays, my Facebook feed is full of anti-Facebook sentiments, but none of these articles showed up there. I had to go look for them, only after I started to share these articles in my News Feed myself, similar articles started to surface. Now, the “echo chamber” and “filter bubble” metaphors may not reflect the true nature of the problem, but Facebook arguing that they are not doing so bad is mainly due to the lack of a ambitious baseline, we don’t know yet what could be achieved if a structured, cross-disciplinary approach was made. Even if the most important effects are social and psychological, if information isn’t available it can certainly not be acted upon.

To further understand the problem, we should listen to Facebook’s Patrick Walker, who responded to the “Terror of War” photo removal with a keynote given to Norwegian editors. The keynote is well worth watching, not just because it provides insight into the core problems, but also because it provides hints to the road ahead.

Patrick Walker himself gives an excellent summary of the accusations they face:

“[…] people said that we weren’t transparent, there’s no way of knowing what our algorithms do, that publishers don’t get any warnings about changes to the news feed, that we refuse to accept that we were editors and were abdicating our responsibility to the public, that we were trying to muzzle in on the media business and eroding the fundamentals of journalism and the business models that support it. That by encouraging editors to optimize for clicks and showing people what they want to see, we were creating filter bubbles and echo chambers.”

He then goes on to forward the values of Facebook, to give people the power to share and to make the world more open and connected. I have no reason to doubt that they are actually trying to do that. In fact, they may often be succeeding. However, the Web is what to a greater extent gives people the power to share. The Web is what truly empowers people, Facebook included, Facebook is merely a thin layer on the top of the Web. The problem is that it is truly a layer. If Facebook was just yet another possible social network, with a limited scope just like Mark Zuckerberg proposed, that’d be fine:

But it isn’t, it wields much more power than that, since it has effectively put the function of a social network into its own hands and controls that. Patrick Walker then goes on to describe how difficult it is to create global community standards for Facebook, and how problematic it would be if these standards did not apply to all of Facebook. He then concludes that people are free to say whatever they want elsewhere, but not on Facebook. This part of the talk is very compelling, and calls into question the appropriateness of calls for national or EU-mandated regulations. But it also makes it clear that Facebook cannot play the role of a public debate platform, he said that pretty much himself. In the moment an opinion becomes unpleasant, and those are the opinions that need the most protection, it has no place on Facebook. He says it clearly: “Facebook is not the Internet”. It makes it clear, to solve most of the problem he mentioned we have to create that alternative to Facebook that provides the platform for an open, public debate.

It is also clear from his talk that many of the problems that Facebook faces are due to that Facebook needs a single set of rules, and while he has made a good case for that for Facebook, it doesn’t have to be that way on the Internet. In fact, the architecture of the World Wide Web is decentralised, there is no single actor, such as Facebook that should control such an important feature as a social network. Decentralising the social network will have the effect of allowing a plurality of standards. Facebook only has to solve Facebook’s problems, these problems are not universal, and that’s why I say that the underlying technology matters. A decentralised social network has a different set of problems, some are difficult, but it is my clear opinion that the hardest problems are created by Facebook, and can be solved as decentralisation enables diversity and pluralism.

The General Lack of Diversity

As Walker noted, they have been accused of attacking the ability of the press to perform quality journalism. There is some merit to the argument, even if it was easy to predict that social media would be the most important distribution channel more than 15 years ago, before the social networks grew large and strong. Now, the press has to choose between letting Facebook be the editor-in-chief and hopefully a benevolent provider of a working business model, or to maintain their autonomy, and essentially starting over and figure out how to make money in social media on their own.

The problem is not just about Facebook or the press. Recently, Elon Musk said on their carpooling efforts that it “wasn’t Tesla vs. Uber, it is the people vs. Uber”, implying that Uber is a monopoly problem waiting to happen.

The centralisation is not only a problem for opinions in the public debate and  business models, though both are important aspects. It creates difficulties for permissionless innovation, an aspect central to the Web and the reason why Facebook could succeed. It limits the research that can be done on the platform, for example, no one else could have done the article we referenced, which places the owner of the social network in an ethically problematic privileged position.

The General Lack of Universality

With diversity challenged, another key aspect of the Web is also challenged: Its universality. Not everyone has a place on Facebook. The obvious ones that are excluded are pre-teen children. Not that they seem to care. In social networks, they certainly have a place. Moreover, people with cognitive disabilities will find the environment on Facebook very hostile, where they can be fooled into helping the spread of fake news and other material, also material that Facebook legitimately may delete. To some, much damage can be done before appropriate action can be taken. Moreover, their friends are not empowered to help. That’s not what the social network should have been, what I first had in mind was to port the social cohesion of real life to the Web, but the opposite has happened. This is a great failure, but at least a problem centralised systems could solve if they wanted to.

Combating Abuse

It gets even harder once we get to the problems surrounding revenge porn and “grooming”. I want to make clear that Facebook is not the target for this criticism, I’m talking more generally here. The problem is severe, but a problem that has just a few victims, and I believe that it cannot be solved if one is thinking only in terms of commercially viable systems. The technical contributions towards solving this problem is something I think needs to be government funded. Decentralisation is not necessarily helpful technologically, but standards and adoption of once approach could make a large impact. I think it is critical to addressing this problem that we enable trust models to work on the Web and that people are enabled to look out for each other.

Respect for the Individual

Finally, and this is a key problem for the future as well as the present, is the respect for the rights of individual citizens. We are moving towards an Internet of Things, where all kinds of sensors will provide lots of data, often connected to persons, and mining them can yield information that each citizen would certainly consider highly sensitive. I believe we cannot simply go on, neither in research nor in business technology, and pretend these problems are not significant, or that they are simply Somebody Else’s Problem. I reject the notion that the majority doesn’t care, they care, but they are left powerless to remedy the problem.  I hold it as a moral imperative to empower people, and we, as technologists have to tackle this problem.

I fear that we may have an anti-vax or anti-GMO type backlash if we do not commit to a privacy-aware infrastructure, so even if the moral imperative isn’t accepted, one should take this possibility seriously.

A Different World is Possible

I have already pointed out that decentralisation is an important technological enabler for a different world, and stated that this new infrastructure must be privacy aware. Obviously, neither the motivation nor the solution of a decentralised social network is new, it has been tried for a long time. So, how can we possibly succeed now? I think several things have changed: Most importantly, it is now understood that this is a problem that have large effects for entire societies. Secondly, we have standards, and we have learned much from the past, both in terms of technological mistakes and from research.

Now is time for action. We need a broad understanding of the problems, but we also need to start with the fundamental technological infrastructure that we need to provide to the world, and try out several possible approaches, approaches that can further the understanding of the cross-disciplinary solutions. I hope to be able to contribute in this space.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.