On the morning after the EU referendum, did you wake up and give your phone a perfunctory glance just to check Britain had not left the European Union, only to be stunned when you realised your fellow citizens had voted for Brexit?
[Image credit: Michael Vadon via Wikimedia Commons]
Were you left with a similar feeling of complete disbelief when – more recently – the people of the US voted in a presidential candidate who, as far as you were aware, was widely reviled?
Or maybe you voted for Brexit and were confused at the widespread dismay as to the result. As far as you knew, all your friends were against staying in the European Union, and the media said leaving would definitely improve life for Brits.
If any of the above scenarios sound familiar, then it’s likely you have been the victim of a filter bubble. In some way, all Facebook users are.
[Image credit: iStock/TheaDesign]
What is a filter bubble?
Filter bubbles have come to be due to website algorithms which guess what an individual user would like to see based on information about that person. This information may include past click behaviour, location and search history – although organisations like Google and Facebook have traditionally been very secretive as to exactly what their algorithms are made up of.
In his book Filter Bubble, Eli Pariser, chief executive of Upworthy, described the effect as “that personal ecosystem of information that’s been catered by these algorithms”.
The Facebook News Feed is a prime example of this. Once upon a time, upon logging into your account you would have been faced with status updates and photos your friends had uploaded, in chronological order. Indeed, in 2006 – back when you needed a university email address to get a Facebook account – my Facebook experience was just one “Craig got 2 drunk last night lol” status after another.
But in 2016, just as your pals have gradually become more savvy when it comes to what to show you on social – Craig is all about heavily-filtered travel snaps now – so has Facebook. You’ll see posts from some friends and not others, and content appears in a seemingly random order. Often there will be more articles and adverts than posts from people you know. These might come from organisations you follow on the social platform, but Facebook also distributes content via ‘suggested posts’, tailored to your specific interests.
In theory, this should be a good thing, as you’re being shown relevant content that you, personally, will be interested in. As a very basic example, if you’re a woman who has recently searched for a Christmas party dress online, you’re likely to be more interested in articles and adverts surrounding female fashion for the winter party season than you are in content focussing on mens’ shoes. Helpful, right?
[Image credit: iStock/scyther5]
Why is this a problem?
When it comes to news, this approach can create a filter bubble with sinister implications. Ultimately, it means that people can become isolated in their own cultural or ideological bubble. Everything a user sees on social platforms and search is in line with their established world view, backing up any misinformed opinions. This can also mean they become separated from potentially vital content that disagrees with their opinions, or at the very least, would offer them an alternative.
German chancellor Angela Merkel spoke out against the rise of the filter bubble in a high-profile speech, claiming that Facebook had a duty to show its users unbiased and accurate news. She called for major internet platforms to reveal the secrets of their algorithms, suggesting that filter bubbles endanger debating culture.
She said the issue is a “challenge not just for political parties but for society as a whole”, adding that this lack of transparency could “lead to a distortion of how people perceive things”.
Ultimately, in a world where people are heavily influenced by social media and online content, a system that controls access to facts, information and opinion could not only divide society and cause serious unrest, but actually change the results of elections, threatening democracy.
In some cases, it could potentially change the results of elections without people actually knowing what they’re voting for. Some people believed that if Brexit happened, all immigrants would instantly be expelled from Britain, or that £350 million per week would go into the NHS – myths that were quickly busted once the results were revealed. Reading content other than the articles in their online social bubble could have dispelled the notion that either of these things were likely or even possible.
[Image credit: iStock/Pinkypills]
Would Facebook really get rid of the filter bubble?
I spoke to Catherine Cooke, our head of social at Axonn, who said it’s unlikely that Facebook would remove the filter bubble function – at least without legal coercion – due to the success and popularity of the service.
And this isn’t necessarily a bad thing.
While the filter bubble can be risky in some ways when it comes to editorial news and opinion pieces, analytics and targeting data, such as the filter bubble, are actually transforming digital marketing from a consumer perspective.
She said: “Users are less frequently pestered by totally irrelevant ads, and are much more likely to see marketing content that they find interesting.
“The discontinuation of the filter bubble, from the digital marketing perspective, would be a great shame as it would usher back in more irrelevant and inappropriate ads.”
So while the filter bubble can be problematic when it comes to news or opinion content, it can be very helpful from a consumer perspective – ensuring you’re not bored or disengaged by content totally irrelevant to you.
Facebook and fake news
In the wake of the US election, experts are suggesting that ‘fake news’ or misinformation – primarily shared through Facebook – could form another piece of the puzzle as to the shock surrounding this year’s world events.
While satirical sites like The Onion and Clickhole peddle what are clearly spoof stories, there are others that are not so easy to distinguish from the truth.
BBC’s Newsbeat spoke to the ‘chief reporter’ at Southend News Network: a ‘news site’ that deliberately publishes fake stories. He said that while it all started as “a bit of a joke”, his site rakes in up to two million views per month. In a world where online content providers are constantly clamouring for clicks, it’s easy to see why he’d continue with this business model, no matter how harmful it could be.
“Half the people fall for the stories, the other half are genuinely entertained by what they read,” he said.
Facebook has been one of the biggest distributors of such news, and in the run up to the US election, several world leaders called for the social platform to pull such content, predicting it could influence the outcome of the election, as well as other world events.
In a blog post responding to concerns over misinformation, Facebook chief Mark Zuckerberg said: “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.”
He added that links people report as false on Facebook are penalised in the platform’s News Feed – along with clickbait, spam and scams – so “it’s much less likely to spread”.
However, Zuckerberg acknowledged that these problems are complex, both technologically and philosophically.
“We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible,” he said. “We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”
What does this mean?
Of course, it is far too simplistic to say that the plethora of fake news on Facebook and its ideologically isolating filter bubble algorithm are solely responsible for electing Donald Trump when there are endless contributing factors.
No-one can doubt the media’s sway over voters, and yet, Trump didn’t have a lot of support among mainstream media outlets, with the Clinton campaign dominating the news agenda. This fact, however, may not have been immediately apparent to right-wing Americans, who might only have seen Trump-supporting stories pop up on their social channels.
As people increasingly go to social platforms for their news, Facebook and other such services are going to be faced with many more questions surrounding censorship. No longer is the internet the global village where every opinion is equally heard – now there are algorithms and influencers, and people with the power and the information to decide what you see.
Whether Facebook manages to successfully combat fake news in the community-minded spirit it intends to, and whether it agrees to be more transparent with its filter bubble algorithm, it is going to have to tackle the ‘technological and philosophical’ problems surrounding censorship and controlling what people see online soon.And vitally, it will need to tackle these issues without harming its incredibly lucrative marketing-orientated business model.