With Facebook facing ongoing criticism over the sale of users’ data, CommonSpace spoke to political scientists and data security experts in Scotland to learn more about the bigger picture of how social media is shaping our politics
“IF YOU go shopping for a CD online you’ll see you recently bought this and it advertises something similar, whereas you used to go into the shop and see things you didn’t expect. I preferred that.”
In 2014, Dr Mark Shephard, a senior politics lecturer at Strathclyde University, conducted research into the use of social media in the Scottish independence referendum. In 2017, he studied the role of social media in the breakthrough of Labour leader Jeremy Corbyn. Now, Shephard says he is concerned that social media might be limiting our choices and pushing people further into polarised “silos”.
The past two months has seen a new level of scrutiny directed at the role of social media in our lives and our politics amid revelations of the now defunct Cambridge Analytica’s alleged illegal use of Facebook data to influence the US election and, potentially, the Brexit referendum. But experts say that the entirely legal ways in which social media users’ data is shaping political discourse and campaigns should also be of concern to the discerning voter.
Polarisation on social media is often discussed in terms of trends in human and political behaviour, but Shephard suggests that the algorithms used by the likes of Twitter and Facebook could be behind it. What people see on social media, he explains, is based on past behaviour, so the information they’re exposed to tends to reinforce existing viewpoints – meanwhile, those on the “other side” are separated by an information divide.
READ MORE: Fraser Stewart: How social media algorithms can rob us of democratic access
Referring to the example of the independence debate, Shephard says this could serve to accentuate opposing views and lose sight of the spectrum of opinions on offer. “If you’re partisan, you’re going to see extreme Yes or No, but when you speak to people there are a lot of differences in between,” he says.
The consumption of information on social media, he says, can be similar to reading a news source with a particular bias. “For example, if you only watch Fox News, you will only get one opinion – with social media people are being encouraged to only see one view,” Shephard says.
Dr Kami Vaniea, a lecturer in cyber security and privacy at Edinburgh University’s School of Informatics agrees that social media algorithms can serve to create a “filter bubble” whereby people see only the posts which match with their apparent interests.
“If you really want a shock, go on Facebook and look on all your friends’ pages. You’ll see everything they’ve been posting that you never knew about,” she says. “That’s because Facebook does a curated feed of what they think you’ll like.
“You see some of this already – people tend to have a favourite news channel, but the difference is that you know that you’re doing that. With social media, you don’t even realise what you’re not seeing.”
What’s more, Vaniea explains, these algorithms can be utilised by those who wish to target particular groups, often in highly refined ways, be that to advertise products or to promote political campaigns. “There is an issue of targeting people but it’s not, as Google is always reminding us, done by a person – it’s an algorithm, so it must be fine,” she says.
In reality, she suggests, such practices could have “major political problems”, because it “takes away some of our ability to have discussions about what a politician is saying because you’re being individually targeted”.
READ MORE: Mick Clocherty: Why fluffy social media use from our police forces is bad in the long run
This is an issue which has come to the fore following the US election and alleged Russian interference, but Vaniea stresses that these methods are “very, very common”. “Obama used this incredibly well – targeting people down to the very house. It used to be seen as novel and interesting, but now that it’s Russia it’s seen as scary,” she says.
For Vaniea, the revelation of how Facebook data was used by Cambridge Analytica “isn’t really a surprise”.
“We’ve had marketing campaigns trying to influence what cereal you eat, and there’s been extensive research on that. It’s just more obvious when you look at politics as it becomes a bigger global issue,” she adds.
Vaniea recalls a New York Times article which revealed how Target used shopping information to determine which of its customers were pregnant in order to market baby-specific items to them before birth. “The idea with political opinions is quite similar – if I can specifically target your opinions on social media I can target information to you.”
The problem at the heart of this issue, she says, is that “these types of approaches are silent from the end-user perspective”. Due to growing awareness and resistance to the issue, Facebook in the US is now marking ads which are paid for politically, and Vaniea says this “has a lot of influence on how people view information”.
While moves like this, and the introduction of the General Data Protection Regulation (GDPR) in the EU this month, are likely to have some positive impact in this area, Vaniea says social media companies continue to be reticent when it comes to transparency.
“Facebook use the word ‘algorithm’ when really they want to say ‘magic’, because they don’t want to tell you,” she says. Now, she adds, “there needs to be an element of accountability”.
Professor Sarah Pedersen, director of research in communication, marketing and media at Robert Gordon University, agrees that social media companies “need to take responsibility” for the content on their platforms. Currently, such companies are not classified as publishers, but Pederson says this needs to change.
“In this country we have a variety of methods that newspaper publishers keep to codes of practice. We don’t have similar codes of practice or ways of holding to account social media companies and perhaps we should be thinking how that should be done,” she says.
“We’ve seen Facebook CEO Mark Zuckerberg speak to Congress and [social media] executives speaking at parliament, but how do we make sure they act as the publishers they are? It’s clear that they need to be more transparent.”
READ MORE: David Carr: The malign dictatorship of King Zuckerberg
While Zuckerberg has been called to give evidence at the House of Commons as part of the Cambridge Analytica inquiry, Facebook confirmed this week that he has no plans to do so, despite the threat that he would be forced by a formal summons the next time he enters the UK.
Mark Shephard also thinks it would be “nice” if social media companies were more transparent, and suggests that they could explain to people that, for example, “based on your friends you’re likely to receive one kind of information and you’ll be in a silo”.
Indeed, he says: “It might be nice if there was a guide at the start of any campaign that explained the biases of different traditional and social media.”
With regards to the people behind the political campaigns taking advantage of social media’s capabilities, Shephard is clear that parties have always made use of available methods of targeting and influencing, and that “parties would be silly if they didn’t try to put their best food forward”.
This, he says, is why it’s so important that people are able to access a broad range of sources of information, and why they ought to be aware that they may not be seeing the full picture. In light of this, he says, there is a need for people to take action themselves.
Shephard says it is a bit comparable to an algorithm for what music you listen to.
“It’s up to us as citizens to say: we’re being sold Ultimate Kylie [Minogue] and actually it’s only a couple of good hits.
“It’s incumbent on people to read widely and have a diversity of friends online as this impacts how information is targeted – so you can try confusing the algorithms if you’ve got those friends and networks.”
READ MORE: Theresa May grilled over government links to Cambridge Analytica
Awareness of these issues is growing and people are “becoming increasingly sceptical of what they’re reading”, Shephard adds, while others are “deleting their accounts” on social media.
Pedersen, who has studied the use of social media and blogging sites to discuss US and UK elections, says that her research indicates that people are already smarter in their use of these platforms than some might suggest.
“The research I’ve done shows that people do think about what information they are consuming and look to the left and right. I think people are aware of the agenda of organisations or media outlets,” she says.
“Sometimes we view social media users as sheep to be led, but they are aware of these issues. I’m not saying it’s not a concern, but we shouldn’t just assume readers of news don’t bring a thoughtful approach to it.”
While the alleged corruption of firms like Cambridge Analytica might hit the headlines, the everyday algorithms of social media may be influencing our politics in ways we have not yet fully comprehended, never mind worked out how to address.
Picture courtesy of Book Catalog
