By Melissa Chang
In the wake of the recent US election, the dust has (somewhat) settled, and the pointing of fingers has subsided (well, almost). However, one figure that has emerged blameworthy in the last few weeks is Facebook. Facebook has been accused of perpetuating and spreading ‘fake news’ which influenced voters. This presumably is to the benefit of the current President-Elect. Facebook has been around for about 10 years now, and for many of us, it is hard to imagine our lives without Facebook and the countless other social media networks which have emerged in its wake (e.g. Instagram, Twitter, etc.).
Facebook has managed to keep a steady hold of its 1.79 billion users (Statista) by constantly tweaking and playing around with their algorithms which pick the type of content users see on their newsfeed. Swiping through a personalised Facebook feed is presumably more enjoyable. This technology, known as content curation, has attracted some controversy. But to be fair, Facebook should not be the scapegoat for curated content. A 2012 survey reported that curation is now the norm, with only 5% of organizations surveyed not sharing content from other outlets (Content Curation Adoption Survey 2012 Report). From online stores to music streaming services, a majority of internet services now make use of curation in some form. However, Facebook is still the most popular social media network in the world. our familiarity with the medium enables us to better assess the effects of its curated content.
It used to be the case that we saw content chronologically, with most recent content posted by those we follow on top and older content toward the bottom. Yet, in the last few years, that hasn’t quite been the case. These days, you’re more likely to see links to videos and articles that Facebook deems pertinent to your interests, based on other content you’ve ‘liked’, even if they do not come from users you follow. It’s certainly clever, intuitive, and seemingly harmless. However, we should be more eagle-eyed in light of these developments, and assess if it really is as benign as it appears – particularly in the arena of political news. In this article, I will discuss the ethical implications of content curation with a particular emphasis on political content in Facebook, and offer suggestions as to how these problems may be resolved.
It’s one thing for a constant ‘liking’ of cat videos to result in more cat videos on a feed, but having content curated for us can reinforce biases we already hold. It is likely that we ‘like’ content creators or producers who share our views. Several studies have shown that subjects tended to choose news articles from outlets that aligned with their political opinions (Flaxman, Goel & Rao). These sources cannot be trusted to disavow us of opinions which are wrongly held.
For example, let’s say I liked a certain content producer because of their ethos to ‘tell it like it is’, and I agreed with their views on a certain politician. However, this content producer is associated with other producers who disseminate articles with strong anti-environmental leanings. Even if I don’t have I have anything for or against those views, the strong messages popping up on my feed would encourage my tendency toward a stronger attitude. Over time, I might even adopt these pernicious views about the environment, even if I had no intention to do so. Curation technologies and media do not accommodate subtlety in terms of differing messages – liking one website which publishes controversial content might buy you content from overwhelmingly extreme sources. And this might influence you in the process.
In this sense, curation technology is not inherently bad. If used to bring more joy to the internet world in the form of cute kitten videos, it is perfectly benign. However, when faced with graver subject matter, it’s lack of subtlety can have harmful effects.
There is something more worrying at work — the curation of content has allowed fake news to spread to a much wider audience. By spreading content tailored to your interests, you may chance upon content you would have never found on your own. But many of these content sources are not completely trustworthy. Furthermore, the line between parody news, news, and straight-up ‘fake news’ is hard to differentiate.
Fake news spreads information that is verifiably false, and sows distrust in actual news media. In this past election cycle, for instance, fake news diluted and eroded fact to the point that what fact itself was was in contention. Take, for instance, the following headlines: “Pope Francis Shocks World, Endorses Donald Trump for President”, and “FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide”. ‘Clickbait’ tactics make sharing and engaging these stories hard to resist.
Many people entertain verifiably false ideas due to their consumption of fake news. A poll found that 75% of American adults who were familiar with a fake news headline believed that story to be accurate (Silverman & Singer-Vine). This is certainly troubling, since a misinformed voter is quite possibly worse than an uninformed one. Some facts even began to hold less weight and this is quite clear by how many of Trump’s supporters knew the facts about their candidate that should have disqualified him, but it didn’t matter to them.
If fake news is a side-effect of content curation, one could certainly make the case that the technology is ethically questionable. However, it is not explicitly clear that we cannot have content curation without encouraging fake news. My Spotify account often uses content curation to find music I enjoy, without exposing me to fake news. If content curation and fake news are not inextricably linked, it is not certain that the technology is ethically questionable. And therefore, this argument is not particularly strong.
Finally, curating content reduces individual autonomy, delegating our content consumption to a nameless and faceless algorithm that decides what we read and absorb. This is sinister because a process over which we once had control is now being delegated to an entity we do not know, let alone trust. This makes us lazy and perhaps less inclined to discover the truth for ourselves; it allows us to be satisfied with being spoon-fed content.
Without content curation, we would be forced to find our own news sources. We would have to depend on our own sense of accuracy and correctness to derive what we take to be factual. One could argue that there is something of worth in this process, much like how music was spread by radio and mixtapes before the dawn of digital music. Albert Borgmann discussed how ‘things’ require a certain level of engagement and skill to operate, build character and perform a socializing function, while ‘devices’ are merely the commodity of the device. In this sense, radio and mixtapes would be ‘things’, while streaming services like Spotify would be ‘devices’. With the embracing of ‘devices’ over ‘things’ by means of content curation, it does seem that we would reminisce the sense of responsibility cultivated when we find our own sources.
In sum, curation technology in itself does not seem ethically wrong, but it does seem to have some effects on individual autonomy, which could cause the technology to be considered ethically questionable.
For a website as frequently visited as Facebook, there should be more stringent guidelines on content curation. The site has an immense influence and this comes with great responsibility. One suggestion for combating the negative side-effects of content curation would be to diversify the sources broadcast to the user. Even if q user likes content written from a certain point of view, they should also be exposed to content with an opposing view. The user can then make an informed choice as to what to believe.
Another suggestion would be to implement a ‘fact checker’. A non-partisan tool such as PolitiFact [insert url: http://www.politifact.com%5D could be of assistance in this area. One could argue that this is not Facebook’s job – Facebook is not a news website, nor has it ever claimed to be. Yet, the reality is that many people use Facebook as a real-time news source, and Facebook should shoulder some responsibility if they benefit from such a large audience.
Ultimately, one wonders how much any of these suggestions will really help. Many people are stubborn and will believe what they want to believe, discarding what does not fit into their chosen narrative, even if there is abundant evidence to the contrary. Curating political content certainly does not help to bridge increasingly polarized mindsets. This is possibly the best criticism of the technology in itself – when facts themselves are in contention, there is no even starting ground to discuss the way forward.
Image Credits: storypick, AFP, The Mary Sue