Recent studies have demonstrated that using Facebook can actually make us more biased in our views. According to academic Michela Del Vicario’s new paper, our social circles online are uniquely geared towards sharing the sort of content which reflects our own views back at us, with clusters of different groups around the social media network all sharing material that they agree with – with very little crossover between the clusters.
Also known as reader’s bias or confirmation bias (in which we unconsciously consume information which supports our own thoughts and beliefs), this phenomenon is not “news” to Facebook. In fact, it’s hard-wired into Facebook’s DNA, almost literally.
Yes, the psychology and social dynamics of the social media network appear to organically steer us towards developing narrow minds and biases, but the actual algorithm behind the search engine’s news feed unintentionally performs a very similar function, which leads users to almost exclusively see content that they already agree with.
Here’s how the newsfeed algorithm plays its part in the phenomenon: The algorithm behind Facebook’s newsfeed is a very closely guarded secret, yet its core purpose is to keep users happy (so that they’ll keep coming back and seeing advertisements on the platform). That means showing users content that they like. To help decide what a particular user will like, the algorithm mines the wealth of data linked to an individual, using thousands of factors to determine what will be shown most prominently.
Many of these key factors are based on users’ relationships to other users. How many times you write on a friend’s Timeline, like a person’s posts, view their photos or use Facebook Messenger to chat to them will all figure in how prominently their posts and content they like will appear on your feed. As we tend to be closest to those with similar views, the content we see is therefore already skewed towards our own biases.
There are also many other factors at play behind the scenes of the newsfeed, which assess your actions in truly minute detail. For instance, liking an article after you’ve clicked on it is an even more powerful positive signal than liking it before clicking as the former action suggests you actually read the content. This type of metric use is all part of the vast feedback loop which makes up the News Feed, continually supplying users with the content most similar to content they, or their closest (and therefore most similarly minded) friends, have liked in the past.
But Facebook’s goals are much grander than simply spoonfeeding us what we like. The social media network now hopes to give us the content we don’t even know we want – yet: “If you could rate everything that happened on Earth today that was published anywhere by any of your friends, any of your family, any news source…and then pick the 10 that were the most meaningful to know today, that would be a really cool service for us to build. […] That is really what we aspire to have News Feed become.”