In 2012 Facebook tweaked the algorithm to manipulate the emotional content appearing in newsfeeds of 689,003 randomly selected, unwitting users. Posts were identified as either positive’ (awesome!) or negativeโ (bummer) based on the words used. In one group, Facebook reduced the positive content of news feeds, and in the other, it reduced the negative content. โWe did this research because we care about the emotional impact of Facebook and the people that use our product,โ Kramer says. โWe felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friendsโ negativity might lead people to avoid visiting Facebook.โ Did tinkering with the content change the emotional state of users? Yes, the authors discovered. The exposure led some users to change their own behaviours: the researchers found people who had positive words removed from their feeds made fewer positive posts and more negative ones, and vice versa. It could have been an online version of monkey see, monkey do, or simply a matter of keeping up with the Joneses. ‘The results show emotional contagion’, Adam Kramer and his co-authors write in the academic paper.
Excerpt from: Who Can You Trust?: How Technology Brought Us Together โ and Why It Could Drive Us Apart by Rachel Botsman