Facebook makes us more narrow-minded - Study

  07 January 2016    Read: 1305
Facebook makes us more narrow-minded - Study
Research finds that users seek out information that reinforces their beliefs, which is then shared and given increasing weight whether accurate or otherwise.
Facebook reinforces the beliefs of users because they tend to seek out news and views that tally with their own opinions, according to a new study.

The social networking site creates an "echo chamber" in which a network of like-minded people share controversial theories, biased views and selective news, academics found.

This means that any bias held is simply repeated back to them unchallenged and accepted as fact.

The research, published in the Proceedings of the National Academy of Sciences, analysed Facebook data about the topics people discussed on the social network in 2010 and 2014.

It concluded: "Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation and polarisation.

"This comes at the expense of the quality of information and leads to proliferation of biased narratives fomented by unsubstantiated rumours, mistrust, and paranoia."

The researchers, from several Italian institutions and Boston University in the US, found that once a piece of information was accepted as fact, it spread rapidly throughout that particular online "community of interest," despite having no proven basis in science.

It noted that scientific information could often be traced, whereas the origins of conspiracy theories such as the controversial belief that vaccines cause autism, were difficult to identify.

"Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest," the paper said.

"Whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by... how much it coheres with the user`s system of beliefs."

The researchers said the way in which common ideas were shared and reinforced may explain how certain phenomena, such as the rejection of global warming evidence, become widespread.

"This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumours, mistrust, and paranoia," they added.

The scientists warned that the problem of unreliable information going "viral" online had become so serious it was classed of one of the biggest social threats by the World Economic Forum.

"Massive digital misinformation is becoming pervasive in online social media to the extent that it has been listed by the World Economic Forum as one of the main threats to our society," they said.

More about:  


News Line