According to a study, most social media users share content without reading it
Congratulations! Opening the link and reading this text so far is something that a minority of readers do today, especially if the content in question has been shared on Facebook, according to research conducted in the USA.
After analyzing more than 35 million public posts with widely shared links on the social media platform between 2017 and 2020, researchers found that about 75% of the shares were made without users first clicking on the link.
Political content is the most shared
Of the total analyzed, political content from both ends of the spectrum (left and right) was shared without clicking more frequently than politically neutral content.
The findings suggest that social media users tend to only read news headlines and summaries rather than fully engage with the main content.
The results, published in the journal Nature Human Behaviorindicate that a similar phenomenon can also occur on other social media platforms, as well as explaining why “fake news” has spread so easily.
“Scary” discovery.
“It was a real surprise to find that more than 75 percent of the time, links shared on Facebook were shared without the user clicking them first,” said corresponding author S. Shyam Sundar, a professor at Penn State University in the United States.
“I thought that if someone shared something, they would read it and think about it, that they would support or even defend the content. You might expect that perhaps some people would occasionally share content without thinking, but the majority of shares would be like this? It was a surprising and very scary discovery.”
How the data was obtained
Access to the Facebook data was granted through Social Science One, a research consortium hosted by Harvard’s Institute for Quantitative Social Sciences focused on obtaining and sharing social and behavioral data responsibly and ethically.
The data was provided in partnership with Meta, Facebook’s parent company, and included users’ demographic and behavioral information, such as a “political affinity score” determined by outside researchers. They identified the pages users followed, such as media reports and political figures.
Determined political affinity
The researchers used the political affinity score to classify users into one of five groups: very liberal, liberal, neutral, conservative and very conservative.
To determine the political content of shared links, researchers in this study used machine learning, a form of artificial intelligence, to identify and classify political terms in the content of the links. They rated the content on a similar political affinity scale, from very liberal to very conservative, based on how many times each affinity group shared the link.
This new political content affinity variable was created based on 35 million Facebook posts during the US election season over four years. This is considered sufficient time to understand news sharing patterns on social media, according to study co-author Eugene Cho Snyder, an assistant professor at the New Jersey Institute of Technology.
Political bias in the press
The team validated the political affinity of news domains, such as CNN or Fox, based on the media bias graph produced by AllSides, an independent company committed to helping people understand bias in news content, and a classification system developed by researchers at Northeastern University.
Using these classification systems, the team manually classified 8,000 links, first identifying them as political or non-political content. The researchers then used this dataset to train an algorithm that evaluated 35 million links shared more than 100 times on Facebook by users in the United States.
Content in line with the political profile
One pattern identified and confirmed is that the closer the content was to the user’s political alignment, the more it was shared without clicking. “They are simply broadcasting things that appear, at first glance, to agree with their political ideology, without realizing that they may sometimes be sharing false information,” Snyder noted.
The findings support the theory that many users browse news based on headlines and summaries alone, Sundar said, explaining that Meta also provided data from its third-party fact-checking service, which identified that 2,969 shared URLs were linked to false content. .
More than 41 million shares!
Researchers found that these links were shared more than 41 million times without being clicked! Of these, 76.94% came from conservative users and 14.25% from liberal users. The researchers explained that the vast majority – up to 82% – of links to false information in the dataset came from conservative news domains.
How to stop this behavior?
To reduce click-sharing, Sundar said social media platforms could introduce “frictions” to slow sharing, such as requiring people to acknowledge that they have read the entire content before sharing it.
“Shallow processing of headlines and summaries can be dangerous if false data is shared and not investigated,” Sundar said, explaining that social media users may believe the content has already been fact-checked by those in their network who view it. they share, but this work shows that this is unlikely.
“If platforms implemented a warning that content might be fake and made users acknowledge the danger when sharing it, it could help people think before sharing it.”
Disinformation campaigns
That wouldn’t stop intentional misinformation campaigns, Sundar said, and individuals still have a responsibility to fact-check the content they share.
“Disinformation or false information campaigns aim to sow doubt or dissent in a democracy – the extent of these efforts came to light in the 2016 and 2020 elections,” Sundar said.
“If people share without clicking, they may be unknowingly contributing to these campaigns, fueling organized efforts by hostile adversaries to sow division and distrust.”
Why share without clicking?
“The reason this happens might be because people are just bombarded with information and don’t stop to think about it,” Sundar said.
“In such an environment, misinformation is more likely to go viral. We hope that people will learn from our study and become more media literate, digitally savvy, and ultimately more aware of what they are sharing.”
Source: Terra

Ben Stock is a lifestyle journalist and author at Gossipify. He writes about topics such as health, wellness, travel, food and home decor. He provides practical advice and inspiration to improve well-being, keeps readers up to date with latest lifestyle news and trends, known for his engaging writing style, in-depth analysis and unique perspectives.