Social media sites continue to amplify misinformation and conspiracy theories. To address this concern, an interdisciplinary team of computer scientists, physicists and social scientists led by the 91社区 (USF) has found a solution to ensure social media users are exposed to more reliable news sources.
In their study published in the journal , the researchers focused on the recommendation algorithm that is used by social media platforms to prioritize content displayed to users. Rather than measuring engagement based on the number of users and pageviews, the researchers looked at what content gets amplified on a newsfeed, focusing on a news source鈥檚 reliability score and the political diversity of their audience.
鈥淟ow-quality content is engaging because it conforms to what we already know and like, regardless of whether it is accurate or not,鈥 said Giovanni Luca Ciampaglia, assistant professor of computer science and engineering at USF. 鈥淎s a result, misinformation and conspiracy theories often go viral within like-minded audiences. The algorithm ends up picking the wrong signal and keeps promoting it further. To break this cycle, one should look for content that is engaging, but for a diverse audience, not for a like-minded one.鈥
In collaboration with researchers at Indiana University and Dartmouth College, the team created a new algorithm using data on the web traffic and self-reported partisanship of 6,890 individuals who reflect the diversity of the United States in sex, race and political affiliation. The data was provided by online polling company YouGov. They also reviewed the reliability scores of 3,765 news sources based on the NewGuard Reliability Index, which rates news sources on several journalistic criteria, such as editorial responsibility, accountability and financial transparency.
They found that incorporating the partisan diversity of a news audience can increase the reliability of recommended sources while still providing users with relevant recommendations. Since the algorithm isn鈥檛 exclusively based on engagement or popularity, it is still able to promote reliable sources, regardless of their partisanship.
鈥淭his is especially welcome news for social media platforms, especially since they have been reluctant of introducing changes to their algorithms for fear of criticism about partisan bias,鈥 said co-author Filippo Menczer, distinguished Luddy professor of informatics and computer science at Indiana University.
Researchers say that platforms would easily be able to include audience diversity into their own recommendation algorithms because diversity measures can be derived from engagement data, and platforms already log this type of data whenever users click 鈥渓ike鈥 or share something on a newsfeed. Ciampaglia and his colleagues propose social media platforms adopt this new strategy in order to help prevent the spread of misinformation.