A pair of studies published Thursday in the journal Science offer evidence not only that misinformation on social media changes minds, but also that a small group of committed “supersharers,” predominantly older Republican women, were responsible for the vast majority of “fake news”. in the period analyzed.
The studies, conducted by researchers at MIT, Ben-Gurion University, Cambridge and Northeastern, were conducted independently but complement each other well.
In the MIT study Led by Jennifer Allen, the researchers note that misinformation has often been blamed for vaccine hesitancy in 2020 and beyond, but that the phenomenon remains poorly documented. And it is understandable: not only is the data in the world of social networks immense and complex, but the companies involved are reluctant to participate in studies that could present them as the main vector of disinformation and other data wars. Few doubt that they are, but that is not the same as scientific verification.
The study first shows that exposure to misinformation about vaccines (in 2021 and 2022, when the researchers collected their data), particularly anything that claims a negative health effect, actually reduces people’s intention to get vaccinated. (And intention, previous studies show, correlates with actual vaccination.)
Second, the study showed that articles flagged by moderators at the time as misinformation had a greater effect on vaccine hesitancy than unflagged content, so well done flagging. Except for the fact that the volume of unreported misinformation was vastly greater than the material flagged. So while it had a minor effect per piece, its overall influence was probably much greater in the aggregate.
This type of misinformation, they clarified, was more like mainstream media publishing misleading information that mischaracterized risks or studies. For example, who remembers the headline “Healthy doctor died two weeks after receiving COVID vaccine; The CDC is investigating why” from the Chicago Tribune. As the magazine’s commentators point out, there was no evidence that the vaccine had anything to do with his death. However, despite being highly misleading, it was not flagged as misinformation, and the headline was subsequently viewed about 55 million times, six times more people than the total number of people who viewed all of the flagged materials.
“This conflicts with conventional wisdom that fake news on Facebook was responsible for low vaccine uptake in the United States,” Allen told TechCrunch. “It could be the case that Facebook use is correlated with lower vaccine acceptance (as other research has found), but it could be that it’s this ‘gray area’ content that’s driving the effect, not the outlandishly fake stuff.” “.
The finding, then, is that while suppressing blatantly false information is useful and justified, it ended up being just a small drop in the bucket of the toxic farrago in which social media users were swimming.
And who were the swimmers who spread this misinformation the most? It’s a natural question, but beyond the scope of Allen’s study.
In the second study Published on Thursday, a multi-university group reached the surprising conclusion that 2,107 registered American voters accounted for the spread of 80% of “fake news” (their term) during the 2020 election.
It’s a broad claim, but the study cut the data pretty convincingly. The researchers looked at the activity of 664,391 voters matched with active users of
These 2,107 users exerted (with algorithmic help) an enormously enormous network effect by promoting and sharing links to politically flavored fake news. Data shows that 1 in 20 American voters followed one of these supersharers, putting them massively ahead of the average users within reach. On any given day, about 7% of all political news was linked to misleading news sites, but 80% of those links came from these few people. People were also much more likely to engage with their posts.
However, these were not state-sponsored plants or robot farms. “The massive volume of Supersharers did not appear automated, but rather was generated through manual and persistent retweets,” the researchers wrote. (Co-author Nir Grinberg clarified to me that “we can’t be 100% sure that supersharers are not puppets, but using state-of-the-art bot detection tools, analyzing temporal patterns and application usage, they do not appear automated.”)
They compared supersharers to two other groups of users: a random sample and those who share the most non-fake political news. They found that these fake news peddlers tend to fit into a particular demographic: older, female, white and overwhelmingly Republican.
The supersharers were only about 60% female compared to the panel’s even split, and significantly, though not much, more likely to be white compared to the group, which was already mostly white overall. But they were much older (58 on average versus 41 overall), and about 65% were Republicans, compared to about 28% in the Twitter population at the time.
The demographic data is certainly revealing, although it must be taken into account that even a large and very significant majority is not everything. Millions, not 2,107, retweeted that Chicago Tribune article. And even the supersharers, the science commentary article notes, “they are diverse, including political pundits, media personalities, contrarians and anti-vaxxers with personal, financial and political motives for spreading untrustworthy content.” It’s not fair older women in red states, although they figure prominently. Very outstanding.
As Baribi-Bartov et al. He concludes grimly: “These findings highlight the vulnerability of social media to democracy, where a small group of people distorts political reality for many.”
One is reminded of Margaret Mead’s famous saying: “Never doubt that a small group of thoughtful and committed citizens can change the world. In fact, it’s the only thing that has done it.” Somehow I doubt this is what she had in mind.