Many articles and blog posts arguing for the vaccine-autism link have the trappings of genuine academic research: tables, graphs, citations, and scientific jargon. Some of the authors have credentials such as M.D. or Ph.D. degrees. None of these things is a guarantee of scientific value, as the history of science is full of crackpot theories (e.g., AIDS denialism) that are the heavily-footnoted products of people with letters after their names. But most people will not be able to spot the scientific weaknesses of such work. Outside of academia, few understand concepts such as peer review. Jordynn Jack describes one dubious article that appeared in a non-peer-reviewed publication: “Regardless of the scientific validity of the article, though, the writers perform the writing style quite effectively. It would be difficult for the layperson to distinguish this article from any other scientific research paper, especially if one did not investigate the nature of the journal … or of the scientific response to the article.”
Perceived experts (i.e. medical professionals and biomedical scientists) are trusted sources of medical information who are especially effective at encouraging vaccine uptake. The role of perceived experts acting as potential antivaccine influencers has not been characterized systematically. We describe the prevalence and importance of antivaccine perceived experts by constructing a coengagement network of 7,720 accounts based on a Twitter data set containing over 4.2 million posts from April 2021. The coengagement network primarily broke into two large communities that differed in their stance toward COVID-19 vaccines, and misinformation was predominantly shared by the antivaccine community. Perceived experts had a sizable presence across the coengagement network, including within the antivaccine community where they were 9.8% of individual, English-language users. Perceived experts within the antivaccine community shared low-quality (misinformation) sources at similar rates and academic sources at higher rates compared to perceived nonexperts in that community. Perceived experts occupied important network positions as central antivaccine users and bridges between the antivaccine and provaccine communities. Using propensity score matching, we found that perceived expertise brought an influence boost, as perceived experts were significantly more likely to receive likes and retweets in both the antivaccine and provaccine communities. There was no significant difference in the magnitude of the influence boost for perceived experts between the two communities. Social media platforms, scientific communications, and biomedical organizations may focus on more systemic interventions to reduce the impact of perceived experts in spreading antivaccine misinformation.
From the article:
Information consumers often use markers of credibility to assess different sources (25, 26). Specifically, prestige bias describes a heuristic where one preferentially learns from individuals who present signals associated with higher status (e.g. educational and professional credentials) (27–29). Importantly, prestige-biased learning relies on signifiers of expertise that may or may not be accurate or correspond with actual competence in a given domain (25, 30). Therefore, we will refer to perceived experts to denote individuals whose profiles contain signals that have been shown experimentally to increase the likelihood that an individual is viewed as an expert on COVID-19 vaccines (31), although credentials may be misrepresented or misunderstood (user profiles may be deceptive or ambiguous, audiences may not understand the domain specificity of expertise, and platform design may impair assessments of expertise if partial profile information is displayed alongside posts). We focus on the understudied role of perceived experts as potential antivaccine influencers who accrue influence through prestige bias (4, 13). Medical professionals, biomedical scientists, and organizations are trusted sources of medical information who may be especially effective at persuading people to get vaccinated and correcting misconceptions about disease and vaccines (29, 32–35), suggesting that prestige bias may apply to vaccination decisions, including for COVID-19 vaccines (36, 37).
In addition to signaling expertise in their profiles, perceived experts may behave like biomedical experts by making scientific arguments and sharing scientific links but also propagate misinformation by sharing unreliable sources. Antivaccine films frequently utilize medical imagery and emphasize the scientific authority of perceived experts who appear in the films (42, 43, 48). Although vaccine opponents reject scientific consensus, many still value the brand of science and engage with peer reviewed literature (49). Scientific articles are routinely shared by Twitter users who oppose vaccines and other public health measures (e.g. masks), but sources may be presented in a selective or misleading manner (40, 49–53). At the same time, misinformation claims from sources that often fail fact checks (i.e. low-quality sources) are pervasive within antivaccine communities, where they may exacerbate vaccine hesitancy (6, 18, 54, 55).