The ‘bot holiday’ and why clinicians can’t tackle disinformation alone

The war in Ukraine has had an unexpected side effect noted by many health care workers on Twitter: “a bot holiday” — the sudden decline of social media accounts espousing anti-vaccine views and attacking health care workers and scientists trying to correct their disinformation.

While disinformation campaigns are not new, they were once spread through rather primitive vehicles, ranging from door-to-door snake oil peddlers or mass printing of propaganda leaflets. The advent of the internet and social media have transformed the battlefield for disinformation. The army of bots deployed on social media enables false information to be deliberately and widely spread around the world at the speed of light.

While there is global consensus from the World Health Organizationtea US Surgeon Generaland others that disinformation is a problem, how to best solve it remains uncertain.

advertisement

While individual clinicians are taking to social media to tackle disinformation, such efforts would have more impact in countering destructive disinformation campaigns if they were part of a substantive multi-pronged approach. We recently wrote a perspective piece in The New England Journal of Medicine on how to support clinicians who use social media to tackle misinformation. It highlights three major strategies:

  • eliminating disinformation at its source
  • supporting clinicians who are being attacked and harassed for countering misinformation
  • establishing a broad training strategy to help all clinicians tackle misinformation.

While eliminating disinformation at its source is the hardest to do, it would have the greatest impact, as evidenced by the bot holiday. We appreciate efforts by the Surgeon General to hold companies accountable for spreading disinformation.

advertisement

The lack of action against social media companies is causing measurable harm. An estimated 12 million people in the US may be unvaccinated against Covid-19 because of misinformation and disinformation, leading to more than 1,200 excess hospitalizations and 300 excess deaths each day since vaccines became widely available.

Critics of attempts t

o regulate social media companies point to First Amendment protections of freedom of speech, defined as the right to express any opinion in public without censorship by the government. While free speech is a fundamental right, there are, in fact, limits to what’s acceptable under the law. Businesses, for example, are not legally able to use misleading or deceptive advertising to promote health products. This is critical to keep in mind, since spreading disinformation can be a profitable business opportunity. The popular newsletter platform Substackwhich employs a paid subscription model, generates more than $2.5 million per year for the authors of two newsletters that spread anti-vaccine disinformation.

It’s no coincidence that celebrities and others are cashing in on spreading disinformation. Joe Rogan uses his platform on Spotify to amplify anti-vaccine rhetoric to appeal to his audience. Gwyenth Paltrow has built an entire wellness and lifestyle brand on unproven, sometimes dangerous products. Others use this method to build their brand as their entry to the political arena.

These examples underscore an important ethical tension: Why should people be allowed to profit by sharing false medical information that is harmful?

To avoid limiting free speech, many scholars say that the best way to neutralize disinformation is for individuals to counter it with truthful information in the so-called marketplace of ideas. The marketplace of ideas sounds attractive, as if it’s a place where the best and most accurate ideas will prevail. But that’s not the case. Disinformation not only spreads faster on social media, its spread is often intentionally seeded through the use of astro turfing and other techniques designed to promote disinformation. To make matters worse, once disinformation takes hold, it’s notoriously difficult to counter.

Another concern about trying to regulate disinformation is who decides what disinformation is. Science evolves, something we have seen firsthand with respect to wearing face masks. In fact, medical reversal — evidence-based guidance that is later changed — is not uncommon.

One way to identify and halt the dissemination of disinformation is to look for patterns of willful spread of false information that contradicts guidance for which there is broad, evidence-based consensus from medical and scientific bodies. For example, some in the anti-vax movement have alleged for years that getting routine childhood vaccines for vaccine-preventable diseases such as measles is harmful. They have been successful in some parts of the country where vaccination rates are not only suboptimal, but measles has resurged.

When health care professionals are the ones spreading disinformation, there are additional actions that can be taken. For example, some state medical boards have suspended practitioners’ licenses because of disinformation spread. Unfortunately, some politicians have also seized the moment to handcuff state medical boards from exerting their ability to self-regulate in this area. This is why specialty boards or societies, which exist outside of state jurisdictions, can be a critical lever. Communities of peers are already used to advise on cases related to disciplinary action or malpractice, and could be repurposed to assess whether a without-a-doubt pattern of spreading disinformation has been established.

The existence of social media demands a fresh approach to addressing disinformation in medicine and health care. While individual clinicians can and will continue to do their part by talking to their patients, their communities, and posting on social media, we won’t see the results we need until a more comprehensive multi-pronged approach to tackling disinformation is implemented. What’s needed is a permanent “bot holiday” to execute the work health care professionals are trained to do, guide decisions based in science, and ultimately save lives.

Shikha Jain is medical oncologist, assistant professor of medicine in the Division of Hematology and Oncology at the University of Illinois in Chicago, associate director of oncology communication and digital innovation for the University of Illinois Cancer Center. Vineet Arora is a hospitalist and dean of medical education and professor of medicine at the University of Chicago Pritzker School of Medicine. Eve Bloomgarden is an endocrinologist at NorthShore University Health Systems and director of thyroid care and of endocrine innovation and education for the Division of Endocrinology at NorthShore University. All three are co-founders of IMPACTa coalition of physicians and health professionals working to identify and meet the needs of health care workers and communities in Illinois.

Leave a Comment