Government Intervention Needed to Address Vaccination Misinformation

GW faculty argue in a new commentary that regulation is necessary to curb the spread of false information about vaccines on social media.

vaccinations
Misinformation about vaccines often mimics scientific language, reasoning and justifications. (Photo: National Institute of Allergy and Infectious Diseases)
September 05, 2019

By Kristen Mitchell

Government regulation may be needed in order to address the growing proliferation of misinformation about vaccines on social media platforms, according to two George Washington University researchers. Misinformation about vaccinations is a public health risk that contributes to an increase in unvaccinated children and undermines herd immunity, they said.

Y. Tony Yang, a professor at the GW School of Nursing and Milken School of Public Health, and David Broniatowski, an associate professor in the School of Engineering and Applied Science, along with a colleague from the University of California, Hastings College of the Law, recently published an article in JAMA Pediatrics detailing how governments should operate within the existing legal framework aimed to protect the Internet’s free market of ideas to combat misinformation.

Read more about Dr. Yang and Dr. Broniatowski’s recommendations and what they believe can be achieved:

Q: Some people think content regulation should be left up to social media platforms. Why do you say governments should have a more active role in regulating misinformation around vaccines?
Dr. Broniatowski: There are serious concerns arising from having social media companies determine what content can or cannot be shared. Since there is a large number of such platforms, disallowing content on one platform may simply lead to a migration to another platform. This suggests the need for a consensus approach among platforms.

Social media companies may not have the expertise in house to determine whether specific content should or should not be shared. Beyond the potential legal and censorship issues stemming from deleting content in a manner that may seem arbitrary to some, this can also negatively affect those who seek to understand rationales for vaccine refusal. If public health officials are not familiar with the concerns of the vaccine hesitant because they don’t get a chance to express these concerns, how are we to adequately respond to them?

Q: How do you suggest governments regulate misinformation while also respecting the First Amendment right to free speech?
Dr. Broniatowski: This depends both on the nature of the content and the technology of the platform. Content whose primary purpose is to generate ad revenue or sell a product is not simply an expression of free speech—it is a form of business transaction and therefore may be subject to the same kinds of regulations designed to prevent false advertising. Similarly, content that can be traced to state-sponsored propaganda is also not designed to inform.

Regarding the technology, some platforms, such as Facebook, primarily spread vaccine misinformation through groups and pages. These groups have administrators who are charged with ensuring that the platforms’ terms of service are respected. Without restricting speech, governments may be able to encourage platforms to uphold these terms, holding both the platforms and the group administrators accountable for not doing so. Similarly, platforms such as Twitter are frequently prone to automated malicious content, which also violate platforms’ terms of service. This automation is often designed to game platforms’ content promotion algorithms, however, platforms could make it much more difficult for such automation to be used.

Q: Can you explain what changes social media companies should make to their user terms of services agreements to crack down on misinformation?
Dr. Broniatowski: In many cases, platform terms of service already ban some types of misinformation, however, the terms differ among platforms. The government could productively convene a series of meetings designed to help platforms generate a consensus set of terms for what sorts of online behaviors are acceptable. Some platforms may choose not to participate in this process; however, in so doing they would now be violating the industry standard and may potentially be held liable in a manner analogous to holding a non-board-certified physician liable for malpractice.

Q: Why do you urge public health professionals and physicians to take to social media themselves to combat vaccine misinformation?
Dr. Yang: While social media is unfortunately a source of much vaccine misinformation, it is also a platform that can be used to combat the lies with truth. When qualified and respected health professionals use social media to disseminate science and information about vaccines, this can be a powerful tool. This platform is versatile and allows people to use engaging images and hashtags to go along with their factual data. Using social media to combat vaccine misinformation has the potential to reach a much broader audience, including the vaccine hesitant who would not seek out scientific materials otherwise.

Q: How would health literacy training minimize the effect of misinformation?
Dr. Yang: Information about vaccines is complex and people with low health literacy may find it difficult to comprehend and adequately evaluate the facts. Moreover, critical thinking and evaluation skills are required for people to seek out and locate information—and even more so to avoid the abundant misinformation online. People without these health literacy skills may be prone to falling victim to misinformation and be more likely to exhibit vaccine hesitancy. Therefore, training in health literacy provides individuals with the skills they need to successfully navigate a complex health system and discern the best choices for them and their families based off actual information.

Q: Do you have any tips for helping individuals identify this type of content when they come across it online? What should they be looking for?
Dr. Yang: Vaccine misinformation comes in many different forms. Some promotes the fears that people have about potential vaccine side effects. Some proposes supposedly safer, natural alternatives. One thing that most effective misinformation has in common though, is that it mimics scientific language, reasoning and justifications. That is one reason why advanced health literacy is beneficial to help discern the truth from the pseudo-science. If you are unsure of the scientific validity of a piece of dubious vaccine information, cross check with your physician or another trusted scientific source. Also, seek out content that reinforces the truth: vaccines are safe, reliable and one of the best ways we have to prevent disease.

 

Learning & Research

News

Research Finds Extreme Elitism, Social Hierarchy among Gab Users

September 03, 2019

A new study discovers small number of influential users share homogeneous content sometimes associated with state-sponsored propaganda.

What Can Policymakers Learn from California’s Childhood Vaccine Law?

June 10, 2019

A study led by Avi Dor, Milken Institute SPH professor of health policy and management, found the 2016 law led to an increase in vaccination rates.

Russian Trolls, Bots Influence Vaccine Discussion on Twitter

August 24, 2018

A new GW-led study found that bot accounts shared anti-vaccination messages 75 percent more than average Twitter users.