skip to main content

The Critica Protocol

February 21, 2023

22276crt Blog Heros Critica Method

How We Propose to Counteract Scientific Misinformation on the Internet

An increasing number of people  obtain information about health and science online. There is also a well-discussed and researched mass  dissemination of misinformation throughout the Internet. This means that in order to counteract misinformation about health and science, it is necessary to consider approaches that do so on social media and other Internet platforms. Arguing scientific issues in the traditional media or at conferences is useful but clearly insufficient.

A main goal for Critica is to develop and test methodology to address scientific misinformation in real time directly on social media platforms. To do this, we have developed a protocol that we plan to implement and test in the near future. The protocol can be initiated after we develop algorithms and identify relevant internet platforms that will  identify misinformation on the Internet for particular topics as it is posted. Such algorithms have been developed by other organizations and are effective in identifying specific postings, so this will not be a technological problem for us to accomplish. For example, a researcher at RWTH Aachen University in Germany was able to  survey YouTube to specifically find videos making claims–both accurate and inaccurate–about climate change. A team of scientists from MIT and the Qatar Computing Research Institute used artificial intelligence to  determine which internet sources are  accurate or politically biased. DrugTracker, developed by scientists at the New Jersey Institute of Technology, monitors online platforms such as Twitter and Reddit to find out  where people are obtaining illegal drugs.

We began this effort by creating criteria for the kind of topic we will address. They are:

  1. The topic must be something that affects human health, either individual, population, or both.
  1. There must be a clear scientific consensus on what the correct information about the topic is.
  1. Misinformation about the topic is frequently found throughout media and has the potential to harm individual and/or public health and safety.
  1. Critica team members feel themselves to be capable and competent to address the topic area.

As an example, consider the anti-vaccination phenomenon. This meets all four criteria: It represents a clear health issue; the overwhelming scientific consensus is that most vaccines are safe and effective and should be administered to all children who are free of medical contraindications; there is a considerable amount of anti-vaccination sentiment continuously found on the Internet and this has led to people refusing to vaccinate their children and to outbreaks of vaccine-preventable illnesses like measles; and Critica team members have done considerable work addressing the anti-vaccination movement.

On the other hand, topics in nutrition like whether red meat increases risk for adverse health outcomes, while interesting and prone to online misinformation, would not meet our criteria because there is not yet a clear scientific consensus on the topic. Similarly, smoking combustible cigarettes would not be an appropriate Critica topic because although there is a clear scientific consensus that this is a substantial health risk, misinformation about smoking safety is not likely to have much effect on actual smoking rates (i.e. most people will not decide to smoke because they see postings on the Internet claiming smoking is safe and does not cause lung cancer).

Once we select a topic and develop an algorithm to detect misinformed online statements, a Critica team member will interact with the individual who created the post and others who have joined the conversation in real time and directly on the site on which the statement appears. We believe doing this as near in time to when the  misinformed posting appears is a critical step because the longer misinformation goes unchallenged, the more difficult it may become to counteract it and its subsequent behavioral effects.

Similarly, we do not believe that posting counteracting information on a different website will work as effectively as placing it on the same site that the misinformation appeared. Finally, there is  evidence that interactive models of science communication work better to convince people about a scientific consensus than purely text-based approaches.

As an example, let us take vaccine safety and effectiveness. A typical incorrect statement found on some social media platforms would be: “vaccines overwhelm a child’s immune system and make them more susceptible to getting sick.” This statement is clearly not supported by the empirical evidence, yet has the potential to influence parents on the fence about vaccinating a child to refuse vaccination or demand vaccine schedules that differ from those recommended by the CDC and the American Academy of Pediatrics (AAP).

The next steps in our protocol for counteracting online misstatements is that a trained Critica team member enters into a conversation with the poster of the original misstatement—in this example about vaccine safety–and anyone else who subsequently enters the chain of comments. Our approach has several overriding principles:

 

  1. Lack of information (the “deficit model”) is not the cause of many incorrect science beliefs and therefore while  rehearsing the facts about an issue is an important component of challenging science misstatements, it is generally insufficient to change minds.
  2. The people we are addressing are usually not trying to deliberately harm the public in order to gain personal advantage but often truly believe the misstatements they make and are speaking out from a sense of duty to prevent harm.
  3. The steps we take have a basis in the scientific communication literature, either by virtue of high-quality laboratory studies, field studies, or both.
  4. We must be transparent that we are trying minimize the spread of misinformation; this is not a “Nudge” approach that seeks to disguise that fact.
  5. We will be careful to avoid condescending and disrespectful messaging, but recognize that there is some  evidence supporting the value of using satire and humor in communication science.
  6. We will try not to enter into discussions of topics with which we have any potential conflicts of interest, but when this is not possible we will disclose all real or potential conflicts of interest on our part.
  7. Our protocol represents a model that must be subjected to rigorous and ongoing evaluation and adjusted according to what those evaluations show.

The protocol as currently formulated and as it would be applied to the example misstatement on vaccines is as follows:

  1. Indicate that you have read the posted comments. Ensuring  a person is heard is an important step toward engagement. However, in doing so try not to repeat the  misstatement verbatim as this can reinforce the false belief.
    Ex. I saw and read your comment about vaccines and their effect on the immune system and I am very interested in what you have to say.
  2. Use  motivational interviewing (MI) techniques to find common ground with the poster. MI is an  evidence-based psychological method that has been shown to be effective in reducing drug addiction behaviors and improving medication adherence. It is a non-judgmental approach that attempts to establish a patient’s goals and elicit approaches to changing attitudes and behaviors that the patient will feel comfortable undertaking. Although combating misinformation in the general public is a slightly different context, we think that many of the same principles apply so we are testing motivational interviewing techniques as a step in changing minds about scientific misinformation.
    Ex. Like you, it is very important to me that we do the best thing possible for our children. The last thing we want is to cause any harm to a child, whether it’s our own kid or someone else’s.
  3. Technique inoculation: An  evidence-based approach to dissuading people from attitudes that contradict scientific evidence is to begin by “inoculating” them. This means giving an initial warning that the poster is about to receive information that contradicts their misstatement. We begin by addressing the technique or method used to generate the misinformation. Although it is not entirely clear how inoculation works, the theory behind it is that by warning people that they are about to receive information that challenges either the method behind the gathering or content of misinformation, an initial anticipatory fear response is elicited that places the individual in a heightened state of attention. Then, a weakened version of the misinformation is presented in a rebuttal of either technique or topic, which stimulates a problem-solving mode. Finally, correct information is presented that can be more easily absorbed.
    Ex. There is a problem with the way the information about what you said about vaccines was gathered that I would like to point out to you.
  4. Technique rebuttal: Now the actual nature of the problem with the method used to gather the misinformation is presented, usually after the poster or others have made comments about points one through three. First, the incorrect claim is stated in slightly weakened form and then immediately refuted.
    Ex. The technique used to develop the claim that vaccines can have a negative effect on the immune system stems from an incorrect generalization from the way illnesses that colds and the flu and the immune system affect the immune system. It is easy to misunderstand the impact of vaccines on the immune system by using examples of how the immune system works when confronted with an actual source of illness, like bacteria or a virus that makes us sick. In that case, the response can indeed be overwhelming and even cause symptoms itself. But those examples don’t apply to how vaccines work.
  5. Topic inoculation: We warn the poster that we are about to provide information that directly contradicts the content of their claim.
    Ex. I am going to offer some facts that are quite different from what you have written.
  6. Topic Rebuttal: After a few exchanges, we now provide the actual content that contradicts the misstatement.
    Ex. It turns out that vaccines do not in fact cause an unusually large or harmful immune response. With every breath we inhale millions of tiny particles that cause an immune response greater than what occurs with a vaccine. Because vaccines are made of either dead bacteria and viruses or very small parts of them, they actually provoke only a very minimal immune response. Even if we gave a young child all the recommended vaccines at once, which of course we don’t do, it would not be enough to harm the immune system.
  7. Tell stories whenever possible. An  often quoted statistic is that stories are remembered 22 times more than facts.  Demonstrating that the Critica team member  practices the behavior being advocated can make the message more believable.
    Ex. I didn’t like watching my son get the shots. He screamed a lot and he developed a fever. But I did make sure he got all of the immunizations on schedule and I am very glad I did that. Fever is temporary but the benefits of vaccines last decades to a lifetime. Now I feel he is completely protected from things that could harm him.
  8. We encourage a sense of self-efficacy by showing people that they are fully able to research and understand the facts we have provided on their own.People are also capable of recognizing their own biases and taking them into account as they weigh conflicting facts and evidence. Self-affirmation or self-efficacy—a sense of confidence that we are in control of our motivations, ideas, and behaviors– has been shown to be important in promoting  behavior change. Feeling there is something we can do helps us change our minds, even when the topic is something we have firm prior beliefs about.
    Ex. There are lots of ways you can check out what I am saying. Talking to your pediatrician is one way. Or I can suggest some sites you can go to on the internet that have reliable  information about vaccines.
  9. Appeal to the need to be part of a  group of like-minded people. Beliefs can become difficult to change if a person holds them because of a group affiliation, in part because social networks restrict the information a member of the network receives. We will try to show that there are very good social groups that adhere to the correct scientific information on a given topic that people can join. Ex. Lots of parents have come to accept the fact that vaccinations are safe for their children and work to prevent some really awful diseases that we don’t want anybody’s child to get. Check out this example:  https://vaccinatesc.org/.
  10. Repeat the information  multiple times because many people will get only a superficial idea of what is being offered and will forget the content of a corrective statement if they see it only once.
  11. Evaluate the effectiveness of the interventions: We will develop a coding method to judge whether comments and responses after we implement our protocol on a social media site appear to be changing minds and convincing people to make evidence-based decisions. These will be rated by independent assessors who are not told the study hypotheses. We will also use surveys and questionnaires to get at whether we are having an impact and which elements of the protocol are or are not most effective.

We stress that this protocol is a work in progress and that we will make modifications, however extensive as necessary, to accord with our evaluation findings. Because we have based this protocol on research we consider to be of high-quality, we believe the Critica protocol has a reasonable chance of being successful in guiding people to accept scientific consensus and make evidence-based decisions about their health and well-being.

Categories: Uncategorized
Tags: No tags