The Washington PostDemocracy Dies in Darkness

On social media, vaccine misinformation mixes with extreme faith

Even with renewed efforts by tech companies, religious-themed misinformation is among the hardest to police

February 16, 2021 at 6:00 a.m. EST
(Franziska Barczyk for The Washington Post)

In an insular world on the social media app TikTok, young Christians act out biblically inspired scenes in which they are forced to take a vaccine for the coronavirus, only to end up splattered in fake blood and on the brink of death.

The melodramatic videos are an attempt to represent how the introduction of coronavirus vaccines could herald the biblical End Time. Along with hundreds of thousands of other vaccine-questioning posts by social media users all over the world, they’re demonstrating the ways in which health misinformation is targeting Christians, some reaching sizable audiences.

Some churches and Christian ministries with large online followings — as well as Christian influencers on Facebook, Instagram, TikTok, Twitter and YouTube — are making false claims that vaccines contain fetal tissue or microchips, or are construing associations between vaccine ingredients and the devil. Others talk about how coronavirus vaccines and masks contain or herald the “mark of the beast,” a reference to an apocalyptic passage from the Book of Revelation that suggests that the Antichrist will test Christians by asking them to put a mark on their bodies.

The rapid spread of this material has triggered debate and concern among U.S. Christian leaders and experts who believe the religious movement against vaccines is growing, even as many leaders such as Pope Francis and Southern Baptist Convention policy leader Russell Moore are urging people to get shots. Both approved vaccines, Pfizer-BioNTech and Moderna, passed rigorous federal safety reviews and were shown to be more than 94 percent effective at preventing disease.

Some non-priority District residents are receiving coronavirus vaccines just before they are set to expire. (Video: Allie Caren/The Washington Post)

“In the summertime, I thought, these are just fringe beliefs. But the further we got into the pandemic, I realized, these are very widely held, and I was surprised by how many Christians and churches subscribe to this,” said Emily Smith, an epidemiologist at Baylor University, a private Christian university in Waco, Tex. She runs a large Facebook page dedicated to discussing covid-19.

Smith, who is Christian and married to a Baptist pastor, said her posts trying to disavow anti-vaccine sentiment have been met with hostile responses and threats.

“It’s one of the scariest and most disheartening parts of this, that so many people think that when you put on a mask, it is the mark of the beast or signals that you don’t have faith or God isn’t in control,” she added.

The prevalence of these baseless or distorted claims is yet another example of the way technology companies have failed to control the spread of harmful and problematic material on their platforms. Social media companies have all banned misinformation about the coronavirus and the vaccine, citing the potential for such material to cause “imminent harm.” But enforcement has been spotty.

Groups opposed to vaccines have long used social media to get out their message. But with the development of coronavirus vaccines and the pandemic shutdowns, the companies have enabled misinformation about vaccination and the virus itself to become much more widespread. Conspiracy theories about covid-19 that proliferated on social platforms were a major force behind people protesting shutdown measures and government restrictions over the last year, while a misleading documentary called “Plandemic” became a top trending video on YouTube.

Pope Francis plans to get coronavirus vaccine, calling it ethical obligation

White evangelicals, along with Black Americans of different faiths, are some of the groups with the highest levels of vaccine skepticism in the United States. Just under a third of U.S. adults say they will probably or definitely not get the vaccine, compared to 44 percent of those who identify as White evangelicals, according to a January Washington Post-ABC News poll. Other polls have found higher levels of vaccine skepticism among White evangelicals.

Some creators said they were not entirely certain that the material they were sharing was accurate or certain to become reality, but that it was important to raise questions — and to warn and potentially protect fellow citizens.

“If you believe without a shadow of a doubt that a bus was coming at you, would it not be evil of me to not tackle you and try to get you out of the way?,” said Tyler Lackey, a 27-year-old Dallas-based Christian influencer and rapper who goes by the username BeyondHymn on TikTok and other platforms.

The Book of Revelation describes the End Time as a bloody battle filled with persecution, during which a beast forces “all people, great and small, rich and poor, free and slave, to receive a mark on their right hands or on their foreheads,” according to the New International Version. “They could not buy or sell unless they had the mark, which is the name of the beast or the number of its name,” 666.

A video by Lackey in July — “Could vaccines be the mark of the beast?” — speculated that mask mandates in stores and other rules might signal the mark because they were the first step in preventing people from being able to buy and sell, which Revelations suggests will be prohibited in the End Time. The video was viewed over 100,000 times.

He said that video and another he had made about Bill Gates were “shadow banned” by TikTok, meaning they were blocked from the service’s recommendations page.

A pastor’s life depends on a coronavirus vaccine. Now he faces skeptics in his church.

Facebook and Facebook-owned Instagram banned in December covid-related content associated with the mark of the beast, fetal tissue myths and misinformation connecting vaccine ingredients to the Antichrist as part of a broader ban on vaccine-related conspiracy theories. Such material has become harder to find but is still available, according to a Washington Post review using the Facebook-owned analytics tool CrowdTangle.

For example, in late December, a Facebook page with 61,000 followers called Exposing Satanic World Government posted a screenshot from the right-wing news site Gab, in which a user claimed that Amazon, banks and airlines would force people to take the vaccine. The screenshot shows Gab founder Andrew Torba commenting that such practices were the “Mark of the beast.” The post, which Facebook removed after The Post shared it with the company, had 315 likes and shares. (Amazon founder and chief executive Jeff Bezos owns The Post.)

TikTok banned two hashtags, #MarkOfTheBeastIsTheCovid19Vaccine and #VaccineIsTheMarkOfTheBeast, in late January in response to The Post’s inquiry. The company said the hashtags and affiliated ones, which had over 700,000 views in December, were tied to misinformation about coronavirus vaccines and violated the company’s medical misinformation policy. The company has banned such misinformation since March.

Thousands of tweets related to the mark of the beast disappeared in January after Twitter conducted a large sweep of accounts connected to the QAnon movement — a sprawling set of false claims that have coalesced into an extremist ideology that has radicalized its followers — in the wake of the Capitol riots, according to Zignal Labs, an analytics firm that identified nearly 12,000 references to it across social media in December. That suggests a significant overlap between those sharing Christian-themed conspiracy theories and those sharing other conspiracy theories about the election and covid, according to Zignal.

As QAnon grew, Facebook and Twitter missed years of warning signs about the conspiracy theory’s violent nature

Experts said that religious-themed content can fall into a gray area of policy enforcement for technology giants, making it some of the hardest misinformation to police. Tech platforms have tried to draw a distinction between people expressing opinions or negative viewpoints about the coronavirus and people putting out misinformation about it. They also try to discern overall intent. But such distinctions are challenging to suss out, particularly when content is framed as a question.

“Linking misinformation to religion can be difficult to combat because a person may feel like they are being attacked for their religious beliefs,” said Kolina Koltai, a vaccine misinformation researcher with the University of Washington’s Center for an Informed Public.

Conspiracy theories are often most potent when they are targeted at specific communities rather than aimed for mass appeal, she said, because they “are connected to deep-rooted parts of a person’s value system.”

The technology platforms can supercharge people’s exposure to misinformation when their algorithms direct users to content that is similar to what they have previously viewed or encourage them to join like-minded groups and communities where misinformation can fester unchecked.

Fashion influencer Taylor Rousseau, 21, said she and her family received numerous threats after posting a viral music video on TikTok last year in which she pretended to be injected with a deadly microchip for refusing the vaccine. In the video, Rousseau, wearing her own brand of thick faux eyelashes, is splattered with fake blood while a James Arthur song plays. She prays tearfully for a savior, who appeared as a light shining from the clouds at the end of the clip, which was viewed nearly 700,000 times.

For some Christians, the Capitol riot doesn’t change the prophecy: Trump will be president

“If someone doesn’t get the vaccine ’cause of me, I don’t really feel that is harming them,” Rousseau said in an interview, adding that she got the idea for making her video from another creator who had made a similar one. “Social media is already causing an overwhelming amount of fear.”

Rousseau explained that many Christians she knows believe that the social unrest and division over the last year is a sign of the world ending, and that her video was meant to be a metaphor for things that could happen if the apocalypse were to come to pass — rather than an actual microchip in the vaccine itself.

“Everybody missed the point,” she said. “The video was 100% a POV [point of view] to portray what life would be like in the End Times.”

The video was removed from TikTok for violating the company’s “integrity and authenticity policy,” which prohibits “medical misinformation that can cause harm to an individual’s physical health,” but she appealed the decision and it was reinstated two weeks later, according to records provided to The Post.

Along with misinformation, social media platforms are hosting spirited religious debates about these topics. A YouTube video titled “Is the COVID 19 vaccine the Mark of the Beast?” by a pastor of a megachurch in Riverside, Calif., who refutes that the vaccine is the mark, has over 202,000 views.

Conversations within communities are sometimes more effective at combating misinformation than labeling or other measures from social media companies, said Renee DiResta, research manager at the Stanford Internet Observatory.

“People who are understanding of the framework of belief and empathetic to the underlying religious concern have a huge role to play in combating this misinformation,” she said. “It has to come from someone who occupies a position of trust.”

The Trump administration wants to take credit for a covid vaccine. Trump supporters are undermining it.

The claim that the mark of the beast is a microchip within the vaccine is a Christian-themed spin on a distorted narrative about Microsoft co-founder and philanthropist Bill Gates. That claim has been circulating all year among different groups, but the hoax gained new force when repurposed as a religious claim, DiResta said.

Misinformation becomes increasingly difficult to combat when religious leaders and celebrities endorse it. Ties between the vaccine and the mark of the beast have been drawn by pastor Guillermo Maldonado of Miami megachurch King Jesus International Ministry — where President Donald Trump visited in 2020 to launch his campaign outreach to Latinos — and by the Amazing Power of Prayer, a Christian ministry focused on virtual prayer that has more than 246,000 Facebook followers.

King Jesus International did not respond to requests for comment. Amazing Power of Prayer founder Trevor Winchell said in an email that he had come to his conclusions after spending “many long hours digging and researching vaccines and microchips and the possible combination of a microchip being put into a vaccine.”

When rapper Kanye West called the vaccine the mark of the beast in July, searches for the terms “mark of the beast” and “vaccine” spiked on Google and Facebook and were more popular than at any other point during the year, according to CrowdTangle and Google Trends. A spokesperson for West did not respond to a request for comment.

Foreign players, including Russia and China, have also gotten involved in sowing division over vaccination issues. In January, articles by state media service Russia Today and China Plus America about Facebook blocking posts by a Mexican cardinal who called the vaccine the mark of the beast received several thousand likes and shares, according to CrowdTangle.

Another creator, Sam Smith, a 33-year-old sales representative in St. Louis who posts popular Christian-themed content on Instagram and other services, said TikTok took down a video in which he said he didn’t actually claim that the vaccine was the mark of the beast and was merely raising questions. A similar video is still on Instagram.

“The moment you say something that goes against anyone’s agenda, you are banned or shadow banned,” he said. His next video, he said, would be about how the same number of people have died in 2019 as in 2020.

Experts say that viral claim is false and that U.S. deaths in 2020 topped 3 million, by far the most ever counted.

Michelle Boorstein contributed reporting.

Coronavirus: What you need to know

Covid isolation guidelines: Americans who test positive for the coronavirus no longer need to routinely stay home from work and school for five days under new guidance planned by the Centers for Disease Control and Prevention. The change has raised concerns among medically vulnerable people.

New coronavirus variant: The United States is in the throes of another covid-19 uptick and coronavirus samples detected in wastewater suggests infections could be as rampant as they were last winter. JN.1, the new dominant variant, appears to be especially adept at infecting those who have been vaccinated or previously infected. Here’s how this covid surge compares with earlier spikes.

Latest coronavirus booster: The CDC recommends that anyone 6 months or older gets an updated coronavirus shot, but the vaccine rollout has seen some hiccups, especially for children. Here’s what you need to know about the latest coronavirus vaccines, including when you should get it.