Tech giants have long relied on a relatively lean-staffed business model to make social media a profitable enterprise. That model has been undermined by significant user abuses, including hate speech, terroristic propaganda and child pornography. Now the war on COVID-19 vaccination poses a new threat with immediate, lethal consequences. Facebook and other platforms have taken steps to stem the tide of disinformation, so far with little impact. Nevertheless, Yelp believes it may have a more effective answer.
Social media is not winning the war on COVID-19 vaccines
The question of whether or not social media platforms ought to police their content is the core of the issue. After all, editors in traditional print media and electronic media, as well as radio and television, have the power to exercise pinpoint control over their content.
That power does not necessarily come down on the side of truth and justice, as amply demonstrated by organizations like Breitbart News and various television personalities, among others. Nevertheless, that power does exist. Conventional news organizations can and do select their content on a granular, human scale.
The sheer size and pace of social media platforms is an entirely different animal, one that makes it impossible to police content one post at a time.
Until recently, social media platforms have relied on the argument that they are merely technology tools and are not responsible for user content. However, signs of a significant crack in the armor began to emerge in 2016, when the boycott campaign fostered by the grassroots organization Sleeping Giants began. Instead of soliciting individual consumers to boycott media organizations, Sleeping Giants asks consumers to pressure advertisers and other stakeholders to withdraw their advertising dollars.
The vaccine disinformation problem festers
Despite some attention to the problem of vaccine disinformation in recent years, Facebook and other social media platforms have become a powerful weapon in the war against COVID-19 vaccines.
Last May the Center for Countering Digital Hate (CCDH) name-checked just 12 individuals it deemed responsible for circulating the majority of COVID-19 vaccine disinformation on Facebook and other social media platforms.
The investigation highlighted the failure of social media platforms to detect even the most egregious abuse, despite all the resources at their command.
In particular, CCDH singled out Facebook for permitting “private and secret anti-vaccine Groups where dangerous anti-vaccine disinformation can be spread with impunity.”
That should have prompted action, but earlier this week the organization Media Matters drew attention to the "Dan Stock" vaccine misinformation video, which apparently spread from YouTube to Facebook and Twitter.
“In a little more than three days, a viral video pushing misleading claims about coronavirus vaccines and masks has earned more than 90 million Facebook engagements from uploads to streaming platforms, receiving millions of views. The video is spreading despite YouTube and Facebook’s rules against coronavirus misinformation, and its reach is significantly higher than the numbers for earlier coronavirus conspiracy theory videos,” Media Matters reported, chastising both platforms for failing to learn from earlier experiences.
Turning up the pressure on tech companies
The business-to-business boycott strategy of Sleeping Giants has motivated next-level action from at least one advertiser. In July 2020 Pernod Ricard introduced an app that appears to draw from the Sleeping Giants crowd sourcing model. The app enables the company to get alerts directly from consumers about abusive social media content.
The app provides Pernod Ricard with a pathway for proactively protecting its brand and communicating abuses to its advertising platforms, rather than waiting to be flagged by boycott campaigns.
By October 2020, the company was spearheading an industry-wide campaign to adopt the crowdsourcing alert model for hate speech.
As of this writing the initiative has not pivoted to include vaccine misinformation, but industry observers have speculated that an “Internet Superfund” could help clean up abuses related to COVID-19 information.
The Yelp social media solution
As a tech company and social media platform, Yelp is also at risk of reputational loss due to vaccine disinformation. However, over the years Yelp has built a relatively effective system for identifying and punishing businesses that post or solicit fake reviews, deploying a combination of software and human moderation.
Yelp also earns good marks for removing reviews that consist of abusive rants and other off-topic content, a practice known as “review bombing.”
The company is already on the alert for review bombs related to the COVID-19 pandemic, and earlier this week it announced that it is ramping up its efforts to stem the tide.
In a blog post earlier this month, Yelp’s vice president of user operations Noorie Malik introduced a new filter that enables consumers to select businesses according to their COVID-19 vaccine policies for customers and staff. Malik also explained the steps that Yelp is taking to protect businesses against abuses by anti-vaccination reviewers.
The steps mirror Yelp’s efforts in support of Black-owned businesses and other identity attributes through its Consumer Alerts program.
“For businesses that activate ‘Proof of vaccination required’ and ‘All staff fully vaccinated’ on their Yelp page, we are putting protective measures in place to proactively safeguard them from reviews that primarily criticize the COVID health and safety measures they enforce,” Malik wrote, adding that the past several weeks have seen an increase in review bombing related to vaccine misinformation.
The Facebook social media solution
Yelp’s new vaccine policy filter could help provide businesses with much-needed community support in pushing back against local anti-vaccination agitators.
Meanwhile, in an interesting coincidence of contrasts, earlier this week Facebook announced that it has finally shut down a leading anti-vaccination network. Practically on the same day, Facebook also officially launched a new “prayer post” feature, which it has been testing since last year.
“In Facebook Groups employing the feature, members can use it to rally prayer power for upcoming job interviews, illnesses and other personal challenges big and small. After they create a post, other users can tap an ‘I prayed’ button, respond with a ‘like’ or other reaction, leave a comment or send a direct message,” reported the Associated Press.
Whether or not the new prayer feature can help prevent more people to stop posting, sharing, and absorbing vaccine misinformation on social media remains to be seen.
Meanwhile, an increasing number of boots-on-the-ground religious institutions are holding mass vaccination clinics in a desperate attempt to save their members’ lives.
As the Delta variant plows a bloodthirsty path through unvaccinated populations — including children and infants — the prayer post feature could become a real money-maker for Facebook. Perhaps the company could devote some of that revenue to stepping up its own efforts on vaccine misinformation.
Image credit: Martin Lopez/Pexels
Tina writes frequently for TriplePundit and other websites, with a focus on military, government and corporate sustainability, clean tech research and emerging energy technologies. She is a former Deputy Director of Public Affairs of the New York City Department of Environmental Protection, and author of books and articles on recycling and other conservation themes.