Can you trust medical information on Facebook? Not without careful consideration!
Billions use the internet for medical information. Unfortunately, there is a lot of false or misleading health information available online, and it can be surprisingly hard to tell the difference between true and false information.
Facebook, the biggest social network in the world, could have a positive impact on universal health. Instead, their algorithms and lax controls have allowed Facebook to become a breeding ground for health misinformation. In fact, experts believe Facebook has failed to keep people safe and informed during the COVID-19 pandemic.
The reach of Facebook is staggering.
Before we dive into the health misinformation threat, it’s important to understand the power and reach of Facebook. According to Statista, in the 1st quarter of 2021, Facebook had roughly 2.85 billion monthly worldwide active users.
Why is it so important to understand the threat of misinformation?
In the “good old days”, the public turned to, and relied on, doctors and scientists for medical information. Now, anyone with a computer can make up medical information, share it and reach thousands.
It doesn’t take a lot of deep thinking to understand why health misinformation is a threat to global health. The impact that Facebook and other social media platforms have on the actions and beliefs of their users is enormous.
Many studies have found a correlation between believing in “conspiracy theory” misinformation and not following health protection guidelines with regard to vaccinations or safer sex. For example, the misinformation spread throughout social media by anti-vaxxers contributed to the 2019 outbreak of measles in the US and other countries.
And, recent surveys in the UK found that COVID-19 conspiracy theories published on social media hinder believers from following health-protective measures. Additionally, articles touting bogus cures can lead people to try dangerous treatments, such as one post, with 4.5 million views, which implied that colloidal silver is a safe alternative to antibiotics.
Certainly, false health information undermines the efforts of reputable health institutions and sparks fear and distrust. It’s the opposite of the world needs during a pandemic that requires widespread public cooperation to stop the spread.
The US surgeon general issues a strong advisory about misinformation online.
On July 15, 2021, US surgeon general, Dr. Vivek Murthy, issued a formal advisory (his first) alerting Americans to the serious public health threat caused by health misinformation. The advisory states that during the COVID-19 pandemic, many people have seen health information that is false, inaccurate, or misleading according to the best available evidence at the time.
During a news briefing on the advisory, Dr. Murthy made it clear he was targeting tech and social media companies, saying they have a responsibility to be more aggressive in fighting misinformation. Interestingly, he specifically mentioned Facebook as a culprit.
What does Facebook have to say?
According to a NY Times article, in response to Dr. Murthy’s advisory, a spokeswoman for Facebook stated: “We permanently ban pages, groups and accounts that repeatedly break our rules on Covid misinfo, and this includes more than a dozen pages, groups and accounts from some of the individuals referenced in the press briefing today,”
Additionally, in December 2020, Facebook promised to keep people “safe and informed” about the coronavirus. And before the pandemic, in July, 2019, Facebook announced they were increasing efforts to minimize health content that is sensational or misleading.
Has Facebook lived up to their promise? Can Facebook do a better job? What can you do?
Fake accounts flourish.
According to Avaaz, in August 2020, there were an estimated 125 million fake accounts that Facebook admits are still active on the platform.
Where does false medical information on Facebook come from?
Networks of organizations and people who run websites intent on deceiving the public also generate false medical information on Facebook. Many of these networks have been spreading misinformation about health matters for years, including fake statistics regarding vaccinations.
However, researchers at the nonprofit Avaaz found that some of these networks spreading misinformation about COVID-19 were not focused on health issues until the pandemic.
Avaaz analyzed false medical information on Facebook.
In an effort to determine the scale and impact of false medical information on Facebook, Avaaz examined all websites that, according to NewsGuard, are untrustworthy and have repeatedly shared false content, including information on health or COVID-19.
They narrowed the list of websites to evaluate by only including websites with at least 1 clear example of widely shared health misinformation (as fact-checked by a 3rd party). They then identified the top Facebook pages connected to these selected websites.
From this group, they only included public Facebook pages with at least 3 items of fact-checked misinformation that generated at least 100,000 interactions. Their final list included 82 websites and 42 ‘superspreader’ Facebook pages.
MIT analyzed cancer treatment ads on Facebook.
A 2022 analysis of Facebook ads by MIT Technology Review found scores of ads with misleading or false health claims related to cancer treatments.
Unfortunately, the misleading ads can remain on Facebook for months or even years. Importantly, they found that some of the ads promoted treatments proven to cause acute physical harm in some cases. And other ads touted very expensive treatments with questionable outcomes.
What’s the harm of these misleading ads?
Certainly, it’s not helpful to use cancer treatments that are unproven or ineffective. However, it can be worse than that because some alternative cancer treatments advertised on Facebook can cause physical harm. Additionally, these unproven treatments can negatively impact conventional cancer treatments such as chemotherapy. And trying these unproven treatments can delay the start of proven treatments, which can give cancer a chance to advance while also complicating and diminishing the effectiveness of conventional treatments.
In fact, research shows that those who first try unproven cancer treatments have worse survival rates.
False medical information on Facebook is widely viewed.
Avaaz found that health misinformation spread by networks on Facebook reached an estimated 3.8 billion views in the last year, spanning at least five countries — the US, the UK, France, Germany, and Italy. Moreover, as COVID-19 became a harsh reality for much of the world, there were an estimated 460 million views of health misinformation on Facebook just in April 2020.
How does this compare with views of reputable sites?
In April 2020, content from the 10 websites spreading health misinformation reached 4 times the views on Facebook as compared to reliable content from the websites of 10 leading health institutions, such as WHO and CDC. Notably, this occurred while Facebook was publishing reliable information through their COVID-19 information center.
Misinformation spreads like cancer.
The number of websites and Facebook posts dedicated to spreading false health information is horrible enough. But the reach of the misinformation explodes as people and organizations share false content to build followers and help the misinformation go viral.
And these false information sites often use sensational and provocative content, increasing the appeal and the subsequent “likes” and “shares”.
Furthermore, because many of these websites and Facebook pages share false and misleading content frequently to a large audience, it’s almost impossible for the algorithms to detect and fact-check the content before millions of people see the misinformation.
An alarming example of a rapid spread of misinformation.
With COVID-19 gaining a stronghold on the world, conspiracy theories about Bill Gates and COVID-19 spread quickly, with 1.2 million mentions on TV and social media between February and April 2020.
Consider the article “Gates’ Globalist Vaccine Agenda: A Win-Win for Pharma and Mandatory Vaccination”. On April 9, 2020, the vaccine skeptic organization Children’s Health Defense shared this article, which contains 9 false claims about Bill Gates and his polio vaccine program. The original article reached over 3.7 million views on Facebook.
Furthermore, 29 of Avaaz’s 82 health misinformation websites shared all or part of the article, quoted from it or linked to it, within 10 weeks of the original publication. The misinformation garnered an additional 4.7 million views in six different languages. Additionally, 15 of the 42 ‘superspreader’ Facebook pages shared, linked or quoted a version of this specific health misinformation example.
Importantly, Facebook labeled the original post as false, but the subsequent posts on other pages appear without false information labels.
Who are the top ‘superspreaders’ of false health information on Facebook?
Avaaz concluded that Facebook pages, including public, closed group and personal profiles, were one of the main drivers used by the 82 websites to gain viral spread. The 42 top ‘superspreader’ public Facebook pages identified by Avaaz each generate at least 100,000 interactions on posts. And they collectively have over 28 million followers.
Moreover, these pages have collectively amassed a staggering estimated 800 million views in one year. These Facebook pages repeatedly shared content and direct viewers to the websites.
Here is the Avaaz list of the top 10 spreaders of health misinformation on Facebook:
What about Facebook’s labels to warn users about false information?
Only 16% of the health misinformation Avaaz analyzed had a warning label from Facebook. Even though Facebook promised to fact-check their content, the other 84% of articles and posts remain online without warnings.
With this much misinformation floating around Facebook, you can’t help but ask – why don’t the Facebook algorithms work?
The role of Facebook’s algorithm in amplifying health misinformation.
Facebook determines what content each user sees in his/her News Feed using an algorithm. It considers the user’s connection to a person or group, the quantity of reactions/comments and other variables.
Because health misinformation is often provocative and sensational, it receives significant engagement, which in turn, makes the algorithm further boost the content. A vicious cycle begins.
Although Facebook claims they aggressively try to downgrade health misinformation and boost reputable sources, the tactics are clearly not working.
Additionally, the 125 million active fake Facebook accounts skew the algorithm on an ongoing basis. How? The engagement data generated by these fake pages does not represent real users.
The researchers at Avaaz concluded that “Facebook is still failing at preventing the amplification of misinformation and the actors spreading it“. Their findings suggest that health misinformation spreaders weaponize Facebook’s algorithm and/or that the algorithm is biased towards amplifying misinformation.
Furthermore, their findings suggest that Facebook’s policies to counter this problem are not being applied effectively enough.
What could Facebook do to slow the spread of false medical information?
Avaaz presents a 2-step solution to stop the spread of medical misinformation on Facebook:
Correct the Record:
Provide all users who have seen misinformation with independently fact-checked corrections. Avaaz estimates this could decrease the belief in the misinformation by an average of almost 50%.
Detox the Algorithm:
Downgrade misinformation posts and systematic misinformation in News Feeds, potentially decreasing their reach by as much as 80%.
Unfortunately, Facebook has not effectively applied either of these solutions, so the false medical information on Facebook flourishes, despite the ongoing concern from doctors and other health experts.
What can you do?
Certainly, don’t trust everything you read on Facebook or websites. When seeking medical information, start by speaking to your doctor.
If you want to find medical information on Facebook, look for reputable sites run by government agencies (e.g. CDC), international agencies (e.g. WHO), or top teaching hospitals (e.g. Mayo Clinic, Cleveland Clinic, Johns Hopkins, Massachusetts General Hospital). Additionally, you can find reliable information on the major news outlets (e.g. NBC, CBS, ABC, NPR and STAT).
For more information, read The Avaaz report: Facebook’s Algorithm: A Major Threat to Public Health.
Additionally, read my blog post Can you Trust Medical Information Online?
Leave a Reply