Doctors Can Digitally Dispel Vaccine Dis-and Mis-Information
Since the early days of AOL and Yahoo, digital information has transformed how people access and share health information. While the digital age has increased access, it has also accelerated the spread of disinformation and misinformation.
To address this issue, Dr. C. Everett Koop spearheaded an industry initiative in 1999 to protect people from unreliable health information by involving doctors, nurses, and pharmacists in the content creation.
In the years since, companies such as Healthline and RxWiki elevated this process by implementing rigorous and transparent fact-checking, which has helped to enhance consumer trust in digital health information.
However, with the generative artificial intelligence (AI) revolution in late 2023, trusting health content will never be the same, posing significant societal problems.
Researchers wrote in an article published by PLOS ONE on May 8, 2024, that understanding how people consume news and navigate the online information environment is essential to tackling the trust-content issue.
This study analyzed news posts on Twitter in France, Germany, Italy, and the U.K. It revealed how essential topics are perceived across nations and the role played by misinformation.
The study's results reveal that reliable, known sources largely shape the information landscape. However, unreliable content is ever-present across all countries and topics.
From a real-world perspective, a recent poll reinforced that healthcare providers are highly trusted in the United States.
Professionals in "white coats'' were ranked as the most trusted sources of information about health information and vaccines, as most adults (68%) say they usually keep up-to-date with their provider's recommendations.
Unfortunately, government sources of vaccine information, such as the U.S. CDC and local public health departments, fare much worse in this KFF poll, which is linked here.
A potential solution to the digital trust debate may have quietly evolved over the last year.
Elon Must's transformation of X's Community Notes may be the health industry's best option.
This novel approach empowers diverse contributors to identify posts containing misinformation and to challenge misinformation by appending informative notes to the X posts. Importantly, this process is controlled by the public instead of decision-makers at the company.
A new study recently published in JAMA led by John W. Ayers, Ph.D., from the Qualcomm Institute within UC San Diego, found that Community Notes, a crowdsourced approach to addressing misinformation, helped counter false health information in popular posts about vaccines with accurate, credible responses."
In a university news release, Ayers stated, "X's Community Notes have emerged as an innovative solution, pushing back with accurate and credible health information.
The sample of Notes studied was attached to X posts that averaged over 1 million views, extrapolating to between 500 million and 1 billion views for all vaccination-related posts noted.
"Our study shifts the focus from talking about misinformation to taking action, offering practical insights into social media strategies that protect public health," explained Mark Dredze, Ph.D., the John C Malone Professor of Computer Science at Johns Hopkins University and study co-author, in a press release on April 24, 2024.
"Although we couldn't examine how these Notes directly influenced people's beliefs or actions, the characteristics we analyzed have consistently been shown to predict messages effectiveness."
Ayers concluded, "Other social media platforms should embrace transparency by open-sourcing their misinformation countermeasures. This step is crucial for enabling independent scientific scrutiny, enhancing public trust, and amplifying adoption of the most impactful strategies."
Our Trust Standards: Medical Advisory Committee