The news came from a colleague — not a doctor but someone who works in the emergency room and has seen firsthand the devastation caused by the pandemic. “There is a cure for Covid-19,” he said. “It must be true because a doctor friend shared a Facebook post about this cure.”
When confronted with the latest, credible scientific evidence — that there is no cure for Covid-19, that the disease has killed more than 180,000 Americans precisely because we have no effective way of averting death for the millions who are infected — he doubled down. “But I saw it on Facebook,” he said.
In the emergency room and in conversations with the American public through cable news interviews and Op-Eds like this one, we’ve both been working to dissect and debunk the many myths about this new virus, its potential treatments and the possibility of a vaccine. We read the mistruths on our patient’s phones, listen to theories borrowed from internet chat rooms and watch as friends and family scroll through Facebook saying, “Here — it says that this was definitely created in a Chinese laboratory.”
Seven months into the worst pandemic of our lifetime, the virus continues to spread alongside medical myths and health hoaxes. False news is not a new phenomenon, but it has been amplified by social media. A new report about Facebook from Avaaz, a nonprofit advocacy organization that tracks false information, shows how widespread and pervasive this amplification is.
Websites spreading health hoaxes on Facebook peaked at an estimated 460 million views on the platform in April 2020, according to the report, just as the virus was spreading around the world and overwhelming hospitals in New York City. Facebook claims to assess and add warning labels to factually incorrect posts; but in a subset of posts analyzed by Avaaz, only 16 percent of those containing health misinformation had a warning label.
Facebook’s algorithm rewards and encourages engagement with content that provokes strong emotions, which is exactly the kind of content we warn patients to doubt and carefully assess, since false information is often packaged as novel and sensational. The report’s title calls Facebook’s algorithm “A Major Threat to Public Health” — something our clinical and research experiences amply confirm.
Public health organizations have been unable to keep up with the deluge of sophisticated medical myths and pseudoscience shared on Facebook. Despite the efforts of the Centers for Disease Control and Prevention and the World Health Organization, content from the top 10 health misinformation sites received four times as many Facebook views as content from the C.D.C., W.H.O. and eight other leading health institutions during April 2020.
Facebook enables known misinformation spreaders to share their bunk widely. Networks spreading health conspiracy theories and pseudoscience generated an estimated 3.8 billion Facebook views between May 28, 2019 and May 27, 2020.
The report quantifies the reach of so-called superspreaders of health misinformation and disinformation on Facebook, including websites such as GreenMedInfo and RealFarmacy, which package pseudoscience as credible, believable news. These include false claims that 5G technology is harmful to human health and that certain types of vaccines have never been tested.
While GreenMedInfo has been removed from Pinterest, it thrives on Facebook: In the last year, it received more than 39 million views. And RealFarmacy, which according to Avaaz is on track to become one of the largest health misinformation networks in the world, received an astonishing 581 million views in a year. One article alone, hawking colloidal silver as a treatment for viruses, was viewed an estimated 4.5 million times. We can’t compete with a global platform whose powerful algorithm rewards sensational, false content.
We see the consequences in the clinic and the emergency room. Patients question our evidence-based medical guidance, refuse safe treatments and vaccines, and cite Facebook posts as “proof” that Covid-19 is not real.
While doctors and other health care professionals play a critical role in educating the public, we are not immune to the sophisticated techniques of false information. Colleagues have confided in us that they believe the virus is man-made and diminishing in strength; others have asked us to invest money in Covid-19 “cures.” While we try, each day, to counter these dangerous falsehoods that circulate among our patients and our peers, our ability to counsel and provide care is diminished by a social network that bolsters distrust in science and medicine. Facebook is making it harder for us to do our jobs.
Purveyors of false news will always exist; for as long as there have been epidemics there have been snake oil salespeople exploiting fear and peddling false hope. But Facebook enables these charlatans to thrive. Absent a concerted effort from Facebook to rework its algorithm in the best interests of public health — and not profit — we will continue to throw water on little fires of misinformation while an inferno blazes around us.
Seema Yasmin (@DoctorYasmin) is director of the Stanford Health Communication Initiative and the author of the forthcoming, “Viral B.S.: Medical Myths and Why We Fall for Them.” Craig Spencer (@Craig_A_Spencer) is an emergency medicine physician and director of Global Health in Emergency Medicine at NewYork-Presbyterian/Columbia University Medical Center.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
"Opinion" - Google News
August 29, 2020 at 02:00AM
https://ift.tt/2ECQm4O
‘But I Saw It on Facebook’: Hoaxes Are Making Doctors’ Jobs Harder - The New York Times
"Opinion" - Google News
https://ift.tt/2FkSo6m
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update
No comments:
Post a Comment