Over the past few years, many popular internet sites have begun to police hate groups, conspiracy theorists, armed right-wing groups, white supremacists and Christian nationalists. As a result, some have migrated to alternative digital platforms, such as Brighteon, CloutHub and Gab.
Policing the web is difficult. It often relies on complex software algorithms that decide what content users will see and when they see it.
Frances Haugen, the former Facebook employee and whistleblower, recently leaked a trove of revealing internal documents from the company, suggesting the platform allowed misinformation to spread widely, to keep more people logging on.
In January, Facebook announced that if users search for terms associated with either the Holocaust or Holocaust denial, it would direct users to credible information about the Holocaust outside Facebook.
A Facebook spokesperson confirmed it is continuing to do that.
“We’re also going further by educating people on Facebook with authoritative information about the Holocaust when someone searches for terms associated with it,” the spokesperson said.
The ADL report suggests Facebook needs to expand the language parameters used to detect Holocaust denial, and combine machine learning techniques with human review.
This article originally appeared here.