At a time when Big Tech seems hyper-focused on policing political content, they don't seem to have the same gusto for going after terrorist content. A new report from the Tech Transparency Project (TTP) finds hundreds of auto-generated pages on Facebook for terrorist groups like ISIS or Al Qaeda. These pages are generated by Facebook's algorithm when users list places or things in their profile that don't already exist on the platform. "The company has not only failed to deal with this issue, but they are actually creating pages for those terrorist groups, despite knowing this has been a problem with their platform's features for several years," says Katie Paul, TTP director.
It does seem odd that a mammoth operation like Facebook (under its parent company Meta), which brags about fact-checking pictures and videos, and slaps disclaimers on every post about vaccines or climate change, can't seem to get a handle on universally recognized terrorist groups. Paul believes Big Tech platforms simply haven't made this a priority. "This should be an easy win for Facebook," she tells KTRH. "But unfortunately we just don't see these companies investing appropriately to combat this content."
Beyond terrorism, the TTP report uncovered Facebook facilitating other illegal activity. "We found that Facebook actually posts advertisements on Marketplace listings for human smuggling services," says Paul. "So the company is not just failing to address these problems, it is profiting from its failure to address those problems."
The solution may be more government oversight of Big Tech---something both parties tend to agree on. "Ultimately, we still need to see Congress hold these companies accountable in the way they already do with other major industries, like oil and gas or tobacco," says Paul.