Self-Regulation Isn’t Working, So How Should Facebook Be Regulated? by Tanner Stening October 5, 2021 Share Facebook LinkedIn Twitter Photo by Matthew Modoono/Northeastern University Democrats and Republicans appeared united on Tuesday on the need to enact reforms to protect social media users from the harmful effects of Facebook’s algorithms, which were documented in leaks provided to the Wall Street Journal last month. Frances Haugen, a former Facebook employee-turned-whistleblower, testified before the Senate Commerce subcommittee about the tech giant’s own decision-making in the face of internal research showing that its platform was being used for sinister purposes, and that its algorithms promoted content that triggered outrage, or was hateful. Among other things, Facebook was being used to incite genocide against ethnic minorities in Ethiopia, promote human trafficking, and sow doubt about the threat of the COVID-19 pandemic, the Journal reports. David Lazer, university distinguished professor of political science and computer sciences, and co-director of the NULab for Texts, Maps, and Networks, and John Wihbey associate professor of journalism and media innovation. Photos by Adam Glanzman and Matthew Modoono/Northeastern University Despite the evidence of these uses, which Facebook itself compiled—and is accused of keeping secret—Haugen maintains the social media company turned a blind eye, neglecting its duty to inform its shareholders and the public about what it knows, and profited off the ill effects of its algorithms. But as ready as Democrats and Republicans appear to work together to shine a light on the inner workings of Facebook and other tech giants, it remains to be seen if any potential political solutions they devise really will reflect a consensus on the fundamental issues at hand, several Northeastern experts say. “The question is can they craft something that sort of satisfies both sides of the aisle,” says John Wihbey, associate professor of journalism and media innovation at Northeastern. “We’ve seen this before. Whenever tech [executives] go before Congress, they tend to be grilled on the left and the right, and for different reasons.” Large social media companies like Facebook have come under increased scrutiny over their role in spreading misinformation, hate speech, and other forms of problematic content. But Democrats have primarily focused their criticisms at the companies’ lack of moderation of misinformation, while Republicans have tried to call attention to what they perceive are problems of censorship, particularly after former president Donald Trump was banned from Twitter and Facebook following the Jan. 6 attack on the U.S. Capitol. These differences were hardly on display during Tuesday’s hearing, as both sides pledged action, rallying around the need to protect users from Facebook’s secretive practices. Lawmakers focused on the way the social media giant, which also owns the picture-sharing platform Instagram, targets children and teens with its “engagement-based ranking” algorithms. Haugen testified that Facebook’s own research has shown that its Instagram algorithms feed insecurity and exacerbate mental health problems, promoting content that glorifies eating disorders, for example, to young female users. Democrats and Republicans alike expressed concern over these findings. “Our differences are very minor, or they seem very minor, in the face of the revelations that we’ve now seen,” Sen. Richard Blumenthal, a Connecticut Democrat, told his Republican colleague, Jerry Moran, of Kansas, who suggested the two should “resolve their differences” and “introduce legislation” that would provide more oversight into Facebook and its practices. But even as both sides agree on the need for more transparency, partisan tension is still simmering in the background, says David Lazer, university distinguished professor of political science and computer sciences at Northeastern. “The left has been taking a more aggressive stand on suppressing the spread of misinformation, and those on the right believe those standards around misinformation will be unfairly used to suppress conservative voices,” Lazer says. That’s a real point of departure for both parties, which could slow or stymie efforts to tackle joint concerns down the road, Lazer notes. “Both [parties] are worried about big tech, and both might agree that there should be ways of making what big tech does more transparent,” he says. “But how they interpret what’s happening—that’s where we could see a difference.” For media inquiries, please contact media@northeastern.edu.