Regulating Facebook: It’s Not That Simple
Facebook is again facing criticism for its moderation policies after leaks, released by former Facebook employee Frances Haugen, came to light. Many outlets like the Washington Post, Wall Street Journal, and the Associated Press sounded the alarm, and once again, Mark Zuckerberg’s infamous creation was under threat. Though the condemnation was almost uniform among commentators, what to do about it is far more difficult.
For the last ten years, Facebook has struggled to regulate itself amid its growing size and a rise in online conspiracy theories and violent rhetoric effectively. Recently, however, those struggles seem to be more intentional. Leaks of Facebook’s internal communications show that the company was encouraging harmful content and making the negative reactions to said content worth more than standard likes, boosting negative content in the algorithm.
The consequences have been felt not just in the United States, though it is certainly present. In India, Facebook’s failure to eliminate conspiracy theories about child snatchers from outside the country led to villagers grabbing four men from their car and beating one of them to death. One researcher created a new account to determine what it would be like to experience Facebook from an Indian’s perspective. Summing up his experience, the researcher explained that “Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total….”
The violence is not restricted to a single incident against a small group. During the Rohingya crisis, Facebook failed to address rampant anti-Muslim propaganda on its site, some of which came from its military. UN investigators later reported that Facebook was a key component in the violence that drove 660,000 Rohingya from their homes and out of the country. The United Nations’ chairman of the Independent International Fact-Finding Mission explained that Facebook “has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly, of course, a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media….”
In an interview, Dr. Jae Sik Ha, an associate professor in the Department of Communication at the University of Illinois Springfield, explained that this dependency on Facebook does not end with Myanmar, saying, “Myanmar, India, and the Philippines…In those countries, Facebook is equivalent to the Internet.” The failure to deal with the increased dependence on Facebook worldwide has plagued the company for years, but the accusations of genocide and hate speech now engulfing Facebook’s brand is taking a beating.
Then there are the politics of Facebook’s regulation. While some in the United States have called for Facebook to increase its moderation policies, others are not certain. A 2020 poll from Pew Research Center found that 69% of Republicans and 30% of Democrats believe that social media companies prefer liberal content over conservative posts. This despite evidence that right-wing content spreads faster on Facebook and has more engagement than their left-leaning counterparts. Few liberal or left-wing content creators reach the daily top 25 posters, whereas conservatives consistently reach that engagement.
With the rise of these perceptions, conservative activists and politicians have pursued methods to pressure Facebook into changing its policies. Former President Donald Trump encouraged Republican lawmakers to repeal section 230 of the Communications Decency Act, which protects social media sites like Twitter and Facebook from liability for what is posted on their site, in the hope of forcing them to change their policies. However, Dr. Ha notes this will probably not happen and will have the opposite result. “These social media companies have power. So, if we remove section 230, they will have more power.” In addition, Dr. Ha explained that removing liability protections would mean that companies like Facebook may be more controlling of their content to avoid a lawsuit. In other words, Section 230 is likely not to be the most successful approach. Republicans are not the only ones trying similar limitations to Section 230. Democratic Senators Amy Klobuchar and Ben Ray Luján have proposed a bill that would require Facebook to delete COVID-19 misinformation from its platform, though its legality is somewhat uncertain. The proposal, titled the Health Misinformation Act, was written to prevent misinformation from spreading online during a national emergency. The difficulty is that what is considered misinformation is dependent upon context and, therefore, difficult to define uniformly.
Due to these challenges, Facebook remains as it is, with few consequences for its leadership. Though it is unclear what will happen next with Facebook, the shouts for change are growing louder and a name change on the corporate level isn’t about to make that go away anytime soon.