Facebook has a long way to go in reclaiming the trust of conservatives, according to an interim report on the social media giant’s liberal bias overseen by a former U.S. senator. 

“Many conservatives lost trust in Facebook, believing it discriminated against them,” former Sen. Jon Kyl, R-Ariz., wrote in a Tuesday op-ed for The Wall Street Journal. 

The same day, Kyl released the results of the report, begun in May 2018 at Facebook’s request, looking at whether the social media giant has displayed political bias against conservatives. 

Nick Clegg, Facebook’s vice president of global affairs and communications, called Kyl’s report the “first stage of an ongoing process.”

“This work is not an issue of personal political opinion. As at any large company, there is a diversity of political opinions at Facebook and plenty of people who would not describe themselves as conservatives,” Clegg said in a formal statement

“My own long-held political views have been a subject of public record for years. But regardless of one’s own political views, this is about whether we apply our own policies fairly to all sides, and whether those policies begin with an understanding of how core groups of users express their beliefs.”

News reports surfaced in May 2016 that Facebook routinely held back or suppressed news with a conservative perspective. 

A former Facebook employee, who has remained anonymous, said it was common practice for the site to suppress news and information of particular interest to conservatives. 

“Depending on who was on shift, things would be blacklisted or trending,” the former news curator for Facebook said in a report published by Gizmodo. “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”

The new report was compiled by Kyl and a team at Covington & Burling LLP after they questioned more than 130 conservative politicians and leaders.

Conservatives, it says, were concerned that the platform’s algorithm changes worked against conservative content, frustrated that Facebook was not clear in what it designated to be clickbait and spam, and skeptical of how the site designated what it called “false news.”

Kyl’s report concluded that “some of the third-party fact-checkers utilized by Facebook at various times (e.g., Snopes, PolitiFact, Factcheck.org, the Associated Press)—which are certified by the Poynter Institute, an entity that owns PolitiFact—have skewed to the ideological left.”

The report also found that users took issue with Facebook’s “hate speech” function, saying that it had a “highly subjective nature of determining what constitutes ‘hate’—an assessment that may be subject to the biases of content reviewers.” It said the term hate speech “is itself controversial, insofar as it may incorrectly ascribe motive in many cases.”

Among other concerns was Facebook’s practice of flagging content it deems too conservative, which resulted in parts of the Bible, St. Augustine’s writings, and the Declaration of Independence being removed or made less prominent.

The Heritage Foundation hosted Kyl in June 2018 for a listening session with conservative groups about their concerns that Facebook shows political bias.  

Two years earlier, former Sen. Jim DeMint, then president of The Heritage Foundation, and other conservative leaders met with Facebook co-founder and CEO Mark Zuckerberg to discuss the platform’s apparent bias.

“I made it clear that Facebook has every right to be as biased as it wants to be as a private company, and conservatives have every right to look elsewhere for social platforms if they feel Facebook is silencing them,” DeMint, who left Heritage in May 2017, wrote in an op-ed at the time. 

“But if Facebook promises its users an unbiased platform for the free exchange of ideas—all ideas—then it should keep that promise.”

Kyl’s report highlights several ways Facebook is trying to do just that, including (and quoting the report):

  • Helping users understand why they see (or do not see) certain content in their News Feeds: Facebook introduced ‘Why am I seeing this post?’ to inform users on why they see certain content and to enable them to control more easily what they see from friends, Pages, and Groups. Facebook is also improving ‘Why am I seeing this ad?,’ which launched in 2014.
  • Providing additional explanations of News Feed rankings: Although Facebook has taken steps to provide additional clarity around its Community Standards and News Feed ranking, the company told us that it remains committed to providing additional transparency to help people and publishers better understand how content is ordered in personalized feeds. 
  • Making enforcement actions against Pages more transparent: Page managers can now see when Facebook removes content that violates the Community Standards and when Facebook reduces distribution of posts rated “false” by a third-party fact-checker.
  • Sharing additional details about how the Community Standards evolve: Facebook released more information regarding how its policies are developed and debated, including by publishing notes from twice-monthly global meetings in which Facebook employees debate and discuss potential changes to the Community Standards. 

Kyl and his team are expected to submit another report on Facebook within a few months.

In his op-ed, Kyl, a senior counsel at the law firm of Covington & Burling, said Facebook still has significant progress to make in regaining trust. 

“As Facebook considers additional changes, we will continue to help it understand conservative perspectives,” Kyl wrote. “To live up to its vision as a platform for all ideas, I believe Facebook understands it must do all it can to regain the trust of conservative users.”