Feed Item
Added a post 

The reports reveal that the board is still negotiating its relationship with the social network and says its own credibility depends on Facebook’s being more forthcoming.

Facebook’s Oversight Board issued a strong reprimand against the company in a set of quarterly reports Thursday, accusing it of not being “fully forthcoming” about a key program. The reports highlight the tense negotiations between the two entities, as the board tries to force greater transparency from the social media giant, despite its limited power.

The board, an experimental panel created by Facebook to oversee its most complicated content decisions, said the company “failed to provide relevant information” about its “XCheck” program, which shields VIP users such as politicians and celebrities from its rules. On other occasions, the information Facebook provided to the board was incomplete, the reports said.

It was “not acceptable,” the board wrote, that Facebook did not mention the “XCheck” system when it briefed the board on its enforcement policies for politicians when the oversight board was reviewing the company’s decision to ban former president Donald Trump.

The public recrimination highlights strains behind Facebook’s interactions with the oversight unit as it questions the quality and quantity of information that Facebook provides.

“The credibility of the Oversight Board, our working relationship with Facebook, and our ability to render sound judgments on cases all depend on being able to trust that information provided to us by Facebook is accurate, comprehensive, and paints a full picture of the topic at hand,” said the board, an international panel of about 20 experts in a variety of fields.

The relationship between Facebook and the board has been tense since September, when reporting in the Wall Street Journal asserted that Facebook had “misled” the board in its description of the XCheck program. Facebook told the board in June that “XCheck” was used only in “a small number of decisions,” but the Journal’s reporting revealed it had included at least 5.8 million VIP users in 2020. After that report, the board said that Facebook recognized that the information it provided could be construed as misleading. The board said Facebook disclosed at the briefing that the company completes an average of fewer than 10,000 cross-check reviews per day.

On Thursday, the board announced that it would launch a review of XCheck, following a request from Facebook, and make recommendations on how the program could be changed. Facebook asked the board for specific guidance in reviews and how to promote transparency.

In a statement to The Washington Post, Facebook spokesman Andy Stone thanked the board “for their ongoing work.”

“We believe the board’s work has been impactful, which is why we asked the board for input into our cross-check system, and we will strive to be clearer in our explanations to them going forward,” he added.

Julie Owono, a member of the board who specializes in international law and human rights, said the board is seeking to engage with stakeholders beyond the company, including experts outside the United States and with civil society organizations.

“We do not only rely on the company’s words,” said Owono, the executive director of Internet Sans Frontières, an organization that defends digital rights and access to the Internet. Facebook’s request for guidance “is a step in the right direction to have a more public conversation on that mechanism,” she said.

Facebook is navigating a crisis in Washington as it deals with the fallout from documents leaked by whistleblower Frances Haugen, which formed the basis for the Journal’s XCheck reporting. The Oversight Board and Haugen have announced that they plan to meet to discuss the cross-check program.

Earlier this year, Facebook committed to “fully implement” the board’s recommendation that it explain the rationale, standards and review process. But in Thursday’s reports, the board criticized a lack of a detailed explanation about the program from the company.

An experimental endeavor widely referred to as a “Supreme Court” of Facebook, the board finds itself at a critical point about a year into its existence. The board has sought to position itself as an independent and neutral third party that rigorously interrogates Facebook’s practices. But this relationship is largely dependent on the social network’s benevolence, since the board has no government affiliation or legal standing to compel Facebook to share information on company operations or to comply with board requests.

“Even when the company doesn’t answer, it speaks volumes,” Owono said. “It allows us to actually be more pushy, and look for that information externally, outside of the company.”

Facebook created the Oversight Board to promote a more even distribution of power in response to criticism that chief executive Mark Zuckerberg and a handful of other top executives wield too much power over a fleet of social networks used by more than 3.5 billion people globally. But the board, which is funded by a Facebook-backed trust, has been a lightning rod for controversy since its inception. Critics say it insulates the company from responsibility for the most important decisions, even as Facebook is not required to comply with the board’s recommendations.

“We are responsible for enforcing our policies every day and we make millions of content decisions every week,” Zuckerberg said in a post outlining the vision for the board in 2019. “But ultimately I don’t believe private companies like ours should be making so many important decisions about speech on our own.”

The board’s process also is lengthy, averaging 74 days to decide and implement each case. It is able to review only a tiny fraction of the decisions that Facebook makes. Between October 2020 and the end of June 2021, Facebook and Instagram users submitted about 524,000 cases to the board. The company also referred 35 cases. In total, the board selected 21 cases and proceeded with 17 of them. By the end of June, it had decided 11 cases, overturning Facebook’s decisions eight times.

The board made 52 recommendations to Facebook as part of those decisions, but the company has not agreed to implement all of them. In many instances, it has told the board it is “assessing the feasibility” of the recommendations. Owono said the board is developing a mechanism to track the status of its recommendations.

In the transparency reports, the board reprimanded Facebook for not answering all of its questions. The board had sent 156 questions to Facebook as part of its decisions through the end of June. The company declined to answer in 14 instances, and only partially answered in 12, the board said.

The board’s primary recourse, if Facebook does not stick to its agreements, is to turn to the news media. The new transparency reports could be a key tool for the board to pressure Facebook to adhere to the commitments it makes to the board.

Owono said the board has made progress in pushing the company to improve. After the board highlighted that Facebook had not translated its community standards into Punjabi, a language spoken by 30 million people in India and Pakistan, the company said it would implement that recommendation. The board also continues to take on new cases, including one in Ethiopia.

“I feel the steps we’ve taken are extremely encouraging for a task that is absolutely gigantic and has world implications,” Owono said. “We will not solve all the problems overnight. We are continuing to receive cases that will continue to pose challenging questions.”

source: The Washington Post by Cat Zakrzewski

Comments
Info