EU residents will have a new way to dispute content moderation decisions by Facebook, YouTube and TikTok


European Union residents will have a new place to turn to settle disputes with Facebook, TikTok and YouTube. A new Appeals Centre, certified , will soon begin accepting complaints about content moderation decisions.

The concept is similar to Meta’s Oversight Board, which weighs in on content moderation decisions across Facebook, Instagram and Threads. Meta has that other social media companies should use its Oversight Board, though there’s been little incentive for them to do so. Europe’s Digital Services Act (DSA) changed that calculation somewhat, as it enabled the creation of Out-of-Court Dispute Settlement (ODS) bodies that have the ability to help resolve user complaints.

And while the Appeals Centre is a separate entity, there are some notable links between the two organizations. The new Appeals Centre will be led by , who was previously the CEO of the Oversight Board Administration. The Oversight Board Trust, which oversees the board’s budget, also helped fund the new Appeals Centre with a “one-time grant,” according to from its chair of trustees, Stephen Neal. And, the first non-executive trustees of the Appeals Centre are also trustees on the Oversight Board.

The Appeals Centre says it expects to be up and running “in late 2024,” at which time individuals and organizations will be able to request appeals through its website. Users wishing to appeal a moderation decision from Facebook, YouTube or TikTok will be required to pay a “nominal fee” that will be refunded if the group rules in their favor, according to information posted on .

However, it’s not clear exactly how this process will work or how many cases the group will be able to take on. Meta’s Oversight Board, which has been up and running for years, received nearly 400,000 appeals and issued just in 2023. The Appeals Centre may also end up being less influential than the Oversight Board. A from Ireland’s media regulator notes that “the decisions of ODS bodies are not binding.” Still, it could increase the visibility of the kinds of content moderation issues that often frustrate users and give some hope that their situation may be reconsidered.





Source link

About The Author

Scroll to Top