A new appeals body Appeals Centre Europe has been established to review policy violation complaints related to major social media platforms such as Facebook, TikTok, and YouTube.
The new independent out-of-court dispute settlement body is supported by Meta Platforms’ Oversight Board and certified by the media regulator in Ireland, Coimisiún na Meán, under the EU’s platform rules, the Digital Services Act (DSA).
The newly-formed panel will handle dispute cases where users claim they have been wrongfully penalized or had their content removed due to policy violations. Initially, the appeals body will handle cases related to Facebook, ByteDance’s TikTok and Alphabet’s YouTube, with more social media platforms to be added over time. The institution will start receiving user disputes before the end of the year.
Appeals Centre Europe will apply human review to every appeal case within 90 days. Its board of seven non-executive directors will decide whether the platforms’ decisions are consistent with their content policies. The Centre will be self-funded through fees charged to social media companies for each case. Users raising a dispute will pay a nominal fee to be refunded if the Appeals Centre decides in their favour.
“We want users to have the choice to raise a dispute to a body that is independent from governments and companies, and focused on ensuring platforms’ content policies are fairly and impartially applied.”
Thomas Hughes, inaugural CEO of the Appeals Centre
At the same time, online platform providers may legally refuse to engage with the dispute settlement body and it cannot impose a binding settlement of the dispute on the parties.
The initiative’s main goal is to provide more transparency and fairness in content moderation decisions, allowing users to challenge rulings made by the mentioned platforms. The formation of the appeals body comes as part of ongoing efforts to improve accountability in digital content regulation.
In the EU, the DSA set of regulations became applicable to all online platforms last February. It sets clear obligations for online platforms, including social media companies and marketplaces, to protect users from illegal content, misinformation, and privacy violations.
Key DSA rules include removing harmful content swiftly, offering transparency in advertising, and giving users more control over their data. However, social media users frequently claim they have been wrongfully penalized or had their content removed by the platform. It appears due to perceived inconsistencies in content moderation, vague community guidelines, or overzealous automated systems. Users experience controversial penalties particularly often when it comes to sensitive topics like politics, health, or social justice.
Nina Bobro
Nina is passionate about financial technologies and environmental issues, reporting on the industry news and the most exciting projects that build their offerings around the intersection of fintech and sustainability.