On June 11, 2025, the Center for the Study of Organized Hate (CSOH) and Fact Review, a Greece-based fact-checking organization, co-organized a panel discussion on the European Union’s Digital Services Act (DSA) and how South and Southeast Asian diasporic communities in the EU can leverage the regulation to combat and prevent online harms, particularly those that are transnational in origin but local in impact.
The DSA, which came into full effect for all online platforms on February 17, 2024, introduced a robust set of user rights, platform obligations, and an enforcement architecture designed to accommodate the EU’s diverse national landscapes. While many had anticipated the “Brussels Effect” to shape global standards on hate speech and disinformation, this has yet to materialize. Major platforms have increasingly resisted meaningful accountability.
Enforcing the DSA is not automatic. It depends on coordinated efforts among national Digital Services Coordinators (DSCs), trusted flaggers, civil society organizations (CSOs) and out-of-court dispute resolution bodies. This panel brought together representatives from each of these stakeholders to walk participants through the enforcement process, highlight challenges, and explore strategies for community-level engagement.
The DSA and Its Components
The discussion began with Ioanna Choudalaki, Head of the DSA Team at the Hellenic Telecommunications and Post Commission (EETT), the designated Digital Services Coordinator (DSC) for Greece, who outlined the key provisions of the Digital Services Act and the obligations it places on digital service providers. These include mandatory reporting mechanisms that platforms must provide for users to report illegal or harmful content, specific requirements for Very Large Online Platforms (VLOPs) to grant researchers access to data, and the responsibility to assess and mitigate systemic risks arising from their services.
She also emphasized the role of voluntary codes, particularly those addressing illegal hate speech and disinformation, which support platforms in meeting DSA requirements and foster collaboration among platforms, trusted flaggers, DSCs, and researchers. She highlighted the critical role of civil society organizations (CSOs) in identifying illegal content and working with DSA-designated stakeholders to ensure that such content is addressed effectively.
Andronikos Koutroumpelis, Senior Editor at FactReview, also a trusted flagger explained how different countries in the EU have distinct approaches to specific violations of DSA. These differences, he noted, must be carefully considered by CSOs when encountering violative content across jurisdictions. He also clarified the role of trusted flaggers, emphasizing that their mandate from Digital Services Coordinators (DSCs) is limited to areas within their designated expertise.
Andronikos added that users can still file legal removal requests directly through platforms. Because most reports are processed algorithmically rather than by human moderators, he highlighted the value of contacting trusted flaggers across Europe who specialise in hate-speech detection.
Niklas Eder, COO and co-founder of User Rights, an out-of-court dispute resolution body, outlined his organization’s role within the DSA enforcement framework. These bodies are designed to intervene when a user disagrees with a platform’s decision to either remove or retain a piece of content. Under the DSA, any EU resident has the right to appeal such decisions through a certified dispute resolution body, which will assess the case based on both the platform’s community guidelines and the applicable laws of the user’s jurisdiction. While these reports must originate from within the EU, the scope of the dispute can extend to content and harms with a global reach.
Addressing Transnational Online Harms
A key question before the panel was how the growing South and Southeast Asian diaspora in the EU can effectively engage with the DSA framework. Eder addressed the challenge of tackling hate speech and disinformation in regional languages from these areas, noting the importance of identifying whether specific dispute resolution bodies have the capacity to handle such linguistic contexts. He emphasized the crucial role of civil society organizations (CSOs) in providing cultural and contextual expertise, particularly when harmful content originates outside Europe, where out-of-court mediators may lack the necessary background. Eder also highlighted the European Commission’s responsibility in supporting CSOs by helping them identify the appropriate organizations for addressing violative content.
Andronikos also highlighted the importance of addressing dog whistles and hate speech conveyed through coded language. He noted that fact-checking networks are well-positioned to detect such language, particularly when they understand the cultural and political context in which the hate is spreading. While acknowledging the challenges of moderating content that is deliberately ambiguous, he stressed that CSOs and diaspora communities must collaborate to identify recurring patterns of platform inaction or refusal to act on harmful content.
“One of the strongest forms of evidence is to demonstrate that we reported, say, 100 pieces of harmful content, and the platform removed only 5%. That’s hard evidence of systemic failure,” Andronikos said.
Ioanna advised community members to begin by sending a platform notice that cites the relevant hate-speech or harassment provisions under the host country’s criminal code. A legally framed request, she noted, compels platforms to issue a reasoned response. If the platform fails to act, the same evidence bundle — complete with screenshots, URLs, and timestamps — should be forwarded to a language-appropriate Trusted Flagger, whose fast-track status under the DSA all but ensures a human review. If this step is also unsuccessful, the case should be escalated to the national Digital Services Coordinator (DSC), whose statutory mandate includes logging recurring failures as “systemic risks” and reporting them to the European Commission.
What should CSOs do?
All panelists emphasized that the DSA remains in the early stages of implementation, with much of its scope still untested. This reality underscores the urgent need for collaboration among communities, DSCs, Trusted Flaggers, and all other stakeholders involved in the DSA’s enforcement. The panel discussion offered a detailed look at how each segment of the enforcement ecosystem functions to ensure the DSA’s effective implementation.
Civil society organizations (CSOs) must begin building a robust body of evidence documenting hate speech and disinformation targeting diasporic communities in Europe, particularly when such content originates outside the continent. Evidence collection must be systematic: documenting the content itself, its cross-regional impact on users, and how platforms respond to user reports. CSOs must consistently flag violations, identify the jurisdictions in which users were harmed, and outline steps for remediation. This work also requires sustained capacity-building efforts and the identification of long-term partners committed to the process.
For South and Southeast Asian diaspora CSOs, a simple but effective starting point is maintaining an “evidence diary.” Every time hateful or misleading content targets their communities, they should save a screenshot, record the time, copy the full URL, and note the platform’s response. A well-organized collection of such examples carries significantly more weight than isolated complaints. When submitting a report, it is essential to explain why the content violates the law in the user’s country, citing the relevant hate speech or harassment provision helps establish a pattern of systemic failure.
Establishing working relationships with accredited Trusted Flaggers is also critical. These entities maintain ongoing engagements with platforms, and those relationships should be leveraged to help them develop a deeper understanding of how hate speech and disinformation manifest within diasporic communities. If platforms still refuse to act, the next escalation point is an out-of-court dispute settlement body. Submitting a small batch of similar cases as a single appeal strengthens the case by offering a clear, concise dossier. Persistent patterns of neglect should be compiled into quarterly reports and forwarded to the national DSC. A file linking dozens of unresolved reports to tangible offline harm can compel regulators to investigate whether the platform poses a systemic risk.
Finally, the push for language inclusion remains essential. Until South Asian and Southeast Asian languages receive full moderation and research coverage, the DSA’s promise of equal protection will remain unfulfilled. Community groups must keep language equity at the forefront of every engagement with both platforms and regulators.