
In April 2025, EDMO BELUX 2.0 welcomed researchers, journalists, media literacy experts, and civil society representatives to a workshop and networking event organised at UCLouvain Saint-Louis Bruxelles and titled “Mitigating Disinformation in Belgium and Luxembourg.” Over 50 participants attended the event combining expert presentations and exchange views on current challenges and future directions in disinformation mitigation. The sessions explored various dimensions, including media literacy strategies, regulatory frameworks, platform policy developments, fact-checking practices, the pressures faced by legacy media and journalists as well as trends, threats, and responses related to foreign information manipulation and interference. Through engaging presentations, interactive roundtables, and networking moments, participants tackled two essential questions: “Where are we now?” and “Where are we going?”. Participants approached several perspectives to critically examine the mitigation of disinformation. They assessed the current state, discussing existing challenges and considering whether a turning point has been reached. They also reflected on possible future issues, evaluated the effectiveness of current initiatives, and explored potential new strategies to better address disinformation. Additionally, discussions highlighted the necessary resources and collaborations to strengthen mitigation efforts and move forward.
The EDMO BELUX workshop provided a comprehensive overview of the current state and future directions of disinformation (mitigation) through four thematic panels, each addressing distinct but interconnected aspects of the phenomenon.
Challenges and Perspectives for Media Literacy Approaches
With contributions from different speakers involved in media literacy initiatives, the first session highlighted significant changes and challenges in media literacy education. Historically, events such as the Charlie Hebdo attacks and political developments like Trump’s 2016 election have dramatically reshaped some major objectives of media literacy, moving from critical thinking about news production and media industry processes to fighting conspiracy theories and “fake news”. This raises a paradox for media educators: should they restore trust in legacy media institutions presented as trustworthy or focus on preserving critical thinking towards all media, including traditional sources? With the increasing polarization, the issue of trust in institutions and now extends beyond the media industry, affecting other parts of society, such as science.
In the educational sphere, despite growing awareness among teachers about the necessity of addressing disinformation, actual implementation remains limited due to various factors such as time constraints, insufficient support from educational authorities, and a lack of adequate training and resources. During media literacy workshops, many students from many different contexts display limited or no understanding of disinformation concepts. During these interventions, students are more inclined to pay attention to trivial questions such as journalists’ income or connections with famous people than understanding news production processes. Students’ (dis)information knowledge seems to mainly depend on whether the topic was previously addressed in the class. Media literacy workshops with journalists can be done in several ways, whether with journalists intervening in class or through classes visiting media companies and witnessing the journalists’ work environment and news production practices. Strategies adopted during media literacy workshops with journalists raise important questions such as: should they acknowledge that mistakes can be made during the process of producing news and explain how they try to prevent them?
In higher education settings, with high disparities in media literacy among the students, previous exposure to media literacy training can sometimes lead students to resist additional education, believing further training to be redundant. This resistance is compounded by the challenge of effectively incorporating emerging technologies, especially artificial intelligence, into existing literacy frameworks. Adapted approaches, considering students’ specificities, should be developed to overcome these challenges.
Media literacy is seen as essential for everyone, especially more vulnerable groups, and, consequently, it should appear in a diversity of spaces. A better understanding of media and information usage might help develop a coherent media literacy approach.
Regulatory Frameworks and Changing Platform Policies
The second panel explored the current regulatory landscape of disinformation. In Europe, several regulatory tools related to disinformation mitigation were developed and implemented in the last few years. For instance, the European Union’s Digital Services Act (DSA), complemented with a self-regulatory Code of Practice on Disinformation, makes very large online platforms (VLOPs) – and search engines, or VLOSEs – accountable for disinformation mitigation, disinformation being considered as one of the identified systemic risks. The DSA’s rules on platform governance, content moderation, transparency and due diligence have potential impacts on disinformation.
Ongoing EDMO research projects are mapping the efforts that digital platforms are making – or not – to counter disinformation in EU (countries), focusing on their reported activities and collaborations (in the context of their commitment to the Code of Practice on Disinformation) as well as how local stakeholders (civil society, government entities, journalists, etc.) can identify forms of collaboration with those platforms. How are VLOPs effectively empowering users, collaborating with media and information literacy experts, empowering disinformation research and facilitating useful data access?
If the evolution of the legal framework seems to go in the right direction, it does not overcome all the challenges and comes with its own potential concerns, such as the sufficiency and the efficiency of self-regulatory or co-regulatory measures, that currently seem quite limited in holding platforms accountable. Platforms’ collaborations with local communities and media and information literacy experts are currently poor, especially for small countries. The same goes for the access to platform data for research purposes. The general context also shows indications that some digital platforms are backsliding on content regulation. It brings the question of the next steps, with future political choices that might have to be made.
Another issue is the definition of “disinformation” being used in such contexts, by whom and with which effect? Sometimes the notion is loosely defined, sometimes each platform adopts its own interpretation. There are difficulties with defining boundaries for disinformation, which is closely related to the notion of freedom of expression.
Fact-Checking, Legacy Media, and Journalism Under Pressure
This panel addressed the dual pressure faced by contemporary journalism and fact-checking, particularly in the context of increasing hostility and technological disruption.
First, recent trends indicate intensified attacks on legacy media organizations and individual journalists, driven by diverse political and social groups. Journalists increasingly face targeted harassment from political figures, pressure from activist groups, and legal threats challenging journalistic independence and freedom of expression. Such pressures might contribute to the destabilization of the media landscape, potentially undermining public trust and journalistic credibility. Thus, the media landscape and journalists’ independence should be protected, and a strong and resilient media sector seems crucial.
Second, the growing influence of generative artificial intelligence (AI) raises paradoxical challenges for fact-checking. On the one hand, AI tools frequently propagate inaccuracies and amplify problematic disinformation narratives. AI-driven platforms illustrate these dangers, often generating biased, misleading, or outright false content, thus complicating fact-checking efforts rather than facilitating them. On the other hand, AI tools might be used to facilitate fact-checking processes but cannot replace the complex human skills required. If journalists and fact-checkers demonstrate a willingness to experiment and a receptiveness to integrating AI into workflows, it might be better to have clear guidelines, an ethical approach, as well as training to use these tools cautiously. Research is promising to support fact-checking practices, but it should be done with adequate resources, in an interdisciplinary approach, in collaboration with fact-checkers and with a focus on their needs.
Trends, Threats, and Responses to Foreign Information Manipulation and Interference
The final thematic panel focused on foreign information manipulation and interference (FIMI), highlighting its increasingly complex and transnational nature. Belgium and Luxembourg, like many countries, have been targeted by sophisticated disinformation campaigns involving extensive cross-platform coordination and exploitation of specific societal vulnerabilities. Evidence from structured intelligence tools has revealed intricate disinformation networks utilizing varied tactics, including impersonation of media figures, and extensive disinformation operations via digital platforms. A structured analysis using tools such as STIX and OpenCTI allows a comparative approach and the discovering of recurring patterns, enabling strategic planning, faster responses, and better collaboration. However, current analytical frameworks still lack sufficient specificity and adaptability to fully decode these complex campaigns, indicating a need for enhanced tools and methodologies designed specifically for disinformation analysis.
Participants emphasized the critical role of community-level vulnerabilities in facilitating FIMI effectiveness. Communities characterized among others by political polarization and low trust in traditional media have proven especially susceptible. There is a need for a deeper understanding of micro (psychological factors) and macro-level (education, regulation, civil society and media structure) drivers of disinformation vulnerability, which might help develop adapted responses.
The Current Landscape and Future Directions
In conclusion, the EDMO BELUX workshop painted a nuanced picture of the disinformation landscape in Belgium and Luxembourg, characterized by increased complexity and urgency. Media literacy initiatives remain essential but face obstacles due to societal polarization and educational disparities. Regulatory frameworks like the DSA, while promising, require more consistent enforcement mechanisms. Journalism is increasingly under threat, necessitating renewed protection and public solidarity to uphold media independence. Moreover, while technological advancements such as AI offer substantial opportunities, maintaining human oversight and ethical guidelines remains crucial. Overall, sustained and multidisciplinary cooperation among educators, policymakers, journalists, fact-checkers, researchers and technology experts is imperative to effectively address the evolving threats of disinformation and reinforce democratic processes and societal trust.
Full programme of the event: https://belux.edmo.eu/fr/event/disinformation-in-belgium-and-luxembourg-edmo-belux-2-0-workshop-networking-event/