The European Commission plans to require the largest online platforms to help detect and counter “hybrid threats” under a proposal due in November.
The measures would expand existing obligations under the Digital Services Act (DSA) and form a central plank of a new European Democracy Shield initiative aimed at protecting elections and public debate from manipulation.
Brussels uses the term hybrid threats to describe coordinated hostile activity that falls short of armed attack, combining tools such as disinformation, cyber operations and economic pressure. The draft envisages a crisis protocol under the DSA tailored to such threats, to be developed with platforms, national authorities and other stakeholders, providing common procedures to spot incidents and respond quickly.
Platforms designated under the DSA—including Google, Meta, Microsoft, TikTok and X—are already required to assess and mitigate systemic risks and to act on illegal content. The Commission’s new move would broaden those duties in the specific context of democratic processes, reflecting concerns about foreign information manipulation and interference linked to Russia.
A further strand targets the rapid spread of synthetic media. Companies that signed the EU’s Code of Practice on Disinformation would be pressed to analyse threats posed by deepfakes and other AI-generated content to electoral integrity, and to set out counter-measures. The Code’s signatories include Google, Meta, Microsoft and TikTok.
The Commission’s executive vice-president for technological sovereignty, security and democracy, Henna Virkkunen, is expected to unveil the proposal on 13 November, though the timeline and details may still shift. Her portfolio includes oversight of the DSA’s enforcement and broader efforts to streamline the EU’s digital rulebook.
The Democracy Shield has been trailed across EU institutions this year as a framework to harden Europe’s information space, support independent media and civil society, and strengthen resilience to cyber- and hybrid-operations. Parliamentary briefings and party policy papers point to priorities such as early-warning systems, cooperation with fact-checkers, and clearer escalation pathways when coordinated manipulation is detected.
The push comes amid a wider security agenda. On 16 October, EU foreign policy chief Kaja Kallas presented a Defence Readiness Roadmap 2030, setting out flagship projects on drones, border surveillance and air and space defence—measures the Commission argues are necessary to deter hostile activity and bolster the bloc’s ability to respond to hybrid campaigns. The security context forms the backdrop for the planned information-space measures.
If adopted, the DSA crisis protocol for hybrid threats would give Brussels and national authorities a structured mechanism to request specific steps from very large online platforms during defined incidents—for example, enhanced detection of coordinated inauthentic behaviour, faster access to data for vetted researchers, and prominent labelling or demotion of proven deepfakes that threaten electoral processes. The draft also signals closer coordination with the Code of Practice, which is voluntary, and the binding DSA risk-mitigation regime, to reduce gaps between standards on paper and practice during fast-moving campaigns.
Debate will focus on scope and safeguards. Platform groups and civil society have previously urged clarity on triggers for extraordinary protocols, time limits, due-process protections for users and publishers, and transparency around state requests. The Parliament’s research service has also flagged the need to balance enforcement against disinformation with fundamental rights and to ensure the judiciary is not a weak link in the “security chain”. Those questions are likely to resurface as the Commission moves from draft to formal proposal.
For now, the signal is that Brussels wants faster, more coordinated action when hostile actors exploit social networks and AI tools to shape public opinion. With the Democracy Shield, the Commission is seeking to lock in procedures before the next electoral cycle, while maintaining its broader message that existing digital rules will be enforced on all major platforms operating in the EU.


 
								