EU Plan: Combating Disinformation and Hate Speech Targeted at Female Politicians
The plan is to take a closer look at websites that appeal to men and boys.
In early March 2026, the European Commission (EC) adopted a new Gender Equality Strategy for 2026–2030 that, among other goals, aims to provide female politicians and journalists special protection against gender-based online violence, which includes disinformation and hate speech, while also scrutinizing online environments and narratives that particularly affect boys and men.
The new strategy is far more comprehensive than its previous embodiment (2020–2025), addressing eight areas of policy aimed at eliminating gender inequality. Several new threats, primarily affecting women, have been added to the strategy’s list, such as gender-based cyberviolence, anti-gender narratives that oppose gender equality, and a number of other issues, as per the European Commission. As a result, the Commission hopes that better gender equality in the EU could lead to a 9.6% increase in GDP per capita and create 10.5 million additional jobs by 2050.

The Fight Against “Disinformation” and “Hate Speech” Continues
The new strategy explains that current power structures prevent women from participating fully, equally, and meaningfully in politics and public life, and specifically from holding leadership positions. It states that female politicians are often still targeted and attacked over their perceived lack of qualifications, experience, and skills. According to the document, so-called disinformation and misogynist hate are frequently used to undermine the credibility of female politicians and thus to question their legitimacy, eroding public support for them.
For example, one-third of women in politics reportedly have given up social media, having suffered from cyberviolence. Female journalists face an even harsher reality: they are targeted online still more frequently than politicians, with attacks on them more vicious than with politicians, and sexualised in nature.
It is for this reason that the Commission plans to update the measures that promote women’s participation and leadership in politics, public administration, and parliaments. Under the European Democracy Shield, the Commission has announced its intention to develop recommendations on how to ensure the safety of political candidates and elected representatives both offline and online, including through a fight against disinformation. Indeed, the Commission has placed women at the center of its attention and takes disinformation that targets female politicians extremely seriously. In order to fulfil its plea, the Commission has highlighted the Code of Practice on Disinformation as a singularly relevant resource for the Digital Services Act (DSA) – a document itself referred to, by some, as the “European censorship law”, requiring large online platforms to reduce the visibility and spread of disinformation (i.e. information the authorities have deemed false or misleading). As you may recall, it is under the auspices of this very code that various working groups discussed methods of addressing so-called disinformation, to ask platforms to change their content moderation rules or implement additional measures, such as fact-checking, or to have them demonetize certain conservative news outlets. In other words, the implementation of such methods has previously been targeted not so much at information that is objectively false, but often been used as a way to suppress viewpoints the authorities deem undesirable, even when they’re accurate.

Funding Female Politicians to Sue Commentators
In its Gender Equality Strategy, the EC is taking things a step further by hoping to allocate funds through the proposed future AgoraEU programme (not yet adopted) that is specifically designed to support female politicians in their fight against online hate speech and disinformation. In other words, in the future, female politicians are likely to be able to use European taxpayers’ money to sue the same European taxpayers if they deem them to have insulted or unjustly criticised the politicians. To be true, examples of this already exist in Europe: in Germany, a man who called former Foreign Minister Annalena Baerbock a “fool” was fined €6,000, for example. Indeed, the majority of “hate speech” convictions in Germany today are passed for the sole “crime” of insulting or criticising a politician, regardless of the gender.
In the fight against disinformation, the EC’s strategy does not forget about the member state institutions. According to the ProtectEU European internal security strategy, member states must now be educated on how to use the newly available legal tools for removing any “illegal” web content.
In summary, the Commission is convinced that EU democracies are under threat from narratives opposing gender equality, illegal online content, and their artificial amplification. These threats are said to increase the polarization between men and women, with such content reportedly funded by the world’s wealthiest movements.
A Pat on One’s Own Back: “We’re Doing a Good Job Investigating X”
It is therefore crucial that the DSA – through which the EU aims to bring online activities under its control – serves as one of the central regulative acts in the implementation of this strategy. The Commission notes that, under the DSA, gender-based violence is a systemic risk that Very Large Online Platforms (VLOPs), including social media platforms, and Very Large Online Search Engines (VLOSEs) subject to this regulation, must assess and mitigate. In this context, the Commission highlights its own ongoing investigations into the social media platform X – against which the EU imposed its first fine under the DSA (in December 2025, for transparency violations) – and into the AI model Grok, whose scandal involving the generation of non-censored (sexually explicit) images erupted earlier this year. The Commission thus emphasises that it plans to strictly enforce all online laws to “protect women and children”.

In addition to risk monitoring, the Commission is gathering evidence under the DSA regarding compliance with the regulation’s requirements, and confirms that it will continue “negotiations” with online platforms to ensure better compliance. If necessary, the Commission will issue guidelines to platforms and search engines on how they must mitigate the risks and what measures to implement. Regarding the protection of minors, platforms are required to implement the DSA’s Guidelines on Protection of Minors, in which the Commission explains the mandatory measures for those platforms. Here too, the Commission intends to continue issuing information requests and fostering cooperation, this time seeking to identify “how they manage the risks of users being able to download illegal or harmful apps, including so-called ‘nudify’ app.” It is worth recalling, however, that according to a recent report by the U.S. House Judiciary Committee, the Commission’s alleged cooperation and negotiations involve neither true necessity, true cooperation, recommendations, nor guidelines. It is a clear-cut, and at times brutal campaign to pressure online platforms, using threats of fines to enforce the removal of content that the authorities or “the correct” interest groups do not want to be seen there.
Furthermore, the Commission states in its strategy that there is a definite need to improve the capacity of so-called trusted flaggers in order to flag content that involves gender-based cyberviolence on platforms. However, all these “trusted flaggers” will receive guidelines from the EC (guidelines on trusted flaggers), which allegedly only clarify the role of flaggers, and not much else. Given the Commission’s actions to date, one can expect specific content guidance and recommendations in the future. In addition to the flaggers, the guidelines will also assign responsibilities to online platforms, in regards to the ways to handle received reports.



