Protecting the Bundestag elections: Federal Minister of the Interior Nancy Faeser meets with representatives of online platforms

type: press release , Date: 22 January 2025

Faeser: Platform operators need to improve their measures to protect the federal elections from disinformation and must remove punishable content such as death threats more quickly and consistently.

Today at the Federal Ministry of the Interior, Federal Minister Nancy Faeser met with representatives of major social media platforms and big tech companies, along with representatives of other relevant federal ministries, security authorities and civil society organisations, to discuss measures to protect the upcoming Bundestag elections. Dr Ruth Brand, who as Federal Returning Officer is responsible for organising the elections and acting independently to ensure their integrity, also took part in the meeting. The tech companies and platform operators in attendance were Google (YouTube), Meta (Facebook and Instagram), Microsoft, TikTok and X.

The meeting focused on measures to fight disinformation campaigns which target, for example, the electoral process or specific candidates, or to fight hate crimes such as death threats, as well as the issue of flagging political advertising or AI-generated or manipulated content such as deepfakes so they are identifiable as such to users.

Federal Minister Faeser said: “Our security authorities are on their guard in every area to detect and stop attempts at foreign influence and targeted disinformation in advance of the federal elections. Lies and propaganda are tools that Russia in particular uses to attack our democracy. It’s also important to protect candidates from online crimes that go as far as death threats in some cases. We know that such threats can lead to real violence. Where people are threatened, democratic debate is no longer possible.

“The major platform providers are responsible for what happens on their platforms. The platform operators must obey the laws that have been democratically decided in Europe. In light of the current discussions, it was important for me to remind them of this. Their scrutiny of unlawful content needs to be stepped up, not cut back. Criminal offences such as death threats need to be reported to investigative authorities and removed from the platforms – faster and more consistently. Political advertising needs to be clearly recognisable as such. AI-manipulated videos must be clearly marked. And we need more transparency about the algorithms so they don’t stoke radicalisation processes, especially among young people.”

Legal regulations

The Digital Services Act obligates platforms with 45 million or more users in the EU to report punishable content to the responsible law enforcement authorities, to take action against the artificial amplification of certain content and to act against other forms of technological manipulation of content (e.g. by bots or fake accounts). The European Commission is responsible for oversight of the very large platforms’ compliance with these provisions.

The European Regulation on addressing the dissemination of terrorist content online (TCO Regulation) includes the obligation to remove certain terrorist content within an hour after a request from authorities. In Germany, the Federal Criminal Police Office (BKA) is responsible for issuing such removal orders.

Transparency in political advertising will be obligatory under EU Regulation 2024/900, which will fully enter into force on 10 October 2025.

Potential threats and measures taken by the Federal Government

The Federal Government takes the threat posed by foreign interference through disinformation very seriously and is taking resolute action against it. Protecting Germany’s federal elections from hybrid threats by foreign countries is especially important to ensure free and secure elections, a core element of our democracy. Observations in the international context show that foreign countries can be assumed to have a fundamental interest in illegitimately influencing elections; Russia is currently the most prominent actor in this area. Foreign countries, especially Russia, have a number of tools at their disposal to influence elections to their own benefit, and have the will to do so. The further intensification of cyber attacks, hack-and-leak operations, hack-and-publish operations, and influence operations can be expected. Such operations aim to unsettle the public, to influence elections or to discredit certain candidates or political stakeholders. Content generated or manipulated by AI plays a major role in this context. For example, deepfake video or audio clips can be created to deceive voters.

The Federal Office for the Protection of the Constitution (BfV) has established a task force and published extensive information on threats to the federal elections. In this information, the BfV analyses threats including espionage, cyberattacks by foreign intelligence services, sabotage, disinformation, discreditation efforts and illegitimate influence operations. The BfV’s information can be found here (in German).

Additionally, the Central Office for the Detection of Foreign Information Manipulation (ZEAM), which is located at the Federal Ministry of the Interior, began its work in June 2024. This new office examines the approaches, modes of dissemination and mechanisms of foreign interference through information manipulation on social networks and elsewhere online so that such operations can be identified as early as possible.

The Federal Returning Officer organises and supervises elections and election preparations in Germany at federal level. She is the official, nonpartisan source for information on the electoral process. She is also responsible for identifying and combating disinformation which is related to her remit or to the electoral process in general.

An overview of the measures to protect the 2025 federal elections from hybrid threats and disinformation is available at: https://www.bmi.bund.de/desinfo-bt-wahl-en