Submission to the European Democracy Shield Consultation
Guaranteeing the fair, transparent, human-centred and responsible use of AI in electoral processes requires addressing monetisation incentives
Published: May 14, 2026
Platform monetisation systems are an under-examined aspect of platforms’ operation, which play a central role in incentivising inauthentic content and activity and channeling covert funding to political campaigns.
In our latest submission to the European Commission, we explain how monetisation is a core driver of AI-generated disinformation and can facilitate the covert financing of electoral campaigns.
Monetisation is a core driver of AI-generated disinformation
Inauthentic activity and AI-generated content is a major threat to elections in the EU. Electoral periods are engagement peaks, which attract opportunistic individuals seeking to maximise their monetisation payouts. Often, these actors have no direct connection to the local context, or language, and rely on AI to automate the creation and distribution of content.
Recent investigations demonstrate the relationship between platform monetisation and harmful political content ecosystems in the context of the Dutch, Hungarian, and UK elections.
Monetisation facilitates the covert financing of electoral campaigns
Monetisation services can also offer new and poorly regulated sources of financing for political campaigns. Political parties and candidates can use monetisation services to generate revenue from platforms, brands and audiences. They can also use these services to subsidize covert online campaigns or other types of undisclosed campaign activities.
In theory, most platforms have policies that would prevent the use of their monetisation services by political and government actors. In practice, however, these remain largely unenforced.
Our own research, for example, has found that multiple political parties and politicians have been making active use of monetisation services, despite platforms’ stated prohibition. The same was proved in terms of donations.
RECOMMENDATIONS
Greater transparency and accountability around social media monetisation practices is essential to:
- Ensuring adequate enforcement of laws and regulations around electoral financing, financing of FIMI and disinformation
- Reducing incentives for inauthentic and AI-powered content, activity and disinformation
- Rebalancing platform revenue redistribution towards public-interest media and trustworthy information ecosystems.
♦️ Monetization Transparency
Platforms should be required to provide meaningful transparency about their monetisation governance. Including their Monetisation: services, terms and policies, enforcement practices, and effectiveness of enforcement.
They should also be required to provide adequate context on the monetisation of content and accounts’ access to monetisation services by including content and account labels, and monetisation libraries.
This transparency is necessary to enable users, researchers and regulators to assess whether platform funds may be subsidising inauthentic content and activity or offering financing to political campaigns.
♦️ Fair and more predictable monetisation terms
Automated demonetisation systems frequently generate errors that disproportionately affect legitimate journalism, civil society actors, and public-interest media.
Where demonetisation measures are imposed, platforms should be required to provide:
- Fair, predictable, and plain explanations of the restrictions
- Accessible appeals mechanisms;
- Adequate reparations that include compensation for financial losses resulted from undue restrictions
♦️ Rebalancing distribution towards Public-Interest Media
Major platforms generate substantial revenues while redistributing comparatively limited amounts to independent journalism and public-interest media ecosystems.
Platforms should be required to redistribute a larger share of their revenues towards:
- public-interest journalism;
- independent media;
- fact-checking ecosystems;
- civic information infrastructure.
This could help counterbalance engagement-driven monetisation models that currently privilege sensationalist, manipulative and inauthentic content.
All items
WHAT TO FIX’s Submission to Ofcom (OSA Transparency Notices)
De-Risking Social Media Monetization: Rating Platforms’ Coverage of Monetisation-Related Risks
Democracy Shield: WHAT TO FIX’s Position Paper
WHAT TO FIX Urges Transparency into Social Media Monetization in Exchange of Views with European Parliament
EU Democracy Shield: WHAT TO FIX’s recommendations to the European Parliament
Issue Brief: New Incentives, Evolving Threats
WHAT TO FIX’s Submission to the European Board on Digital Services’ First Annual Report On Systemic Risks
Monetization Principles: Paving the Way towards Responsible Monetization Governance
WHAT TO FIX’s Submission To The DSA’s Article 40 Delegated Act