Our Submission to the DSA’s Delegated Act

 

WHAT TO FIX’s Contribution to the DSA Article 40 Delegated Act

12 December, 2024
 
As the EU’s Digital Services Act (DSA) continues to take shape, we at WHAT TO FIX are pleased to have the opportunity to contribute to that process.
Last week, we submitted our response to the European Commission’s consultation on the delegated act for researcher data access under the DSA. This is the delegated act foreseen by the DSA’s famous Article 40, which requires VLOPs and VLOSEs to share data with external researchers so that they can study systemic risks to the EU.
If you’ve been following our work on monetization and social media, you’ll already know that we see monetization — specifically ad revenue sharing programs — as an urgent yet puzzlingly overlooked factor contributing to systemic risks across the EU and globally.
That’s why our official suggestion to the Commission focuses on making sure that researchers in the EU are able to request and receive data related to monetization programs. It’s crucial for platforms to provide transparency around exactly how their monetization policies are designed, how monetization policies are enforced and who exactly participates in these programs.

Here are just a few sample questions we think researchers could explore upon accessing specific monetization-related data under the DSA:

  • To what extent are platforms channeling funds to actors involved with illegal content or activity? We’ve documented examples of Facebook accounts being allowed to monetize despite being involved in the distribution of sanctioned Russia Today (RT) content as well as trademark and copyright violations. We believe the lack of business due diligence is widespread across platforms and presents a systemic risk to the Union which needs to be further investigated.
  • To what extent did social media monetization contribute to disinformation around specific incidents in the Union, and what responsibility do platforms carry for the associated negative effects? We could, for example, explore the extent to which the proliferation of mis- and disinformation — and the resulting disruptions to emergency response — in the aftermath of deadly floods in Valencia in November 2024 was fueled by monetized accounts, or look at how many of the accounts spreading disinformation in the lead up to the June 2024 European elections were monetized and managed from outside of Europe.
  • How is platforms’ current rollout of monetization programs contributing to the proliferation of mis/disinformation? We’ve documented the ways that monetization programs have incentivized the emergence and proliferation of financially motivated actors who operate globally, often during crises and elections, with no knowledge of the language or contexts they are publishing content in. We believe these actors, who are incentivised and subsidised by platforms’ monetization programs, present a systemic risk to the Union which needs to be further investigated. We would also like to investigate how these monetization programs may be subsidizing the rapid proliferation and increased affordability of disinformation-for-hire and fake account markets.
 

But wait a sec…

If social media monetization programs potentially play such a huge role in incentivizing, driving and financing mis/disinformation, illegal content and activity, shouldn’t everyone have access to more information around how it all works?
Yes!
That’s why, even though we’re asking the European Commission to require platforms to cough up relevant data to researchers within the framework of the DSA’s article 40, we’re also advocating for social media platforms to provide transparency about exactly how these programs operate in the course of their regular reporting and public-facing documentation.
Keep an eye on our #AdRevenueSharing project for more.