The Higher Regional Court Cologne Barks Up The Wrong (Data) Tree: The Court’s Interpretation of Article 5(2)(b) DMA in the Meta AI Case

Meta's logo behind a (regulator`s?) suspicious glance.

On 23 May 2025, the Higher Regional Court of Cologne dismissed an application for an interim injunction filed by the German consumer association Verbraucherzentrale NRW against Meta’s planned use of publicly-shared adult user data from Facebook and Instagram to train its AI systems, specifically evaluating whether this practice violated Article 5(2)(b) of the Digital Markets Act (DMA) (next to several provisions of the GDPR, not further discussed in this post).

The Court reasoned – based on the summary examination typical for interim proceedings – that the DMA’s ban on “combining” personal data covers only the targeted cross-platform profiling of individual users (the kind of data aggregation condemned in the Bundeskartellamt Facebook case), not the inclusion of partially de-identified, disaggregated material in an undifferentiated AI “data silo.” Although the Court’s reasoning appears flawed with respect to Article 5(2)(b) DMA, the order contains important findings and marks a significant step forward for collective DMA actions and the broader private enforcement of the DMA.

 

Background and facts of the case

In a press release dated 14 April 2025, Meta announced its intention to begin training and improving its AI systems, particularly its large language model, on 27 May 2025, using so-called “first-party data” (e.g., public posts, profile pictures, and comments) and “flywheel data” (user interactions with the AI system) from EU users. In the likeness of other AI providers that feed on scraped data to train their models, Meta explicitly recognised it would be scraping data from its social networks for that particular purpose.

Meta had previously postponed its AI training plans after objections from the Irish Data Protection Commission (IDPC) and consumer organisations. Following a general opinion of the European Data Protection Board on data protection aspects of processing personal data in the context of AI models, the IDPC accepted Meta’s proposed changes - especially partial data de-identification, aggregation, and an opt-out mechanism – and did not prohibit the processing.

Verbraucherzentrale NRW were not happy with the proposed changes and wanted to stop Meta’s plan to use first-party data ahead of its implementation with an interim injunction. They argued that Meta’s planned data processing violated several GDPR provisions (particularly Articles 6(1)(f) and 9(1)) and constituted unlawful combining of personal data under Article 5(2)(b) DMA, since the company intended to use combined data from two of its designated core platform services under the DMA (Facebook and Instagram) without explicit user consent.

Meta countered that its use of data did not amount to data combination under the Article 5(2)(b) DMA, as the datasets would be partially de-identified and aggregated, not linked at the individual user level. It also asserted that the measures it had taken were sufficient to justify processing under the GDPR.

 

The Courts reasoning on Article 5(2)(b) DMA

The Court found the application admissible but ultimately sided with Meta on substance.

With regard to a DMA violation, on the basis of a summary examination, it held that Meta is undoubtedly a gatekeeper and Facebook and Instagram two of its core platform services (the European Commission designated both of them as online social networking services in the regulation’s terms).

In the absence of a clear definition what constitutes combining of “personal data from the relevant core platform service with personal data from any further core platform services or from any other services provided by the gatekeeper or with personal data from third-party services” under Article 5(2)(b) DMA, the Court took recourse to legal interpretative methods. On the one hand, it took into account views from literature that the usage of datasets from platform services for the improved training of AI should constitute DMA-violating data combinations and noted that Article 5(2)(b) DMA (as specified by Recital 36 DMA) aims to prevent “potential advantages in terms of accumulation of data, thereby raising barriers to entry“ and creating economies of scope for gatekeeper (para 44 of the order).

On the other hand, it held that the incorporation of partially de-identified and disaggregated data from two central platform services into an unstructured training data set for an AI does not constitute unlawful data combination (paras 45 – 48). According to the Court, Article 5 DMA includes an exhaustive list of prohibitions and in the case at hand, it misses a “targeted” combination of the personal data of the “same user” it deems necessary in the context of Article 5(2)(b) DMA. The mere above-named purpose of the provision in itself is not sufficient – and, according to the Court, the specific purpose of Article 5(2)(b) DMA is in any way not the restriction of data combinations in the context of training AI systems.

Rather, the Court held that the legislator did not have specific questions relating to data processing for the purpose of developing and improving artificial intelligence systems in mind when creating Article 5(2) of the DMA. As a matter of fact, that would have been logically impossible, since the first ‘explosion’ of generative AI systems and large language models (LLMs) took place in November 2022 (which coincided with the entry into force of the DMA), when OpenAI rolled out its first version of ChatGPT, peaking at 1 million users in just 5 days.

The Court refers to the legislative history linked to the German Bundeskartellamt Facebook case, which concerned cross-platform personalisation through data aggregation and the combination of personal data relating to the same individual in the context of user profiles but not their “unassigned and even de-identified inclusion in a uniform data silo” (translation by the authors). Only such situations can be seen as targeted combination. In that context, the Court also points to the recent European Commission decision on Meta’s pay or consent model (detailed analysis here), which were at the time of the Court decision not yet publicly available and only presented in part to the Higher Regional Court Cologne by the defendant Meta. According to the Court, the decision “expressly refers to the fact that the combining concerns data of the same person” (translation by the authors). Only in such situations, the Court sees the possibility for consent as a potential venue to legitimising the combination data (a highly problematic topic in the context of the DMA anyway).

 

Evaluating the Court’s reasoning

In the rush of summary proceedings, some legal subtleties tend to slip through the cracks, it's a bit like speed-reading a novel and missing the plot twists. Let us take a bit more of a nuanced look at the Courts arguments first and the overall application of Article 5(2)(b) DMA to the case at hand second.

The Court's interpretation of Article 5(2)(b) DMA using the standard Savigny methods of interpretation is, in the absence of a further definition of data combination in the DMA, fundamentally the correct starting point, but it is insufficient in some respects. As a provision of Union law, Article 5(2)(b) DMA requires an autonomous interpretation, taking into account the specific aspects of Union law methodology. The purpose and aim, as often laid down in recitals of Union law provisions, such as the pivotal Recital 36 DMA, plays an important aspect in EU interpretative methodology. While it's true that the purpose of a law cannot override its wording, especially where the text leaves no room for interpretation, Article 5(2)(b) DMA is, in this case, sufficiently open to allow the underlying intent to be taken into account. The mere wording of “combine personal data from the relevant core platform service with personal data from any further core platform services” alone could encompass Meta’s combination of personal data from the two relevant core platform services.

Looking at the purpose of the provision, the Court does acknowledge Recital 36 DMA and recognises that gatekeepers can achieve economies of scope through data combination (as also written in the commentary the Court miscites (using the wrong author) in para 44). However, it overlooks the broader role of Article 5(2)(b) in advancing the DMA’s core objective of fostering “fair and contestable” digital markets and the competition-akin nature of the provisions.

Strikingly, these overarching goals of fairness and contestability are not mentioned even once in the order. Yet, contestability, in particular, is the crucial issue in here. The Article 5(2)(b) DMA prohibition of personal data combination also aims to address the gatekeepers’ enhanced access to personal data of end users, which gives gatekeepers advantages in terms of data accumulation, thereby raising barriers to entry of markets where contestability is already impaired. Looking from a competition-like perspective, the origin of the provision (more on this in a minute): in addition to exploiting users and their data, Article 5(2)(b) DMA is also about excluding or hindering “competitors” through “competitive advantages” gained from data aggregation. The DMA sees an express correlation between data accumulation and erosion of market contestability, as the European Commission recently also further explained in the mentioned Meta pay or consent decision, which has been published in the meantime (paras 32 and 36). One should also consider the emphasis the European Commission places in that decision (albeit in the context of “specific choice”) on the heightened thresholds and obligations applicable to gatekeepers under the DMA (anyone else reminded of the concept of special responsibility?) as a tool for (re-)establishing contestability in digital markets (paras 54 and 55).

In fact, if we go back to the European Commission’s stated interpretation of Article 5(2) on that particular decision, the Court’s interpretation is quite nonsensical when applied in practice. According to the Commission, the provision must be read bearing in mind that two cumulative (and distinct) requirements apply: the user must be granted specific choice AND consent so that the gatekeeper can override the prohibition. Specific choice is an autonomous concept within the DMA, justified by the gatekeepers’ special position within the market and, thus, there is an enlargement of sorts of data protection regulation supplementing the GDPR. On top of that, the specific choice alluded in Article 5(2) DMA does not go back to the processing of personal data, but rather to the provision of a specific choice of a less personalised alternative that involves less processing of personal data and delivers a similar result in terms of quality and user experience. However, consent is an animal of a different kind, and it lies much closer to the GDPR in terms of its interpretation. Consent must be granted by the end user to the requisite legal standard set out in Articles 4(11) and 7 GDPR and, as such, a range of elements must be considered, such as the imbalances of power between the data subject (end user) and the data controller (the gatekeeper), the conditionality of granting consent or the granularity of the choice provided to the user. The notion of specific choice speaks the language of the DMA, whereas consent resonates with the GDPR.

The Court’s ruling takes both requirements as one and the same by drawing a direct correlation between Meta’s AI case and the Bundeskartellamt’s Facebook case. True, the parties discussed and quarrelled around Meta’s integration of data to its AI with regards to the data protection framework, but the DMA is different in nature and scope and, as such, autonomous interpretations must be performed, as well.

Few, if any, other providers of large language AI models have legal access to such a vast amount of personal data from their own social media services as Meta does when using the aforementioned first-party data from Facebook and Instagram to train its models. Following this same argument, it also makes no difference that the personal data was publicly shared by users on the two social media platforms. That is an issue solely related to the GDPR which may predetermine a gatekeeper’s lack of compliance with the regulation, but it cannot sustain, as a standalone, its effective enforcement of the provision. Access to that data is not freely granted for the training of other AI models. Meta therefore enjoys a crucial competitive advantage in data accumulation for AI training, an advantage that significantly raises the barriers to entry or expansion which Article 5(2)(b) DMA is intended to prevent, further congesting the already impaired market where a gatekeeper is active. This is not sufficiently taken into account by the Higher Regional Court Cologne.

However, we believe that such a conflation of meanings may be caused by a good reason. Once Article 5(2) DMA made it into the books of the EU’s Official Journal, discussions ensued relating to the prohibition’s application and how gatekeepers could not simply dodge its application through minimal changes to their data infrastructure. That is, nonetheless, exactly what has happened. Throughout all of their compliance reports, gatekeepers have reversed the meaning of Article 5(2) DMA: it seems more of a positive obligation to provide ways in which consent can be granted rather than an absolute prohibition which can, at times, be exempted in really narrow circumstances. The Court seems to take this same approach, and finds one (good) reason (and one alone) to sustain compliance with the DMA and it makes good Meta’s approach to enforcing the regulation.

Taking a look at the legislative history the Court also alludes at, it is true that Article 5(2)(b) DMA dates back to previous competition law cases, as many obligations in the DMA do, particularly the Bundeskartellamt Facebook case. This case – amongst others – served as an important incentive for including Article 5(2)(b) in the DMA and illustrates the risks, particularly in terms of exploitation, that arise when combined personal data is used to create user profiles for targeted advertising. However, the Bundeskartellamt Facebook case does not necessarily capture the equally critical problems of exclusion and contestability that Article 5(2)(b) DMA is also designed to address.

Furthermore, and contrary to the Court’s view, the legislative inspiration behind this provision does not mean that every application of Article 5(2)(b) DMA must precisely mirror the Bundeskartellamt Facebook case. The Court’s reasoning that neither Article 5(2)(b) nor Recital 36 DMA indicate “a specific purpose for restricting the combination of data in the context of training AI systems” and “it is not apparent that the legislator had the specific questions relating to data processing for the purpose of developing and improving artificial intelligence systems in mind when creating Article 5(2) of the DMA” (translations by the authors) is therefore equally flawed. Not all future applications of a law can, or should, be spelled out in its wording or in recitals. Laws are drafted with general language precisely to cover new or evolving contexts that raise similar underlying concerns. That includes the use of combined data not just for social media platforms, but also in other technologies such as large language models, but also marketplaces, or payment services, contexts not covered by the historical precedent but clearly within the named spirit and purpose of the provision. Any narrower reading would render the DMA even more rigid than it already is, given its exhaustive list of obligations. Interpreting Article 5(2)(b) DMA to apply to the use of personal data in AI training is not an expansion of the DMA’s scope beyond the exhaustive list, but rather an application of an existing prohibition through legal interpretation.

This finding is also in line with the exact paragraph (para 30) of the European Commission Meta pay or consent decision the Higher Regional Court Cologne quotes. The Commission states that Article 5(2)(b) DMA “does not specify any specific purpose for such combination” and that the rule applies “in relation to any purpose for which the data is used”. Such purpose can be the training of large language models.

On the contrary, one has a hard time concluding from that paragraph, as the Court does, that the data combination under Article 5(2)(b) DMA must concern data relating to the same (i.e. individual) person. The Commission uses the plural form “end users,” referring to a group of persons, when stating: “a gatekeeper is prohibited from combining the personal data of end users from one of its CPSs with the personal data of those end users from its other CPSs or distinct services, or with personal data of the same end users from third-party services.” This wording indicates that, while the data must relate to the same end users across services, it can concern multiple end users simultaneously. There is no requirement for a one-to-one linkage of data from a single end user. Had the Commission intended such a narrow interpretation, it would likely have used the singular form instead in that paragraph. Similarly, the requirement of a targeted linking of data relating to the same person cannot be found in para 96 (or elsewhere) in the commentary that the Commission relies on.

 

Some additional side notes: the bigger picture

While Verbraucherzentrale NRW lost this interim battle, the private enforcement game of the DMA is just beginning. In that context, the order includes important findings.

 

Collective actions & private enforcement for DMA violations

The order makes important initial positive statements on a hotly debated issue: private enforcement of the DMA. Contested by Meta, the Court sided with the majority of legal scholars in holding that (at least some provisions of) the DMA can generally be privately enforced before national courts (para 30). This is an important general statement for the future of DMA enforcement. Within the overall structure of Member State and EU institutions, both levels can work together to enforce the DMA. The Higher Regional Court Cologne ultimately did not have to decide whether Article 5(2)(b) DMA has direct effect as an underlining feature of private enforcement and therefore establishes rights for end users in the specific case. We believe and argue that it does.

The Higher Regional Court also underlined the explicit avenue of private enforcement that the DMA takes: collective actions (para 31). It is clearly stated in Article 42 DMA that consumer representative actions via the Representative Action Directives (RAD) are a crucial instrument of private enforcement of the DMA. The (updated) Annex of the RAD as well as the German transposition law replicates this. The collective action approach that the DMA takes may come with certain pitfalls and might require some improvements and facilitations down the line – as we also argue elsewhere. Yet, this case clearly demonstrates the potential that consumer representative actions hold when it comes to mass data aggregation: overcoming the classic David vs. Goliath problem. After all, have any of you gone to court to challenge Meta over the use of your personal data for AI training?

 

International jurisdiction

Especially for consumer collective actions, the issue of international jurisdiction can be quite tricky and is subject to ongoing scholarly and judicial debate. The Higher Regional Court brushes over the issue by applying Article 7(2) Brussels Ibis (instead of other possible heads of jurisdictions, such as for consumer contracts under Articles 17 – 19 Brussels Ibis) to possible Article 5(2)(b) DMA violations, since the contested data processing concerns data of users in Germany. In line with the competition-akin nature of the DMA, violations of its provisions can indeed give rise to tort jurisdiction under Article 7(2) of the Brussels Ibis. In this case, Verbraucherzentrale NRW represented users located in Germany, establishing the “place where the damage occurred.” However, this may prove more challenging in other digital cases under the DMA, which are inherently transboundary in nature and not easily located. Notably, jurisdiction under Article 7(2) Brussels Ibis – or any other provision of the Regulation – is not centralised at the seat of the qualified entity.

 

Cooperation with the Commission and further fragmentation

The Court also held that it “had no opportunity in the summary proceedings to obtain an opinion from the Commission (Art. 39(1) DMA) nor to refer the matter to the European Court of Justice, assumes that the defendant's announced transfer of partially de-identified and disaggregated data from two central platform services to an unstructured training data set for AI does not constitute a merger within the meaning of Art. 5(2) subpara. 1 lit b) DMA.” (para 43). This is a missed opportunity on the EC’s side, since it could have sought to enforce the DMA’s wish for uniformity from the start and avoided conflated understandings of Article 5(2) DMA to circulate across the Member States.

Nonetheless, the most worrying (and unnoticed) aspect of the ruling lays with the fact that we did not expect for one of the first judicial applications of the DMA to take place in the midst of a (mainly) GDPR-related controversy. For the most part, the Court’s reasoning sought to justify how Meta’s integration of data into its AI complies with the GDPR, and the application of Article 5(2) DMA quickly seeped into the discussion. If we go back to the experience of private enforcement in antitrust law, we might remember those instances where judges in charge of applying Articles 101 and 102 TFEU struggled to quantify harms relating to already-established infringements by competition authorities. Now let’s reflect about what happened here: a Court with limited experience on the DMA’s interpretation (and only in interim proceedings) went on to define the terms of how data combinations should be interpreted in the future in the context of AI so as to avoid further breaches of Article 5(2) DMA. The European Commission’s lack of support to the Court in this particular instance is more telling than it seems. The DMA’s sole enforcer prefers that Courts figure out the DMA for themselves before it is compelled to flesh out how a provision should be interpreted outside of its pre-set enforcement venues, i.e., non-compliance procedures and specification proceedings. In the absence of any guidance, the DMA’s private enforcement is also more and more prone to result in an aspect which it seeks to eradicate: regulatory fragmentation in terms of standard-setting at the EU level relating to digital markets dominated by undertakings with a gatekeeper position.

 

Conclusion

The case has shown the pitfalls and potential of private enforcement of the DMA and DMA enforcement altogether. Without further intervention of the IDPC or public enforcement on the part of the European Commission, it was for Verbraucherzentrale NRW to act swiftly and at least try to prevent the AI training with the EU first-party data ahead of the planned implementation date on 27 May 2025. Interim proceedings were the only possibility to achieve that. Unfortunately, interim proceedings only allow for a summary examination by the court. With such a limited decision-making practice of the European Commission itself but also from national courts on the matter at hand, the Higher Regional Court Cologne wanted to be cautious. Appeals and the main proceedings might lead to a different outcome – the subsequent view of the European Commission on Meta’s Article 5(2) DMA compliance points to a clear direction.

 

**

Lena Hornkohl is the President of IUS Omnibus, a not-for-profit consumer protection association incorporated under the laws of Portugal, which brings consumer collective actions, including on competition law and data privacy matters. She has not been involved in the case at hand. The views expressed are her own and do not reflect the views of IUS Omnibus.

Comments (0)
Your email address will not be published.
Leave a Comment
Your email address will not be published.
Clear all
Become a contributor!
Interested in contributing? Submit your proposal for a blog post now and become a part of our legal community! Contact Editorial Guidelines
Image
Compendium 2024

Book Ad List