Great Expectations Placed on the Draft of the EDPB-EC's Joint Guidelines on the Interplay Between the DMA and the GDPR

EDPB and EC logos alongside each other

A year ago, the European Data Protection Board (EDPB) and the European Commission (EC) announced they would work on joint guidance on the interplay between the DMA and the GDPR. On the 9th of October, they finally delivered on the promise and issued the draft of the Guidelines so that stakeholders can have their say via public consultation.

In general, the Guidelines adopt the general format of the EDPB’s many documents interpreting the content of the GDPR in different directions. In this particular case, the Guidelines review six of the provisions embedded in Articles 5-7 that hold the most consequential challenges in terms of the interplay between the digital regulation and the impact of the GDPR.

 

The (strained) relationship between the DMA and the GDPR

The Guidelines are quite prompt in recognising the complementarity between the DMA and the GDPR in terms of their goals and protections provided to individuals, despite the fact that they pursue different purposes and objectives and have different scopes (paras 3 and 4). Given that gatekeepers may qualify as controllers or processors in the terms of the GDPR (para 2), there is, therefore, an inherent overlap when both types of rules apply to the processing activities of these regulatory targets.

According to the Guidelines, if the DMA were to achieve its objectives, greater fairness and contestability of digital markets would lead to more choice for individuals in the form of more data protection and privacy-enhancing features in their services, in line with the principle of data protection by design and by default enshrined in Article 25 GDPR (para 4). That is to say, the DMA’s application secures that data protection standards are upheld. As a matter of fact, the European Commission even defended its non-compliance decision against Meta’s breach of Article 5(2) DMA that gatekeepers must face stricter requirements in terms of the processing activities they perform, due to the data advantages that they possess vis-à-vis business users (para 55 of the non-compliance decision). The Guidelines access this same narrative by indirectly prompting the fact that the DMA compels gatekeepers not only to adjust their current products and services to comply with the regulation but to design and fine-tune the services they will roll out in the future in accordance with a high level of data protection in mind.

On the non-compliance decision against Meta, the European Commission was quite reluctant in interpreting the GDPR notion of consent to establish a breach of Article 5(2) DMA, establishing that it should be read as an equivalent to the concept under Articles 4(11) and 7 GDPR, already fleshed out substantially by the EDPB. The Guidelines provide a similar impression of the strained relationship between the two public bodies since they were initially endeavored as a project of the EDPB (see the Chair of the EDPB’s letter to representatives in the DMA taskforce unit), and it was only after the EC’s explicit request to collaborate that they were drafted jointly. It is well and true that both regulatory instruments tackle similar challenges. The ’without prejudice’ clause in Recital 12 is clearly as alive as ever. The Guidelines do not clarify whether the DMA acts as a lex specialis to the GDPR in some cases or whether the same requirements may be tempered (or enlarged) due to a gatekeeper’s presence in a given digital market when such processing activities are captured via ex ante regulation.

Stemming from the findings the CJEU produced in its Meta Platforms and others (Conditions générales d’utilisation d’un réseau social) (Case C-252/21) (see comment of the ruling here), the Guidelines also clarify how the principle of sincere cooperation should be applied in the relationships between data protection supervisory authorities and the European Commission as the DMA’s sole enforcer. There are no preclusive or bounding effects produced by the findings of a given data protection supervisory authority with respect to a gatekeeper’s processing activities. To ensure a coherent, effective, and complementary enforcement of the DMA and the GDPR, the Guidelines highlight that consultation is necessary in several cases. For instance, the EC must consult with data protection supervisory authorities when it is called to examine whether a gatekeeper’s conduct is compliant with the DMA, and such an analysis also entails examining whether the regulatory target’s conduct is consistent with the GDPR provisions. The obligation to consult also arises when the data protection authority happens to examine a controller’s conduct under the lens of the GDPR and must additionally consider whether it is consistent with the DMA (para 217 of the Guidelines). In these cases, the EC is not compelled to cooperate with the EDPB, but rather with the lead supervisory authority, i.e., the data protection supervisory authority corresponding to the gatekeeper’s main establishment in the EU (para 218).

 

The bullseye of the Guidelines, Article 5(2) DMA

Even though the Guidelines focus on six different provisions in fleshing out the GDPR implications of their enforcement, for the most part, they examine how the prohibition of processing, combining, and cross-using personal data across core platform services (CPSs) under Article 5(2) DMA relates to the legal requirements under the DMA.

First of all, the Guidelines term the access to personal data as a “parameter of contestability, taking into account the use of personal data to develop, create and improve highly targeted services” (para 12). The statement is quite surprising, given that the farthest that the Court has gone into recognising is to assert that access to personal data is a significant parameter of competition between undertakings in the digital economy (para 51 of the ruling). Through this first outright declaration, the Guidelines equate competition with contestability. In my own view, a dangerous comparison to make, since the enhancement (or restoring) of contestability in digital markets pre-dates the unfolding of competition (at least, in the form of protection sought by the DMA) and, as such, conflating one with the other can lead to stretching enforcement too far. On top of that, the Guidelines do not account for the unidirectional intentions of the contestability objective under the DMA. If access to personal data is a parameter of contestability, any business user and competitor of the gatekeeper can simply contribute to improving it by complying in the same fashion with the regulation’s obligations. That is, however, not the DMA’s regulatory intention. An unwarranted (or excessive) access to personal data is the target of Article 5(2), and it is only up to the gatekeepers to comply with the provision to unidirectionally enhance the possibilities of their business users to develop, create, and improve highly targeted services as a consequence.

Moreover, the Guidelines set in black and white how large the interplay between Article 5(2) DMA and the GDPR is. All processing activities covered by Article 5(2) DMA qualify as processing operations within the meaning of the GDPR. As such, two regulatory regimes apply to those processing activities in parallel, but the DMA, in my own mind, acts as a lex specialis to the GDPR. For instance, Article 5(2) limits the lawful grounds under which the CPSs and other services of gatekeepers, as controllers, may carry out certain processing of personal data of end users (para 18).

Another example of the interplay is that of the exemption to the prohibition: the gatekeeper is not compelled to comply with the provision when it has granted the user specific choice and consent to agree to those same processing activities. This was the precise point of discussion that the EC analysed in depth in its recent non-compliance decision against Meta. It set out that specific choice and the granting of consent do not belong to the same set of considerations. There are two cumulative requirements that the gatekeeper must comply with in order to take advantage of the prohibition’s exemption (para 33 of the non-compliance decision). As a matter of fact, the EC pointed out that the ‘specific choice’ tenet is wholly embedded in the DMA’s regulatory design, whereas the notion of ‘consent’ must abide by the terms of Articles 4(11) and 7 of the GDPR.

The Guidelines follow this same reasoning and clearly distinguish both legal requirements by equating the first requirement (i.e., specific choice) to the gatekeeper’s enabling end users to freely choose to opt-in to the data processing and sign-in practices covered by the prohibition by offering a less personalised but equivalent alternative with regard to those processing activities (para 23 of the Guidelines). Aside from the factors that the EC considered in its non-compliance decision against Meta, the Guidelines highlight that equivalence between both options will be measured against the benchmark of their performance, experience, and conditions of access (para 26).

The gatekeeper must display a less personalised but equivalent alternative for non-consenting users that does not include any of the processing activities that would require consent under Article 5(2) DMA. For instance, the Guidelines add, the gatekeeper can still process personal data across its services while providing the less personalised alternative when it either touches upon cross-uses of personal data between services not provided separately or when it is able to rely on the legal basis under the GDPR (para 27).

Furthermore, the Guidelines pinpoint that the notion of consent must be informed by the GDPR requirements of ‘specific’ and ‘free’ consent, determining the need for granularity of consent choices provided to end users. In other words, the gatekeeper must ensure that it complies with the DMA-bred notion of specific choice and seek to make consent specific, in the sense of pointing out the intended purposes of the processing of personal data (para 30). The Guidelines introduce quite a surprising change to the interpretation of Article 5(2) DMA since it highlights that when gatekeepers seek consent for processing personal data for various purposes, they should provide a separate opt-in for each purpose, to allow users to give specific consent for specific purposes. That is to say, when gatekeepers seek to personalise content, ads, and service development by processing, cross-using, or combining personal data in the sense of Article 5(2) DMA, they must provide the chance to accept or reject the processing to the end user for each one of those purposes (para 31). Each consent request for the same specific purpose will, however, be streamlined into a single consent flow (para 42). This is precisely what the gatekeepers are not doing, as stemming from their compliance reports, since they display a centralised pop-up asking the users for their consent to exempt them from the DMA prohibition.

Aside from that, the Guidelines set at a crossroads any feasible granting of consent by an end user to a gatekeeper (as I pointed out in a paper) in the context of Article 5(2) DMA. According to the Guidelines, the notion of consent can factor into its analysis the imbalance of power that may exist between controllers who are gatekeepers and end users to determine the influence (and validity) of their freedom to grant consent. Consent will not be free in cases where there is any element of compulsion, pressure, or inability to exercise free will, i.e., where the controller’s position in the market, by itself or in combination with other factors, leads the data subjects to note that there are no other realistic alternative services available to them (para 35). It will rarely be the case that a gatekeeper does not hold such a prominent position vis-à-vis the data subject, especially if one bears in mind the low levels of contestability that the EC (and the Guidelines) assume these markets to have.

Furthermore, the Guidelines also go into great lengths to explain the legal bases that the DMA explicitly excludes gatekeepers from using in their processing activities covered by Article 5(2) DMA. In line with the provision, the gatekeeper cannot rely on Articles 6(1)(b) or (f) GDPR as lawful grounds for the processing of personal data in scope of the provision (para 82). They can rely on those legal bases for those processing activities falling outside of the provision’s scope. Against the framework of the EC’s current revision of the DMA (and, particularly, with regards to the inclusion of AI into the regulation’s scope), this aspect will prove to be the most consequential for AI providers, since gatekeepers will be de facto forced to keep all the personal data inputted into their AI models anonymised to escape the GDPR’s (and the DMA’s) potential application (see the reasoning in depth here). Given the expected revision of the DMA’s terms, the Guidelines could have, at least, touched upon this particular aspect, since it is impacting the business of gatekeepers today, insofar as the AI functionality and features they are rolling out in the European market remain impacted by the regulation to the extent that they are considered as ‘embedded features’ of the currently designated CPSs (for a broader discussion of the topic, see here).

For the rest of the lawful grounds, the Guidelines clarify that consent is not to be considered as primus inter pares and, as such, the gatekeeper can subsidiarily rely on them when it has not obtained end user consent from the consumer. This is particularly relevant in the context of Article 5(2) DMA, since data controllers designated as gatekeepers will be able to process personal data (but not combine and cross-use personal data, one must assume) as long as it has the backing of one of the available legal bases (para 78). In turn, the incentives for gatekeepers to take recourse to consent flows decrease dramatically, since it will be much easier for a gatekeeper to rely on a GDPR legal basis than to go through all of the hoops of proving its compliance solution’s meeting of both the specific choice and consent legal requirements, in the terms spelled out by the European Commission.

 

Defining the scope (and technical measures) of portability and access to relevant data to operate an alternative search engine

The Guidelines carefully review the content of four DMA provisions (Articles 6(9), 6(10), 6(11), and 7 DMA) that touch upon data protection legal requirements directly, by setting out the scope of data impacted and the consequences their application may bring to gatekeepers when considered as data controllers. In general, the Guidelines detail the technical requirements that a gatekeeper must comply with from the GDPR’s perspective to ensure the high level of data protection required by the DMA.

One such example is Article 6(9), which expands the scope of the portability right under Article 20 GDPR because it applies irrespective of the lawful ground under which data has been processed and requires gatekeepers to enable continuous and real-time data portability to end users or third parties authorised by them, at no additional cost (para 104). The Guidelines list data falling within the scope of the provision, such as data actively provided by the end user, generated through its activity in the use of a CPS, observed by the gatekeeper from the end user’s behaviour, processed by the CPS, and data concerning other data subjects (paras 107, 108, and 112). Nonetheless, the Guidelines establish types of data that might encounter broader problems for processing, notably on-device data stored in the terminal equipment of the end user. The portability of this type of data would concern Article 5(3) of the ePrivacy Directive, and access to end users and third parties should be considered only in strictly necessary cases to provide an information society service explicitly requested by the user (para 110).

Moreover, the Guidelines directly point out that the gatekeepers' legal basis to port personal data is that of Article 6(1)(c) GDPR (i.e., compliance with a legal obligation). In any case, they cannot be held accountable for the subsequent processing of data by the end user or authorised third party receiving the data (paras 10 and 106). This is precisely the reason why the gatekeeper is entitled to implement authorisation mechanisms for third parties seeking access. Authorisation mechanisms will rarely be acceptable for the enforcer; however, when gatekeepers make data portability conditional upon the business use case or purpose for which the ported data will be used by the authorised third party (para 131). Most gatekeepers require third parties to disclose, for instance, a mock-up of the user experience they will provide the consumer to port data, although they strenuously deny that they use such information to deny their requests. The Guidelines place such conditionality as a hard limit for authorisation mechanisms applied in the context of Article 6(9) DMA.

The Guidelines instructional reasoning follows from Article 6(9) to Article 6(11), which is one of the most under-enforced provisions by the gatekeepers. The obligation compels search engine providers (i.e., Google Search, since it is the only provider designated) to provide access to anonymised ranking, query, click, and view data. On the gatekeeper’s latest iteration with stakeholders on its second compliance workshop, it basically recognised the lack of effectiveness of its technical implementation. Google anonymises the data available to its competitors based on frequency thresholding, including all queries entered by at least 30 signed-in users globally over the 13 months before the end of the relevant quarter and excluding data on results viewed by fewer than 5 signed-in users in a given country and per device type.

According to the estimates of stakeholders, only 1-2% of the whole database shared by Google is actually productive data that they can use to compete on the merits with the gatekeeper. In turn, the gatekeeper argued that no currently available anonymisation techniques allow it to provide a robustly anonymised dataset that fulfills the obligation’s objective. This is precisely the crux of the provision and where it stands at odds with the effective enforcement threshold, since business users are not de facto taking advantage of the right conferred to them by the DMA.

The Guidelines do not help in reversing this situation, either. For instance, it highlights that when selecting among various possible ways of achieving anonymisation of data of end users, gatekeepers should select the one that preserves the most quality and usefulness of the data for the third party undertaking requesting access to it, while also ensuring that the shared data of end users is anonymised taking into account all the means reasonably likely to be used by the third party undertaking providing the online search engine to identify end users directly or indirectly (para 180). Conversely, the Guidelines urge the EC to act via a specification proceeding to set out the binding measures that the gatekeeper must implement to ensure effective anonymisation and eligibility of third parties to receive data (paras 188 and 189). In other words, the Guidelines shy away from signaling one anonymisation method as opposed to another and clearly point to the EC’s responsibility in doing so. This clear signal begs the question of whether the gatekeeper’s statements of the unfeasibility of applying other anonymisation techniques ring true to the enforcer’s ears or whether further steps may be technically implemented to further the effects of Article 6(11) DMA.

 

Data protection measures as a means to protect the integrity of the hardware or operating system

For the most part, the Guidelines interpret the interplay between the GDPR and the DMA in their positive relationship. That is to say, the GDPR’s implementation secures (and enhances) the effectiveness of the DMA’s remedies. However, the Guidelines also acknowledge that the GDPR may be used to undermine the DMA’s enforcement in some respects, notably by placing it as a shield to enforcing a particular obligation. The Guidelines’ review of Article 6(4) demonstrates this point: the GDPR can be (and in fact, it has) instrumentalised as a means to debilitate alternative app and app store distribution.

Article 6(4) opens alternative app distribution with a caveat. Gatekeepers can implement measures to ensure that alternative app distribution does not endanger the integrity of the hardware or operating system provided by the gatekeeper. Data protection and privacy have been raised by gatekeepers as legitimate reasons to justify the introduction of these kinds of measures. For example, Apple has repeatedly stressed that sideloading and alternative app store distribution do not meet the same high privacy and security standards as the App Stores, so the DMA’s requirements make it more likely for EU users to be exposed to risks of scams or malware. Due to this reason, it has introduced a range of restrictions making it difficult for the end user to install an alternative app store, e.g., by requiring the user to navigate the settings page to allow alternatives before the download is even considered (or made).

The Guidelines respond to the challenge in three different ways. First, they encourage gatekeepers to keep an exhaustive and comprehensive list of the measures (to be made available on their compliance reports, for instance) they put in place and the rationale justifying why there are no other effective means to comply with those other legal requirements that would less adversely affect the attainment of the goals of Article 6(4) DMA (para 90). Instead of requiring the EC to prove whether a particular restriction introduced by a gatekeeper meets the necessity and proportionality thresholds, the Guidelines hint at the idea that it is up to the gatekeepers to come forward with an initial justification of their actions.

Second, the gatekeepers must introduce such measures, staying consistent with the premises of Article 32 GDPR, by ensuring a level of security of data that is appropriate to the risk posed by the changes imposed by the DMA on its business model (para 94). For instance, the gatekeeper can decide whether it allows third-party app and app store providers to gain access to certain sensitive information such as location, photos or contacts, but those measures cannot be more restrictive than they apply to their own services. In a similar vein to the vertical interoperability obligation, the gatekeeper cannot seek to impose distinct conditions of access to data or features on its ecosystem depending on the economic operator concerned.

Third, the measures gatekeepers can take to protect integrity and security risks must be complemented by the implementation of additional measures to demonstrate their compliance with the GDPR and to enable Article 6(4) DMA to comply with their data protection obligations, such as enabling app store providers to seek valid consent to process data stored on the device of the end user by enabling them to present interfaces with consent prompts in the operating system of the gatekeeper or provide additional protections from malware (para 101). To some extent, the gatekeeper must counteract the imposition of restrictive measures based on GDPR considerations by making it easier for those data controllers concerned to comply with their requirements under the same regime.

 

A good first step

The draft Guidelines issued jointly by the EDPB and the EC signal a good first step in trying to reconcile the interplay between the GDPR and the DMA, although it must stay within a broader strategy of both public bodies to square the circle of such an intersection. As the Guidelines point out, the initial description and detailing of some of the implications of the GDPR in the DMA for the six provisions does not aim to be exhaustive in terms of setting out every single challenge arising from it. However, the Guidelines miss a gaping hole relating to the gatekeeper’s processing activities; the impact of AI and AI-enabled features on the interpretation of the GDPR in the DMA. If the topic is yet to be resolved in the context of data protection, it will be hardly easier to address when obligations such as Article 5(2) DMA are applied by the EC.

Aside from that, the Guidelines reaffirm that gatekeepers remain fully subject to GDPR obligations, but they also indicate that the DMA acts, in practice, as a lex specialis in certain contexts, imposing stricter constraints on data processing, combining, and cross-use activities. Although the statement may further strain the relationship between the regulations, acknowledging such an element may help (more than undermine) in securing enforcement and the DMA’s interpretation.

Comments (0)
Your email address will not be published.
Leave a Comment
Your email address will not be published.
Clear all
Become a contributor!
Interested in contributing? Submit your proposal for a blog post now and become a part of our legal community! Contact Editorial Guidelines
Image
Compendium 2024

Book Ad List