On October 9, 2025, the European Data Protection Board ("EDPB") and the European Commission issued (for public consultation) joint guidelines clarifying how the EU’s Digital Markets Act ("DMA") and General Data Protection Regulation ("GDPR") work together. The guidance if adopted could be highly consequential for organizations interacting with, supplying to, or dependent on “gatekeepers” and their core platform services ("CPS"). It clarifies when consent is mandatory, when other GDPR legal bases remain available, how gatekeepers must structure consent flows and data access tools, and how data portability, business-user access, anonymisation, and interoperability must be implemented without compromising data protection. While the DMA targets market fairness and contestability, its implementation must be compatible with, and is often conditioned by, GDPR requirements.
Key Takeaways
The DMA and GDPR are complementary frameworks that must be applied coherently, and gatekeepers cannot use one regime to dilute compliance with the other--which can prove to be a significant compliance hurdle. Article 5(2) of the DMA tightens reliance on legal bases by requiring valid GDPR consent—and a less personalized but equivalent alternative—before combining or cross‑using data or using third‑party data for advertising, while restricting repeat prompts. Consent journeys must be neutral, granular, and reversible, in line with GDPR and ePrivacy.
Article 6(9) of the DMA substantially expands data portability beyond GDPR Article 20 by covering on‑device and observed data and enabling continuous, real‑time flows to users and their authorized third parties; the gatekeeper’s lawful basis is GDPR Article 6(1)(c). Under Article 6(10) of the DMA, business users are entitled to effective, high‑quality, continuous and real‑time access to aggregated and non‑aggregated data, including personal data where users have consented, with gatekeepers obliged to facilitate consent on a neutral basis. Article 6(11) of the DMA requires effective anonymization—implemented primarily through technical measures—for sharing search interaction data with third‑party search engines, potentially complemented by organizational safeguards through implementing acts. For messaging interoperability under Article 7 of the DMA, the same level of security (including E2EE, where applicable) must be preserved and data minimisation observed, with DPIAs frequently required.
End-user consent, choice, and online interfaces (DMA Art. 5(2); GDPR Art. 4(11), 7)
The proposed guidelines make clear that gatekeepers may not rely on contract performance or legitimate interests for processing activities listed in Article 5(2) of DMA. Instead, consent—or in limited cases another specified lawful basis—is required to process third‑party user data for advertising when those third parties use the gatekeeper’s CPS, to combine personal data across the gatekeeper’s or third‑party services, to cross‑use personal data across separately provided services, and to sign in users across services to facilitate such combination.
Consent must be freely given, specific, informed, and unambiguous, with granular purpose choices for distinct processing goals (e.g., content personalization, ad personalization, service improvement). Gatekeepers must present an equivalent, less personalized alternative to users who refuse—without degraded quality except where degradation is an inevitable result of not processing the data. Re-prompting for the same purpose is limited to once within a year, and designs must avoid nudging or choice fatigue. Special category data cannot be used for ad profiling under the DSA and political advertising rules, and explicit consent under GDPR Art. 9(2)(a) may be required for processing sensitive data outside those prohibitions.
Importantly, processing that is strictly necessary for services provided together with or in support of a CPS (e.g., identity or payments linked to a transaction) can for DMA purposes proceed without Art. 5(2) consent; however, controllers must still satisfy GDPR (e.g., Art. 6(1)(b) or, if appropriate and after balancing, Art. 6(1)(f)). Personalised advertising typically falls outside “strict necessity” for contract performance.
Distribution of app stores and apps (DMA Art. 6(4)) and GDPR security/compliance
Gatekeepers must enable the installation and effective use of third‑party apps and app stores and allow changes to defaults, with any restrictions limited to what is strictly necessary and proportionate to preserve device and OS integrity or user security, and duly justified. GDPR and ePrivacy remain fully applicable: gatekeepers and app developers are typically separate, independent controllers with distinct responsibilities; gatekeepers should apply the GDPR's Article 32 security measures such as encryption, secure storage, granular permissions, and breach response capabilities; optional consent UI components may be offered to third parties but cannot dictate how those parties meet GDPR or impose more burdensome defaults than for the gatekeeper’s own apps; and accessing or storing data on a device generally requires ePrivacy Article 5(3) consent unless strictly necessary for a user‑requested service.
Right to data portability—end users and authorized third parties (DMA Art. 6(9); GDPR Art. 20)
Article 6(9) DMA broadens portability to data “provided by” or “generated through” an end user’s activity on a CPS, including on‑device and observed data such as engagement and device attributes, while excluding derived or inferred data. Portability must be effective, real‑time, continuous, and free of charge, with device‑to‑device transfer where appropriate and solutions that support multi‑homing and switching. The gatekeeper’s lawful basis to port personal data is GDPR Article 6(1)(c). Gatekeepers must authenticate authorized third parties and ensure secure transmission but are not responsible for their downstream processing, as those parties will generally act as independent controllers. Where datasets include the personal data of other individuals, gatekeepers should ensure transparency (including dashboards identifying recipients) and provide tools to allow requesters to exclude third‑party data where appropriate. If a portability flow would transfer personal data to a non‑adequate third country, the end user’s explicit consent under GDPR Article 49(1)(a) should be obtained after clear risk disclosures.
Gatekeepers should implement neutral, user-friendly choice architecture for portability and avoid dark patterns. Periodic, configurable reminders may be appropriate for long-running flows.
Business-user access to data (DMA Art. 6(10))—including personal data with end-user consent
Gatekeepers must provide business users (and their processors) with effective, high‑quality, continuous, and real‑time access to aggregated and non‑aggregated data provided or generated through the relevant CPS or supporting services. Access to end‑user personal data is limited to data that is directly connected to the end user’s use of the business user’s products or services on the CPS and is contingent on the end user’s consent, obtained through mechanisms that the gatekeeper must make available.
Key implications include that the gatekeeper’s sharing of personal data under Article 6(10) DMA relies on GDPR Article 6(1)(c), and only after the user’s consent has been captured via gatekeeper‑facilitated interfaces. Consent capture must be neutral and at parity with the gatekeeper’s own processes, and gatekeepers should provide clear privacy notices and maintain recipient dashboards. Real‑time and continuous access will typically be delivered through documented, widely accessible APIs, with integrated tools supporting small‑volume users. Business users, for their part, remain responsible for GDPR compliance, including necessity and data minimization, and for putting in place appropriate processor agreements where applicable.
Sharing of anonymized search interaction data (DMA Art. 6(11)) and effective anonymization
Gatekeepers must share ranking, query, click and view data with third‑party search engines on fair, reasonable and non‑discriminatory terms, and any personal data within those datasets must be anonymized for the recipient, considering means reasonably likely to identify a person. The guidelines stress that technical alteration is indispensable to reduce re‑identification risks to an insignificant level, with organisational and contractual measures only complementing—not replacing—those technical steps. Pseudonymisation may render data non‑personal for a specific recipient only if reversal or linkage with other reasonably available data is not feasible for that recipient. Commission implementing acts may specify technical, organizational and contractual safeguards, including restrictions on onward sharing and obligations to cease access and delete data in case of violations.
Gatekeepers should calibrate anonymization to preserve data utility while ensuring residual identification risk is insignificant.
Interoperability of number-independent interpersonal communications (NIICS) (DMA Art. 7)
Gatekeepers must offer free interoperability for basic messaging functionalities at the request of third‑party NIICS providers, preserving the same level of security—including end‑to‑end encryption where applicable—across interoperable services. Implementations should minimise metadata and use well‑defined key exchange, apply strict data minimisation so only data strictly necessary for effective interoperability is exchanged, and deploy opt‑in mechanisms for discoverability and any out‑of‑band identity exchange. Given the scale, sensitivity and novelty, DPIAs will often be required. Any geographical limitations should be enforced proportionately using minimal data (for example, obfuscated country‑level IP), validated only at appropriate intervals and not repurposed. Measures to protect integrity, security and privacy must be strictly necessary, proportionate and justified, and should favor less intrusive alternatives to profiling.
Implications for GDPR Compliance
Data protection by design and by default: Gatekeepers and organizations integrating with CPS must redesign user journeys and developer tooling to enable valid consent, granular choices, and straightforward withdrawal; support continuous/real-time data portability and business-user access; and maintain security, integrity, and confidentiality in line with Article 32 GDPR.
Lawful bases and purpose limitation: Controllers must carefully select lawful bases for cross-use in “supporting” services and avoid defaulting to contract performance or legitimate interests for advertising or personalization that exceed user expectations. Clear purpose specification and separation of processing contexts are essential to avoid unlawful bundling.
Transparency and accountability: Neutral consent flows, complete privacy notices, dashboards listing recipients, and logs evidencing consent and authorization are needed to meet Articles 5 and 12–14 GDPR. Gatekeepers should keep comprehensive internal records mapping data categories covered by portability and access obligations and their technical interfaces.
Special category data and profiling limits: Advertisements based on special category data profiling are prohibited under the DSA; political ad targeting is subject to heightened restrictions. Where special categories are processed outside these prohibitions, explicit consent under GDPR Art. 9(2)(a) is required.
Children and vulnerable users: Any reliance on legitimate interests, especially for marketing, must account for reasonable expectations and higher protection for children; unconditional objections to direct marketing under Art. 21(3) GDPR still apply.
Governance and enforcement coordination: The guidelines foresee close coordination between the Commission and supervisory authorities to ensure consistent enforcement and to avoid double jeopardy. Gatekeepers should anticipate consultations and prepare evidence demonstrating that DMA implementation choices least adversely affect DMA objectives while remaining GDPR/ePrivacy compliant.
Next Steps for Organizations
Organizations should start by mapping interactions with gatekeepers’ CPS to locate flows impacted by Articles 5(2), 6(9), 6(10), 6(11) and 7 DMA and align lawful bases, consent strategies, and transparency tooling accordingly. They should redesign consent and choice architectures to be neutral, granular and user‑friendly, with robust withdrawal mechanisms and sensible reminders that avoid consent fatigue. Technical teams should prepare for real‑time, continuous portability and business‑user data access by adopting scalable API interfaces, granular scopes, strong authentication/authorization, and secure transmission. Marketing and product functions should review personalization against purpose limitation, necessity and reasonable expectations, reassessing any reliance on legitimate interests for advertising. Search data recipients should evaluate their anonymization approaches and readiness to comply with potential implementing acts and contractual safeguards. For NIICS interoperability, plan for protocols that preserve E2EE, minimize metadata, and support DPIAs, with opt‑in choices for discoverability and cross‑service identity exchange.
As the public consultation period continues (with an end date of December 4, 2025), our team will continue monitoring these developments and their implication for EU and international compliance obligations.

/Passle/678abaae4818a4de3a652a62/MediaLibrary/Images/2025-10-15-19-48-20-888-68effa84a8531caef9c84844.jpg)
/Passle/678abaae4818a4de3a652a62/SearchServiceImages/2025-11-07-22-20-51-466-690e70c30c7a1185349fc015.jpg)
/Passle/678abaae4818a4de3a652a62/SearchServiceImages/2025-10-24-20-23-41-012-68fbe04d706eb353587e7ad2.jpg)
/Passle/678abaae4818a4de3a652a62/SearchServiceImages/2025-11-06-04-40-13-286-690c26ad1330db0e8147874d.jpg)