HomeResourcesArticles › Regional Compliance
Article · 9 min read

EU AI Act vs India DPDP Act vs Singapore PDPA: Key Differences for Global Organisations

Organisations operating across the EU, India and Singapore face three regulatory frameworks simultaneously — with different legal bases, different enforcement consequences, and critically different concepts of what constitutes regulated AI and personal data. Building a single compliance programme that satisfies all three requires understanding precisely where they align and where they diverge.

Published May 2026Regional ComplianceEU AI Act DPDP Act PDPA Regional Compliance
Executive Summary

Organisations operating across the EU, India and Singapore face a trifecta of AI and data regulation: the EU AI Act (binding from August 2026), India's Digital Personal Data Protection Act (DPDP, enacted August 2023 with rules pending), and Singapore's Personal Data Protection Act (PDPA, with AI-specific guidance through the Model AI Governance Framework). These frameworks share common objectives but differ materially in scope, enforcement, and the specific obligations they create. Organisations attempting to build a single compliance programme that satisfies all three need to understand precisely where the frameworks align and where they diverge.

3 frameworksEU AI Act, India DPDP Act, Singapore PDPA — three regulatory frameworks that every multinational operating across these markets must navigate simultaneously
Aug 2026EU AI Act high-risk obligations effective — organisations marketing AI to EU customers or operating EU-affecting AI face legal obligations from this date
DPDP RulesIndia's DPDP Rules, expected in 2024–2025, will define the detailed implementation requirements of the DPDP Act — finalisation will significantly clarify compliance obligations for organisations with Indian operations

Why Multinational AI and Data Compliance Is Not Three Separate Projects

The instinctive response to multiple regulatory frameworks is to manage them as separate compliance workstreams — an EU AI Act project, a DPDP Act project, a PDPA project. This response is expensive, creates inconsistency across the organisation, and misses significant overlap between the frameworks that allows efficient integration.

But integration has limits. The frameworks have materially different legal structures, different enforcement mechanisms, different organisational obligations, and different concepts of what constitutes regulated AI or personal data. An integrated programme that treats all three frameworks as equivalent will fail to meet the specific requirements of each. The art is integration where frameworks align and separation where they diverge — and understanding precisely which is which for your specific operations and AI systems.

Where the Three Frameworks Diverge Most Significantly

Scope of regulated activity

The EU AI Act regulates AI systems based on what they do and the risk they create — not simply because they process personal data. A high-risk AI system that makes no decisions about individuals (a manufacturing quality inspection AI, for example) is still subject to EU AI Act high-risk obligations. Conversely, an AI system that processes personal data but makes only low-risk recommendations may have minimal EU AI Act obligations.

The DPDP Act regulates the processing of digital personal data of Indian citizens — regardless of whether that processing involves AI. An organisation that processes the personal data of Indian customers without using any AI is subject to DPDP Act obligations. An organisation that uses AI to process non-personal data is not. The DPDP Act's scope is determined by the nature of the data, not the technology used.

Singapore's PDPA similarly regulates personal data — with the Model AI Governance Framework providing voluntary guidance on responsible AI use that goes beyond the PDPA's legal requirements but does not have the same binding force as the EU AI Act.

Enforcement mechanisms and consequence severity

The EU AI Act enforcement is through national market surveillance authorities, with maximum fines of €35 million or 7% of global annual turnover. Enforcement is mandatory — member states are required to designate national AI authorities and enforce the Act.

The DPDP Act creates the Data Protection Board of India as the enforcement authority, with penalties up to ₹250 crore (approximately £23 million) for significant violations. The enforcement framework is still being established — the DPDP Rules, which will define detailed compliance obligations and enforcement procedures, had not been finalised as of early 2026.

Singapore's PDPA enforcement through the Personal Data Protection Commission can impose financial penalties up to S$1 million (approximately £600,000) — significantly lower than EU or Indian maximums. The PDPC has historically taken a relatively facilitative enforcement posture, preferring corrective action undertakings to financial penalties.

Obligations for AI-specific governance

The EU AI Act creates specific, legally binding obligations for AI system governance: risk management, data governance, technical documentation, human oversight, accuracy and robustness. These are prescriptive legal requirements, not guidance.

India's DPDP Act does not create AI-specific governance obligations. It creates data protection obligations that apply to personal data processing — including personal data processed by AI systems. The AI governance dimension in India is addressed through voluntary frameworks (the MeitY Responsible AI framework) and evolving regulatory guidance, not through the DPDP Act itself.

Singapore's approach is more nuanced: the PDPA creates binding data protection obligations, and the Model AI Governance Framework (MAIGF) provides detailed, voluntary AI governance guidance that PDPC strongly encourages but does not legally mandate. AI governance in Singapore is currently compliance-encouraged rather than compliance-mandated.

DimensionEU AI ActIndia DPDP ActSingapore PDPA + MAIGF
Legal statusBinding EU lawBinding Indian law (rules pending)PDPA binding; MAIGF voluntary
Scope triggerAI system risk levelPersonal data processingPersonal data processing + voluntary AI governance
AI-specific obligationsYes — prescriptive for high-risk AINo — data protection obligations onlyNo — MAIGF is voluntary guidance
Max penalty€35M / 7% turnover₹250 crore (~£23M)S$1M (~£600K)
Enforcement bodyNational market surveillance authoritiesData Protection Board of IndiaPersonal Data Protection Commission
Extraterritorial effectYes — EU market or EU persons affectedYes — Indian citizen data wherever processedYes — personal data of Singapore residents
Data residency requirementsNo specific AI data residency requirementRules may impose data localisation requirementsNo data residency requirement
Consent requirementsNot applicable (AI regulation focus)Consent as primary lawful basis with limited exceptionsConsent-based with limited exceptions

Building an Integrated Compliance Programme

  1. Map your operations to each framework separately and specifically. For the EU AI Act: what AI systems do you develop or deploy that affect EU persons, and how are they classified under Annex III? For the DPDP Act: do you process digital personal data of Indian citizens, in what capacity (data fiduciary, data processor), and for what purposes? For the PDPA: do you collect, use or disclose personal data of Singapore residents, and do you have a PDPA-compliant data protection programme in place?
  2. Identify shared foundation requirements. Several governance capabilities are required by all three frameworks — with different specifics. Data governance and quality controls are required by all three. Consent and transparency obligations appear in all three. Incident notification requirements exist in all three. Individual rights (access, correction, deletion) appear in all three — with different scope and timelines. Building these capabilities once, to the most demanding specification across all three frameworks, eliminates duplicate effort.
  3. Implement EU AI Act-specific requirements as an additional layer for relevant AI systems. The EU AI Act prescriptive requirements for high-risk AI systems have no equivalent in DPDP or PDPA — they must be addressed specifically. For AI systems affecting EU persons that are classified as high-risk: risk management system (Article 9), data governance (Article 10), technical documentation (Annex IV), human oversight (Article 14), and conformity assessment. These requirements should be implemented within an ISO 42001 management system framework where possible.
  4. Monitor DPDP Rules finalisation and build adaptive compliance architecture. The DPDP Act framework will be significantly clarified when the Rules are finalised. Organisations with material India operations should build a compliance architecture that can accommodate the Rules when they are published — rather than waiting for finalisation before beginning compliance work. The foundational data governance, consent management, and individual rights capabilities required by the DPDP Act are identifiable from the Act itself.
  5. Leverage Singapore MAIGF for voluntary governance that future-proofs against mandatory requirements. The MAIGF represents what Singapore regulators consider responsible AI governance. While voluntary today, implementing MAIGF guidance provides a governance foundation that is likely to be consistent with future mandatory requirements if Singapore moves toward binding AI regulation. It also provides demonstrable AI governance maturity for enterprise clients in the Singapore market.
  6. Centralise privacy and AI governance documentation with jurisdiction-specific overlays. Maintain a central data protection and AI governance programme with jurisdiction-specific documentation overlays — a PDPA-compliant privacy policy, a DPDP-compliant data processing framework, and EU AI Act-compliant technical documentation — rather than building three entirely separate programmes.
Multi-Jurisdiction AI and Data Compliance Readiness Checklist
EU AI Act scope has been assessed — both as provider and deployer of AI systems affecting EU persons
DPDP Act scope has been assessed — whether we process digital personal data of Indian citizens and in what capacity
PDPA scope has been assessed — whether we collect, use or disclose personal data of Singapore residents
Shared foundation requirements have been implemented to the most demanding specification across all three frameworks
EU AI Act Annex III classification has been completed for all AI systems affecting EU persons
EU AI Act technical documentation (Annex IV) exists for each high-risk AI system
DPDP Act compliance architecture is being built and designed to accommodate Rules when finalised
Singapore MAIGF guidance has been reviewed and implemented where feasible as voluntary governance
Cross-border data transfer mechanisms are in place for all three jurisdictions

Frequently Asked Questions

Which framework should we prioritise if we have limited compliance resource?
Prioritise EU AI Act for any AI systems affecting EU persons — the enforcement mechanism is the most developed, the penalty exposure is the highest, and the August 2026 deadline for high-risk obligations creates a specific compliance timeline. DPDP Act compliance should be built in parallel where material Indian personal data processing exists — even before the Rules are finalised, the foundational consent management, data governance and individual rights infrastructure required by the Act can be established. Singapore PDPA compliance is typically the least resource-intensive of the three to achieve if a basic data protection programme is already in place — start with a PDPA gap assessment and address gaps systematically.
India has not yet finalised the DPDP Rules. Does that mean we should wait before implementing compliance?
No. The DPDP Act itself imposes obligations that are clear without the Rules: the requirement to obtain consent for personal data processing, the obligation to provide privacy notice, the right of data principals to access and correct their data, and the obligation to implement appropriate security safeguards. The Rules will clarify specifics — categorisation of significant data fiduciaries, data localisation requirements, and enforcement procedures — but the foundational compliance programme can and should be built now. Waiting for the Rules means beginning from zero when they are published, at a point when regulatory pressure will be active rather than anticipatory.
Singapore is considering mandatory AI governance regulation. How should we plan for that?
Singapore's PDPC and MAS have both signalled interest in moving toward more structured AI governance requirements over time, particularly for AI used in financial services and high-impact consumer applications. The MAIGF is widely understood as the precursor to mandatory requirements. Implementing MAIGF guidance now serves two purposes: it demonstrates AI governance maturity to enterprise clients and regulators in the current voluntary environment, and it ensures your programme is aligned with the direction that mandatory requirements are likely to take, minimising the remediation cost when requirements become binding.

How AjaCertX Helps

AjaCertX delivers multi-jurisdiction AI governance and data protection compliance programmes for technology organisations, financial services firms, and multinational enterprises operating across EU, India, Singapore, GCC and UK markets.

  • EU AI Act scope assessment and high-risk AI system compliance programme
  • India DPDP Act gap assessment and compliance programme design — including DPDP Rules readiness architecture
  • Singapore PDPA compliance assessment and MAIGF implementation programme
  • Integrated multi-jurisdiction data protection and AI governance framework design
  • ISO 42001 AI management system implementation
  • Cross-border data transfer mechanism assessment and implementation (SCCs, adequacy, binding corporate rules)
  • Regional compliance programmes for GCC, UK and Asia-Pacific markets
Managing AI and data regulation across multiple jurisdictions?

Regional compliance specialists. Integrated programme proposal within 48 hours.

Conclusion

The EU AI Act, India DPDP Act and Singapore PDPA represent three distinct regulatory frameworks with different legal bases, different scope triggers, different obligations, and different enforcement consequences. An integrated compliance programme is both possible and efficient — but only where frameworks genuinely align. Where they diverge — and the EU AI Act's AI-specific prescriptive requirements are the clearest point of divergence — separate, framework-specific compliance work is required.

The organisations that manage this most effectively build shared foundations where frameworks align, layer framework-specific requirements where they do not, and maintain the programme architecture to accommodate the evolution of all three frameworks — because all three will evolve over the next three to five years.

About AjaCertX
AjaCertX is a specialist compliance, certification and assurance partner serving technology, financial services and regulated industries globally. Our Regional Compliance practice delivers EU AI Act, India DPDP, Singapore PDPA and integrated multi-jurisdiction compliance programmes for multinational organisations operating across EU, UK, GCC and Asia-Pacific markets.
WhatsAppConnect