The United Kingdom’s data protection framework is governed by the Data Protection Act 2018 (DPA 2018) and aligns with the UK’s General Data Protection Regulation (UK GDPR) to ensure the protection of personal data in today's technology-driven world. The introduction of the Data Use and Access Act 2025 (DUAA) is an important development that complements these existing laws. The DUAA introduces changes in areas such as data access and portability, as well as digital verification and complaint management. It marks a shift from a compliance-focused regime to a more innovative and market-driven approach by addressing challenges brought by technological advances and regulatory changes. The DUAA aims to strengthen data rights and protections while streamlining processes, fostering innovation, and embedding accountability. To explore these aspects further, this article examines the framework’s legal foundations and practical implications, focusing on automated decision-making and the challenges posed by emerging technologies.
Data protection laws are built on the principles of privacy, personal freedom, and accountability. The GDPR emphasises transparency while giving users control over their data. This reflects ideas like Alan Westin's[1] notion of privacy being about control and aligns with the broader human rights framework set by the European Convention on Human Rights. Specifically, Article 8 of the ECHR guarantees the right to respect for private and family life and has shaped the UK’s approach to data protection.
The DUAA expands on the UK GDPR framework by granting specific powers to both public and private sectors. It also focuses on creating connections between different bodies and implementing technology-driven standards, as detailed in sections 1-7 and Part 2 of the Act. The DUAA is designed to be flexible, with mechanisms for ongoing updates and oversight, and the creation of specific data registers and interfaces. By integrating obligations across sectors, the DUAA manages the oversight of customer and business data and digital verification services, thus establishing frameworks for data flows in both public and private sectors. The DUAA empowers the Secretary of State and Treasury to make new regulations governing customer and business data,[2] including mandatory data sharing to promote competitive policy and public interest access. It refines the data protection framework by adding more lawful grounds for processing and improving procedures for subject access and data portability, such as the statutory ‘stop-the-clock’ mechanism for requests. The DUAA also strengthens the role of the Information Commissioner by granting new enforcement powers and requiring regular reporting to Parliament. The integration of interface bodies and dashboard services, along with statutory duties for periodic regulatory review, illustrates an effort to develop a flexible regulatory system that can adapt as new data-sharing and verification technologies emerge.
The DUAA’s commencement is managed through a four-stage implementation timetable set by the government, ensuring a phased approach to bringing different provisions into effect.[3] This structured rollout allows organisations and regulators to prepare and comply progressively with the Act’s requirements, facilitating smoother adoption and enforcement.
The DUAA's approach to automated decision-making and legitimate interests addresses specific regulatory challenges posed by technological advancements. Section 14 of the DPA 2018 regulates decisions made solely by automated means and requires safeguards such as human intervention. The rise of AI and algorithmic processes capable of complex decision-making raises questions about the sufficiency of these protections. While the provision aims to ensure fairness, the DUAA revises the restrictions on fully automated decision-making from Article 22 of the UK GDPR, thereby allowing automated decisions including those determined by AI and algorithmic processes unless prohibited or specifically regulated.
Ongoing gaps in statutory and practical guidance suggest that many human reviews are superficial and fail to meaningfully alter outcomes for affected individuals, especially in decisions concerning public benefits or employment. The lack of detailed guidance on mitigating algorithmic bias contributes to ongoing uncertainty. Furthermore, the absence of clear standards concerning algorithmic explainability, transparency, and evidence-based safeguards for profiling places the onus on regulators and courts to define ‘meaningful human involvement’ and redress mechanisms. These requirements are codified in sections 50A to 50D of the DPA 2018, as inserted and amended by the DUAA, which introduce key safeguards for automated decision-making and clarify expectations for human oversight and effective remedies.
The DUAA introduces recognised legitimate interests detailed in regulations, now explicitly including circumstances such as public safety and crime prevention, regulatory enforcement, public health, and protection of vulnerable individuals. This marks a shift from the strictly individualised balancing test approach of the UK GDPR. Using recognised legitimate interests requires documenting the reasons for processing and ensuring that use does not override individual rights and freedoms, with regular reviews and justified policies. The introduction of formalised transparency measures such as layered notices and advanced Data Protection Impact Assessments for high-risk scenarios supports this approach. The formal recognition of sector-specific legitimate interests suggests a move toward more predictable yet potentially broader justifications for processing and poses new challenges in ensuring suitable safeguards and regular accountability mechanisms are enforced.
For businesses, compliance with the DPA 2018 imposes administrative burdens, particularly for small and medium-sized enterprises. The ICO’s role in providing guidance and enforcement is important but often criticised for delays in addressing complaints. Businesses must also navigate complex requirements for data processing agreements and international transfers. With the introduction of the DUAA, open data obligations and broader rights of data access create new friction points where procedural complexity and resource demands disproportionately affect SMEs, as documented in the ICO’s small business outreach. The DUAA's improvements in data access and portability, such as the ‘stop-the-clock’ mechanism, aim to reduce administrative burdens and support efficient dispute resolution.[4] The addition of dashboard services and periodic obligations for data provision, along with transitional compliance periods, increases cost and operational planning challenges for sectors like financial services and utilities, which must now also meet interface body participation requirements.
For individuals, the DUAA empowers individuals with rights such as access, rectification, and erasure. However, awareness and accessibility of these rights remain limited, reducing their practical impact. For example, many individuals are unaware of their right to challenge automated decisions that significantly affect them. According to recent ICO survey data,[5] actual exercise of these rights remains low, especially for vulnerable populations, due to a mix of digital literacy barriers and fragmented tools for rights requests. The provision of dashboards or electronic interfaces is intended to simplify interactions between data subjects and controllers, signalling an effort to improve accessibility. Official ICO reports for 2024[6] confirm that pilot interface deployments increase rights uptake only when paired with direct user engagement initiatives and simplified digital onboarding, highlighting the need for fair distribution of rights improvements.
Public authorities face their own set of challenges as they balance transparency with data protection, especially in areas like health data sharing during emergencies. The COVID-19 pandemic highlighted the need for clear guidelines on data sharing while maintaining privacy. The documented lag in emergency legal guidance resulted in ad hoc sharing agreements that were later subject to regulatory adjustment, which suggests that the current legislative model struggles with real-time application during dynamic public interest scenarios. The DUAA’s review cycles and requirements for interface body governance are designed to allow quicker, evidence-based policy updates for future crises and to improve accountability for emergency data-use decisions.[7]
In acknowledging the role of emerging technologies, it becomes clear that the DUAA struggles to keep pace with advancements in data intensive sectors such as AI and blockchain. For example, it lacks specific provisions to address the ethical and legal implications of AI-driven profiling. The rise of generative AI tools further complicates the landscape as these tools often process vast amounts of personal data without clear accountability mechanisms. The DUAA extends the requirement for Data Protection Impact Assessments to include high-volume digital verification and the rollout of new interoperability platforms and any automated decision-making with material effects on individuals. This legal lag creates operational risks for both data controllers and public authorities, as highlighted by the lack of developed case law to offer interpretive certainty and by regulator-commissioned research warning that only dynamic technical codes, rather than static frameworks, are likely to keep pace. The recognised need for sectoral codes and recurring consultation, as discussed by both the Financial Conduct Authority (FCA) and ICO in their 2025 position statements, is essential for closing gaps between rapid technological deployment and slower legislative cycles.
While the DUAA mandates data security measures, the rise in ransomware attacks and data breaches highlights the need for stronger enforcement and penalties. The ICO’s ability to impose fines is a deterrent, but the frequency of breaches suggests that more proactive measures are needed. Despite increased statutory penalties, the persistent occurrence of major breaches indicates a failure to embed suitable technical safeguards into daily operations, with follow-up ICO enforcement often seen as reactive. While statistical enforcement bulletins[8] and cybersecurity reports indicate that data breaches occur across multiple sectors, specific claims about concentrations in the retail and health sectors require cautious interpretation, as definitive sector‑specific figures from official ICO sources remain limited[9]
To address these challenges, proposed amendments should clarify ambiguous provisions, such as legitimate interests, and introduce safeguards for emerging technologies like AI and distributed ledger technology such as blockchain. Mandatory impact assessments for these systems could improve accountability and transparency. Legislative intent must be underpinned by mandatory and timely publication of impact assessments and accessible external audits so that regulatory aspiration quickly translates into enforceable operational standards. Moreover, continuous updates of conformity assessment and certification for high-risk digital verification services under new trust framework provisions should align with both UK and international standards, as recent official guidance points out.
The ICO should be provided with additional resources to ensure timely enforcement and guidance. Increased funding and staffing could improve its capacity to handle complaints and conduct investigations. Evidence from EU practice demonstrates that procedural efficiency and strong regulator independence, not just broad legal authority, are key to effective deterrence and stakeholder trust, so the UK could explicitly apply these lessons in ICO reform.[10] The DUAA’s new penalty-setting flexibility, combined with transparent publication of enforcement procedures and regular Parliamentary reporting, should support a move towards better accountability alongside resource adequacy.
Public awareness campaigns and simplified processes for exercising data rights can improve user engagement. For instance, creating user-friendly tools for submitting data access requests could increase uptake. Socio-legal research now shows that without public policy addressing deeper structural inequalities in education and digital capabilities, tools alone will not drive meaningful improvement in the exercise of rights, especially for disadvantaged groups. Pilot schemes by the ICO and sector regulators indicate that significant improvements in rights exercise rates occur only when digital interfaces are coupled with local outreach and sustained user education efforts.
In conclusion, the UK’s data protection framework, while comprehensive, requires ongoing adaptation to address emerging challenges and align with global standards. By examining its provisions, comparing it with international frameworks, and proposing reforms, this article provides a measured analysis suitable for legal professionals, policymakers, and scholars. The integration of case law, practical implications, and policy considerations offers pathways for reform that balance privacy, innovation, and economic growth. Moving forward, the responsiveness and adaptability of the UK data protection framework, both in incorporating judicial guidance and responding to empirical evidence of regulatory drift or compliance gaps, will dictate its lasting legitimacy and practical effectiveness. Rigorous implementation of ongoing statutory review obligations and regular consultation duties is necessary, as frequent evidence-led evaluation and cross-sectoral benchmarking are needed to ensure that regulatory changes do not outpace accountability and effective rights protection.
Author: Brian Sanya Mondoh, Barrister and Attorney at Law
Founder, Blockchain Lex Group and CryptoMondays Caribbean and Africa
Disclaimer: This publication is for informational purposes only and does not constitute legal or financial advice. The content is not intended to be a substitute for professional advice or judgment. Please consult with a qualified legal or financial advisor before making any decisions based on the information provided in this publication. The authors and publishers are not responsible for any actions taken as a result of reading this publication.
Data (Usage and Access) Act 2025
UK Government, ‘Data Use and Access Act 2025: Plans for Commencement’ (GOV.UK) https://www.gov.uk/guidance/data-use-and-access-act-2025-plans-for-commencement
UK GDPR
Data Protection Act 2018
Alan F. Westin, Privacy And Freedom, 25 Wash. & Lee L. Rev. 166 (1968). Available at: https://scholarlycommons.law.wlu.edu/wlulr/vol25/iss1/20
Data (Use and Access) Act factsheet: UK GDPR and DPA – GOV.UK
https://www.gov.uk/government/publications/data-use-and-access-act-2025-factsheets/data-use-and-access-act-factsheet-uk-gdpr-and-dpa
Gerard Buckley, Tristan Caulfield, Ingolf Becker, GDPR and the indefinable effectiveness of privacy regulators: Can performance assessment be improved?, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyae017, https://doi.org/10.1093/cybsec/tyae017
ICO: Data security incident trends
https://ico.org.uk/action-weve-taken/complaints-and-concerns-data-sets/data-security-incident-trends/
Information Commissioner’s Office, ‘Data security incident trends’ (ICO) https://ico.org.uk/action-weve-taken/complaints-and-concerns-data-sets/data-security-incident-trends/
Information Commissioner's Office, 'Public Attitudes on Information Rights Survey 2025' (ICO 2025)
Information Commissioner's Office, 'ICO Annual Report 2024/25' (ICO 2025).
[1] Alan F. Westin, Privacy And Freedom, 25 Wash. & Lee L. Rev. 166 (1968).
Available at: https://scholarlycommons.law.wlu.edu/wlulr/vol25/iss1/20
[2] Data (Use and Access) Act 2025, sections 1, 2, 3, 4, 5
[3] UK Government, ‘Data Use and Access Act 2025: Plans for Commencement’ (GOV.UK) https://www.gov.uk/guidance/data-use-and-access-act-2025-plans-for-commencement
[4] Data (Use and Access) Act factsheet: UK GDPR and DPA – GOV.UK
https://www.gov.uk/government/publications/data-use-and-access-act-2025-factsheets/data-use-and-access-act-factsheet-uk-gdpr-and-dpa
[5] Information Commissioner's Office, 'Public Attitudes on Information Rights Survey 2025' (ICO 2025)
[6] Information Commissioner's Office, 'ICO Annual Report 2024/25' (ICO 2025).
[7] Data (Use and Access) Act 2025, s 92.
Data (Use and Access) Act 2025, s 93.
Data (Use and Access) Act 2025, s 142.
[8] ICO: Data security incident trends
https://ico.org.uk/action-weve-taken/complaints-and-concerns-data-sets/data-security-incident-trends/
[9] Information Commissioner’s Office, ‘Data security incident trends’ (ICO) https://ico.org.uk/action-weve-taken/complaints-and-concerns-data-sets/data-security-incident-trends/
[10] Gerard Buckley, Tristan Caulfield, Ingolf Becker, GDPR and the indefinable effectiveness of privacy regulators: Can performance assessment be improved?, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyae017, https://doi.org/10.1093/cybsec/tyae017