Reconciling Privacy and AI in the Digital Age: A Critical Analysis of AI Governance in Canada

Keywords: Privacy, Artificial Intelligence, AI Governance, Regulation

Nicole Basten

4/4/20257 min read

Author Bio

Nicole is a 2026 Juris Doctor Candidate at the Lincoln Alexander School of Law at Toronto Metropolitan University. She hopes to pursue a legal career in securities regulations and her current research focuses on consumer protection and information privacy.


Introduction

A recent focus in Canadian legal scholarship is the development of a framework that properly situates artificial intelligence (“AI”) within the schema of Canadian law and society. One major point of conflict in discussions about the use of data, algorithms, and AI in Canadian law are the issues that arise specifically with respect to informational privacy. Bill C-27, The Digital Charter Implementation Act, was introduced to the House of Commons in 2022 and currently stands as Canada’s attempt to create a comprehensive and agile regulatory framework to govern digital privacy, data protection, and AI.

Although the future of the Bill is unsettled at this time, it nonetheless reflect the only federal regulatory intervention with regards to AI in Canada and ultimately aim to modernize privacy regulation and promote responsible AI creation and innovation. However, from a law and economics lens, it is obvious that Bill C-27 centers technological innovation and economic interests and peripheralized privacy protection. I argue that, if adopted, Bill C-27’s proposed Artificial Intelligence and Data Act (“AIDA”) will act as a hindrance to individual privacy interests in Canada, r einforcing discriminatory power imbalances and supporting profit over privacy.

Bill C-27 Overemphasizes Technological Innovation and Development

The thrust of a law and economic analysis is whether law delivers economic efficiency. A critique of this frame of analysis centers on privileging other interests including personal, social and cultural. Since data is the life wire of AI systems, privacy interests have been stated to be in serious jeopardy. Technology companies are increasingly devising innovate ways to extract more data than ever. Couple with that, “weak, industry-friendly laws” are on the rise as countries strive to maintain technological dominance. This is sort of weak regulatory approach may be made against Bill C-27, whereby it trades off strong privacy protection for technological efficiency and commercial interests. Three aspects built into Bill C-27 are illustrative of how the Bill prima facie emphasizes innovation and advancement over privacy interests:

  1. Fundamental Purpose

The preamble of Bill C-27 explicitly states that trust in the digital and data-driven economy is key to ensuring its growth. While scholars like Neil Richards posit that “a sustainable and ethical digital society will depend on the trust that is safeguarded by the rules protecting our privacy”, if they are to effectively promote a trust relationship, these information rules must weigh privacy interests equally and informatively against economic interests. The Consumer Privacy Protection Act (“CPPA”), one of the statutes within Bill C-27, states its purpose as the promotion of electronic commerce by protecting personal information that is collected, used, or disclosed in the course of commercial activities. While there is nothing wrong with steering a legislation to meet specific objectives, it also should consider other crucial interests.


Although section 5 of the CPPA recognizes “the right to privacy of individuals” it equally recognizes the need of organizations to collect, use or disclose personal information…”, but it is silent on which interest takes priority. In short, CPPA tacitly recognizes that these are competing interests but nevertheless failed to reconcile it by not stating expressly that privacy interest is fundamental. Notwithstanding that Canadian courts have consistently recognized privacy as paramount, that is given a “quasi-constitutional status”, it is important to signal expressly to technology companies that privacy interests trump their thirst for data extraction.

Moreover, for the fact that technological evolution occurs rapidly and unexpectedly, privacy interests are often activated by the misuse of personal data and “are not often prone to anticipation”. This makes requiring strong enforcement actions easily accessible such as not making private right of action under section 107 contingent on prove of actual harm or injury or requiring that the action be sanctioned by Commissioner.

These sort of safety valves only favours technology companies and suggest strongly that Bill C-27 seeks to mainly promote and protect technological and economical interests. The Bill’s focus on technological advancement over privacy interests thus becomes clear: focused on the profit-making interests of private corporations, the Bill makes no proactive effort to prevent possible personal information and state clearly the paramountcy of privacy interest, leaving open the possibility of unforeseeable privacy incursions. This also highlights the facade of privacy protection hidden in the Bill–a reactionary, post-mortem measure of protection that stands in stark contrast to the constitutionally entrenched notion that privacy protections must be proactive.

  1. CPPA Compliance is Monitored by the Minister of Innovation

Indicative of the Bill’s focus on innovation over privacy, Bill C-27 is overseen, implemented, and enforced by the Minister of Innovation, Science, and Industry (the “Minister”). While the Minister does work alongside the Privacy Commissioner and other government entities, the Minister, whose objective is ultimately the advancement of Canada’s economic growth and development, is responsible for the administration and enforcement of the CPPA. This poses an interesting tension between the Minister's experience, background, and overall goal of development and innovation and his responsibility to protect the privacy of individuals, which stands in direct opposition to those economic advancement interests. The risk that the application and enforcement of privacy protections within the Bill will be overshadowed by the pursuit of technological advancement is high, thus outweighing the benefits. Moreover, the Bill has been criticized for failing to provide a clear definition of the Minister’s role, and for its failure to create independent regulatory bodies to oversee data and AI. Considering that the enforcement of Bill C-27 and its provisions will be the determinative factor of whether the Bill properly protects privacy, oversight by the Minister of Innovation, Science and Industry itself raises questions about the legitimacy of the privacy protections afforded under Bill C-27.


AIDA Fails to Properly Acknowledge the Intersection of AI and Privacy

AI regulation will not be sufficient unless it properly acknowledges, considers, and specifically aims to protect user informational privacy. The proposed AIDA has a focus on AI systems, setting standards for ethical use, accountability, and transparency for the purpose of regulating international and interprovincial trade. While s. 4(b) of the AIDA claims to “prohibit certain conduct in relation to artificial intelligence systems that may result in serious harm to individuals or harm to their interests”, scholars like Teresa Scassa highlight the ambiguity present in the AIDA’s provisions.

For instance, Scassa argues that critical terms, such as “high-impact” remain undefined, to be later delineated by the Minister by virtue of AIDA ss. 5(1). The impact of these ambiguities on privacy interests is severe and may result in privacy being foregone in pursuit of AI systems’ development. Since the governance of algorithms is currently “played out on an ad hoc basis across sectors”, the AIDA should have made an attempt to facilitate uniform and comprehensive privacy provisions directly related to the creation, dissemination, and use of AI technologies.

These three issues can therefore be used as illustrative examples leading to the conclusion that, prima facie, Bill C-27 and the composition of statutes within it appear to emphasize innovation and technological advancement over privacy interests. Each of these issues is also illustrative of the power dynamics reinforced by the legislation, covered in the next section.

Ambiguity Built Into the Legislation Reinforces Harmful Power Dynamics

AI requires at least two constituent elements: algorithms and data. Data could mean digital records which, in some instances, may include personal information. Databases are filled with information and then used to train AI programs, which in turn predict “everything from traffic patterns to the location of undocumented migrants”. Scholars have rightly stated that, “privacy is about power”. The use and distribution of personal information–data–is a critical piece of our increasingly digital society. Privacy law expert Neil Richards likens this data to oil of the industrial age, “human information is the fuel of the information economy”. The privacy as power debate therefore equates the ability to exploit personal information as social power, and the commodification of personal data cements the power dynamics between individuals and the governments and corporations that gather and distribute the information. Richards proclaims that “In an economy that exploits personal data, the battles over privacy will ultimately determine the allocation of power in our economy and our society as a whole”.

Additionally, the idea of “surveillance capitalism” which captures and commodifies personal data to target technology users (ex. companies that monitor internet activities to send targeted advertisements) is an increasingly common occurrence in the digital age. It is clear then, that AI is extenuating the power dynamics between technology creators and individual users. Indicative of this power dynamic is “the way public discussion about AI is influenced by actors developing these technologies”. For example, Facebook founder Mark Zuckerberg’s 2010 proclamation that privacy is over. As explained by scholars like Richards, Zuckerberg has a clear interest in the dissolution of privacy protection: “many of those calling so loudly for the death of privacy are really seeking its demise so that they can line their own pockets”.

The ability to collect and exploit personal information is power, and the consequences of the erasure of individual privacy in this way are severe, resulting in discrimination and marginalization of already-vulnerable groups. For instance, there is a documented practice of employer discrimination by refusing to hire people with disabilities for the express purpose of minimizing healthcare expenditures. Ultimately, when considered against the background of the underlying power dynamics involved, Bill C-27’s failure to meaningfully account for individual privacy, leaving critical terms to be defined by the Minister presents ambiguities with serious consequences for individual privacy interests and contributes to technologically produced harms to marginalized populations.

Conclusion

Bill C-27 and the statutes within it ultimately aims to modernize privacy regulation and promote responsible AI creation and innovation in Canada. From a law and economics perspective, Bill C-27 fails to properly balance privacy interests with innovation and development. This short commentary has analyzed Bill C-27 and the intersection of privacy and AI to demonstrate the facade of privacy protection present in contemporary legislation efforts. The piece makes the case that Bill C-27 overemphasizes innovation and development with serious consequences for individual privacy. Ultimately, if adopted, the AIDA will act as a hindrance to individual privacy interests in Canada, reinforcing discriminatory power imbalances and supporting profit over privacy.