Canadian Journal of Law and Technology


Artificial intelligence, AI, AI and healthcare, clinical outcomes and AI, bias and discrimination of AI, algorithmic bias


In this article, we canvas why AI may perpetuate or exacerbate extant discrimination through a review of the training, development, and implementation of healthcare-related AI applications and set out policy options to militate against such discrimination. The article is divided into eight short parts including this introduction. Part II focuses on explaining AI, some of its basic functions and processes, and its relevance to healthcare. In Part III, we define and explain the difference and relationship between algorithmic bias and data bias, both of which can result in discrimination in healthcare settings, and provide some prominent examples of healthcare-related AI applications that have resulted in discrimination or have produced discriminatory outputs. Part IV explains in more detail differences between algorithmic bias and data bias, with a focus on data bias and data governance, including the non-representativeness of data sets used in training AI. From this point we turn to look at possible legal responses to the problem of algorithmic discrimination, and, in Part V, we demonstrate the insufficiency of existing ex post legal protections (i.e., legal protections that offer redress after someone has suffered harm), including claims in negligence, under human rights legislation, and under the Charter of Rights and Freedoms. Part VI explores possibilities within the Canadian ex ante legal landscape (i.e., the regulation of AI applications before they become available for use in healthcare settings), notably through federal regulation of medical devices, and identifies gaps in oversight. Finally, in Part VII we provide recommendations for federal and provincial governments and innovators as to the appropriate governance and regulatory approach to counter algorithmic and data bias that results in discrimination in healthcare-related AI, before concluding in Part VIII.