Modern Law - Droit Moderne

Episode 8: Technology and the Charter’s future

Episode Summary

Yves Faguy speaks with Lex Gill of the Citizen Lab and a lawyer at Trudel Johnston & Lespérance in Montreal about how emerging technologies are  making us rethink our protections under the Charter.

Episode Notes

Yves Faguy speaks with Lex Gill of the Citizen Lab and a lawyer at Trudel Johnston & Lespérance in Montreal about how emerging technologies are  making us rethink our protections under the Charter.

To contact us (please include in the subject line ''Podcast''): national@cba.org

Episode Transcription

Technology and the Charter’s future

Yves Faguy: You’re listening to Modern Law, presented by the Canadian Bar Association’s national magazine.

Hi. I’m Yves Faguy. In this episode of Modern Law we discuss how emerging technologies are making us rethink our protections under the charter. It’s April 4th, 2022, and the Canadian Charter is turning 40 this month. That doesn’t seem very old, at least for those of us who were around when it was enshrined. Yet in that short time it’s had plenty of opportunity to evolve as part of our constitutional living tree – changing with the times while hopefully staying rooted in its core values.

But the drafters of the charter conceived it as an instrument intended to bind the actions of the state, not private actors. And who could have foreseen four decades ago the extent to which technology would completely change the Canadian policy space – one in which business actors are not only expected to be advocates for human rights but also one in which they play an outsized role in our freedom of expression debates, and act as gatekeepers of our personal data?

So the question we’re asking today is this: can our charter really keep pace with technological advancement on the scale that we’ve seen, and all that comes with it?

To help us think this through, I have Lex Gill on our show today. She’s a research fellow at Citizen Lab and at the Monk School of Global Affairs and Public Policy. For those of you who are familiar with it, Citizen Lab focuses its research and policy input on issues that lie at the intersection of information and communications technology, human rights and global security. Lex is also a lawyer at Trudel Johnston & Lespérance, a class action and public interest litigation firm based in Montreal. And because she obviously has a lot of time to kill, she teaches a course at McGill law on technology in the charter.

Welcome to the show, Lex. Thanks for joining us.

Lex Gill: Thank you for having me, and happy birthday to the charter.

Yves Faguy: And happy birthday to the charter, which is in a couple weeks. Before we dive into that, let’s get to know you a little bit. Tell us about your practice, which seems to be – cover quite a bit, and how you became interested in tech and charter rights.

Lex Gill: Yeah. So my day job is as a litigator at TJL. You know, we do both plaintiff side class actions and public interest litigation but it’s – we explain it that way but it’s not really right to say it like that in the sense that it’s not that those two things are really that different instead of the entire orientation of the firm is really towards greater social, economic and environmental justice. And we pick all of our cases – the ones that pay and the ones that don’t – with that in mind.

So in that sense our practice is kind of procedurally specialized but really substantively diverse and it touches all kinds of emerging areas of public and private law with a real focus on state and corporate accountability. So it’s a great place to be if you’re interested in messy, unsolved problems and in building a law that is a legal system that is more just.

Yves Faguy: You told me once that, just because they argue – the law hasn’t been argued, it’s not a reason not to take on the case. Is that it?

Lex Gill: Oh that’s it and I imagine – I can imagine that coming out of the mouth of any of the three partners of the firm. [Laughter] You know? And, I mean, that is very much part of the ethos of the firm – that where there is an injustice there’s work to do. My practice is pretty eclectic and I don’t actually do that much technology law in the context of my day-to-day there, though I do remain a fellow at Citizen Lab where I used to work and I’m still very actively involved on issues of technology law and particularly things like freedom of expression and privacy and equality in the Canadian law context.

Yves Faguy: Let me just ask you: so what is it that you do? So, tell me about your practice at Citizen Lab now.

Lex Gill: Yeah. So I’m a fellow at Citizen Lab where I used to work full-time and I’m still really actively involved there on issues of freedom of expression and privacy and equality, particularly in a Canadian law context. And Citizen Lab has largely become known for its work internationally, but the focus of my involvement with the organization in recent years has really been on the ways that these kinds of issues manifest in areas like Canadian National Security Law, online regulation and all the rest.

And so that’s where I wear my technology hat much more clearly, but it’s been informed also by other rules I’ve had in civil society over the years. I did a couple of different stints at CPIC, at University of Ottawa, the Canadian Civil Liberties Association – largely focused on national security and privacy. And, actually, I’m proud to count both CPIC and CCLA among our list of clients now. But I’ve been involved in this space for a number of years and in a few different capacities.

And I guess I also – I’m a former Supreme Court clerk as well to Justice Wagner, and I can’t help but think that that also sort of informs everything I do and the ways that I think about legal change and the possibility for legal change before appellate courts.

So it all sounds really weird and messy and interdisciplinary when I say it like that, but I guess the common thread is about being preoccupied with systemic and unsolved problems in the law. You know? And technology is, you know, I’ll paraphrase Phil Rogaway, who’s a really well-known cryptographer, and say that technology rearranges power. So in that sense if you’re interested in justice, in rebalancing the scales and making the world a more just place, you can’t help but be preoccupied by the ways in which technology is shaping and reshaping human rights and what is legally possible in the world.

Yves Faguy: I like that and I’ll take that notion of how it rearranges power. Because, you know, the charter governs our relations with the state. But what is it about this iteration of the technological age that is completely scrambling up things and completely scrambling up those power structures?

Lex Gill: Well, I guess my question and response is, compared to what? I think that there is – there’s some truth to the sense that the role that private actors play in the digital context is just massively outsized today, and that’s reorganized who mediates things like privacy and freedom of expression. But, you know, it’s not like 40 years ago or pre Google or whatever we were living in a world where people had great protections for these human rights and civil liberties. Before it was the big tech companies it was the church; it was your boss. Or it was the local newspaper run by the guy who goes to your church and who golfs with your boss.

Well, there’s really no doubt that the rise of these big technology companies – the Googles and Facebooks – it’s changed the game, and there’s no doubt that those actors play a really outsized role in mediating the limits and possibilities of human rights and civil liberties. But we also have to understand the both and of these emerging technologies, and we need to understand them in their proper historical context. So is – what’s the word you used? Muddled or –

Yves Faguy: Scrambled.

Lex Gill: Scrambled. That’s right. I mean, I’m not sure that the technology has scrambled the charter but it might be making more apparent where the gaps have always been.

Yves Faguy: And so where are those gaps as you see them today?

Lex Gill: Yeah. I think that there has always been an issue with the charter – that it doesn’t apply to private actors. And I think that in the 21st century perhaps that’s becoming more untenable or what – you know, when we talk about that negative space I do think that we’re talking about a couple of things. First is that it is largely large, powerful, private actors, normally foreign domiciled corporations, multinationals, that that mediate and define the boundaries of what rights like freedom of expression and privacy and equality look like online.

You know, that’s one piece of it. There’s increasingly the role that those private actors play in the creation of tools that ultimately end up being adopted by the state for other applications. We can talk about that too. So things like policing technology or automated decision systems.

Yves Faguy: Facial recognition.

Lex Gill: Facial recognition. Exactly.

Yves Faguy: Yeah.

Lex Gill: Yeah. So these kinds of technologies, you know, which are, you know, are products that are often developed by the private sector and then sold to create or meet public needs and so there I think that there’s a real gap in terms of the judicial review that’s practically available when those kinds of technologies are used. And even just real barriers towards basic disclosure around when those technologies are used. You know, so that’s a gap.

Yves Faguy: So let’s talk about that, actually, a little bit. What’s the issue there? Is this that the state is outsourcing some of its functions to private actors or that it’s not paying attention the quality of those products and the extent to which those could be hurting our rights?

Lex Gill: Yeah. I think it’s both. On one hand these kinds of potentially rights-violating technologies are being adopted by state actors at an incredible pace, and it’s often very difficult to directly challenge their use or adoption in court. So, you know, you can imagine that in all kinds of different cases.

But, you know, one example that I think is really easy is something called IMSI catchers which are basically little bits of hardware that pretend to be a cellphone tower and they capture all kinds of identifying information associated with a person’s device or their phone number as they move through a public space.

It’s a kind of surveillance technology that was developed, you know, years and years and years ago in the context of, like, national security. You know? And it was used for those purposes. But gradually it became a much more sort of routine part of law enforcement investigative techniques. Was used in Canada in a variety of contexts. Um. You know, for years people were doing access to information requests and getting neither confirmed nor denied notices. Eventually information about their use starts to trickle out in the context of criminal trial disclosure. You know? And but, like, there’s no situation in which [absent? 00:10:59], you know, you might imagine a really novel, weird section-eight class action or something where you could directly challenge the adoption and use of those technologies.

So on one hand the government is adopting new kinds of rights violation technologies that don’t have a natural or obvious path towards judicial oversight, and then on the other hand, you know, the state is responsible for regulating these kinds – the development of these kinds of technologies. What we, you know, at Citizen Lab are often referred to is dual use. So technologies that may have some sort of beneficial, for example, utility in the security context, but that can easily be abused.

Like, you know, so we would put those in categories like surveillance technology or censorship and content filtering technology. So you can think about things like keyword filtering tools that, you know, might be innocuous when they’re adopted to filter out porn in your local library but are profoundly problematic in the context of nation-state level ISP filtering, right?

It’s the same fundamental technology but the applications are different. And the state plays a role in regulating that stuff but it ought to be playing a much bigger one, right? And again, we see the same kind of trickle-down effect where these technologies might be built for use by nation-states, but they creep into kind of commercial uses. And so the kinds of surveillance technologies that might have once been reserved for, I don’t know, high-level statecraft. We’re seeing surveillance tools that are now being marketed for everyday policing or for workplace surveillance of employees or even in use in the context of interpersonal violence. And there’s even a word for that, which is stalkerware – the kinds of ways in which commercial spyware is actively marketed towards parents and abusers in the domestic violence context.

So we have to understand that this is a – it’s a marketplace and it can be regulated like a marketplace, and that involves a whole spectrum of tools – you know, regulatory action.

Yves Faguy: So what seems to be the problem in terms of the government regulating these, I guess, these dual purpose technologies? Is it that they don’t know how to or is it that, you know, they don’t want to?

Lex Gill: Well, in some ways it’s the nature of the technology itself, right? Like, the fact that something is a dual purpose technology means that it does have some benefits. You know? Whether that is for, you know, encryption technology might be a good example of that where encryption massively enables freedom of expression and security. But, you know, the government will be quick to tell you that it also enables child predators and bad people on the internet and terrorists and all the rest. You know?

But we can also think, like, you know, the reality is too when we start talking about other kinds of dual use technologies like commercial spyware or technologies of internet filtering and content moderation. I mean, some of those technologies are funded directly by the government. In other cases there’s just an issue of expertise, you know, or political will that isn’t there. And I think that part of the problem too is the companies that develop these technologies often see the marketplace as a whole. But the state is only going to be interacting with these products and problems kind of in their own little silos. So the national security people don’t necessarily see anything wrong with the proliferation of commercial spyware, but the people working on domestic violence certainly do, and they’re not the same people.

You know, and so we have to kind of take an ecosystem approach and think about the ways in which these technologies can – are really developed to meet all kinds of different needs and create all kinds of different harms at the same time.

Yves Faguy: Now, I think what’s also coming up, though, is this notion that, you know, we often expect the state to regulate these things, and to regulate anything that could be harmful to its citizenry. But how much has the landscape changed in terms of shifting that responsibility to private actors? And this may apply to the technology you’re discussing but it could be anything else as well that might be out there.

Lex Gill: It’s complicated. I’m not sure that I can generalize a response to that question, honestly. I think that, you know, there is a shifting – I think freedom of expression is actually a good way to tackle that question, right? Where there is really a shift in what people expect the state to do and what people expect the private sector to do. And, you know, I think that the way we’re talking about freedom of expression now, at least among a younger generation of lawyers and scholars, is just a lot more nuanced than the debates of previous generations where the idea was either that you were in favour of, like, a free and liberal marketplace of ideas that, you know, when you had a relatively absolute view of that issue. Or you were just militantly opposed to the proliferation of harmful expression in media. You know? The strongest example being hate speech, of course, and that is the foundation upon which a lot of our [to be? 00:16:24] jurisprudence is built.

But, you know, we also see parallel debates with pornography or violent video games and I think today the conversation is just much more textured and thoughtful around freedom of expression because decades of good scholarship and good advocacy has opened the door to an analysis that’s focused around power.

So, you know, the challenge is, you know, we need to start from the understanding that, like, social media platforms do allow the proliferation of all kinds of harmful content and hate speech and child exploitation imagery and non-consensual distribution of intimate images and all these other kinds of really serious harms. But, at the same time, you know, narratives about safety and violence and victimhood, we’ve learned, are often a pretext to open the door to more intrusive kinds of state power, to greater police powers and intelligence powers. And, you know, the online harms proposal from the federal government just before this last election is a perfect example of that.

And so I think, you know, when we think of who’s responsible, who’s responsible for intervening in cases of harmful content online and how we think about freedom of expression, necessarily there’s been a shift in terms of, you know, the role that private actors play in that conversation as well as the role of the state. But I think that part of what’s going on is not just, you know, this idea that perhaps private actors like social media companies and so on need to take up greater space in regulating those kinds of conversations. But also a diminishing trust in the sort of enforcement tools, and in particular the enforcement tool of the criminal law to solve these harms, where I think at minimum there is a sense that I think people are more skeptical, that when government talks about intervening to protect people online and that the tools the government is going to use to protect them, which is, you know, normally the criminal law, that that’s the right toolkit to respond to these kinds of harms.

And so there’s been really generative conversations around things like better media literacy, de-radicalization efforts, efforts towards harm reduction type strategies – so things like proactive removal of non-consensual distribution of intimate imagery as one potential solution, without having to engage the entire universe of criminal law investigation and enforcement.

So I think that the conversation around freedom of expression has just become more nuanced in that sense, in part because the power analysis is different. And, you know, in the digital context we see real examples of the ways in which vulnerable and marginalized people are kind of weaponized to justify new forms of law enforcement and intelligence powers and there –

Yves Faguy: How so?

Lex Gill: Well, I think that the kind of a canonical example of that is a debate that happened in the United States a few years ago around legislation called SESTA or FOSTA, which was notionally about protecting victims of trafficking but really ended up facilitating those surveillance and criminalization of sex workers and other vulnerable folks.

And so I think, you know, we have to be very cautious when we think about harm. And we have to think about the ways in which the internet is, again, it’s this sort of approach of both and that when we protect freedom of expression online we’re also protecting dissidents campaigning against their governments or organizers running a union campaign or survivors denouncing their abusers, people seeking access to an abortion. You know, when you think about global freedom of expression online. One of the categories of speech that’s most often censored is women’s health information.

So you have to understand these technologies at a sort of systems level and understand that when you build the technical capacity to remove content, you’re building a tool that is easy to repurpose and where function creep is really possible. So that doesn’t mean that the government doesn’t have a role to play and that doesn’t mean that the platforms don’t have a role to play. The reason why most people who work on these issues think of them as extremely hard issues is because they understand at the infrastructure level, we’re opening the door to, you know, to technologies that can really facilitate different kinds of serious rights violations.

Yves Faguy: Yeah. And I think there’s a word that you employed earlier, which is just trust. You know, I think one of the things that we’re also struggling with, beyond just the pure technological component of it, is that there has been a lot of talk about our diminishing trust in institutions. It’s perhaps not aided by the fact that we live in an information ecosystem that’s – some might say is lost its mind. But, again, just to bring this back to this notion that, you know, the public policy space has blurred between private and public sectors, if we’re talking about the rules that govern our freedom of expression – because there are rules that govern our freedom of expression – how do we ensure that there’s enough trust in the right institutions to make sure that that’s done in a way that continues building trust as opposed to eroding it?

I mean, sorry, it’s a bit of a big question, it’s a bit of a heavy but, you know, again like, you know, we trust our institutions because we know that hospitals will deliver healthcare to us and that here in Canada the state will deliver a certain amount of healthcare to us. We invest our trust in these institutions. But what happens when you don’t know who’s moderating free expression online?

Lex Gill: Yeah, I mean, it’s a huge question. It’s one that I don’t have a, you know, satisfying answer to. I think there are a few people who do have a complete and satisfying answer.

I think that part of the answer to that question is around process in thinking about what fair process looks like, rather than thinking too much about outcomes at the outset. You know, our jurisprudence’s approach to regulating speech under [2B? 00:22:53] is what I often refer to as extremely artisanal. It’s a really contextual analysis. It looks at power. It looks at the actual words being used. It looks at the images in context, right? And that’s a huge part of its strength. But it is an approach that is also massively unwieldy if we try and apply that same approach to, for example, a social media platform where there’s, you know, billions of pieces of information flying around all at once.

And so rather than thinking about what a perfectly, you know, safe or true media environment might look like because that I don’t think is attainable. Or maybe even perfectly, you know, not even desirable, I think that we need to think about, like, what fair process looks like in terms of situations where content might be removed or downgraded or more difficult to access. And investing in institutions that build trust through practice, right? That is why we trust our courts. We don’t trust our courts because we think that we’re guaranteed outcomes but we trust our courts because there is a fair process available to resolve disputes and adjudicate rights.

And I think that thinking about speech and other kinds of rights abuses online we have to kind of think in a similar way. If we’re serious about the idea of building trust then it’s really about building process first.

Yves Faguy: What about the rights themselves? I mean, you know, we talk about how, you know, the charter and other human rights instruments are rooted in a certain set of core values. But, I mean, do we need to rethink also a little bit what’s fundamental in terms of our rights and freedoms? As we find ourselves having to switch the focus of concern to our relationship with private actors, not just the state.

Lex Gill: So there’s a question about, like, the substantive rights in there and there’s also a question about the scope of application. And in terms of the substantive rights I find myself pretty optimistic in the sense that lately for a case I’ve been reading a lot of jurisprudence about section 7 and section 15 from, like, the ’90s and early 2000s. And those two rights are unrecognizable today. And the charter has massively, massively evolved to give those two charter rights real meaning and to give claimants real space to fill the charter with what really matters.

And so I think in terms of substantive rights I’m optimistic on the charter’s ability to evolve, to meet the sort of most important needs and issues that people face, and I think technology is –

Yves Faguy: So in that sense it’s resilient.

Lex Gill: So in that sense it’s resilient, it’s dynamic, it evolves. I feel hopeful about that aspect of the charter.

But on your other point about this question of application, about, you know, the reality that there’s no amount of stretching that will have the federal charter applied to purely private actors. So I think that we can get creative and practical and purposive about situations where the state has effectively delegated public roles to tech companies, I think there there’s work to do.

The reality is that we have to look a little outside of the charter to really make sure that we’re fully protecting human rights and civil liberties in Canada. And, you know, I practice in Quebec and so we have an easy tool for that, which is the reality that they’re, you know, that there’s the Quebec charter that creates a private cause of action for human rights violations, and that applies to large technology companies just as it applies to any person – physical or moral in the province. And so I think there the path is obvious.

Yves Faguy: How is that – sorry, but how is that any different than human rights legislation in other provinces?

Lex Gill: Right. OK. So, the Quebec charter is incredibly special, actually. And where the scope – you know, human rights codes at the provincial level, you know, vary, but none of them are as far reaching or as ambitious I guess in terms of their scope and reach. And the Quebec charter for those common law lawyers who have never taken a look, hop on CanLII and give it a read. It looks an awful lot – it looks an awful lot like the federal charter in some ways, but it really applies to everyone and it gives you a right to claim damages and other remedies and punitive damages and litigation under the Quebec charter. You know?

So, I mean, in that sense, like, obviously it applies to the usual suspects the employers and the landlords and so on. But it’s much more far-reaching. You know, I have a case – I won’t talk about it here – but, for example, seeking purely punitive damages against Facebook for violation of users’ privacy rights in contracting and providing their information to third parties on the platform. You know, in that kind of case where in a common law province you might raise, like, a private tort like intrusion upon seclusion or something like that. There’s really a direct cause of action for a violation of one’s right to private life under the Quebec charter.

And so I think we might imagine a development of human rights laws in other provinces that impose those kinds of obligations on everyone much more explicitly rather than hoping that the common law will evolve in a gradual way to create more kinds of actionable rights in the private law for these kinds of issues. So that’s one way to think about it.

Yves Faguy: Are we seeing any of that happening?

Lex Gill: Yeah. [Laughter] Slowly. Slowly. I think that one avenue for that is really introduction of clearer private causes of action to various forms of quasi-constitutional legislation. So that could look like human rights codes but it could also look like privacy laws, right? Which, you know, and so we can think about – PIPEDA – which has been dying for reform for many moons now. You know? PIPEDA of reform. I feel like I talked about that in an interview five years ago and I’ll talk about it in an interview five years from now.

But, you know, so PIPEDA reform is one way to look but really thinking about ways in which we might be able to hold, you know, the law evolves, right? The law is complicated. And so I think that lawyers who are interested in these issues are finding all kinds of novel and important ways in tort and consumer protection to get these kinds of issues before the courts. But legislative reform has a role to play here too and in creating clearer pathways to debating these kinds of issues before the courts, and I think that’s really important.

Yves Faguy: It’s a good – actually, you know, it’s a point that Maroussia Lévesque raised in an earlier interview which was, you know, lawyers sometimes, you know, and constitutional scholars in particular, can perhaps sometimes get a little stuck in their ivory towers. But, you know, and I’m not – and I don’t want to accuse all of them with the same broad brush. But the point she was trying to make was that constitutional scholars and experts have to really start doing more in terms of cultivating their own tech competence so that they can argue these rights more effectively. Do you have any thoughts on that or, you know, is that happening?

Lex Gill: Yeah. I mean, I think it – I think it is changing very, very quickly. I think even 10 years ago you would be hard-pressed to find many scholars in Canada who are doing novel work on Canadian legal issues related to technology and human rights. And now we are very lucky to have a real roster of brilliant original thinkers in the Canadian context.

And even just thinking about the lawyers in an around Citizen Lab there’s people like Cynthia Khoo and Kate Robertson and John Penney and Siena Anstis – people who really have been developing that expertise and bringing those issues before the courts and in their own scholarship and representing interveners. And the other thing that I think that we have seen is civil – you know, again, academia always a bit of a mix bag, right? But in civil society organizations I think we’ve seen them be really nimble and among civil society organizations I think we’ve really seen innovative and important ways that they’ve developed this expertise in ways that make sense for their existing mandates.

So, for example, you know, over the last half decade the CCLA built out an entire privacy program kind of out of thin air. Leaf has developed a real specialization on issues of platform regulation and gender-based violence in the online sphere. So I think that we’re seeing this expertise develop but, you know, in ways that make sense.

And of course, like, you know, the criminal defense bar is at ground zero for a lot of these issues in terms of the ways that emerging technologies are being adopted in the policing context. So I think that we’re getting there and I think that the courts are really interested in these issues.

Yves Faguy: I was about to ask: what about the courts? How are they responding? Probably a mixed bag too, but.

Lex Gill: Probably a mixed bag too, you know, but I think there are, you know, certain judges, like, when you think about judges like at the federal court. Like, I’m thinking, like, for example, like a Justice Mosley who’s written a lot on different kinds of emerging technologies in the national security context where it’s sort of judicial innovation around intrusion upon seclusion which is having some consequences in the context of emerging technologies I think there's a real openness from the courts. And the Supreme Court, you know, I have to say has – I don’t agree with every decision in this area, certainly, but I think they’ve made a real effort to lead the way too and understanding the relationship between human rights and emerging technologies. So, again, I’m optimistic.

The problem is, of course, that no matter how eager judges are or how good the courts are, the justice system is slow, and technology moves really fast. And so the reality is that the best of times you’re getting an issue, like the issues we’re discussing, before the courts in a couple of years. And if you’re dealing with a sophisticated defendant, you know, give it – add a couple years to that because you’re going to deal with arguments about jurisdiction and interpretation and applicability and all kinds of fun interlocutory stuff before you ever actually get a decision on the merits, right?

So, I think that part of the challenge is that technology moves fast and our justice system isn’t necessarily set up to adjudicate those kinds of issues.

Yves Faguy: What about – so and then so back to the private sector again, you know, I mean, we do see businesses certainly at least paying lip service, if not trying to do more, to, you know, draw some sort of inspiration from our human rights models – to, I guess, I don’t want to say police or enforce protections but to make sure that at least in their sphere of influence people are trying to follow some of those human rights laws. Is that just a little bit too idealistic and –

Lex Gill: Yeah. I mean, I guess it depends on the private actor you’re talking about. But, you know, it used to be that the tech sector really took pride in painting itself as a kind of wild west and it, you know, in its inability to be regulated, and that’s really been replaced with a discourse of self-regulation and of responsibility. But, you know, and so now these big commercial actors almost without fail really proactively assert our respect for human rights and privacy and speech and equality.

Yves Faguy: Does it stand up to scrutiny?

Lex Gill: Well, I mean, you know, though are those aren’t really rights at all, right? They’re marketing. They’re only rights once those companies accept that they create actual obligations and the violations of those obligations are actionable and that they’re ready to be held responsible when they let users and when they let individuals down.  And on that front I think we have a long way to go. So I think, you know, that’s part of my answer to that.

Yves Faguy: Well so if there is, then, this negative space outside the charter where, you know, our protections are not being sort of fully defended, how do we remedy that? Moving forward. I mean, does the state have to pick up a bigger role somehow and does it have to figure out a way to square the – or sort of connect the dots between, you know, these protections have to be somehow followed and respect – or some – these protections have to be available in the private sphere as well?

Lex Gill: Yeah. I mean, again, I think it depends on the specific rights we’re talking about and the specific private actors. But again, my focus here is really on process, on remedies. You know, I think that there are real challenges. And I mean, the federal government’s proposal around online harms, what they’re now calling online safety, is a good example of this – where, you know, they’re seeking an approach to regulate, like, every bad thing that can happen on the internet at once. And I think, you know, the risk there – well, there’s lots of risks there but, you know that –

Yves Faguy: Overreach?

Lex Gill: Overreach is one of; them inefficiency is another; adopting legislation that will be tied up in constitutional litigation for a decade before it sees the light of day is a third. Certainly. But, you know, so but I think that seeing that government – the government’s attempt to intervene in this space has been fraught doesn’t mean that there’s no rule for the government to play at all; we just have to be thoughtful and strategic about where that work can be best done, you know? And I think I would love to see the government start with issues that are very clearly within its own sphere of regulation.

So one issue I care a lot about is automated decision making and the ways in which increasingly government is adopting technologies that replace or augment human decision making, and that can be in areas like risk scoring and sentencing; it can be bail hearings; it can be welfare, social assistance applications; it can be immigration and refugee applications. But we’re really seeing a trend where the government has sought to sort of, in order to render its decision-making process more efficient, adopted these kinds of technologies. And the questions around how charter issues are going to play out in that sphere are incredibly fascinating and –

Yves Faguy: What are they?

Lex Gill: I would love to see clear – well, I think there’s a couple things. One: there’s a whole barrel of monkeys of messy judicial review questions. So, you know, how do you apply [VAWA? 00:38:30] law to the decision of an automated system? How do we – what kind of disclosure do you get when a decision about you is made in whole or in part by a machine? What do we do about the fact that the algorithms that are making these decisions are normally owned by private companies? And they’re proprietary, you know? And often sheltered from disclosure on that basis. How do you think about issues like due process and how do we most importantly ensure that these technologies aren’t just reproducing the existing issues of bias and discrimination that we know are pervasive in our legal system?

So, I mean, in the criminal justice system, the ways – and I would say Cynthia Khoo and Kate Robertson and Yolanda Song wrote a massive report for Citizen Lab called to surveil and predict all around the scene the ways in which automated technologies are playing out or will be playing out in the criminal justice system. You know? What does that mean for policing of certain neighbourhoods? The targeting of certain groups that we know, for example, like black individuals and other racialized individuals who are already disproportionately targeted by law enforcement. You know?

So I think that there’s going to be a lot of work to do in, you know, educating the courts but also developing really smart jurisprudence contending with the way that these kinds of systems can create the illusion of rationality. You know? There’s really this idea that if we get a computer involved then the process is suddenly scientific and rational and efficient. But often what we’re actually doing is giving the state greater license to replace those discretionary exercises of public power with an automated process that just reproduces, and in some cases exacerbates, existing forms of inequality. So we need a real cogent framework.

Yves Faguy: You know, I do – I mean, I suppose there’s a positive message coming from you which is that, you know, at a substantive level, the charter is resilient and it’s not broken. But maybe that we have some – we have some work to do in terms of, you know, fixing some of the processes and maybe rebuilding some of the trust in our institutions to make sure that these protections are properly defended. But how do you see Canadian constitutional law evolving in response to further technological change over the next decade or so? Because, you know, it’s not like it’s stopping here. We have quantum computing. We have obviously there’s artificial intelligence but, you know, I’m not sure where the metaverse is taking us or if it will take us anywhere.

But it seems at the pace of sort of foundationally, revolutionary technologies is just picking up steam. So where does constitutional interpretation go from here?

Lex Gill: Yeah. OK. So, I mean, it’s – far be it for me to propose some sort of grand, unified theory about how the constitution might evolve but there’s a couple areas where I think it’s easier to see where there’s constructive work to be done and where the courts might be ready to sort of pick up the ball. I think one of those issues is around, like we were just talking, a kind of automated decision making frameworks for reviewing those kinds of decisions and creating some guardrails to make sure that technologies aren’t adopted in ways that exacerbate rights violations. So I think that’s one area.

I think another big question that the courts are going to need to figure out around privacy in particular is this idea of privacy in public space. You know, in a completely different context, poor and homeless people have already had to contend with this, of course, as do other groups that are over policed and marginalized. But, you know, I think that there’s a broader question when we look at the proliferation of cameras with facial recognition, the use of drones, gate monitoring technology and airports. We can really see sidewalk labs and the smart cities debate. We can really see how that which is private is becoming disentangled increasingly from that which we own and control.

And, you know, section 8, the analysis under section 8, is still so entangled with those issues of control. And I think that that’s shifting. We see it in cases like Jones and [Maraca? 00:43:14]. But ultimately questions of ownership and questions of, you know, quote unquote private spaces, still sort of dominate that analysis. And so when we think about, you know, what kinds of expectations of privacy you might have walking around your neighbourhood, I think the courts are really going to have to figure that out in a big way. Because, you know, maybe the Sidewalk Labs project in Toronto failed but it’s not going to be the last of its kind and those types of technologies kind of inherently raise really important constitutional questions.

So, and there’s, like, little hints of where the court might be going in Jarvis, which is not a constitutional case but a case about voyeurism. But I think that there’s a lot of work to do there.

And then another direction that I think is important to grapple with, and that we can expect the courts to really look more closely at in the coming ten years – it’s maybe a totally different take on this idea of the negative space in the charter but, you know, I think it’s important to think about the negative space created by technologies of circumvention and liberation. So if we’re looking at, you know, the kinds of tools that people are building to get around the sort of pervasive corporate surveillance encryption tools like signal that allow people to chat securely, free from government surveillance, or anonymity tools like Tor that allow people to browse the web without what they’re doing being monitored, or technologies like SecureDrop which allow people to anonymously and confidentially communicate with journalists, which is really, you know, one of the foremost guarantors of freedom of expression and freedom of the press, probably in the 21st century. You know? Those technologies pose really fascinating constitutional problems.

And the government has a real interest. You know, governments have real interest in limiting access to those technologies because they allow people to get around infrastructure of surveillance and control. And of course there’s a trade-off to that, right? They’re also used to facilitate crimes and other abuses. But the reality is that there are – these technologies in a lot of countries are the final frontier protecting people’s ability to, you know, organize against a bad boss or an even worse government.

And so, you know, the charter or our constitutional order is all about balancing. But the reality is that if you’re going to accept that these kinds of technologies exist, you have to accept that there are certain places that are a little bit beyond the state’s reach. And I think that’s going to be a hard conversation but a really important conversation to have in our legal system in the coming years.

And, you know, and I think that – I’ve done a lot of work on it on encryption issues but I think that’s partly because I think that the – even the widespread use of these technologies is just not incompatible with the rule of law. And in fact it’s really essential to democratic life in the 21st century, I think, is, you know, to grow the – the law needs a certain kind of imperfection, some kind of imperfect enforcement. It needs some friction and these are technologies of friction that really in their own ways protect and enable human rights.

And so I think that that, when I think about what the messy, hard problems for the courts are in the next 10 or more years, those are some of the things that come to mind.

Yves Faguy: Yeah. And it’s not, you know, it’s not necessarily a crime to want to carve out a private space for yourself as –

Lex Gill: Far from it. It is your internationally and constitutionally protected human right.

Yves Faguy: Yeah. I think that’s a great place to end. We’re out of time anyway but I want to thank you so much, Lex Gill, for joining us today. That was a great conversation.

Lex Gill: Yeah. Thank you so, so much for having me. This was great.

Yves Faguy: We hope you enjoyed this episode of Modern Law, one of our CBA podcasts. You can hear this podcast and others on our main CBA channel, on Spotify, Apple Podcast, Google Podcast and Stitcher. Subscribe to receive notifications for new episodes and to hear some French listen to Droit Moderne.

If you enjoyed this episode, please share it with your friends and colleagues, and if you have any comments feedbacks and suggestions feel free to reach out to us on Twitter @cbanatmag and on Facebook, and check out our coverage of legal affairs at nationalmagazine.ca.

A big thank you to ACD Production for its help in editing. We’ll catch you next month.