How to craft privacy policies for both Canada and the U.S.
Click here to see other videos and webinars in this series
Konata Lake (00:06): Today, we're going to be talking to you about privacy. Molly, maybe to start off, why don’t you talk to us about privacy regulations, generally and particularly how they impact fintech and what you see?
Molly Reynolds (00:22): So we already are working with quite a large patchwork of privacy regulation in Canada. We have federal privacy law that applies to private sector organizations, and we have provincial privacy law in several provinces. And all of those are undergoing a lot of reform over the next two years. And we are seeing a big focus in those legislative reforms on more transparency to consumers about how their data is being used and why. And a lot more focus on meaningful consent and real choices for consumers on how their data is used. An increased focus on deletion of data to actually minimize risk of breaches and an enhancement of consumer rights, including access to their information and data portability. And all of that is going to be wrapped in if the reforms go forward with significant new penalties and fines under these federal and provincial regimes. In addition to that, Konata, we also have sector-specific regulation rules that may apply either directly or indirectly to fintechs in the financial space. For example, Mutual Fund Dealers Association and OSFI requirements to report cybersecurity breaches. So we're already dealing with quite a bit of consumer-friendly privacy legislation in the financial space. And we're only going to see it get more complex and a little bit more burdensome in the years to come.
Konata Lake (01:56): You mentioned the regulation now is mandating or requiring deletion of data. Certainly fintech companies, one of the main reasons they get into the space is data—collection of data, utilization of data. So maybe you can talk a bit about what the policy behind driving that, the requirements to delete data and also how that's being balanced against companies’ need to utilize and manipulate data and how the rules will kind of thread that needle.
Konata Lake (03:24): And maybe talk a bit about what we're seeing in terms of litigation in regard to privacy.
Molly Reynolds (03:31): So across Canada, we continue to see a huge wave of privacy related litigation, especially class actions. And that's both in the aftermath of a data breach and in circumstances where consumers are saying that the intentional business practice of the company violated their privacy rights, it went too far, or it was too creepy. We see with the upcoming legislation and legislative reforms probably an even larger increase in privacy litigation. And, you know, these cases do claim tens or hundreds of millions of dollars in damages. They may settle for a little bit less, but in many cases, the more successful you are as a company, the greater your risk in this context of litigation because compensation may, for example, be $100 per person. And if you have 100,000 or 500,000 consumers that are affected, that will add up quite quickly. In addition, important to note, and especially in Québec, that there is a very clear path towards punitive damages for privacy violations. So what that means is the court is really punishing the company because they ought to have known better in their compliance practices, even if individuals didn't really suffer any loss or didn't really face any harm as a result of the data practice. And under the new Québec privacy law, individuals can seek at least $1,000 per person in punitive damages, which is really going to raise, I think, the risk and the price ticket around this litigation.
Konata Lake (05:07): That's a really important point to note because I think fintech companies, particularly if they’re a startup, may view this in the risk assessment and say, “I know that I may be non-compliant, but the thing I'm doing is actually not creating any damage. And so I will take my chances and deal with a potential lawsuit.” But the point you make is it's not just about the actual damage that's incurred. It's also potential for punitive damages. And so I think it’s a really important point for fintechs to know as they kind of, calibrate the risk and decide on paths forward.
Konata Lake (05:45): And what about our friends to the south in terms of differences between US and Canada that participants in the sector should be aware of?
Molly Reynolds (05:54): Well I think it's really important, whether you're starting in Canada and then moving into the US or vice versa. To know that, although the regimes are largely compatible and you can create a harmonized compliance process for your privacy program, they are not the same. And so it's really important to do some upfront work, understanding all of the federal, state and provincial privacy laws that apply and figuring out what those differences are. Once we really understand the landscape of what we're dealing with, including where you might want to go in growth plans in the near future, then we can create North American-wide privacy policies, internal compliance programs that meet all of those requirements. They're rarely in conflict, the risk is really just that sometimes we focus too much on one country and risk being offside, accidentally, of the other country’s requirements.
Konata Lake (06:49): Maybe talk a bit about what we're seeing in terms of privacy rules and regulations, and regulation of AI in general that participants in the fintech sector should be aware of.
Molly Reynolds (06:59): Yeah, so up until now, we have had really indirect AI regulation. There might be privacy laws that relate to personal information in that context, there might be competition regulation, for example. But in June of 2022, the federal government did introduce the AI and Data Act and it is the first technology-specific AI regulation in Canada. It hasn't passed yet, but if it is finalized, there are going to be some significant requirements associated with the legislation. The biggest one from the data perspective is that data that is being used in AI technology is going to have to be anonymized. It can't be partially de-identified, it can't be personal information. It has to actually meet the requirement of being anonymized. In addition to that, any company that's developing, designing or distributing an AI technology is going to have to do a risk and impact assessment and identify as well as publish the risk mitigation measures that it has taken in response to that assessment.
Konata Lake (08:04): Certainly seems like a theme here for participants in the space. Understand the regulation because, you know, there's increased kind of compliance obligation and potential compliance costs. And so certainly something for participants to be aware of and to watch.
Molly Reynolds (08:18): Absolutely.
To discuss these issues, please contact the author(s).
This publication is a general discussion of certain legal and related developments and should not be relied upon as legal advice. If you require legal advice, we would be pleased to discuss the issues in this publication with you, in the context of your particular circumstances.
For permission to republish this or any other publication, contact Janelle Weed.