May 29, 2025Calculating...

Clearview in the Alberta courts: privacy laws infringe freedom of expression, but mass data scraping for AI facial recognition is still not on the table

In the latest installment of facial recognition company Clearview AI’s litigation saga1, the Alberta Court of King’s Bench weighed in on (i) the applicability of provincial privacy laws to extra-provincial companies and (ii) the breadth of the “publicly available information” exception to the requirement for companies to obtain consent under privacy laws. Clearview was successful in mounting a constitutional challenge to the wording of the exception, which resulted in the exception’s restrictive wording being read down.

What you need to know

  • The Court held that the exception from the requirement to obtain consent for the collection, use and disclosure of “publicly available information” under Alberta’s privacy regulations is unconstitutional, on the basis that the definition of “publicly available information” is too narrow and restricts the Charter right to freedom of expression in a way that is too broad to be justified.
    • The Court expanded the term to include information in a “publication”, meaning something that has been intentionally made public.
    • The Court specified that this includes personal information (including images) posted to social media without privacy settings.
  • The decision impacts the risk profile of (i) any company that collects personal information from online sources without consent, (ii) the platforms and other companies that make this information available online, and (iii) search engines.
  • Despite the expanded scope of the exception, restrictions still exist: any collection, use or disclosure of personal information must still be reasonable in light of the circumstances. This includes the original purpose for making the personal information publicly available and the commercial nature and commercialization of the personal information in question.
  • The Court also confirmed that personal information is subject to the privacy laws of the individual’s jurisdiction of residence—even if the company collecting or using the information is operating elsewhere.

Background

Clearview AI (Clearview) provides facial recognition services by scraping images from online sources such as social media, then using algorithms to analyze and compare scraped images against its database.

In February 2020, the federal Privacy Commissioner and several provincial privacy commissioners, including the Alberta Information and Privacy Commissioner (the Commissioner), launched an investigation into Clearview AI. The commissioners found that Clearview had acted contrary to privacy laws by collecting, using and disclosing personal information (i) without consent and (ii) for purposes that were neither appropriate nor reasonable. In December 2021, the Commissioner ordered Clearview to cease offering facial recognition services in Alberta, cease the collection, use and disclosure of images and biometric facial arrays collected from Albertans, and delete previously collected images and facial arrays.

Clearview sought a judicial review of the Order, arguing that: (i) the Commissioner erred in finding that Clearview was subject to Alberta’s privacy legislation and did not have a reasonable purpose for handling personal information; (ii) the Commissioner’s interpretation of what constitutes “publicly available” information was unreasonable; and (iii) the Commissioner’s interpretation of Alberta’s privacy regime was unconstitutional and breached the Charter right to freedom of expression. The first two of these issues were recently considered by the British Columbia Supreme Court in Clearview AI Inc. v. Information and Privacy Commissioner for British Columbia, which we discussed in a recent bulletin2.

The decision

The Commissioner’s interpretation of PIPA was reasonable

The Court dismissed Clearview’s application to quash the order.

As in the British Columbia decision, Clearview was unsuccessful in arguing that Alberta’s privacy legislation did not apply to its activities. The Court found that the Personal Information Protection Act (PIPA) applied to Clearview’s image collection and use, as there was a “real and substantial connection” between the province and the company—regardless of the fact that Clearview had ceased carrying on business in the province. The Court noted the need to go beyond the “traditional territorial conception” of jurisdiction in order to protect privacy interests due to the challenges inherent in regulating seemingly “borderless” areas like the Internet3.

The Court also upheld the Commissioner’s conclusion that Clearview’s purposes for collecting, using and disclosing personal information were not reasonable. The reasonableness standard in PIPA focuses on what a reasonable person would consider appropriate in the circumstances. In arguing that its purposes were reasonable, Clearview likened its activities to that of Google or other search engines that scrape images and information. However, the Court found that Clearview’s purposes in extracting biometric data was not comparable, even though the source of the information is the same in both scenarios (i.e., the Internet). Therefore, the finding that Clearview’s purposes are unreasonable does not “threaten the operation of the Internet” in general, as Clearview suggested.

The Court agreed with the Commissioner that Clearview’s purposes were potentially harmful, referring to broad-based societal harms caused by mass surveillance. The Court also affirmed that the connection between the purpose for handling personal information scraped from social media and an individual’s original purpose in posting such information is relevant to assessing the reasonability of purposes in this context.

The narrow “publicly available information” exception in the PIPA Regulations is unconstitutional

PIPA requires organizations to obtain consent for the collection, use and disclosure of personal information, unless the information is “publicly available”. The Court rejected Clearview’s argument that the Commissioner’s narrow interpretation of this exception was unreasonable: like the decision in British Columbia, it recognized the quasi-constitutional interest in protecting privacy and accepted the Commissioner’s reasoning that any exceptions to this important right should be narrowly interpreted.

However, the Court found that the definition of “publicly available information” in the PIPA Regulations breached the Charter right to freedom of expression, and is therefore unconstitutional. The court’s decision on the constitutional issue stands in contrast with the British Columbia decision, which did not address this issue but did not ultimately impact the validity of the Alberta Commissioner’s Order.

The PIPA Regulations breach Clearview’s freedom of expression

The PIPA Regulations define personal information as “publicly available” if it is “contained in a publication, including, but not limited to, a magazine, book or newspaper, whether in printed or electronic form”, provided that the publication is publicly available. The Court found that the text of the provision (rather than the Commissioner’s interpretation) breached Clearview’s freedom of expression for the following reasons:

  1. All expressive activity (commercial or otherwise), short of violence, is protected by the Charter.
  2. Although using a bot to scrape information from the Internet is not itself “expressive content”, doing so with the purpose of gathering images and information can lead to the conveyance of meaning, thereby facilitating “expression”. The Court did not accept the Commissioner’s argument that mass surveillance activities should be excluded from constitutional protection because they conflict with the values underlying freedom of expression.
  3. The Charter applies to Clearview’s expressive activity because Clearview has not permanently removed itself from Alberta, and its expressive activity is restricted by PIPA.
The breach of freedom of expression is not justified

A Charter breach can be justified in certain circumstances, subject to the balancing exercise under section 1. In this case, the Court acknowledged that proportionate regulation of commercial expression is typically consistent with the character of commercial expression.

The Court—and Clearview—agreed that PIPA and its Regulations serve a pressing and substantial objective, and that its provisions are rationally connected to that objective.

However, the Court found the “publicly available” exception in the PIPA Regulations overbroad, in that it subjects all collection, use and disclosure of any personal information posted on the Internet without using privacy settings to a consent requirement. This exception is “source-based” rather than “purpose-based”, meaning that it applies equally to, for instance, regular internet search engines that index internet content—but there is no pressing and substantial justification to require consent for this indexing activity.

Though the Court acknowledged that Alberta likely has a legitimate interest in protecting personal information from being used in a facial recognition database like Clearview’s given the potential harm to individuals’ privacy, the exception—in its current form—is overbroad because it restricts other expression for which there is no such justification (e.g., regular search engines).

To remedy this issue, the Court struck out “including, but not limited to, magazines, books, and newspapers” from the Regulation’s definition of “publication”. The Court explained that this change would result in the exception taking on the ordinary meaning of the word “publication” as something “intentionally made public”. Despite this change, the collection, use and disclosure of personal information is still subject to reasonableness requirements under PIPA.

Implications for businesses

Uneven landscape

This decision creates a potentially uneven landscape for businesses (especially those operating across provinces) that scrape publicly posted information from the Internet to train their AI systems. The British Columbia decision, as well as the original privacy commissioners’ findings, suggest that individuals retain control over personal information they post online and the need to obtain consent to collect and use this information is not obviated simply because it is intentionally made public by users. This approach is consistent with decisions and guidance from the federal Privacy Commissioner. The Alberta decision, conversely, finds this broad consent requirement with respect to public information unconstitutional.

Note also that there is a chance that the Alberta Court’s constitutional finding will be appealed or rendered moot by privacy legislation reform currently being considered by the Alberta Legislature.

No carte blanche for screen scraping

However, even if it remains authoritative, businesses should not view this decision as giving them carte blanche to scrape publicly posted information, in Alberta or otherwise.

Given that this decision breaks with British Columbia’s approach and other privacy regulatory guidance, any national strategy with respect to collecting or scraping data online must still adhere to the limitations set out by other jurisdictions. Even if a business is operating only in Alberta, it would be practically difficult to confirm that all relevant data subjects are resident in Alberta—and unlikely that all data subjects would be Alberta residents in the first place. Moreover, many uses of publicly available information will still be restricted due to the requirement that any collection, use or disclosure must be within the bounds of what a reasonable person would consider appropriate in the circumstances.

Indeed, privacy regulators and courts will likely continue to be wary of mass and indiscriminate scraping of information on the Internet—especially images, particularly when they include children and when the ultimate purpose is in relation to mass surveillance (such as the development of biometric facial recognition arrays).

In short, businesses should be cautious when relying on legislative exemptions, including the “publicly available information” exemption under PIPA, and should consult counsel before doing so.

Use of publicly available information for AI training

Just because personal information is posted publicly online does not mean it can be used to train AI models risk-free. It is also important for companies acquiring data for AI training purposes to ensure that they are clear about the origins of their training data. It may be prudent to purchase curated sets of data rather than collect data through web scraping.

Website and platform operators

Operators of websites with publicly posted user content should be mindful that privacy regulators still expect protections for user data against harmful or unlawful mass scraping practices. This decision highlights the importance of having privacy settings in place for users as a general risk mitigant, as such settings may now be instrumental in providing users with control over whether or not their data will be considered publicly available.

Extraterritorial application

Businesses should also remain mindful of the fact that courts do not view the Internet as being borderless, and local privacy regimes may well apply even when business are not actively conducting their activities in that jurisdiction.


To discuss these issues, please contact the author(s).

This publication is a general discussion of certain legal and related developments and should not be relied upon as legal advice. If you require legal advice, we would be pleased to discuss the issues in this publication with you, in the context of your particular circumstances.

For permission to republish this or any other publication, contact Janelle Weed.

© 2025 by Torys LLP.

All rights reserved.
 

Subscribe and stay informed

Stay in the know. Get the latest commentary, updates and insights for business from Torys.

Subscribe Now