Most international businesses with Canadian customers have no idea they already have active AI compliance obligations. Quebec Law 25 — Canada's most comprehensive provincial privacy law — is fully in force and applies to any organisation that collects or uses the personal information of Quebec residents. Penalties run to $25 million CAD or 4% of global revenue. And this is before Bill C-27, which would create Canada's first federal Artificial Intelligence and Data Act, even reaches the statute books.
The landscape: why Canada is different
Canada's AI compliance picture is more complicated than the EU or UK because it's layered: a federal baseline (PIPEDA), a Quebec law that is already substantially stricter than that baseline, and pending federal legislation (Bill C-27) that would add an entirely separate AI-specific layer on top of both.
For businesses operating in other provinces, PIPEDA applies. For businesses touching Quebec residents — as customers, employees, or users — Quebec Law 25 (formally An Act to modernise legislative provisions as regards the protection of personal information, also known as Law 64) already applies in full.
The critical point: Law 25 is not coming. It is not "about to come into force." It has been phased in since September 2022, and the final, most demanding phase — including Privacy Impact Assessments for AI profiling and mandatory opt-out mechanisms — came into force in September 2023. Enforcement is active.
PIPEDA: the federal baseline
The Personal Information Protection and Electronic Documents Act (PIPEDA) has applied to private-sector organisations since 2001. It requires that organisations collect only the personal information they need, identify the purpose before collecting, obtain meaningful consent, and protect the information collected.
For AI systems, PIPEDA creates obligations that most businesses haven't mapped:
- Purpose limitation: If your AI tool was trained or fine-tuned on customer data, the purpose must be documented and disclosed at point of collection.
- Consent: Using personal data for AI-driven profiling, segmentation, or scoring requires consent that is specific to that purpose — not buried in general terms.
- Accountability: You are responsible for personal information in AI systems you use, including third-party SaaS tools. "Our vendor handles compliance" is not a defence.
- Access rights: Individuals have the right to know what personal information you hold about them — including inferences drawn by AI systems.
PIPEDA is enforced by the Office of the Privacy Commissioner of Canada (OPC). While its enforcement powers are weaker than Quebec's Commission d'accès à l'information (CAI), the OPC can investigate complaints, issue findings, and refer matters to Federal Court.
Quebec Law 25: the teeth
Quebec Law 25 is widely described as Canada's GDPR equivalent. For businesses using AI systems that process the personal information of Quebec residents, it is the most consequential law currently in force in Canada.
Privacy Impact Assessments (PIAs) for AI
Law 25 requires a Privacy Impact Assessment for any project involving the acquisition, development, or overhaul of an information system or electronic service delivery system. In practice, this means any AI tool you deploy that processes personal data about Quebec individuals needs a PIA.
The PIA must be completed before deployment, must identify and mitigate privacy risks, and must be documented. Using a new AI-powered CRM, deploying an AI chatbot that collects user data, or running AI-driven analytics on customer behaviour — each requires a PIA under Law 25.
Transparency obligations
Organisations must inform individuals when their personal information is used to make a decision based exclusively on automated processing. This includes:
- Informing the individual that an automated decision was made
- Explaining the factors considered in making that decision
- Providing the individual an opportunity to submit observations
- Providing a mechanism for the individual to request human review of the decision
This is not a hypothetical future obligation. It is in force now. If you use AI to score leads, segment customers, make credit or insurance decisions, or automate any consequential assessment of Quebec individuals — you must be meeting these requirements today.
Human review mechanisms
One of the most operationally significant requirements in Law 25 is the right to human review of automated decisions. Unlike GDPR Article 22 (which applies only to decisions producing "legal or similarly significant effects"), Quebec Law 25 applies broadly to any decision made exclusively on the basis of automated processing.
Many AI-driven tools — scoring engines, recommendation systems, automated approval flows — will need a documented human review pathway added if one doesn't currently exist.
Penalties
The Commission d'accès à l'information can impose administrative monetary penalties of up to $25 million CAD or 4% of worldwide turnover, whichever is higher. Criminal offences attract fines of up to $25 million CAD for organisations. These are not hypothetical maximums — the CAI has active enforcement capacity and has issued findings against major organisations.
CASL and AI-generated outreach
Canada's Anti-Spam Legislation (CASL) is often overlooked in AI compliance discussions, but it is directly relevant to any business using AI for outreach, marketing automation, or lead generation that touches Canadian recipients.
CASL's key requirements for AI-driven outreach:
- Express or implied consent required before sending any commercial electronic message (CEM) to a Canadian recipient — including emails generated by AI tools.
- Identification obligation: Every CEM must identify the sender and provide contact information. AI-generated emails that disguise or omit sender identity violate CASL.
- Unsubscribe mechanism: Every CEM must include a functional unsubscribe mechanism that works within 10 business days. Automated sequences must honour opt-outs.
- No pre-ticked boxes, no bundled consent: Consent for commercial communications must be separate from consent for other purposes — AI-driven form builders should be checked.
CASL is enforced by the Canadian Radio-television and Telecommunications Commission (CRTC), the Competition Bureau, and the OPC jointly. Penalties run to $1 million CAD for individuals and $10 million CAD for organisations per violation.
Bill C-27 and the Artificial Intelligence and Data Act (AIDA)
Bill C-27 — the Digital Charter Implementation Act — contains three parts: reforms to PIPEDA (renamed the Consumer Privacy Protection Act), a new Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act (AIDA).
AIDA would be Canada's first federal AI-specific law. As of early 2026, Bill C-27 has not been passed — it was progressing through Parliament but has faced delays. However, the direction of travel is clear, and businesses that build compliance frameworks now against the AIDA framework will be significantly ahead when it does pass.
AIDA's key obligations as proposed:
- High-impact AI systems: Organisations developing or deploying "high-impact" AI systems (defined by regulation) must implement risk mitigation measures, keep records, and be prepared to publish plain-language descriptions of how systems are used.
- Impact assessments: Mandatory impact assessments for high-impact systems — building on the PIA obligations already in Law 25.
- Ministerial oversight: The Minister of Innovation would have broad powers to require audits, access records, and order systems to be suspended.
- Serious harm provisions: Systems causing serious harm to individuals or biased outputs in consequential decisions face injunctions and fines up to $25M or 3% of revenue.
Even without AIDA, the combination of PIPEDA, Quebec Law 25, and CASL creates substantial compliance obligations for any business using AI with Canadian data subjects.
The cross-border angle: UK and EU businesses with Canadian customers
A common misconception: "We're a UK company, so Canadian law doesn't apply to us."
Quebec Law 25 — like GDPR — applies based on where the data subject is located, not where the organisation is based. If you have Quebec customers, Quebec Law 25 applies to your processing of their personal data. If you have customers anywhere in Canada, PIPEDA applies.
For UK and EU businesses that are already GDPR-compliant, the good news is that the frameworks are structurally similar — consent, purpose limitation, data subject rights, accountability. The Canada-specific additions are manageable: PIAs for AI systems, the human review mechanism for automated decisions, and CASL compliance for outreach. A GDPR-based foundation significantly reduces the work required to meet Quebec Law 25.
For US businesses with Canadian customers, the gap is larger — especially on consent and purpose limitation, which are more strictly interpreted in Canada than in most US state laws.
What action steps look like for most SMBs
For most small and medium businesses, a Canada AI compliance framework needs to cover six areas:
- AI inventory: List every AI system that processes personal data about Canadian individuals — including third-party SaaS tools that use personal data as inputs.
- PIAs for AI deployments: Complete a Privacy Impact Assessment for each AI system identified. Document risks and mitigations before deployment.
- Automated decision transparency: For any AI-driven decision about an individual, create a disclosure mechanism and a documented human review pathway.
- Consent audit: Review consent language for any collection of Canadian personal data used in AI systems. Ensure purpose is specific and consent is meaningful.
- CASL compliance for outreach: Audit any AI-driven email or messaging sequences for CASL compliance — consent documentation, identification, unsubscribe flow.
- AIDA readiness: Map your AI systems against the proposed AIDA high-impact categories now, so you're not caught unprepared when the legislation passes.
Getting this done
Quebec Law 25 is already in force. PIPEDA applies to every business with Canadian customers. CASL applies to every business that sends commercial messages to Canadian recipients. Most SMBs operating internationally have exposure they haven't assessed.
Our Canada AI compliance packages are fixed-price, built for SMBs, and cover all three active frameworks — Quebec Law 25, PIPEDA, and CASL — plus AIDA readiness. We deliver a documented compliance framework, not a general report.
Canada AI Compliance Packages
Fixed-price frameworks for SMBs. Quebec Law 25, PIPEDA, CASL compliance + AIDA readiness. From £197 (~$330 CAD).
See the Packages →