Canadian law societies have not waited for new legislation before addressing AI in legal practice. The Law Society of Ontario, the Law Society of British Columbia, and the Barreau du Québec have all issued guidance applying existing competence, confidentiality, and supervision obligations to AI use by legal professionals. Quebec Law 25 — Canada's most stringent privacy law, fully in force since September 2023 — governs how law firms process client personal data through AI tools. The Federal Court of Canada has issued a direction requiring disclosure of AI use in materials filed before the court. And PIPEDA creates obligations for firms handling personal data of clients across every province. This guide covers what Canadian law firms must have in place in 2026.
- You use AI for legal research, document drafting, contract review, or discovery analysis
- You have Quebec clients — Quebec Law 25 applies to any AI processing of their personal data
- You file materials in Federal Court or other courts with AI disclosure requirements
- You use AI in any hiring, performance review, or employment decision process at your firm
- You serve EU clients — the EU AI Act applies to your AI systems regardless of where your firm is based
Law society guidance: competence and confidentiality already apply
Canada's major law societies have been clear and consistent: existing professional responsibility rules govern AI use in legal practice. No new rules are required — the existing ones already cover it.
Law Society of Ontario
The LSO has confirmed that its Rules of Professional Conduct — particularly the competence obligation under Rule 3.1-2 and the confidentiality obligation under Rule 3.3-1 — apply to AI use. Specifically:
- Lawyers must understand the AI tools they use sufficiently to supervise their output and take responsibility for work product delivered to clients
- Client information uploaded to AI platforms is confidential and must be protected — lawyers must review vendor terms to ensure adequate confidentiality protections exist before using client data with any AI tool
- Supervision obligations require that AI-generated work product is reviewed by a competent lawyer before delivery
Law Society of British Columbia
The LSBC's guidance echoes the same principles, with particular emphasis on the duty of competence under Rule 3.1: lawyers must maintain the skills necessary for their practice area, including understanding the capabilities and limitations of AI tools used in that practice. The LSBC has noted that reliance on AI output without adequate verification is inconsistent with competent representation.
Barreau du Québec
The Barreau has issued specific guidance on the use of AI with client information, emphasising that the professional secrecy obligation under the Act Respecting the Barreau du Québec applies to data processed by AI platforms. Attorneys must ensure that client communications, case strategy, and privileged documents are not uploaded to AI tools that lack appropriate confidentiality contractual protections. The Barreau has also noted the compounding effect of Quebec Law 25 — which creates statutory obligations on top of professional secrecy for any AI processing of client personal information.
Building an AI compliance framework for your Canadian firm?
Our Canada AI compliance packages cover PIPEDA, Quebec Law 25, professional responsibility mapping, and AI vendor review. Fixed price, delivered in five to seven working days.
See the Canada Compliance Packages →Quebec Law 25: the most significant active obligation for client data
Quebec Law 25 is fully in force. The final phase — including Privacy Impact Assessments for AI profiling and mandatory human review mechanisms for automated decisions — came into effect in September 2023. Enforcement is active. This is not a future obligation.
For law firms, Quebec Law 25 creates specific requirements when AI processes the personal information of Quebec clients:
- Privacy Impact Assessment (PIA): Required before deploying any AI system that processes client personal information. The PIA must identify privacy risks, document mitigations, and be completed before deployment — not retrospectively.
- Automated decision transparency: Where AI influences a decision about a Quebec individual, the firm must inform the individual, explain the factors considered, and provide a mechanism for human review of the decision.
- Consent specificity: Consent to process personal information must be specific to the AI purpose — general retainer terms do not automatically authorise AI profiling or automated analysis of a client's personal information.
- Data processor oversight: AI vendors processing Quebec client data are information agents under Law 25. Contracts must ensure adequate protection and the firm remains accountable for how vendors handle that data.
Penalties: up to $25 million CAD or 4% of worldwide turnover, whichever is higher. The Commission d'accès à l'information has active enforcement capacity.
PIPEDA: federal baseline for all provinces
The Personal Information Protection and Electronic Documents Act applies to private-sector organisations across Canada in the course of commercial activities. For law firms, PIPEDA creates accountability obligations for client personal data processed by AI tools:
- Purpose must be identified before collecting client personal information for AI processing
- Meaningful consent is required — not assumed from general retainer terms where AI profiling is a materially different purpose from core legal representation
- Clients have access rights to their personal information — including inferences drawn by AI systems from that information
- Firms are accountable for personal information transferred to AI processors, including those operating outside Canada
Federal Court of Canada: AI disclosure direction
The Federal Court of Canada issued a practice direction requiring that parties and counsel disclose whether AI was used in preparing materials filed with the court, and confirm that all AI-generated content has been reviewed for accuracy. This applies to all materials filed in Federal Court proceedings.
Several provincial courts and tribunals are developing equivalent requirements. Firms that practice across multiple courts need a systematic tracking mechanism — not a case-by-case check before each filing.
CASL: AI-generated client communications
Canada's Anti-Spam Legislation applies to any commercial electronic message sent to a Canadian recipient — including AI-generated client newsletters, follow-up communications, and marketing outreach. Law firms using AI to generate or personalise client communications must ensure:
- Express or implied consent exists for each recipient before any CEM is sent
- Every CEM identifies the sender and includes contact information
- A functional unsubscribe mechanism is included and honoured within ten business days
- AI-generated emails that disguise sender identity or omit required information violate CASL regardless of how they were produced
Penalties: up to $10 million CAD per violation for organisations.
Preparing for AIDA: Canada's pending federal AI law
Bill C-27 — which includes the Artificial Intelligence and Data Act (AIDA) — has not yet passed Parliament. However, the direction is clear. AIDA would require organisations developing or deploying high-impact AI systems to implement risk mitigation measures, maintain records, and publish plain-language descriptions of how AI systems are used.
Law firms that build compliance frameworks now against PIPEDA and Quebec Law 25 will be significantly ahead when AIDA does pass — the documentation and oversight structures required are substantively similar.
What a compliant Canadian law firm AI framework requires
- AI inventory: Every AI tool documented with data handling terms reviewed against professional secrecy and PIPEDA obligations.
- Quebec Law 25 PIAs: Completed for any AI tool processing personal information of Quebec clients, before that tool is used in client matters.
- Supervision and verification policy: Written procedures for review of AI-generated work product before delivery — named supervisors, not informal assumptions.
- Court AI disclosure tracker: A practice-wide record of which courts require AI disclosure, updated as new directions are issued.
- CASL review: Audit of any AI-driven client communications for CASL compliance — consent documentation, identification, unsubscribe flow.
Canada AI Compliance for Law Firms
PIPEDA compliance, Quebec Law 25 PIA, professional responsibility mapping, vendor data review. Fixed price, delivered in five to seven working days. Built for Canadian law firms.
See the Canada Compliance Packages →