Independent · Sourced · Canadian
AboutStandardsDaily Briefing
Subscribe
Home Analysis
Analysis AI Civil Liberties Policy Technology

85% of Canadians Want AI Guardrails. The Government Gave Them a 30-Day Sprint.

Ottawa's AI regulation died on the order paper in 2025. Its replacement is a 30-day industry consultation that 160 civil society groups rejected as a 'facade for manufacturing consent.'

NW Editorial · March 28, 2026 · 9 min read
85% of Canadians Want AI Guardrails. The Government Gave Them a 30-Day Sprint.
Aerps.com / Unsplash — The Canadian government launched a 30-day consultation on AI regulation while 85% of Canadians said they want binding guardrails.
Jun ’22AIDA tabled as part of Bill C-27
Jan ’25Parliament prorogued — AIDA dies on the order paper
May ’25Solomon appointed Canada’s first AI Minister
Sep ’2530-day ‘national sprint’ consultation launched
Oct ’25160+ groups reject sprint, launch People’s Consultation
Jan ’26Still no federal AI law — strategy expected ‘later this year’
Key Takeaways
  • 85% of Canadians say governments should regulate AI for ethical and safe use. Only 24% want regulation removed to help the industry compete.
  • Canada’s Artificial Intelligence and Data Act (AIDA) died in January 2025 when Parliament was prorogued. No replacement legislation has been tabled.
  • AI Minister Evan Solomon’s 30-day ‘national sprint’ consultation was rejected by 160+ civil society groups as a ‘facade for manufacturing consent.’
  • Of the government’s 26 consultation questions, only 3 addressed safety and public trust. The remaining 23 focused on industry growth and adoption.

The polling is not ambiguous. In an August 2025 Leger survey, 85% of Canadians said governments should regulate AI tools to ensure ethical and safe use. In a November 2025 North Poll survey, 60% said they would prefer the government to be skeptical and ensure Canadians are not harmed or deceived by the technology. When asked what AI Minister Evan Solomon should prioritize, 60% said legislation to ensure AI is used ethically and safely. Only 24% said the priority should be removing regulation so the industry can compete with the United States.1

Of Canadians who say governments should regulate AI tools to ensure ethical and safe use — Leger, August 2025

The government heard those numbers and moved in the opposite direction.

Canada had an AI regulation bill. It was called the Artificial Intelligence and Data Act — AIDA — and it was introduced in June 2022 as part of Bill C-27, bundled together with privacy reform. The bill proposed a regulatory framework for high-impact AI systems, enforced by a new AI and Data Commissioner. It spent two and a half years in committee. It was controversial — critics said the bill was vague on what constituted “high-impact” AI, excluded government use entirely, and was drafted without meaningful public consultation. But it was something.2

In January 2025, when Justin Trudeau resigned and Parliament was prorogued, Bill C-27 died on the order paper. AIDA died with it. Canada’s first attempt at comprehensive AI regulation evaporated overnight — not because it was voted down, but because the government let Parliament collapse before the bill could pass.3

Canada is now one of the few G7 countries without a federal AI regulatory framework in force. The European Union has the AI Act. The United States has state-level laws and a White House executive order. Canada has PIPEDA — a privacy law written in 2000 — and a voluntary code of conduct on generative AI that carries no legal force.

Under Mark Carney, the Liberal government appointed Evan Solomon as Canada’s first Minister of Artificial Intelligence and Digital Innovation. Solomon has been explicit about his priorities. He has said he intends to depart from “over-indexing” on harm prevention. His stated approach to regulation is that it must be “light, tight, and right.” He has described the government’s guiding principle as “AI for all” and built his strategy around four pillars: scaling companies, encouraging adoption, promoting digital sovereignty, and building infrastructure.4

What is missing from that list is regulation, safety, or rights.

3 of 26
Questions in the government’s AI consultation survey that addressed safety and public trust — the rest focused on industry growth

85% want regulation. The minister says the priority is adoption.

Canada was the first country to launch a national AI strategy in 2017. It is now one of the few G7 nations without a comprehensive AI regulatory framework in force.
Jonathan Lim / Unsplash — Canada was the first country to launch a national AI strategy in 2017. It is now one of the few G7 nations without a comprehensive AI regulatory framework in force.

In September 2025, Solomon launched a 30-day “national sprint” to develop a refreshed AI strategy. He appointed a 27-member task force and gave them one month to report back with recommendations. Canadians had the same 30 days to participate in a public consultation through a 26-question online survey. Of those 26 questions, three dealt with safety and public trust. The remaining 23 focused on research, talent, adoption, commercialization, and scaling the AI industry.5

The task force was criticized for being weighted toward industry. University of Ottawa law professor Teresa Scassa, Canada Research Chair in information law and policy, said the makeup was “skewed towards industry voices and the adoption of AI technologies.” She added that instructing task force members to consult their own networks “sounds a lot like insider networking, which should frankly raise concerns.”6

More than 160 academics, lawyers, civil liberties groups, and human rights organizations signed an open letter rejecting the sprint entirely. The signatories included PEN Canada, the BC Civil Liberties Association, the Women’s Legal Education and Action Fund, the Canadian Centre for Policy Alternatives, and the International Civil Liberties Monitoring Group. They called the process “a facade for manufacturing consent for a harmful preordained agenda” and refused to participate.7

This seems like upping the ante on moving fast and breaking things. And so we thought we deserve better, the Canadian public deserves better.

— Cynthia Khoo, tech lawyer and open letter signatory, January 2026

Instead, they launched an independent “People’s Consultation on AI” — a parallel process designed to capture the concerns the government’s sprint did not ask about: environmental impacts, labour rights, mental health effects, the proliferation of non-consensual deepfakes, privacy risks, and the documented inaccuracies of generative AI output.

Tech lawyer Cynthia Khoo, one of the letter’s signatories, said the sprint approach was “upping the ante on moving fast and breaking things” and that Canadians “deserve better.”

A robot telling a robot what to do for the policy, which is a few levels distinct from actually talking to Canadians.

— Alex Kohut, founder, North Poll Strategies, on the AI consultation process

Pollster Alex Kohut flagged another problem. The government’s 26-question survey was long, included required open-ended questions, and was likely completed primarily by industry stakeholders working during business hours. Some respondents may have used AI to fill it out — and with the government using AI to process the responses, Kohut said, the result could be “a robot telling a robot what to do for the policy, which is a few levels distinct from actually talking to Canadians.”8

The pattern is now familiar. The Liberal government spent $2.4 billion on AI infrastructure in the 2024 budget. It launched a Canadian AI Safety Institute. It created a voluntary code of conduct. It appointed Canada’s first AI minister. What it has not done — in four years since tabling AIDA — is pass a single binding law that governs how AI systems can be used in Canada, what rights Canadians have when AI makes decisions about them, or what obligations AI companies have to the public.

What Canadians Want
vs.
What the Government Is Doing
Leger / North Poll — August–November 2025
85% of Canadians say governments should regulate AI for safe and ethical use. 60% want the government to prioritize safety legislation.
Solomon — 2025–2026
Solomon says regulation must be “light, tight, and right” and has stated he intends to depart from “over-indexing” on harm prevention.
Trudeau Government — June 2022
AIDA was tabled in 2022 to create a regulatory framework for high-impact AI, enforced by an AI and Data Commissioner.
Liberal Government — January 2025 — Present
The bill died on the order paper when Trudeau resigned and Parliament was prorogued. No replacement has been tabled under Carney.
PEN Canada, BCCLA, WLAF, CCPA + 160 others — October 2025
160+ civil society groups said the 30-day sprint was “a facade for manufacturing consent” and refused to participate.
Solomon’s Office — January 2026
Solomon’s office said the process was “broad and multi-channel.” The government’s AI strategy — with no binding regulation — is expected ‘later in 2026.’

Eighty-five percent of Canadians want AI regulation. The government gave them a 30-day survey weighted toward industry, a task force that civil society groups called unrepresentative, and a minister who says the priority is adoption, not harm prevention. The bill that was supposed to regulate AI died because the government let Parliament collapse. No replacement has been tabled. Canada is running its AI policy on a voluntary code of conduct and a minister who says regulation must be “light.” The question is not whether Ottawa is listening to Canadians on AI. The polling answers that. The question is why it is choosing not to.

Sources

  1. Canadian Press / Nation Newswatch — Critics, pollsters warn Canadians are wary of AI, want government to set guardrails — Leger and North Poll data (2026-01-21)
  2. Montreal AI Ethics Institute — The Death of Canada’s Artificial Intelligence and Data Act: What Happened, and What’s Next (2025-10-20)
  3. Schwartz Reisman Institute / University of Toronto — What’s Next After AIDA? — comprehensive analysis of Bill C-27’s failure and regulatory gap (2025-02-11)
Show all 12 sources ↓

Every source. Every contradiction. Yours to share.

Copy Link
Post on 𝕏
Facebook
WhatsApp
iMessage
Keep Reading

The stories that matter. Before 7 AM.

For Canadians who refuse to be told what to think.
Most Read
← Back to today's stories