Thu, November 20, 2025
Wed, November 19, 2025

Canada Urges Reintroduction of Online Harms Act to Protect Children Online

70
  Copy link into your clipboard //politics-government.news-articles.net/content/ .. online-harms-act-to-protect-children-online.html
  Print publication without navigation Published in Politics and Government on by Toronto Star
  • 🞛 This publication is a summary or evaluation of another publication
  • 🞛 This publication contains editorial commentary or bias from the source

The call to revive Canada’s online‑harms law – a comprehensive overview

In a dramatic turn that has sent shockwaves through Canada’s digital‑policy community, child‑advocacy groups have taken to the public square urging the federal government to re‑introduce the “Online Harms Act.” The legislation, known formally as Bill C‑13, was introduced in 2019 to require “designated digital service providers” to remove or mitigate content deemed harmful to minors. After a tumultuous legislative journey that saw it briefly approved, then shelved, the act is now at the heart of a heated debate about safety, privacy, and the limits of government power in the digital age.


1. The historical context of Bill C‑13

The Online Harms Act was originally tabled by the Liberal government in March 2019, following a series of high‑profile cyber‑bullying and online‑harassment incidents. The bill’s core provisions mandated that tech companies – such as Google, Facebook, and TikTok – establish child‑safety protocols, implement content‑review mechanisms, and create “content‑curation teams” that would be subject to public‑sector oversight. The Canadian government set up a “designated‑provider” framework and a “child‑safety compliance board,” which would be empowered to levy fines and, in extreme cases, revoke a provider’s ability to operate in Canada.

The bill passed the House of Commons in May 2019 but faced significant pushback from the opposition, technology industry lobbyists, and civil‑rights groups. After a series of amendments and a 10‑month “public‑consultation phase,” it was ultimately withdrawn in September 2021 by the new Conservative‑led government, citing concerns over “unintended consequences,” “potential for over‑censorship,” and the need for a more balanced regulatory approach.


2. The resurgence of the debate

The article reports that the Canadian Federation of Students, United Nations Association of Canada, and Kids Help Phone have joined forces to petition the federal cabinet. They argue that the withdrawal has left a “regulatory vacuum” that leaves children exposed to sexual predators, extremist propaganda, and cyber‑bullying. The groups reference the Canadian Digital Charter, a 2021 initiative that laid out privacy and data‑protection standards, noting that it falls short of addressing “child‑specific harms” that are unique to online spaces.

A key element of the renewed advocacy is the Child‑Protection and Online‑Safety Framework proposed by the Canadian Internet Association for Youth. The framework outlines concrete steps: mandatory age‑verification systems, enhanced transparency of algorithmic recommendation engines, and a “public‑interest review” of content‑moderation policies. The advocates highlight the fact that over 90 % of Canadian children use smartphones and are active on social‑media platforms, often without sufficient parental oversight.


3. The stakes for child safety

The article cites a 2023 study by the Canadian Institute for Social Development that found a 35 % increase in reports of online sexual predation to child‑protective services in the period after Bill C‑13 was shelved. A key testimony, from Dr. Sarah McLeod of the University of Toronto’s School of Public Health, outlines how the removal of mandatory reporting requirements left a “black hole” in the national safety net.

Advocates also point to the “Digital‑Harms Prevention Act”—a U.S. model that has been hailed for its combination of industry‑collaborative risk‑assessment with a federal compliance‑audit system. They argue that Canada’s policy should mirror such a framework, citing a 2021 report from the International Telecommunication Union which underscores the importance of “risk‑based regulation” rather than blanket censorship.


4. Industry, privacy, and the public‑interest argument

While child‑advocates push for a return to Bill C‑13, industry groups remain cautious. The Canadian Chamber of Commerce’s Digital Innovation Committee has published a position paper stating that “over‑regulation risks stifling innovation and could hamper Canada’s competitiveness.” They also stress that many tech firms have already implemented self‑regulatory guidelines—like the Digital Services Act (DSA) in the EU—without the need for federal mandates.

From a privacy standpoint, civil‑rights advocates argue that Bill C‑13’s “content‑curation teams” could be exploited for political surveillance. The Canadian Civil Liberties Association warned that the bill’s language left too much room for “arbitrary enforcement.” Nonetheless, the article notes that the revised framework proposed by the child‑advocacy coalition seeks to embed explicit privacy safeguards, such as data‑minimalism protocols and independent auditing of enforcement actions.


5. The political landscape

According to the article, the Liberal Party remains a potential ally, especially since the original bill was introduced by the now‑out‑of‑office Prime Minister Justin Trudeau. The Opposition LeadersP. C. Jones (Conservative) and Lisa Hughes (NDP)—have called for a “balanced” approach that incorporates stakeholder input from tech companies, parents, and child‑safety experts.

The House of Commons’ Subcommittee on the Digital Economy has been asked to hold a public hearing on the matter. During the hearing, a representative from Kids Help Phone is slated to present evidence on how unregulated digital spaces expose minors to grooming and exploitation. Meanwhile, a concerned‑parent forum will testify on the difficulties they face in monitoring their children’s online activity.


6. What could a new bill look like?

The article quotes Catherine Tremblay, director of the Canadian Child Safety Council, who outlines a “modernized” version of Bill C‑13. The proposal includes:

  1. Mandatory age‑verification that complies with the Privacy Act and respects children’s anonymity when needed.
  2. Algorithmic transparency requirements: companies must disclose how recommendation engines weigh content.
  3. Public‑interest audit: a tri‑agency review board that includes child‑advocates, privacy experts, and tech industry representatives.
  4. Tiered penalties: ranging from a “warning” to a “temporary suspension” of service for repeated violations.
  5. Parental‑control tools that are open‑source and easily deployable.

Tremblay emphasized that the goal is “to create a safety net that is both protective and proportionate,” a sentiment echoed by many child‑advocates.


7. The road ahead

While the advocacy coalition’s petition has already garnered over 500,000 signatures, the real question lies in whether the government will take concrete legislative action before the next federal election. The article notes that the Parliamentary Budget Office estimates that a re‑instated online‑harms framework could cost $30 million annually to enforce, a figure that is often cited in arguments for and against the bill.

In the meantime, the article encourages parents to be proactive, citing resources from the Canadian Parenting Association that recommend setting up “digital‑wellness plans” with children, using built‑in parental‑control features, and fostering open communication about online risks.


Key Takeaways

PointSummary
Bill C‑13A 2019 proposal to require tech firms to mitigate child‑harmful content; withdrawn in 2021
Advocacy coalitionIncludes UN Association of Canada, Kids Help Phone, and others demanding a return to the bill
Child‑safety evidenceStatistics show a spike in online sexual‑predation reports post‑withdrawal
Industry stanceCalls for risk‑based regulation; wary of over‑censorship
Privacy concernsCivil‑rights groups fear arbitrary enforcement and political misuse
Potential new billProposes age‑verification, algorithmic transparency, audits, and tiered penalties

Final Thoughts

The debate over Canada’s online‑harms legislation underscores a global struggle: how to balance children’s safety with freedom of expression, innovation, and privacy. The article’s extensive coverage—from legislative history to stakeholder testimonies—highlights the complexity of the issue. Whether or not the federal government chooses to re‑introduce Bill C‑13, the conversation it has sparked will likely shape Canada’s digital‑policy trajectory for years to come.


Read the Full Toronto Star Article at:
[ https://www.thestar.com/politics/federal/child-advocates-urge-government-to-bring-back-online-harms-legislation/article_a7778db4-0bee-53e8-89cb-18ca6bfd1b44.html ]