STAY CONNECTED: Have the stories that matter most delivered every night to your email inbox. Subscribe to our daily local news wrap.
Visitors gather at a sign outside Meta headquarters on Thursday, March 26, 2026, in Menlo Park, Calif. (AP Photo/Noah Berger)

Verdicts against Meta, Google in U.S. could boost Canadian big tech lawsuits

Apr 9, 2026 | 12:29 PM

OTTAWA — Recent U.S. court verdicts that found Meta and Google liable for harms to children are likely to benefit similar cases launched here in Canada, say experts and the lawyers behind the Canadian litigation.

They say the March verdicts in Los Angeles and New Mexico could affect a class action in B.C. and a case brought forward by a group of Ontario school boards.

“As a first step, this is a landmark moment for holding … social media accountable,” said Emily Laidlaw, a law professor at the University of Calgary.

“The cases in the U.S. bode well for the litigation in Canada. There’s so many cases underway across the world, and so I expect that this will have a ripple effect.”

A Los Angeles jury found both Meta and YouTube liable for harms to children using their services, while in New Mexico, a jury concluded that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its platforms.

In New Mexico, state investigators built their case by posing as children on social media and documenting sexual solicitations they received, as well as Meta’s response. The jury was asked if Meta violated New Mexico’s consumer protection law.

The Los Angeles case had a single plaintiff — who goes by the initials KGM — against Meta, Google’s YouTube, TikTok and Snap. TikTok and Snap settled before trial.

The plaintiff argued the platform features of the two remaining defendants, Meta and YouTube, were designed to be addictive, especially for young users.

KGM is one of a handful of plaintiffs whose cases are testing how these arguments play out before juries, and whether they can lead to broader settlements.

A spokesperson for Meta said in a pair of online posts in March the company “respectfully” disagrees with the verdicts and will appeal. A spokesperson in Canada declined to comment on the Canadian cases.

Vivek Krishnamurthy, an associate member of the University of Ottawa’s Centre for Law, Technology and Society and an associate professor of law at the University of Colorado, said the verdicts suggest it’s “perhaps a bit more likely that the Canadian litigation against the platforms could reach a similar conclusion.”

While the U.S. decisions aren’t binding in Canada, Krishnamurthy said, they are “certainly persuasive.”

“I have no doubt that the parties in the Canadian litigation will be looking very closely at this and the plaintiffs will be seeking to use those verdicts to maximum advantage,” he said.

One of the lawyers involved in the Canadian cases is Duncan Embury, head of litigation at Neinstein LLP. He’s representing a group of 22 Ontario school boards that have brought claims against Meta, Snapchat and TikTok.

The boards are claiming the platforms’ algorithmic design has disrupted the public education system, leading to additional costs, Embury said. School boards report that students are paying less attention in class and schools are seeing a spike in bullying and mental health issues, he added.

Embury said the U.S. verdicts have “real significance both to this case, but also to all of us more broadly.”

He noted more cases in the U.S. are scheduled to proceed to trial and that “should give all of us pause to really think about what our children are being exposed to and what potential harms those things give rise to.”

In British Columbia, a class action against Meta alleges people have been injured by the platform. While similar cases were filed elsewhere in Canada, they are now on hold while the B.C. case proceeds, said Reidar Mogerman, a lawyer arguing the class action.

He said the platforms use tools to hold people’s attention, leading to “social comparison” and creating depression and anxiety that can manifest in physical injury through things like eating disorders and suicidal thoughts.

“Allegedly, the algorithms are ultimately designed to hold people’s attention. That’s what the platforms are selling to the advertisers. And in order to hold their attention, they’re giving them more and more damaging and radicalized information,” Mogerman said.

He said the U.S. cases offer a road map because juries in the U.S., examining similar sets of facts, concluded the companies did something wrong.

“So they are momentum, but they’re not a complete answer. We have to litigate ourselves our own case up here,” he said.

Taylor Owen, founding director of the Centre for Media, Technology and Democracy at McGill University, said that while the Canadian cases are building on the American ones, “they’re all sort of following similar underlying logic that there’s a sort of a negligence or liability around product design.”

He noted the U.S. cases had to get around Section 230. That’s a U.S. law that generally exempts internet companies from liability for the material users post on their services.

“So they didn’t touch content. It was all about the design of the product,” Owen said.

Owen said comparisons can be made between these social media lawsuits and the decades of litigation that led to tobacco companies being ordered to pay out tens of billions of dollars in compensation for the health effects of their products.

Those tobacco company lawsuits “revealed a host of things that were known internally to those companies, in a similar way as we now know a lot more about what was known internally in social media companies because of these cases,” Owen said.

“But I think the bigger lesson is that those litigations in the tobacco cases weren’t the things that got us all to stop smoking. It was the regulations that followed them.”

The Liberal government is planning to introduce an online harms bill and is currently consulting with an expert advisory group — which includes Owen, Krishnamurthy and Laidlaw — on that legislation.

Luke Stark, an assistant professor in the faculty of information and media studies at the University of Western Ontario, said regulation should take into account the design features that have been singled out in the lawsuits.

“If you understand social media platforms as just another consumer product, the design of which is … intentional by the companies involved, you could see, for instance, a set of prohibitions or a set of regulations about what kinds of interactive features are available,” he said.

This report by The Canadian Press was first published April 9, 2026.

— With files from The Associated Press

Anja Karadeglija, The Canadian Press