In this week’s GME3, yet another fine against a provincial gaming operator by FINTRAC, OpenAI argues that an ongoing lawsuit brought by Canadian news publishers should be transferred to the U.S., and the Australian government publishes guidance on how social media companies should approach age verification as their under-16 ban looms. Read the full stories below!
Gambling
FINTRAC II: Attack of the Fines
On September 12, 2025, the Financial Transactions and Reports Analysis Centre of Canada (FINTRAC) announced a $1.175 million penalty against the Saskatchewan Indian Gaming Authority (SIGA) for alleged breaches of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act (PCMLTFA). SIGA, a First Nations organization that operates seven casinos across Saskatchewan and the province’s only regulated online platform, PlayNow.com, generated a record $378 million in gross revenue and $146 million in net income in 2024-25, all of which is reinvested into First Nations, community projects, and the provincial government.
FINTRAC alleged three compliance violations: failure to submit four suspicious transaction reports despite clear red flags (categorized as “Very Serious”), failure to include prescribed information in three reports, and failure to maintain adequate risk-assessment policies (both deemed “Serious”). Reasons cited included transaction volumes inconsistent with patrons’ financial standing and activity tied to individuals flagged by law enforcement.
SIGA strongly rejected FINTRAC’s findings, emphasizing that the penalty related to administrative reporting obligations only and that no money laundering, terrorist financing, or other criminal activity was uncovered. The authority stated it maintains a compliance program that is regularly audited and confirmed it will appeal both the findings and penalties in federal court.
This case follows a recent wave of heightened enforcement by FINTRAC. The Canadian National Exhibition Casino is challenging a $200,000 fine, while the British Columbia Lottery Corporation (BCLC) is contesting a $1 million penalty it claims was imposed after an “ambush” examination. The string of disputes highlights a growing clash between FINTRAC’s assertive enforcement posture and gaming operators’ insistence that alleged shortcomings reflect technical reporting disputes rather than systemic AML failures.
Media
OpenAI Eh?
OpenAI appeared before the Ontario Superior Court on September 17, 2025, seeking to have a copyright lawsuit brought by Canadian news publishers transferred to the United States. The suit, the first of its kind in Canada, was launched by a coalition of major outlets including The Canadian Press, Torstar, The Globe and Mail, Postmedia, and CBC/Radio-Canada. The publishers allege that OpenAI unlawfully scraped and used their copyrighted content to train its ChatGPT system, generating profits without permission or compensation.
OpenAI contests Ontario’s jurisdiction, arguing it has no presence or operations in the province and that all relevant activities – the training of AI models and automated web crawling – occurred outside Canada. It contends that Canadian copyright law does not apply extraterritorially and insists that the dispute belongs in U.S. courts, where similar cases are already unfolding and where the question of whether training AI on copyrighted material falls under “fair use” remains unsettled.
The publishers counter that Ontario is the proper venue because the companies are Canadian-owned, headquartered in Ontario, and produce most of their content in the province. They argue that OpenAI’s scraping directly targeted Ontario-based content, giving the case a “real and substantial connection” to the jurisdiction. Beyond technical disputes over servers and web-crawling protocols, the publishers warn that adopting OpenAI’s position would effectively strip Canada of authority over large swaths of its digital economy, undermining national sovereignty and the role of journalism.
OpenAI has accused the plaintiffs of politicizing the case with appeals to sovereignty, insisting the matter is strictly legal and jurisdictional. The outcome will determine not only where the case is heard, but also how Canada positions itself in the growing global debate over AI, copyright, and digital regulation.
Entertainment
Minor Issues Down Under
Australia will implement the world’s first nationwide ban on children under 16 having social media accounts starting December 10, 2025. Platforms including TikTok, Facebook, Snapchat, Reddit, X, and Instagram will be required to prevent underage users from holding accounts, with penalties of up to AUD $50 million (approximately CAD $45 million) for systemic non-compliance.
The government recently issued guidelines on how the ban should be enforced, clarifying that platforms will not be required to reverify the ages of all users. eSafety Commissioner Julie Inman Grant emphasized that major platforms already hold sufficient data to determine users’ ages and have sophisticated targeting technologies that can be applied to identify minors. She dismissed claims that all Australians would be forced into age verification as a “scare tactic,” noting that companies already use these tools for advertising precision.
Communications Minister Anika Wells reinforced that the government’s goal is to protect children without undermining the privacy of adults. She pointed out that long-standing platform users – for example, someone active on Facebook since 2009 – clearly exceed the age threshold, making blanket verification unnecessary. Instead, regulators will assess whether companies take “reasonable steps” to exclude underage users, with enforcement focused on systemic failures rather than individual accounts.
Experts, such as RMIT University’s Lisa Given, noted that age verification technologies are imperfect and that platforms will have significant discretion in determining compliance. Authorities acknowledge that not every under-16 account will vanish overnight, but expect platforms to demonstrate credible, proactive enforcement measures.
Government officials plan to meet with U.S.-based tech companies next week to discuss implementation, signaling international attention on how Australia’s precedent-setting policy balances child safety, privacy, and digital rights.
GME Law is Jack Tadman, Lindsay Anderson, and Will Sarwer-Foner Androsoff. Jack’s practice has focused exclusively on gaming law since he was an articling student in 2010, acting for the usual players in the gaming and quasi-gaming space. Lindsay brings her experience as a negotiator and contracts attorney, specializing in commercial technology, SaaS services, and data privacy.
At our firm, we are enthusiastic about aiding players in the gaming space, including sports leagues, media companies, advertisers, and more. Our specialized knowledge in these industries allows us to provide tailored solutions to our clients’ unique legal needs. Reach out to us HERE or contact Jack directly at jack@gmelawyers.com if you want to learn more!
Check out some of our previous editions of the GME3 HERE and HERE, and be sure to follow us on LinkedIn to be notified of new posts, keep up to date with industry news, and more!


