In this week’s GME3: Ontario signals a tougher stance on unlicensed gambling operators, a U.S. judge delivers a major ruling on AI and copyright, and Australia’s eSafety Commissioner pushes back on YouTube’s exemption from new youth restrictions. Catch up on the latest from the gambling, media, and entertainment worlds below.
Gambling
Slot and Order
Ontario is preparing to escalate enforcement against unregulated gambling operators that continue to serve its residents without a licence. Speaking at the SBC Canadian Gaming Summit, Ontario Attorney General Doug Downey and AGCO Chair Dave Forestell made it clear: the grace period is over. While the province has been largely successful in transitioning over 80% of grey market operators into the regulated market since April 2022, officials signalled that those who have refused to join the legal sector should now expect more aggressive action.
Ontario has followed a three-phase approach: first, by creating an attractive legal market with reasonable taxes and licensing conditions; second, by encouraging hesitant operators to make the transition; and now, by moving into a full enforcement phase. This includes targeting service providers – such as banks and payment processors – that work with unlicensed operators. Forestell noted that the province publishes a list of legal operators and warned that facilitating payments for unlisted sites could amount to supporting illegal gambling activity.
This push comes on the heels of earlier moves by the AGCO, including a May request that media platforms stop running ads for unlicensed brands, with Bodog specifically called out. With the market now considered mature, Ontario officials are making it clear: operators still on the sidelines may soon find the door closed. To read more about this announcement and other headlines from last week’s Canadian Gaming Summit, head over to Gaming News Canada for more excellent coverage.
Media
Throwing the Book at AI
A U.S. federal judge has ruled that AI companies can legally train their models on copyrighted books without the authors’ permission, marking a significant precedent in the ongoing debate over generative AI and intellectual property. In the Northern District of California, Judge William Alsup sided with Anthropic, finding that using copyrighted works to train large language models (LLMs) qualifies as “fair use” under U.S. copyright law.
The case was brought by authors Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson, who alleged that Anthropic had “pirated” millions of books – both purchased and illegally downloaded – to train its AI models, including versions of Claude. Alsup found that the use of lawfully acquired books was “exceedingly transformative” and served a new, creative purpose: enabling AI to generate novel humanlike text rather than replicating the original works. He emphasized that this kind of use did not harm the market for the original books, a key consideration under the fair use test.
However, Alsup distinguished between fair use and theft. While digitizing purchased books passed legal muster, the unauthorized downloading of pirated copies did not. Anthropic could still face liability for this aspect, though it may reduce damages if the company later bought the same books legally.
This is the first major ruling to clarify fair use in the context of AI training, a decision likely to shape the trajectory of similar lawsuits from authors, artists, and media companies challenging how AI firms source their data.
Entertainment
Koala-fied for Exemption
Australia’s eSafety Commissioner, Julie Inman Grant, is calling for the government to reverse its decision to exempt YouTube from the country’s upcoming social media ban for users under 16. Inman Grant publicly criticized the exemption, arguing it undermines the law’s goal of protecting young Australians from online harm. She cited new research showing YouTube is not only the most-used platform among teens but also the leading source of harmful content, including misogyny, violence, disordered eating, and suicidal ideation.
The law, passed in late 2024 and set to take effect by the end of 2025, has already drawn criticism from platforms like Meta and TikTok, which were not granted exemptions. These companies were further frustrated after reports emerged that YouTube received its carveout following a personal pledge from the Australian government to its CEO – before any public consultation had begun.
Inman Grant stressed her role is not to approve legislation but to enforce it, yet she expressed surprise that YouTube had been spared. Meanwhile, YouTube defended its position, arguing it provides valuable educational content and should remain accessible to students and teachers. The platform urged the government to honour its commitment and maintain the exemption.
As enforcement approaches, the debate highlights broader tensions over fairness, consistency, and influence in how tech regulation is applied.
GME Law is Jack Tadman, Lindsay Anderson, and Will Sarwer-Foner Androsoff. Jack’s practice has focused exclusively on gaming law since he was an articling student in 2010, acting for the usual players in the gaming and quasi-gaming space. Lindsay brings her experience as a negotiator and contracts attorney, specializing in commercial technology, SaaS services, and data privacy.
At our firm, we are enthusiastic about aiding players in the gaming space, including sports leagues, media companies, advertisers, and more. Our specialized knowledge in these industries allows us to provide tailored solutions to our clients’ unique legal needs. Reach out to us HERE or contact Jack directly at jack@gmelawyers.com if you want to learn more!
Check out some of our previous editions of the GME3 HERE and HERE, and be sure to follow us on LinkedIn to be notified of new posts, keep up to date with industry news, and more!


