In this week’s GME3, we’re taking a look at the separation of iGO from the AGCO by the iGaming Ontario Act, a recent Alberta court ruling which has declared parts of the province’s privacy law unconstitutional, and the latest AI scandal, in which a Chicago news outlet published a summer reading list including books that don’t actually exist. Read the full stories below!
Gambling
Shuffle Up: Ministries Move
Ontario’s online-gaming ecosystem has undergone some changes. As of last week, the iGaming Ontario Act has split iGaming Ontario (iGO) from its former parent, the Alcohol and Gaming Commission of Ontario (AGCO). iGO, which supervises more than 50 operators and 80 platforms, now reports to the Ministry of Tourism, Culture and Gaming, headed by Minister Stan Cho. AGCO continues to answer to the Ministry of the Attorney General but will limit its role to pure regulation across the entire gambling sector.
The restructuring addresses the conflict flagged by the Auditor General: the AGCO could not credibly police a market while its subsidiary, iGO, was charged with maximizing provincial revenue from it. Housing iGO and the Ontario Lottery and Gaming Corporation under the same ministry will consolidate all revenue-generating functions in one portfolio while freeing AGCO to focus solely on oversight.
Queen’s Park says the change will tighten governance, make iGO a more competitive employer, and let it react faster to market shifts by trimming bureaucratic layers. For players and operators, however, nothing in day-to-day operations changes: existing operating agreements remain intact, and the AGCO retains regulatory authority. iGO’s relationships with licensed operators proceed as before while it continues searching for a successor to retiring executive director Martha Otton, whose departure had already been extended through March 2025.
Media
Face-Off in Alberta
A recent Alberta court ruling has declared parts of the province’s privacy law unconstitutional, while still upholding a ban against Clearview AI, a U.S.-based facial recognition firm. Clearview AI scrapes publicly accessible images from the internet to build a massive biometric database marketed to law enforcement agencies. In 2021, Canada’s federal, Alberta, B.C., and Quebec privacy commissioners jointly ordered the company to stop operating in Canada and delete any images of Canadians collected without consent.
In response, Clearview sought a judicial review and challenged Alberta’s Personal Information Protection Act (PIPA), arguing that its web scraping from social media fell within the law’s “publicly available information” exception. However, Court of King’s Bench Justice Colin Feasby disagreed, finding that Alberta’s rules are too vague and overly broad, allowing selective enforcement by the privacy commissioner, but still supported the order against Clearview. He ruled that scraping images for facial recognition isn’t a “reasonable use” of public data and violates individuals’ privacy expectations.
Feasby noted the act’s failure to explicitly address modern data practices like social media scraping and recommended narrowing the definition of “publicly available” to avoid arbitrary application. He acknowledged the legislation is outdated, echoing Alberta Technology Minister Nate Glubish, who said updates are forthcoming.
Clearview must now delete images of Albertans and report on its compliance within 50 days, despite its claim that its system can’t determine where an image was sourced. A similar order was upheld by a B.C. court last year.
Entertainment
AI Cooks the Books
The Chicago Sun-Times is under fire after publishing an AI-generated summer reading list featuring fake books by real authors. The list, part of a “Heat Index” insert dated May 18, included fictional titles like Tidewater Dreams by Isabel Allende and Nightshade Market by Min Jin Lee—none of which exist. Though marketed as must-reads for 2025, only five of the 15 listed books were real and correctly attributed.
The paper initially couldn’t explain how the piece was published. It later clarified that the content came from King Features, a syndicated content provider, and had not been created or approved by the Sun-Times newsroom. A spokesperson admitted this kind of error is “unacceptable,” and the newsroom’s union condemned the incident, calling it a violation of reader trust and editorial standards.
Writer Marco Buscaglia, who produced the list, acknowledged using AI and failing to verify its output. “No excuses… On me 100%,” he told 404 Media.
This incident adds to a growing list of AI blunders in journalism, from Sports Illustrated’s fake authors to Microsoft’s AI recommending a food bank as a lunch spot. Critics point to the dangers of “AI hallucinations,” where systems confidently generate false information. The Sun-Times said it’s reviewing partnerships and updating editorial policies to prevent such errors in the future.
GME Law is Jack Tadman, Lindsay Anderson, and Will Sarwer-Foner Androsoff. Jack’s practice has focused exclusively on gaming law since he was an articling student in 2010, acting for the usual players in the gaming and quasi-gaming space. Lindsay brings her experience as a negotiator and contracts attorney, specializing in commercial technology, SaaS services, and data privacy.
At our firm, we are enthusiastic about aiding players in the gaming space, including sports leagues, media companies, advertisers, and more. Our specialized knowledge in these industries allows us to provide tailored solutions to our clients’ unique legal needs. Reach out to us HERE or contact Jack directly at jack@gmelawyers.com if you want to learn more!
Check out some of our previous editions of the GME3 HERE and HERE, and be sure to follow us on LinkedIn to be notified of new posts, keep up to date with industry news, and more!