Welcome back to the GME3! For this week’s edition, we’re covering a special AML bulletin from FINTRAC, an overview of North American privacy legislation to keep an eye on in 2024, and a mysterious disparity over who is (and isn’t) allowing AI to scrape information from their websites. Read the full stories below!
Gambling
FINTRACking AML Risks
The Financial Transactions and Reports Analysis Centre of Canada (or “FINTRAC”) released a special bulletin earlier this month highlighting the increased threat posed by money laundering in the iGaming space. In creating this report, FINTRAC analyzed suspicious gambling transactions reports from 2016 to 2023, as well as “data from other financial intelligence units, assessments from domestic and international government and non-government organizations, and information from open sources.”
From conducting their study, FINTRAC outlined common methods of laundering money, including:
Using financial entities, like payment service providers, to launder money through both licensed and unlicensed gaming platforms;
The use of prepaid cards and vouchers to obscure the real (and likely illicit) source of funds;
Using e-wallets to move funds between Canada and unlicensed, off-shore gaming sites, and;
Paying via cryptocurrencies to preserve anonymity.
Many of these unlicensed platforms are based in countries that have weak AML regulations, highly secretive banking practices, and can also function as tax havens.
As a result of this report, some are calling on Canadian provinces to finally introduce their own regulatory regimes for licensed iGaming. As Canadian Gaming Association President Paul Burns said to our friends at Gaming News Canada, “[This] is a message to the other provinces (outside of Ontario) that it’s time to expand to a regulatory regime. Doing nothing is no longer an option.”
If you’re wondering how you can better learn to recognize signs of money laundering, FINTRAC included a long list of indicators as part of the bulletin, and if you’d like to report any suspicious activity to them directly, click here. Otherwise, if you have specific questions about your platform’s AML requirements, feel free to reach out to Jack here at GME Law!
Media
Privacy Week Roundup
It’s everyone’s favourite time of the year! That’s right, it’s finally Data Privacy Week, a time to gather with loved ones, update your VPNs, and make sure that you’ve cleared all the cookies from your browsing history. To celebrate the occasion, we put together a list of some new privacy legislation that could be coming to a legal jurisdiction near you in 2024.
In the U.S., there are new bills currently going through the legislature in Kentucky, Maine, Massachusetts, Michigan, Minnesota, Missouri, Nebraska, New York, North Carolina, Ohio, Pennsylvania, Vermont, and Wisconsin. All of these new bills protect consumer rights to access and delete their personal information, and opt out of the sale of their personal information to third parties. Some (but not all) of them also include the right to correct outdated or outright incorrect information, to have to opt in before a business can process sensitive data, and the right to seek civil damages against companies for violations of their rights.
Here in Canada, the House of Commons is currently reviewing the Consumer Privacy Protection Act (CPPA). We’ve written about the CPPA previously in the GME3, but to recap quickly: the Act is facing some serious pushback from interested parties. The CPPA would greatly increase the power of Canada’s Privacy Commissioner, create a new legal regime concerning the collection and use of personal data, create a tribunal to appeal the decisions of the Privacy Commissioner, as well as enshrine laws around the use of AI.
The CPPA has received some substantial pushback. For one, some argue that creating a new tribunal is entirely unnecessary, as the courts already serve the intended purpose. Second, and more substantively, experts have argued that the rules regarding AI, facial recognition technology, and more don’t go far enough. Finally, former Blackberry CEO Jim Balsillie argued that the approach to regulating data is “misguided” and the practice as a whole should be reconsidered.
There’s a long way to go before Canada’s new privacy legislation is signed into law, and it’s likely the CPPA will see substantial revisions before we reach that point. If you have any questions about the rapidly changing landscape that is privacy law, or your business’s privacy obligations, reach out to Zack here at GME Law. And we’ll see you all for next year’s Data Privacy Week!
Entertainment
Creepy Crawlers
The media has, traditionally had a special relationship with web-crawlers (yes, that’s a Spider-Man joke). Beyond J. Jonah Jameson, real life web crawlers, tools used to gather information to train generative AI software like ChatGPT, have been blocked by over 88% of the top-ranked news outlets in the US. However, the sector that has continued to allow bots to scrape their sites for data – right-wing outlets.
Almost all major news companies are now blocking web crawlers, including The New York Times, The Washington Post, and The Guardian. However, none of the leading conservative outlets, like Fox, Breitbart, or The Daily Caller have taken the same steps.
Experts have been speculating why these outlets have taken this approach to AI moderation. The most compelling theory so far is that it’s an attempt to combat political bias. Because AI models are just predictive text-generators, they will reflect the biases of the writing that they are trained on. And, if the entire “left-wing” media is blocking access to their work, right-leaning content is free to influence how these bots “think.”
This is an issue that American Republicans have expressed some concern about. Recently, at a hearing on AI, Senator Marsha Blackburn read an AI-generated poem extolling the successes of President Biden. She claims that generating a similarly positive stanza dedicated to Trump was impossible using ChatGPT.
The jury is out on whether this strategy could even have an impact. Some argue that allowing their content to be used in training AI could have some impact on the model, but others are more skeptical. AI chatbots are trained on an immense amount of data, and the inclusion of one perspective slightly more than another is unlikely to have much of an effect.
GME Law is Jack Tadman, Zack Pearlstein, Lindsay Anderson, Daniel Trujillo, and Will Sarwer-Foner Androsoff. Jack’s practice has focused exclusively on gaming law since he was an articling student in 2010, acting for the usual players in the gaming and quasi-gaming space. Zack joined Jack in September 2022. In addition to collaborating with Jack, and with a keen interest in privacy law, Zack brings a practice focused on issues unique to social media, influencer marketing, and video gaming. Lindsay is the most recent addition to the team, bringing her experience as a negotiator and contracts attorney, specializing in commercial technology, SaaS services, and data privacy.
At our firm, we are enthusiastic about aiding players in the gaming space, including sports leagues, media companies, advertisers, and more. Our specialized knowledge in these industries allows us to provide tailored solutions to our clients’ unique legal needs. Reach out to us HERE or contact Jack directly at jack@gmelawyers.com if you want to learn more!
Check out some of our previous editions of the GME3 HERE and HERE, and be sure to follow us on LinkedIn to be notified of new posts, keep up to date with industry news, and more!