91原创

Skip to content

AI brings deepfake pornography to the masses, as Canadian laws play catch-up

B.C. recently became the latest to enact new legislation
web1_2024020219020-65bd82eff0b1e5d9ee2308f0jpeg
Taylor Swift, who recently became a prominent victim of deepfake pornography, stands on the field after an AFC Championship NFL football game between the Baltimore Ravens and the Kansas City Chiefs, Sunday, Jan. 28, 2024, in Baltimore. THE CANADIAN PRESS/AP-Julio Cortez

Underage Canadian high school girls are targeted using AI to create fake explicit photos that spread online. Google searches bring up multiple free websites capable of 鈥渦ndressing鈥 women in a matter of minutes. The world鈥檚 biggest pop star falls prey to a deepfake pornographer, with the images viewed tens of millions of times.

This is the new era of artificial pornography for the masses.

The technology required to create convincing fake pornography has existed for years, but experts warn that it鈥檚 faster and more accessible than ever, creating an urgent challenge for Canadian policymakers.

Advances in artificial intelligence have made it possible to do with a cellphone what once would have required a supercomputer, said Philippe Pasquier, a professor of creative AI at Simon Fraser University in B.C.

READ MORE:

Pasquier said society has 鈥渓ost the certainty鈥 of what is real and what is altered.

鈥淭he technology got a little better in the lab, but mostly the quality of the technology that anyone and everyone has access to has got better,鈥 he said.

鈥淚f you increase the accessibility of the technology, that means good and bad actors are going to be much more numerous.鈥

Across Canada, legislators have been trying to keep up. Eight provinces have enacted intimate image laws, but only half of them refer to altered images.

B.C. recently became the latest, joining Prince Edward Island, Saskatchewan and New Brunswick.

The B.C. law, which came into effect on Jan. 29, allows people to go to a civil resolution tribunal to get intimate images taken down, regardless of whether they are real or fake, and go after perpetrators and internet companies for damages.

Individuals will be fined up to $500 per day and websites up to $5,000 a day if they don鈥檛 comply with orders to stop distributing images that are posted without consent.

Premier David Eby said the recent sharing of fake images of pop star Taylor Swift proved no one was immune to such 鈥渁ttacks.鈥

Attorney General Niki Sharma said in an interview that she is concerned people don鈥檛 come forward when they are the victim of non-consensual sharing of intimate images, real or not.

鈥淥ur legal systems need to step up when it comes to the impacts of technology on society and individuals, and this is one part of that,鈥 she said of the new legislation.

The province said it couldn鈥檛 provide specific data about the extent of AI-altered images and deepfakes.

But cases have occasionally been made public elsewhere.

In December, a Winnipeg school notified parents that AI-generated photos of underage female students were circulating online.

At least 17 photos taken from students鈥 social media were explicitly altered using artificial intelligence. School officials said they had contacted police and had made supports available for students directly or indirectly affected.

鈥淲e are grateful for the courage of the students who brought this to our attention,鈥 said Christian Michalik, superintendent of the Louis Riel School Division, in a letter to parents that was also posted on Facebook by a school division trustee.

Manitoba has intimate image laws, but they don鈥檛 refer to altered images.

Brandon Laur is the CEO of White Hatter, a Victoria-based internet safety company.

The firm recently conducted an experiment and found it took only minutes using free websites to virtually undress an image of a fully clothed woman, something Laur called 鈥渟hocking.鈥

The woman used in the experiment wasn鈥檛 real 鈥 she was also created with AI.

鈥淚t鈥檚 pretty surprising,鈥 Laur said in an interview. 鈥淲e鈥檝e been dealing with cases (of fake sexual images) since the early 2010s, but back then it was all Photoshop.

鈥淭oday, it鈥檚 much simpler to do that without any skills.鈥

White Hatter鈥檚 experiment used Google to find seven easily accessible and user-friendly websites and applications capable of creating so-called 鈥渄eep nudes.鈥

In the original photo, a young woman dressed in a long-sleeved blue shirt, white pants and sneakers walks towards the viewer. In the next scenes, she鈥檚 nude, partially nude or wearing lingerie; White Hatter censored the resultant images with black bars.

LEGAL AVENUES, NEW AND OLD

Angela Marie MacDougall, executive director of Battered Women鈥檚 Support Services, said her organization was consulted about the B.C. legislation.

She said Swift鈥檚 case underscored the urgent need for comprehensive legislation to combat deepfakes on social media, and applauded the province for making it a priority.

But the legislation targets non-consensual distribution of explicit images, and the next 鈥渃rucial step鈥 is to create legislation targeting creators of non-consensual images, she said.

鈥淚t鈥檚 very necessary,鈥 she said. 鈥淭here鈥檚 a gap there. There鈥檚 other possibilities that would require having access to resources, and the women that we work with wouldn鈥檛 be able to hire a lawyer and pursue a legal civil process around the creation of images 鈥 because, of course, it costs money to do that.鈥

But other legal avenues may exist for victims.

Suzie Dunn, an assistant law professor at Dalhousie University in Halifax, said there were several laws that could apply to deepfakes and altered images, including those related to defamation and privacy.

鈥淭here鈥檚 this new social issue that鈥檚 coming up with AI-generated content and image generators and deepfakes, where there鈥檚 this kind of new social harm that doesn鈥檛 fit perfectly in any of these existing legal categories that we have,鈥 she said.

She said some forms of fakery could deserve exceptions, such as satire.

鈥淎s technology evolves, the law is constantly having to play catch-up and I worry a bit with this, that there might be some catch-up with this generative AI.鈥

Pablo Tseng, an intellectual property lawyer in Vancouver, said deepfakes are 鈥渁ccelerating鈥 an issue that has been around for decades: misrepresentation.

鈥淭here鈥檚 always been a body of law that has been targeted towards misrepresentation that鈥檚 been in existence for a long time, and that is still very much applicable today to deepfakes, (including) the torts of defamation, misrepresentation or false light, and the tort of misappropriation of personality.鈥

But, he said that specific laws, like the B.C. legislation, are steps in the right direction of further combating the issue, in tandem with existing laws.

Tseng said he knew of one Quebec case that showcased how the misuse of deepfake technology could fall under child pornography laws. That case led to a prison sentence of more than three years for a 61-year-old man who used AI to produce deepfake child pornography videos.

But Tseng said he wasn鈥檛 aware of any judgment in which the technology is referenced in the context of misrepresentation.

鈥淚t鈥檚 clear that just because no judgment has been rendered doesn鈥檛 mean that it isn鈥檛 happening all around us. Taylor Swift is but the latest example of a string of other examples where celebrities鈥 faces and personalities and portraits have simply been misused,鈥 he said.

Dunn said she believed content moderation by websites was likely the best way forward.

She called on search engines like Google to de-index websites primarily focused on creating sexual deepfakes.

鈥淎t a certain point, I think some people just give up, even people like Scarlett Johansson or Taylor Swift, because there鈥檚 so much content being produced and so few opportunities for legal recourse because you would have to sue every individual person who reshares it,鈥 Dunn said.

She said that while most video deepfakes involve celebrities, there are cases of 鈥渆veryday women鈥 being targeted.

鈥淎ll you need to have is one still image of a person, and you can feed it into these nude image generators and it just creates a still image that looks like they鈥檙e naked, and most of that technology only works on women.鈥

鈥楶AINFUL AND DEHUMANIZING鈥

Australian activist Noelle Martin is aware of the peril all too well.

The 29-year-old said in an interview that she did a reverse image search of a photo of herself on Google about 10 years ago.

Her curiosity turned to mortification when she found fake sexually graphic photos of herself.

鈥淚t is the most shocking and painful and dehumanizing experiences that I鈥檝e ever been through,鈥 she said in an interview.

鈥淭o see yourself depicted in all these different positions and different circumstances, in the most graphic and degrading way, is sickening.鈥

She went to the police, but because there were no laws against it at the time, she said they told her to contact the websites to try to get them removed. Some obliged, but others didn鈥檛 respond and the faked images 鈥 and eventually videos 鈥 continued to multiply.

Martin said she still doesn鈥檛 know who targeted her or why.

She began speaking out publicly, advocating for a national Australian law that would fine companies thousands of dollars if they didn鈥檛 comply with takedown orders. The law passed in 2018.

Martin, who now works as a legal researcher at the University of Western Australia, said a global approach to combating the issue is necessary given the 鈥渂orderless鈥 nature of the internet, but it had to start locally.

Though recent conversations about the misuse of AI has been focused on public figures, Martin said she hopes the focus shifts to 鈥渆veryday women.鈥

鈥淣ot only do we not have laws in some jurisdictions, in many of the ones that do, they鈥檙e not enforced. When you put it into the context of this becoming such an easy and quick thing for people to do, it鈥檚 scary because I know exactly what it鈥檚 going to be like,鈥 she said.

鈥淚t鈥檚 not going to be the experience that we鈥檙e seeing, for example, in the Taylor Swift case. The world is not going to rally around an everyday person or help them take down the images, and they鈥檙e not going to be responded to by tech companies in a way that protects them.鈥

Brieanna Charlebois, The Canadian Press

Breaking News You Need To Know

Sign up for a free account today and start receiving our exclusive newsletters.

Sign Up with google Sign Up with facebook

This site is protected by reCAPTCHA and the Google and apply.

Reset your password

This site is protected by reCAPTCHA and the Google and apply.

A link has been emailed to you - check your inbox.



Don't have an account? Click here to sign up




(or

91原创

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }