- By Helen Bushby
- Entertainment reporter at Sheffield Documentary Festival
The director of a documentary on the impact of pornographic deepfakes said she hopes her film will help people understand the unimaginable damage it causes.
Rosie Morris’s My Blonde GF is about what happens to writer Helen Mort when she discovers photos of her face appearing on deepfake images on a porn website.
A deepfake image is an image where one person’s face is digitally added to another’s body.
Helen thinks her photos may come from an old Facebook account and professional photos of her are in the public domain.
In the movie, we see her flip through photos of herself aged 19 to 32, smiling at weddings, family occasions and when she was pregnant.
These are images that were then digitally edited into images of women in sexually explicit and violent scenes.
“I need to see the pictures myself,” she said, speaking straight into the camera. It makes the audience feel part of an uncomfortable conversation.
“There was a woman, she was sitting on the edge of the bed, she saw my face but not my mouth, she was [doing a sexual act]… the woman’s skin is much more tanned than mine, and this woman has my exact tattoo.
“She’s looking at some text… an invitation to humiliate the person in the picture, which is me.”
In the text, Helen is described as “My Blonde Girlfriend”, short for girlfriend, which became the title of the documentary.
Morris wanted to explore the impact those images had on Helen, including the horrific, graphic recurring nightmares and paranoia.
Helen says she often feels as if “everyone on the street somehow knew about the pictures, and they knew this terrible secret about me, which suddenly felt like my terrible secret.” .
However, this is not the first time she has talked about this and there have been other documentaries about pornographic deepfakes, so how is Morris’s film different?
“My film doesn’t pay attention to the perpetrator – I don’t care about the space of the person who did it,” the director said. “My main goal is that I want you to accompany Helen in this story.
“You were there for her at every stage – when I met her, she was still processing and trying to work it out. So the only way you really feel it is for you to be there. next to her.
“What struck me when I met Helen was that you can sexually assault someone without having physical contact with them.
“That’s what motivated me; I think that’s what shocked me the most.”
The trauma when used for deepfake porn is very real. Prof Clare McGlynn at Durham University, an expert on image-based sexual abuse, told the BBC: “The impact can be devastating and devastating to lives.
“Many victims describe a form of ‘social rift’, where their lives are split between ‘before’ and ‘after’ abuse, and the abuse affects every aspect of their life, occupation. , personal, economic, health, happiness.”
Helen says in the film: “I really feel as if those are real images, and it’s hard to explain to people who haven’t seen it with their own pictures, what it really is – They can’t do anything to me.
“But they’ve put all this stuff in my head, these pictures, I can’t see any pictures. I can’t even see perfect pictures that aren’t altered in the same way. .
“Now I look at that picture [an image of her in a dress] and if I look at it independently, I feel like it’s the image of an attack.”
Morris added that Helen was “influenced by the images”.
“The most disturbing aspect is that I don’t think you can separate images from memory as easily, like photographs. You know, if you look back, you wouldn’t know you remembered the moment. that or you remember its picture.
“So what happened to Helen was that these images, which were already attached to the memory, were recaptured and almost planted in her mind this so-called false memory. And you can’t really measure that hurt.
“It’s like being psychologically attacked by this, it’s related to emotional attachment to the original image.”
‘Minimize and trivialize’
“One of them commented in the background of deepfakes that the photo is ‘still your photo… it’s still abusive’,” she said.
Sensity AI 2019 ReportA company that monitors deep-fake practices, found that 96% were non-consensual sexual images and of those, 99% were of women.
Professor McGlynn agrees, saying: “Women are more likely to face this abuse and it’s mostly men who cause it.
“Society doesn’t have a good record of taking crimes against women seriously, and this is also the case with deepfake pornography. Online abuse is often minimized and over-trivialized. ”
Helen also spoke of the unimaginable anxiety of not knowing who created the photos.
“In each of those pictures, my eyes were fixed on the camera,” she said. “But through it all, this person, this profiler, this image hoarder has no face.”
She was even more horrified to learn that the police could not do anything to prosecute whoever had created the images.
They told her there was nothing they could do, because no crime was committed. Creating deepfake images is not illegal.
Law in Scotland already allows police investigations, but current law in England and Wales does not.
Professor McGlynn said that “the changes in the bill are long overdue”, but for it to really work, “it needs to make online violence against women and girls a priority issue”. , and includes measures to ensure internet platforms take these abuses seriously”.
The Department of Science, Innovation and Technology, the body responsible for the bill, told the BBC the bill is expected to become law this year.
‘Think how others live’
Morris concluded that she was “trying to question” her film, adding: “I don’t have a solution, but I really feel like it’s something we need to pay attention to.”
The importance of documentaries cannot be underestimated, she said, adding: “I think they give you a window into other people’s lives.
“I feel like now, thanks to social media, we get to have our own experiences and the way we express ourselves.
“It’s important to think about how other people live and experience the world.”