Back 15.12.2019

Interview: Bill Posters on Al Jazeera’s Listening Post

‘Politics, porn and the toxic world of deepfakes’, Listening Post, Al Jazeera

I had the pleasure of being interviewed by The Listening Post’s Tariq Nafi for the December 14th edition of the programme. Evocatively entitled ‘Politics, porn and the toxic world of deepfakes’, the programme takes a good critical examination of deepfake technologies and the cultural practices that surround the uses of these technologies today. The feature includes coverage of Big Dada by myself and Daniel Howe, as well as coverage of my latest project in partnership with Future Advocacy entitled Partly Political Broadcasts. This project used deep fake technologies to create AI synthesised personas of UK Prime Minister Boris Johnson and the leader of the opposition Jeremy Corbyn, created to raise awareness to the distinct lack of legislation concerning the spread of misinformation online. Released during the 2019 UK election cylce, the project received wide press coverage across all major UK newspapers and included an embedded BBC News team throughout the making of the project.

You can view the segment below:

Synopsis:

Deepfakes are videos produced through the use of artificial intelligence. Melding images and sound, using things like face grafts, body transfers or voice imitations, they make people appear to say things they never said and do things they never did. They appear so real it is often difficult to tell if they are fake.

For now, the vast majority of deepfakes, 96 percent to be more precise, target women, mostly celebrities, where one woman’s face is placed to appear on another’s body, many times in the making of fake pornographic videos.

Bill Posters, digital artist and researcher, explained to The Listening Post’s Tariq Nafi that: “It’s women’s bodies, identities and rights that are being transgressed and oppressed basically by quite a small but quite a prolific body of actors that are taking famous celebrity female actor’s faces and transplanting those into pornography scenes, and there are huge websites that profit millions of dollars from displaying and sharing and streaming these kinds of deep fake pieces of pornography.”

The bigger concern is this: Deepfakes could be used to spread misinformation, mess with politics, and manipulate electorates by fooling journalists and voters. And there are real-life examples already playing out across the world.

Take Gabon, where there were suspicions that all was not OK with President Ali Bongo. Having been abroad for medical treatment he had not been seen in public for months. Reports of his good health did nothing to convince the public so the government released a video.

But something about the video did not add up. Bongo’s eyes barely moved or blinked. He stared off camera, his body and hands seemed rigid and unnatural. The video looked fake, fake enough for Gabon’s military to attempt a coup – that is how easily a deepfake can be used to destabilise an entire nation.

In this new era of democratised access to video synthesis, deepfakes are not all that hard to produce. So what happens when those with serious technical know-how get in on the act?

Posters has done exactly that, harvesting the biometric data of British Prime Minister Boris Johnson, and leader of the opposition, Jeremy Corbyn. Creating a synthesised video of both men endorsing each other for prime minister.

The frighteningly real results are a statement on some of the most pressing issues of our time: Data, our right to privacy, and, as British voters went to the polls, the integrity of the democratic process.

Henry Ajder, head of Communications and Research Analysis at Deeptrace explained that: “The mere idea of deepfakes alone is enough to, kind of destabilise political processes and poison the collective water that is media. We rely on audio visual media every day to inform us about what’s going on in the world. If deepfakes become commonplace, as we anticipate they will, you know, that will lead to, I believe, quite a significant level of disruption'”

Feature contributors:

Henry Ajder – head of Communications and Research Analysis, Deeptrace

Britt Paris – assistant professor of Library and Information Science, Rutgers University

Bill Posters – digital artist and Researcher

 

Thanks to Tariq and the team at Al Jazeera’s Listening Post for the dilligent and in-depth reporting.