Research: AI, Deep Fakes and Censorship

WARNING: This post contains material which is NSFW.

“As the story goes, the web arrived pregnant with possibilities to do away with ascribed identity. It would allow us to rethink who we are by allowing us to transcend geographic location, physical ability, race, gender, age, even species.”

(Jurgenson, N. 2019; 87)

Throughout my MA, the above quote from Jurgenson’s The Social Photo (2019) has been a key reference and inspiration to creating work, in that it made me question what social media monopolies do with the data we freely give, who truly owns it and whether the data ultimately becomes an entity in its own right, orphaned in a sea of data, converted into an ‘immortal’ online self, that is of you, yet not you. This analysis is backed by both Storr and Ravetto-Biagioli, who have summarised, that the more we input, the more the profiles identify as us, but not belonging to us, a form of idealised perfection, through self-censorship. And perhaps unsurprisingly, as a younger millennial, I too, am a guilty member of the self-curation, self-censor club, in what I do, say, write, take photos of, what little I do post could be too much in the eyes of others who have authoritarian leanings and hold views which oppose or contradict mine.

Figure 1 Murray, J. (Jan 2021) Example of how (Thiel, T & /p (2021) Lend Me Your Face!) works

With the rise of deep fakes visual depictions are increasingly under threat of manipulation, fuelling not only more disinformation, but also a risk of being discriminated against for something you did not actually do, the internet and social media has become a place of worry and concern (which is I suppose ironic, given I am doing an online distance course, writing on a social blogging platform!). Some deep fake projects like The Photographers’ Gallery’s commission of Tamiko Thiel and /p’s Lend Me Your Face! (2021) have been created with good intentions, with an aim to educate viewers on questioning their realities and the media they are consuming and what little information you need to create a semi-convincing deep fake video, from freely available open source code.

Figure 2 Murray, J. (Jan 2021) Screenshot of Twitter account deepsukebe

On the other hand projects like deepsukebe.io have clearly been made with nefarious intentions from the get go with their official twitter bio stating “AI leveraged nudifier. Revealing truth hidden under clothes. Our mission is to make all men’s dream come true. More powerful than deepnude – undress any dress” (Deepsukebe, Oct 2020). For obvious ethical reasons, I deliberately opted to using my phicen dolls as the subject to see just how well it worked. The results were to say the least disturbing, in that if it wasn’t for the fact I knew how my dolls looked like nude and the hints of artifacting weren’t so obvious you could quite easily be fooled into thinking the outcome was genuine.

Figure 3 Murray, J. (Jan 2021) Screenshot of Reddit post “How is this NOT involuntary pornography? Men are advertising an AI website that makes anyone nude.” on r/BlatantMisogyny

Especially given the context around how I found about this website, a member on reddit’s r/BlatantMisogyny shared a screengrab of another reddit forum discussing the tech, I’m inclined to agree upon trialling, that this website is effectively advocating the concept of involuntary pornography, with Figure 4, showing an example I selected (an early developmental image from _Unsocial Media, the phicen doll in a busy urban environment), just how successful the script could be used and abused, on mundane street images of individuals calling into question the very privacy of self.

Figure 4 Murray, J. (2019) Early image from development of Unsocial Media_ (left) – Deepsukebe’s generated nude (right) – use slider to compare

Deepsukebe is not the first to come up with nudifying still photographs in 2019, VICE reported on another AI nudifier called DeepNude, which was claimed to have been taken down by the creator due to the backlash received, after journalists from various outlets reported on it’s existence. However despite appearing to have gone offline, it appears the script from this has spawned other sites, as well as a .to domain, also called DeepNude, whether this site has anything to do with the original developer beyond it’s name and the script used is unclear.

Figure 5 Nolan, Z.W (2020) Medium: The Complete Guide : Best Deepnude App (2020)

What I do know however is much like the Medium reviewer Nolan’s (PS: I have reason to believe, this is an alias given the user’s profile photo is a GAN generated profile photo using thispersondoesnotexist, upon using reverse image search to find out what web presence they had) comparisons using Deepsukebe’s stock examples as the starting point, is that of the options available, Deepsukebe is the most disturbing in terms of success, in regards to clothed individuals, especially if you are a woman, as it’s attempts at making men nude is not so successful, giving breasts to one of my male dolls as well as male anatomy. It also isn’t so successful thankfully, if you are wearing clothes which hide your shape, with it getting completely confused with where my dolls legs went, in one image (both examples found within Figure 6‘s slideshow).

Figure 6 Murray, J. (2019-2020) BTS and developmental images for Unsocial Media_ & The Mirror Hack’d put through ‘Deepsukebe’

Figure 7 Murray, J. (2019-2020) BTS and developmental images for Unsocial Media_ & The Mirror Hack’d put through ‘Deepnude.to’

Meanwhile the script that ran Deepnude is not so successful despite using the same images in both experiments, it appeared to be incapable of coping with busy mundane landscapes with a figure in them just randomly inserting breasts in the image, or just replacing the contrasting clothes with breasts (Figure 7 slideshow) producing amusing outcomes that are far less worrying in regards to privacy. I have to admit though the development of projects such as deepsukebe concern me on what I put out on the web in terms of selfies, as it only takes one person with bad intentions to create an issue, that is ultimately a lie, yet could still cause impact on an individual’s life in a negative way.

Self-censorship, is of course a very different beast to actual censorship, which is an angle I am keen to explore in this module, especially given the news of Twitter this week soft launching a moderation project called ‘Birdwatch‘ which seeks to use its userbase to flag and annotate misinformation on the platform, in an attempt to fix the issues it has had on the platform. Unsurprisingly this announcement has flagged it’s own concerns, with some drawing a comparison between Birdwatch and Big Brother in 1984, if the concept fails and effectively becomes a biased echo-chamber of it’s own post-truth, as I explored last module Twitter is already a cesspit of misogyny, with the tweeters of such content rarely facing any kind of repercussions for their actions, with women on the site facing more repercussion for highlighting said misogyny or calling out a user for sending them unsoliticited dick pics. Whose to say Birdwatch won’t amplify these issues, repeating the errors of reddit, who have one-sidedly banned radical feminist forums, whilst leaving radical manosphere forums and revenge pornography forums that largely target women. Orwell, I think aptly sums up the potential issue:

“In the end the Party would announce that two and two made five, and you would have to believe it. It was inevitable that they should make that claim sooner or later: the logic of their position demanded it. Not merely the validity of experience, but the very existence of external reality, was tacitly denied by their philosophy. The heresy of heresies was common sense.”

Orwell, G. (1950; 80)

Figure 8 Murray, J. (2020) BTS for The Mirror Hack’d put through ‘Deepnude.to’ – left original attempt, right second attempt

UPDATE 29/01/2021: Talking to a family member about this technology, they asked if I put an image through twice whether the outcome was always the same or not. Ironically this was something I had overlooked, so today I put through two of the less successful outcomes, the phicen female doll in the long flowy dress and the jiaou male doll, who ended up with breasts in the last trial. Interestingly in both instances, the outcome came out differently from my original attempts with the jiaou male not ending up with breasts and the phicen female with two distinct legs, albeit still heavily riddled with artifacting (Figures 8 & 9).

Figure 9 Murray, J. (2020) BTS for The Mirror Hack’d put through ‘Deepnude.to’ – left original attempt, right second attempt

References

Figures

Figure 1 Murray, J. (Jan 2021) Example of how (Thiel, T & /p (2021) Lend Me Your Face!) works [Online] Available from: https://vimeo.com/505328146 [Accessed 27/1/2021]

Figure 2 Murray, J. (Jan 2021) Screenshot of Twitter account deepsukebe. [Online] Available from: https://twitter.com/deepsukebeio [Accessed 27/1/2021]

Figure 3 Murray, J. (Jan 2021) Screenshot of Reddit post “How is this NOT involuntary pornography? Men are advertising an AI website that makes anyone nude.” on r/BlatantMisogyny. [Online] Available from: https://www.reddit.com/r/BlatantMisogyny/comments/l3f0db/how_is_this_not_involuntary_pornography_men_are/ [Accessed 27/1/2021]

Figure 4 Murray, J. (2019) Early image from development of Unsocial Media_ (left) – Deepsukebe’s generated nude (right) – use slider to compare

Figure 5 Nolan, Z.W (2020) Medium: The Complete Guide : Best Deepnude App (2020). [Online] Available from: https://medium.com/@ZachWillNolan/the-complete-guide-best-deepnude-app-2020-58a8f3e10da9 [Accessed 28/1/2021]

Figure 6 Murray, J. (2019-2020) BTS and developmental images for Unsocial Media_ & The Mirror Hack’d put through ‘Deepsukebe’

Figure 7 Murray, J. (2019-2020) BTS and developmental images for Unsocial Media_ & The Mirror Hack’d put through ‘Deepnude.to’

Figure 8 Murray, J. (2020) BTS for The Mirror Hack’d put through ‘Deepnude.to’

Figure 9 Murray, J. (2020) BTS for The Mirror Hack’d put through ‘Deepnude.to’

Bibliography

Cole, S. (2019) VICE: Creator of DeepNude, App That Undresses Photos of Women, Takes It Offline. [Online] Available from: https://www.vice.com/en/article/qv7agw/deepnude-app-that-undresses-photos-of-women-takes-it-offline [Accessed 27/1/2021]

Deepsukebe (2020) Twitter Bio. [Online] Available from: https://twitter.com/deepsukebeio [Accessed 27/1/2021]

Jurgenson, N. (2019) The Social Photo: On Photography and Social Media, pg 87. London; Verso.

Nolan, Z.W (2020) Medium: The Complete Guide : Best Deepnude App (2020). [Online] Available from: https://medium.com/@ZachWillNolan/the-complete-guide-best-deepnude-app-2020-58a8f3e10da9 [Accessed 28/1/2021]

Orwell, G. (1950) 1984. pg 80. New York; Signet Classics.

Paul, K. (2021) The Guardian: Birdwatch: Twitter pilot will allow users to flag misinformation. [Online] Available from: https://www.theguardian.com/technology/2021/jan/25/twitter-birdwatch-misinformation-donald-trump-election [Accessed 28/1/2021]

Ravetto-Biagioli, K. (2019) Digital Uncanny. pg 57. New York; Oxford University Press.

Storr, W. (2017) Selfie: How the West became self-obsessed. pg 17 (The Dying Self). London; Picador.

Thispersondoesnotexist; u/ProfeshPress [in] r/TIDNE (2020-) Reddit: This 32-year-old cybersecurity analyst and whistleblower Does Not Exist. [Online] Available from: https://www.reddit.com/r/TIDNE/comments/gfg8p9/this_32yearold_cybersecurity_analyst_and/ [Accessed 28/1/2021]

Thiel, T. &/p (2021) Lend me Your Face! [Online] Available from: https://tamikothiel.com/lendmeyourface/online/GoFakeYourself.html [Accessed 27/1/2021]

The Photographers’ Gallery (2021) Digital Project: Lend me Your Face! Tamiko Thiel and /p. [Online] Available from: https://thephotographersgallery.org.uk/lend-me-your-face [Accessed 27/01/2021]

u/PeachBiPi [in] r/BlatantMisogyny (2020-) Reddit: How is this NOT involuntary pornography? Men advertising an AI website that makes anyone nude. [Online] Available from: https://www.reddit.com/r/BlatantMisogyny/comments/l3f0db/how_is_this_not_involuntary_pornography_men_are/ [Accessed 27/1/2021]