Not Quite Fascimile

Maxwell Mijnlieff



In 2018, I was marked suspicious by an automated facial recognition system. This designation led to a series of policing interactions and initiated a system of personal surveillance unbeknownst to me. I still have unanswered questions about what the actual effects of this happening are, but what I do know is that it has permanently shifted my relationship to photography and my image.

Employing various digital tools for face isolation, alignment and analysis, I am trying to image the entity which is created by surveillance systems. The images and networks of data produced by surveillance systems become a second entity, some kind of imaginary being who is made of discernible parts of us but is never really representative of our humanity. By analyzing and imaging myself through generative adversarial networks (GANs), which are the driving neural networks behind most computer intelligence, I am exploring the possibility of rewriting and taking ownership over that second entity which is created when we are surveilled.

The images used are either images of myself which have been processed to obfuscate my identity as a surveillance deterrent, images created by a GAN trained to make new photographs of me, or images of me otherwise interpreted/generated by neural networks. The network imagines a world for me, it imagines images of me, it projects its set of rules about what I am onto me. Because it’s trained with the implicit biases of human society, it imposes those biases on my humanity but passes them off for fact. Just like 19th century photographers like Alphonse Bertillon and Havelock Ellis projected their inherited biases onto their subjects as if they were fact. I’ve combined the computer generated works with appropriated historical images of surveilled people whose images were instrumental in the foundations of modern lens based surveillance, as well as ephemera from patents and documents relating to the use of imaging to control, police and oppress subjects.















All rights reserved, 2020.