Boss, Bitch

Boss, Bitch | 2018

Algorithms are human fabrications of instructions created by humans for machines. With human intervention, there exist an unconscious bias in relation to gender. What happens when we program the machine to recognize certain things but then expect the machine to give us neutral results? Will it be neutral? 

For this project, I downloaded a library of images off of Google Images using the words ‘boss’ and ‘bitch.’ I then ran my image data through a machine learning algorithm using TensorFlow to train my machine to recognize these images. Afterwards, I downloaded an image library for ‘men’ and ‘women,’ merged the libraries, and asked the machine to filter these images and assess it based on the questions: “What images are the most ‘boss’”? and “What images are the most ‘bitch’”? The machine identified the images and gave it a % prediction based on confidence levels. I took the average of the top 30 and created generative portraits of my results. 

Note: This art project is merely a commentary on what I discovered, not a scientific experiment.

Tools: TensorFlow, Photoshop

A generative portrait of Boss from 2001-2018. 

A generative portrait of Bitch from 2001-2018.

A generative portrait of Boss from 2001-2010. 

A generative portrait of Boss from 2001-2010. 

Using Format