The previous installment of Moral Guillotines, “ChatGPT and Demonstrating Care” focused specifically on the case of students who might use ChatGPT to complete academic assignments. In that post, I argued that ChatGPT was symptomatic of a larger systemic issue which emphasizes product over process, and privileges polish over genuine engagement and substance, at the expense of making failure catastrophic rather than desirable.
In this post, I want to examine how a similar impulse is at work in the way we treat creative workers, and how this plays out with using AI to replace creative workers (and of course there are myriad softwares aside from ChatGPT in various disciplines that make this possible). I’m going to argue that demonstrating care for creative workers must go beyond demonstrating appreciation for creative products, so that we may accurately value the labour of the creative process that goes into those products. Furthermore, I’m going to suggest that this process of artistic labour reveals an important way in which artistic products create and are embedded in culture, and suggest that AI created artistic products place this culture at risk.
The Devaluation of (Creative) Labour
All labour deserves compensation, and appreciation. I discussed last week how a focus on product over process when it comes to academic work treats students as a means to an end, rather than treating them and their learning as an end in and of itself. Similarly, I think that the focus on AI outputs masks the process of artistic creation and risks ignoring and undervaluing the labor that therefore is involved in that process (and not just the process as a product itself, e.g. timelapse videos of creations).
Creatives already struggle to gain compensation for the process parts of their labour: the hours of work and learning that go into skills, as well as the hours of work a piece takes to produce. For instance, writers are often (under)paid per word, or per work, rather than per hour.
Additionally, the erasure of the labour of process especially in the case of creative work I think gives consumers of that creative work a skewed idea as to how it is created. Although artistic creations can be created in a process that is like any other kind of labour, in practice, artistic creations tend to be much more intimate, and often connected to the artists growth as a person.
Think on how often the process of creation becomes part of the narrative of the creation itself: Tolkein’s works were largely influenced by both his academic work and his experiences as a veteran, Guillermo del Toro talks extensively about the personal and political influences on his films, and anyone who has ever listened to a Taylor Swift album would be hard pressed to suggest that her music could be divorced from her personal life. Even for those creatives who are not celebrities in their own right, it is difficult to imagine how the creative process might be completely divorced from the experience of living an embodied existence in a particular society and as part of a diverse larger culture. Disrespect for the process of creating artistic products, is in many cases a disrespect of the artist’s personhood. Unless of course, you’re a machine learning algorithm.
The products of machine learning algorithms look like their artistic counterparts, in the same way the burger from The Menu might look like a burger from McDonald’s, or a handcrafted chair may look very similar to a factory mass-produced one. This reveals a few things, I think. First, that many different kinds of labour can be artistic, creative labours. Second, that in the same way we have accepted mass-produced robotic labour to devalue human labour and shifted it to low and middle income countries where the human suffering inherent in that “robot” labour is camouflaged, the seemingly artistic products of ML algorithms are similarly reliant on the devaluation of human labour and a shift of that labour to places where the exploitation and suffering in that labour can be more easily camouflaged.
The Threat of Cultural Stagnation
Finally, while ML algorithms’ products might look like they’re embedded in culture the same way they might look like the culture they’re imitating, I just don’t see this being possible with the current state of the technology. ML algorithms are inherently biased by their data-sets in a way that is different from an artist’s potential influences. An artist’s potential influences are inherently bounded by the things they have experienced, and no human’s experiences can be as vast as the corpus of algorithms. Where an artist might be influenced secondhand (e.g. by experiencing a piece of art inspired by another piece of art), an algorithm will always have the full picture. It will be hard for it to evolve.
In machine learning, previous datasets generally cannot be deleted. ML algorithms cannot lose old chaff and still function in a blanker, newer state, nor can new data be easily inserted into algorithms (such as in an effort to reduce bias) because of the issue of catastrophic forgetting. The idea of an algorithm learning, growing, adjusting, and being influenced by the changes of society and culture is simply not possible without the creation of a brand new algorithm. This is fundamentally different from a human creator, and I think the risks of a stagnant culture as a result of ML created artistic products is a frightening one, especially given that many marginalized folks need biases to be addressed and cultures to change in all kinds of ways for all kinds of important reasons.
What Can Be Done About It?
I’ve discussed previously that one of the major things we can do to support creative labour is to implement a Universal Basic Income (UBI). I’m also a huge advocate for unionization and collective action. Besides this though, I think that in the era of ubiquitous content creation, we all need to re-learn respect for creatives as human beings that are both separate from and intimately entangled with the products they produce.
If you consume an artistic product you love, really try to reckon with the hours and hours of labour that someone (or many someones!) put into that product, and how those hours of labour might have impacted their personal lives, communities, and cultures, as well as how those lives, communities, and cultures might have impacted the work in turn. Let’s experiment together and see if appreciating the things you love in a deeper, more networked way can help you find a deeper love and appreciation for those things, and how those things might connect to your own life, community, and culture.
I have a suspicion that if we cultivate a deeper mindfulness around these things, the shine of AI works might just wear off. It’s hard for me to imagine a world where our love for *things* is intrinsic in those things rather than in what they tell us about and help us to express about our connected, embodied, social experience of life. Perhaps after some deliberation you’ll feel the same.
If you’re interested in my work and would like to reach out for any hiring opportunities, please contact me at Type-Driven Consulting.
Or if you like the work I do here, consider supporting it by buying me a ko-fi!