Cheating by Hariadhi, CC BY-SA 3.0
In the beginning…
Generative artificial intelligence (AI) is just another way to lie, cheat, steal, and avoid the hard work of writing, creating, or composing original work. At least that was my initial thoughts before spending any serious amount of time reading about, or understanding how it actually worked. Like many teachers my first serious introduction to the use of AI was seeing my colleagues in the English department thrown into utter turmoil by the deluge of ChatGPT generated essays being submitted as original work. This cemented the idea that generative AI was merely a new way of engaging in dishonesty, much like cheating on an exam, or paying someone else to do your homework.
Legitimate Concerns
Of course the reality is much, much more complicated than it initially appears. Artists, writers, and composers do have legitimate concerns regarding the use of their intellectual property that is used to train AI. As Bernard Marr of Forbes articulated in his article Is Generative AI stealing from Artists? AI companies are profiting from the legitimate work of creators without formally compensating or recognizing them. He worries that AI will inundate the market with derivative works created in the style of a particular creator, diminishing the demand for original work. While I agree that this may come to pass, part of me believes that original work will always hold more value than cheap knock offs. The $20 Tiffany-style lamp sold on Amazon cannot hold a candle to an authentic hand-crafted piece from the turn of the 20th century. The rub lies in how good generative AI models become and the nature of digital goods. If one cannot tell the difference or “feel” the craftmanship of the item will the providence of it being original hold as much weight? Marr further points out that an authors work may be used in ways that they find morally or ethically objectionable. What happens when a hate group creates a song in the style of an artist for a rally? What if a political figure creates posters using work from an artist who would never endorse them?
Legitimate Questions
Can AI be creative in its own right? What does it mean to be creative? Where is the line between intellectual property theft and fair dealing?
In his article Advait Sarkar wrestles with some of these questions and in the process significantly reframed my understanding of the debate. Some of the salient points he raises I have summarized below.
What matters most process or product?
Sometimes creativity resides in how something was done rather than what was produced. Sarkar considers artists like Jackson Pollock whose motion can be felt in his paintings, which is considered by some to be far more important than the quality of the final piece (I for one find Pollock’s paintings aesthetically pleasing, but that is a discussion for a different blog entirely). This argument could be extended AI based algorithms whose programming may be written in a particularly novel or inventive way.
How much does the author’s intent impact the value of the work?
As Sarkar points out “readymade” art has a long established history. As he notes the value of Duchamp’s “The Fountain” is not just as a urinal, it became art because of the author’s intentions. He argues that our culture of sharing memes gives a degree of ownership to the individual who shares and repurposes the image rather than the person who is the subject of, or initially took the photo. Could this idea extend to AI? For example if one directs AI to create an image is that enough ownership for the work to not be considered plagiarism?
How does one properly attribute art, when it isn’t created in a vacuum?
If you look at the top of this web page you will see a drawing I did in 2014 of a bored chinchilla sitting at an office desk. This work would be considered by most standards to be wholly original. But did I create this work solely on my own? No. First and foremost I had to look at reference photographs of these little creatures because I have only ever interacted with one at a petting zoo when I was six. Is my work influenced by other cartoonists? Absolutely. I grew up cutting out newspaper comic strips, reading comic books, and watching Saturday morning cartoons. All of these influences contributed to my particular “style” which is not completely original. Is any of this documented below the work? No. As Sarkar argues understanding the context of a work of art is essential for evaluating creativity, but this is nearly impossible (and if it was would it be desirable?).
If using AI is plagiarism and theft wouldn’t this logic apply to collage (and similar art forms)?
Here Sarkar is pointing out that many art forms rely on the reuse of materials. Photographs in collage, voice samples in hip-hop music, and quotations in jazz improvisation. If we deem these methods of expression acceptable than as he put it “…the reuse of and non-attributive natures of AI are not universal grounds to deny creativity, or accuse it of plagiarism.” He instead points to the context of the situation as a better guide. Would this be acceptable in this particular situation?
Are we looking at this the wrong way?
One of the most compelling arguments that Sarkar makes (in my opinion) is that the nature of AI might change what we look for as both process and product. As he points out perhaps writing essays will not be the standard for demonstrating knowledge in an English class anymore, but one’s ability to prompt AI well, check the output for errors and fallacies, and knowing how and when to integrate into one’s workflow will become the markers of success and failure. I don’t know how I feel about this as it is fundamental shift in the role of teachers, but just as cameras diminished the need for skilled portraiture, perhaps AI will reduce the need for writing to the point that manually recording thoughts will be obsolete.
The picture is a lot murkier as I dive deeper into the ethical implications of AI, that is for sure.