On Artists, Artisans, Art, & AI
Faithful Reproduction, Myths of Individual Genius, and LLMs under Capitalism
It’s been another loaded week for the SFF industry on social media, thanks to the launch and immediate ratio-ing of Sudowrite, an app based on GPT-4 technologies that its creators claim will help would-be writers craft long form fictions—and most everyone else in the field considers a troubling development.
Jason Sanford runs an excellent Genre Grapevine newsletter that recently covered all the ins and outs of the controversy around Sudowrite, which from stated intent to formative source materials has met with stern rebuke from those who regard the app as promoting plagiarism, devaluing creative work, and offering a shiny new tool for corporate enterprise already eager to cut wages and jobs.
Because frontline industry is already handling all the immediate panic and outrage, though, I want to address some of the argumentative incoherence that necessarily arises when people start expressing disapproval of a given product. Very often, the outrage of fellow creators—while perhaps legitimate for other reasons—will trade on arguments that don’t hold up well to broader scrutiny of other industry practices. And that doesn’t serve us well in the pursuit of a more ethical industry overall.
As the contemporary philosopher Dan Dennett once wrote, “There's nothing I like less than bad arguments for a view I hold dear.”
But the discourse around large language models (LLMs, known as AIs) is full of them.
To understand why, though, we’re going to dive into literary history.
Our literature of replication
Some of the most fundamental texts in our cultural history are either anonymous, or were given to us as having authors we cannot corroborate. The Epic of Gilgamesh, Beowulf, huge tracts of the Jewish, Christian, and Islamic holy texts: it is the tradition of our societies not only to accept some stories as “received” but also to celebrate fealty in the replication of those “received” tales as the core trait of good storytelling.
Certainly, concrete authorship exists alongside sites of anonymity: the gospels may have unknown authorship, thanks in part to the lost Q, M, and L sources, but seven of the thirteen writings ascribed to Paul are confidently viewed as his own. Anglo-Saxon authorship might be lost, but we have a wealth of Greco-Roman writers from preceding periods who gained clear fame from their poetry, plays, and philosophy.
History contains multitudes, in other words: traditions of replication with fealty, and traditions in which individual exceptionalism matters.
These histories continued with early manuscript culture, in which a scribe might sneak some sign of their name into the margins or accompanying artwork of an elaborate copy, but where the great romaunces of the age, even when rewritten with significant variation from one middle-class housewife’s home-book of tales, prayers, and recipes to the next, would still often be set down without explicit authorship. Books written by specific authors also existed—Boccaccio’s The Decameron, Chaucer’s Canterbury Tales—but the same way that an alien culture might rightly consider Danielle Steele more important than many highbrow authors today, for sheer bulk of content present under her name, so too did widespread medieval popularity lie more often with unauthored but routinely copied chivalric and ecclesiastic tales.
Mind you, storytelling then was still a significantly fluid phenomenon, transpiring both on and off the page—which is why spelling conventions were also not of great importance until the work of nation-building compelled the formulation of a supporting literary canon, a history of “great” writers and rules befitting a “civilized” European tongue. (This was especially true for English, the self-conscious late-comer to such civilizing pressures in spelling, grammar, and canon.)
Today, when someone reads either a Christian Bible or a 19th-century novel, they are by and large reading the work entirely outside of its original tradition, for neither type of literature was ever first distributed in so controlled and complete a form. In the case of early Jewish and Christian texts, the living word preceded scribing, and scribing became a site of debate, interpretation, and deliberation over inclusions and exclusions for many centuries after the origin of each tale. For the 19th-century novel, so well known as a lengthy literary form today, the work more often than not began as chapters published before the whole was even written: the stuff of newspaper and magazine intrigue to be shared and read aloud one issue at a time—with early reviews often informing how the writer would then continue their massive tales.
Throughout these centuries, too, notions of publication and plagiarism also underwent significant transformations. The term “plagiarism” has an unusual history: though first put to use by Martial, a Roman poet who reacted to others stealing his work by writing verse about their actions, its actual roots speak to a very different relationship between author and text than the one we know today.
Put simply: plagiarius means “kidnapper, seducer, plunderer, one who kidnaps the child or slave of another", and the difference between a “child” and a “slave” wasn’t as strong as it is today, back in Roman times where paterfamilias gave the eldest male absolute rule to do as he saw fit with anyone in his household—even to slaughter them. Unsurprisingly, then, the publishing of literary output was configured by some writers of the time as equivalent to having a slave or child who will run away and find someone else to claim them if the author doesn’t establish complete authority first. Alternately, though, other writers—like Horace—loathed the commercialism of publication, and configured it as equivalent to prostituting his property.
In that era, too, acclaim was the greatest achievement a human being could attain. There was no expectation of reward in the afterlife, save for the hope that one’s great deeds would live on in the memories of those left behind. Under early Christianity, that view gave way to a different notion of humanity’s highest calling, and a different set of rewards for actions within this one and precious life. Anxieties about plagiarism fell significantly from literary record during this transitional period, until well into the medieval centuries, when individual exceptionalism in philosophy, the nascent sciences, and the arts resurfaced as a pressing societal concern.
Even then, though, there were strong schools of apprenticeship and the rise of guilds, both of which sustained a significant amount of pride in the ability to replicate with fealty what had come before, rather than to strike out with something original. Artisans were a key class of creative labourers, and it wasn’t until the publication of a most sensational book by Giorgio Vasari, The Lives of the Most Excellent Painters, Sculptors, and Architects (1550), that the idea of artists as a distinct class of celebrities, people ever so much more interesting and enlightened than the common man, arose. It’s due to this first work of comprehensive art history, with all the salacious tidbits it shared around its chosen subjects, that we venerate a very select school of Renaissance artists today: not even close to the full gamut of possible artists in the era, but most—by virtue of being in Vasari’s network—thereafter becoming Western canon.
(Suffice it to say: Good PR has never not been critical to success.)
I’m trained as a 19th-century scholar myself, so I know much more about the myth of individual genius as it comes to inform late-Enlightenment and early Romantic-era values. The reason people are often given a poem by Robert Browning to read in school isn’t just because his dramatic monologues represent a distinct poetic form—but also because his poems in that register were critical to the elevation of historical figures into Victorian-era popularity. Like Vasari with The Lives, pieces by Browning on Paracelsus, on Shakespearean characters, on the works of the ancient Greeks all carried memory of past literary and philosophical “geniuses” forward to his age.
Some of those pieces, though, also speak to a key way that literature had changed over all those long centuries of replication versus originality: there was now also an interest in writers who were in dialogue with past works of art. This dialogue long-preceded Browning, of course: we see it in Chaucer building on Boccaccio, and in Miguel de Cervantes raising the spirit of prior chivalric fictions in Don Quixote expressly to mark the end of such a credulous era in prose. But the 19th century brought something new to this discourse: the possibility of mass, cheap publication through improved technologies and less prohibitive taxes on distribution.
And with the power of new technology (along with the fashionable practice among women of translating works from French, Spanish, Italian, and German into English, opening new worlds of international prose), came ever so much more interest in building whole careers of writing in dialogue with the work that had come before.
Literary criticism flourished for this reason. So too did the practice of building local tales inspired by trends taking off in other European regions (not least of which including the influence of German Romanticism). And plenty of works, from Jane Austen’s Northanger Abbey on, were also in the habit of gently critiquing whole other genres and book cultures, to make space for something new.
Mechanical replication, as an extension of artistic replication
Much more recently, thanks to the power of modern computing, we’ve been able not only to identify the frequency with which plagiarism happened as a matter of course in past eras, but also places where our beloved classics indicate much more complicated patterns of joint authorship. And yet, there was plenty of variability built into notions of 19th-century authorship, too, with some authors plainly celebrated in their moments, but many others—especially women, but not exclusively so—publishing anonymously or under pseudonyms to no great public consternation.
Right from the start of contemporary science fiction, in 1920s pulp magazines and surrounding conferences and attempts to establish a canon of proto-fictions in the genre, anxieties about mechanical replacement abounded—but this wasn’t the only creative practice grappling with such Modernist concerns. A whole new medium, the motion picture, was also caught in a science-fictional sense of awe at its own possibilities, and the estranging industrial processes that underpinned them.
Early cinema, from the 1890s through the late 1910s, had been deeply interactive play: an extension of sensational vaudevillian acts that more often spoke to the fluid joy of artistic creation than any more concentrated project of auteur production. That latter concept would also come to the fore in the 1920s—with the rise of mainstream film reviews, and a shift from star-power to the centrality of the director (including under later critics of the era, like Andre Bazin); but also with the sheer commercialism of the film industry, which from then on allowed a single work to be spread far and wide, potentially driving out other forms of artistic production—and all their local artists, absent similar mechanical reach.
In this way, modern SF (with its pulp presses) and film (with its mass distribution) very much grew up together as artistic mediums, both grappling with the tension between artistic innovation and the implications of replication.
Science fiction writers have been wrestling with such mechanical “replacement” anxieties ever since. In 1953, Robert Sheckley’s “Watchbird” (Galaxy Science Fiction) imagines surveillance bots replacing human security systems:
“Now what do you think of that?” Officer Celtrics demanded. “Fifteen years in Homicide and a machine is replacing me.” He wiped a large red hand across his forehead and leaned against the captain’s desk. “Ain’t science marvelous?”
Nor was this by any means a stretch of the imagination. By the time this story had published, Alan Turing’s “imitation game” (later known as the Turing Test) was three years old, and had sparked serious questions about whether machine thinking mattered as much as the ability of a machine to imitate (and by extension, to replace).
In 1973, Damon Knight’s “Down There” imagines a future US where white people have superficially wiped out all other racialized groups, and in which our protagonist “writes” stories with the help of an IBM program that auto-fills text based on a few selections on the part of the author:
Sunlight, he wrote, and the screen added promptly fell from the ceiling as—and here Norbert’s plunging finger stopped it; the words remained frozen on the screen while he frowned and sucked on his pipe, his gurgling briar. Fell wouldn’t do, to begin with, sunlight didn’t fall like a flowerpot. Streamed? Well, perhaps—No, wait, he had it. He touched the word with the light pen, then tapped out spilled. Good oh. Now the next part was too abrupt; there was your computer for you every time, hopeless when it came to expanding an idea; and he touched the space before ceiling and wrote, huge panes of the.
The text now read:
Sunlight spilled from the huge panes of the ceiling as
Norbert punched “Start” again and watched the sentence grow: … as Inez Trevelyan crossed the plaza among the hurrying throngs.
And this writing is entirely in keeping with anxieties of the era, wherein Cold War and space race pressures pushed people to wonder if technology was outpacing the rest of the human experience. By the time Knight would write this piece, ELIZA had already served as a well-liked psychiatry chat-bot for researchers at MIT in the mid-1960s, raising questions about whether humans would benefit just as much from artificial therapy. Willy Wonka and the Chocolate Factory had also been out for two years, with a scene between a programmer and its advanced (and lippy) computer attesting to a normalized cultural conversation about increasingly powerful machines.
(Of course, that conversation had already started with AI anxieties in 2001: A Space Odyssey, in 1968, and Arthur C. Clarke’s attendant novelization of his film script—but in some ways its offhand presence in a children’s film more potently attests to widespread cultural familiarity with the theme.)
Imitation as appreciation in an era of innovation
And then there was Star Trek: The Original Series, first running from 1966 to 1969, and ending just weeks before the moon landing on July 20, 1969. Technology on screen, echoed by technology leaping forward in real life. The launch of a great new era in science fiction, too, surely.
Except that it wasn’t simply innovation that carried Star Trek into enough popularity to yield subsequent series: a whole franchise, now, of adoring tales taking up the torch of the original. Fan fiction—and in particular, an avid fan-base imagining Kirk and Spock as lovers—played a significant role in keeping the rushed, low-budget production alive and well in the minds of future sci-fi fans and potential audiences.
Fan fiction is a term for our commercial era: a term that expressly codes, that is, for work derived from another, copyrighted production. Nevertheless, the practice of writing stories inspired by and using elements from another person’s preceding work is, as noted above, intrinsic to our history of literature. John Milton’s Paradise Lost only has relevance because the author uses characters from another story—the story of Genesis. Jean Rhys’s The Wide Sargasso Sea (1966) is a remediating prequel to Charlotte Bronte’s Jane Eyre (1847). Browning’s poetry gives new life to Shakespearean characters. Pride and Prejudice and Zombies (2009)… well, you get the idea.
(And yes, my own Children of Doro, inspired by Fyodor Dostoevsky’s The Brothers Karamazov, falls squarely into this subset of literature. So too does my current work in progress, inspired by Thucydides’ The Peloponnesian War.)
This notion of “fanfic” is quite complicated. J K Rowling famously hated the rise of fan fiction around her work, in part because of its highly smutty general quality, but also because writers went so far as to write whole alternative subsequent books in the series, while awaiting her next release. In other cultures, like Japan, one could publish doujinshi (usually erotic manga spinoffs of someone else’s creative work) for a profit without backlash, but the notion of copyright is much more rigidly enforced in Western countries. As such, fan fiction has generally been seen as okay—so long as no one’s making a profit directly off someone else’s copyrighted work.
(A03: Archive of Our Own, a nonprofit repository for fanfic, even won the 2019 Hugo in the category of Best Related Work.)
In practice, though, those lines of profitability have always been murky.
Cassandra Clare was famous for her Lord of the Rings and Harry Potter fanfic, especially The Draco Trilogy—which is widely known to be where she developed characters and situations that would later inform The Mortal Instruments series, a wildly successful set of YA novels that received TV adaptation in 2018 through 2019.
E. L. James wrote a Twilight fanfic series called Master of the Universe, then later pulled the fanfic and reworked the characters a bit to create Fifty Shades of Grey: the fastest selling paperback in UK history, and a series that also gained major filmic adaptation.
And even on a much smaller level, if someone supports the fundraising campaign of someone who produces fanfic the donor then intends to read, are they not technically helping someone profit off another’s copyrighted work?
This moralizing line is so strongly drawn around copyright that the industry also eagerly awaits every new year’s list of works falling out of copyright—and thus, into “fair game” territory for other writers to profit directly from crafting worlds built on someone else’s elements. Most recently, The Great Gatsby fell out of copyright—and that’s how we got the industry-backed The Chosen and the Beautiful, by Nghi Vo, as well as Jillian Cantor’s Beautiful Little Fools, in 2022.
Before the change in copyright law, these would have been “derivative” works. Now they’re “retellings”—the same as any commercially published story drawing on J. M. Barrie’s Peter Pan (1904), or Thomas Malory’s Morte d’Arthur (1485).
The capitalist anxiety around AI and art
All of which brings us to the contemporary crisis around so-called “AI”, the LLMs that businesses like Sudowrite, Netflix, and Hollywood studios are champing at the bit to suggest will transform creative practice as we’ve known it.
In recent weeks and months, as the Writers Guild of America finds itself on strike against corporate exploitation, and as spammers have driven Clarkesworld and other major short fiction presses to dedicate far too much of their slush-reading time to weeding out junk submissions, a great deal of anguish has emerged among fellow writers around AI. And for good reason: there’s not much money in writing already, so hype around new technology as a ready replacement for existing labor is exacerbating reminders of how little stability is on offer in creative fields.
But in the process of expressing outrage at people eager to leverage the hype to personal benefit—people rushing to “get theirs” through technology trained on other people’s output—there is a thread of commentary that undermines the very richness of literary tradition we’re ostensibly fighting for in the first place.
It’s commentary that includes ideas such as:
You’re not a writer if you use a tool trained on other people’s writing.
And
Anyone who uses this technology is stealing from other creators.
These are positions that don’t hold up well to our long and often highly treasured history of writing that draws from writing come before, and even storytelling that exists concurrent with our own.
These are also not positions necessary to argue against the core problem illustrated by our culture’s current bull-run from the failings of crypto and NFTs straight into the next Silicon Valley venture-capital hype-sink of magical “AI”.
What do we need to keep in mind instead?
Simply this:
The underlying and pervasive precarity of our economic systems, and the real automation threat behind it. Not new technology, per se, but the people wielding it. The relentlessly exploitative status quo they’re hoping it will sustain.
Corporate monopoly is the “O.G.” artificial intelligence in our lives—and as works of capitalist critique from the 19th century made abundantly clear, long before film and science fiction arose as mediums of discourse around related themes, it is the ongoing vehicle of dehumanization and devaluation that we truly have to contend with, behind all the current smokescreens of specific AI products and scammers.
Yes, of course, we should grapple with those products and scammers, too—we have to. We have no choice. These are the conversations given to us by our times.
But we should also remember what it is, exactly, that we’re fighting for.
Is it simply the removal of these latest technological sites of economic peril?
A return to the industry status quo of even a few months prior, wherein corporate monopolies and globalization pressures were still severely destabilizing our economic pathways—but not with quite as many shiny machines as exist now?
Or perhaps it’s something far more revolutionary—and far better attuned to the messy creative histories of artistic practice out of which we gained this modern era.
Maybe, just maybe, it’s a world of more sustainable creative practice for us all.
Be well, be kind, and seek justice where you can.
ML
P.S. As promised, I’m continuing to share weekly reports for sales of Children of Doro, to help de-mystify the state of indie publishing and publishing in general. At 25 copies sold (huzzah!), I’m now almost halfway to paying off my initial investment costs in promoting the book. Once I cross that threshold, I’ll be in an equivalent position to authors who don’t start collecting royalties until their advance has been paid off via book sales. Fingers crossed I can make that cut off and actually profit from this text—but hey, even if I don’t, I’ve already sold more copies than many folks unfortunately manage even via traditional publishing. And I’m getting some good reviews for the work, which is lovely! (So thanks for reading and reviewing, if you do!)
When I first started seeing AI art posted on Instagram my first thought was "I wonder how Andy Warhol would feel about this?"
I'm still salty we have to differentiate between "traditional" artists and, basically, the rest, because computers have made the use of pencils and paint and clay as mediums freakishly old-fashioned. To me, this entire AI debate is like debating how someone is going to use a hammer. It's a tool. Am I personally going to use it? No. Might it be used as an aid for someone with disabilities? Sure it could. I think we tend to get hung-up on doomsday scenarios, when all this is, is the photograph replacing lifelike paintings. Artists adapted. Because we must.
Another superb piece!