A new approach to AI and generated text.
Oct 27, 2019
I’m bad to watch films with. If I’m not making facetious commentary or instantly imitating any onscreen lines I find even slightly amusing, I’m spoiling the plot. Even if I haven’t seen it before.
I wasn’t always like this. Like most of these things, you can blame mum and dad. I vividly remember the day we were sat down in the auditorium of St. John’s College School for a guest speaker to give us a lively grounding in Joseph Campbell’s The Hero with a Thousand Faces. The fact that a comparative mythologist was brought in to talk to a group of Year Sevens as a treat should tell you a good deal about private school education in Cambridge, England – but I was surprised that the gist of the talk was actually both interesting and emphatically simple.
The man, whose name I have unfortunately since forgotten, told us that every film, book, play, every TV episode and cartoon short was essentially the same story – mutations, representations and reinterpretations of the same basic series of events, the ‘Hero’s Journey’, which Campbell formulated in his 1949 book to consist essentially of a central character who undergoes a separation from his normal life, a personal transformation to overcome adversity, and finally a return to his home having changed. Extrapolated from this are a series of more specific and linear stages such as ‘Crossing the Threshold’, ‘Refusing the Call’ and ‘Divine Aid’, which Campbell had identified reoccurring across an incredible swathe of global mythology and literature.
I liked this, if only because it allowed me in social circles to interrupt, say, watching the final duel between Luke Skywalker and Darth Vader with ‘Ah yes, the Innermost Cave’, or Dirty Harry nursing a scotch at some twilit bar with ‘He’s going to Refuse the Call, I just know it’. Me and my family quickly discovered that these tropes are indeed not only instantly recognisable, but recurrent in essentially equivalent forms across huge numbers of titles. This compulsive generalisation is made even worse by the fact that these events are all interlinked, natch – you know after the hero Crosses the Threshold, Dreamlike Success will soon follow, followed by Complicating Action, etc., etc. It’s not even funny anymore, and I mean that in the most literal sense. You already know what’s coming.
This all sort of lay low in my subconscious for decades – even as I embarked on degrees in English and Comparative literature, I increasingly began to see the speciousness of Campbell’s and others’ narrative structuralism. It hardly takes protracted thought to realise that, since the vagaries and figurativeness of something like the Innermost Cave can take on any number of expressions, the definition is essentially as meaningless as saying: ‘At some point this story will have some sort of a climax.’ If you squint, anything starts to look like a Mentor, or Goddess, or whatever, and the task of unifying all of literature, as in Campbell’s ‘monomyth’, essentially seems to consist of making sufficiently capacious categories.
Fast-forward to last year. I was in the bath (not for all of it, natch) and got a splendid idea, as does sometimes happen in such circumstances, once you let your mind wander. I was about to commit to a dissertation topic for my degree at the University of London, but was having trouble finding something punchy enough. While at Trinity, I had constantly been rubbing elbows in hall with some of the country’s greatest mathematicians and computer scientists (the college has always favoured these subjects above us Humanities peons, by dint of its historic links with Isaac Newton, Charles Babbage, Ramanujan and GH Hardy, etc.) and learned some small amount about machine learning practices, a hot new topic for grandaunts looking for some measure of future employment. The flatmate with whom I first moved down to the city had completed his own MA project with a MatLab program that had taught an essentially blind machine to play the game Flappy Bird to near-perfection through a simple system of negative reinforcement, and in a single, semi-lucid and -complete flash it occurred that the same techniques and technology could be brought to bear on Campbell’s work in a single interdisciplinary project.
Generated text has been around almost as long as computers, or the more successful medium therein of the video game. Since I’d first learned to use a computer monitor, in IT class while the teacher’s back was turned, I had dabbled in coding auto-text (which then consisted of typing ‘10 PRINT “FUCK”, 20 GOTO 10’ on the simplest coding platform and having the whole screen fill with infinite profanity) and remember being spooked by AI chatbots who could respond to your inputs with semi-coherent conversation. However, study of the attempts at automatic, machine-written text have generally been snubbed by literary criticism for pretty understandable reasons. Even if such long-form automated text could get past the sort of ridiculously garbled ‘because dog is hello boy himself towards gratefully’ constructs that come from those unavoidable extra-grammatical fuck-ups, machines fundamentally have nothing to say, nothing essential about the human condition that a human reader could relate to above a human author. No coder could input text for an infinite number of possible responses (which would rather defeat the point anyway), and a single glitches or special-case oversights can turn any number of response laws into garbled gibberish, since the machine has no experience of real-world people, speech or reality generally.
So machines, in a sense, can’t innovate. They can only produce reformations of human language based on expectation, a success/failure learning parameter and the respective grammars that have been programmed into them. AIs have become sophisticated enough to guess your next word in an auto-complete email, based on millions of cases in which they have learned that the word ‘hand’ generally follows the phrase ‘sleight of’, e.g. – but imitation of sophisticated, meaningful prose has always remained elusive. There are valid reasons for thinking it impossible, or pointless, or both. But why not? If, as I learned in the auditorium all those years ago, you can expect the content of a story to follow progressions as strict as sentence grammar, recognise and identify these narrative functions when and where they occur and have an expectation of the upcoming story from what’s come before, why can’t a machine?
Now I am in Santa Barbara – and, for the duration of my PhD, I plan to try to teach machines to tell stories. Try, at least.