Beyond Uncanny Valley – AI and the Movies

videodrome-1108x0-c-default

“Fears of a ‘great awakening’ in which robots become self-aware and decide that humankind itself is the greatest threat to the planet are not new.” Nik Glover on the AI debate currently raging – not least in Hollywood…

At the end of the last century, public debate on both sides of the Atlantic was marked by moral panics relating to: drugs, AIDS, pornography, video games, and satanism (represented by Dungeons and Dragons). These terrors were out there, and they were trying to get into us.

All of these scares would find their way into our picture houses, some up to 30 years after they occurred. See 2021’s excellent Censor as evidence; a poisoned love letter to video nasties, it is worth anyone’s time.

“Cinema surveyed the wonders of science and asked the obvious question – what if it all goes wrong?”

Technophobic cinema has always existed; it just found new urgency via the mediums of broadcast signals and VHS. Videodrome (top, 1983) beamed a body-altering signal into our living rooms, an analog process of insemination that perfectly reflected the horrors of the Reagan decade. Taking the creeping Red Scare of the McCarthy years, and turning it into a faceless and remote antagonist even more mysterious and malignant, 80s horror cinema surveyed the wonders of science and asked the obvious question – what if it all goes wrong? It’s a question we find ourselves struggling with once again.

In the intervening years, what has changed in cinema, and in the world, since those early, simpler days? Today’s debate surrounding the rapid development of Artificial Intelligence resonates with cinema of the late 20th century – fears of a ‘great awakening’ in which robots become self-aware (albeit in a recognisably ‘human’ way) and decide that humankind itself is the greatest threat to the planet (The Matrix, 1999).

“The contemporary debate is often seen in economic terms”

The contemporary debate is often seen in economic terms. What happens to all those writers, designers, visual artists, and other creatives when studios decide that handing their functions over to a reliable algorithm that can’t strike for better conditions is a better use of their resources? The current Writers Guild of America strike is in part attributed to concerns over programmes like ChatGPT being used to ‘strike break’ – filling in for pesky humans while they argue over money. The scenes of angry crowds attacking robots for taking away gainful employment in AI: Artificial Intelligence (2001) may not be as distant as we imagine.

So: where are we now? Programmes like ChatGPT need large data input to be able to replicate, with reasonable originality, story formulas. Would 100 years of Disney animation movies, books, and series be enough to come up with the next live-action Little Mermaid? Could you really see a situation where a Fast and Furious sequel, or a Disney live action remake, couldn’t be entirely scripted by AI?

“The National Association of Voice Actors has called for firmer regulation to sort the humans from the robots”

Actors are already monetising their rights over the use of their voices for generations to come. Darth Vader will probably never be recast, even when the venerable James Earl Jones passes on. The actor reportedly signed over the rights to the use of his archival voice work to create dialogue for the inevitable future episodes, enabling Ukrainian tech start-up Respeecher to use AI to do just that. The National Association of Voice Actors has called for firmer regulation to sort the humans from the robots. As far back as 2000, Oliver Reed was digitally resurrected in Gladiator. Peter Cushing, Marlon Brando, and Carrie Fisher, among others, have since been reanimated in similar fashion. As Lister observes in Red Dwarf, death’s not the handicap it used to be.

Perhaps our modern concept of inspiration itself is at stake. There is a cynicism inherent in inventions like ChatGPT about what lies at the heart of human creativity, and where it aligns with simple productivity. The algorithm crawls through millions of lines of script to find patterns – connections between subjects, opinions, linguistic tropes, and renders new interpretations on any subject requested by the user. The result is therefore a collation of human inspiration, rather than an invention. Of course, it could be argued that the majority of cinema designed by humans arrives along the same lines. This echoes Mark Cousins’ argument in The Story of Film that Billy Wilder’s The Apartment (1960) couldn’t exist without Lubitsch, Vidor, and Chaplin. This argument essentially draws on the Greek perception of inspiration, that it is a gift derived from higher beings, rather than the later concept that it is birthed from reactions to the world around us, or conflict within the unconsciousness of the artist – more immaculate conceptions.

The Age of the Uncanny

We all have foundational moments with art, and they are all rooted in our childhood. Terminator 2: Judgment Day (1991) was my formative experience of CGI – the too-smooth rendering that replaced the more endearingly performative stop motion animation, and actual puppetry. While this first encounter transformed audiences’ understanding of what could be achieved with CGI, it would not be long before lesser technicians would ground the technology once more. It’s just possible that an AI boom could be followed by bust for the same, very human reason – it’s incredibly easy to make Bad Things.

Because the 90s did not immediately represent the triumph of CGI. While James Cameron became its chief exponent and populariser (The Abyss, 1989; Titanic, 1997; Avatar, 2009), a legion of less judicious filmmakers decided that its possibilities were simply too exciting not to explore. The Langoliers (1995) is a curio for Stephen King completists only; unless you enjoy numbingly and distractingly poor computer graphics. Its depiction of plain-of-reality-devouring flying testicles with teeth is for the ages. It certainly can’t be said to reach uncanny valley, the liminal space between the recognisably artificial and convincingly human.

“In Titanic’s impressive wake, a slew of movies appeared which seemed to throw into question the wisdom of CGI-laden effects fests”

Things reached a head, ironically (or not), in 2001. Four years after Titanic had realistically depicted the world’s greatest ocean-going manifestation of hubris (at an astronomic cost), a slew of movies appeared which seemed to throw into question whether it had all been a fever dream. In the same year that Peter Jackson’s first Lord of the Rings movie premiered, a litany of camp, poorly rendered visual effects feasts sloped into cinemas. The Mummy Returns, Spy Kids, and Pearl Harbour were hits. Evolution was not, although it’s well worth a watch.

Lord of the Rings was a watershed. Its computer-generated vistas and battle scenes (aided by incredible miniatures and makeup work) have been endlessly copied with diminishing returns ever since, particularly by Marvel, including in the recent and entirely unconvincing Ant-Man: Quantumania (2023). This, along with de-ageing technology, is probably the most pernicious result of the age of CGI. CGI doesn’t need AI to be bad, it just needs bad input.

Visual effects are one element that AI will undoubtedly influence – those endless lists of digital artists at the end of blockbusters might shrink in coming years – but the technology of automating the movie business goes far beyond our current perception of it. Actors, writers, and visual artists will all have to respond or risk losing human rights they didn’t even realise existed. The right to conceive stories. The right not to be resurrected. The right to innovate.

Frankenstein (1931)

The Great Awakening

Spike Jonze’s Her (2013) posits a love affair between a human and an AI programme. Falling in love with – and out of control of – your own creation is as common a story trope as they come, from Frankenstein (above) to The Duke of Burgundy (2014).

The powers that be in the movie industry may be falling in love with the idea of automation as a way of cutting costs and increasing their power over the creatives, but what price inspiration? Audiences have long been prepared to accept stock themes and content; will they baulk at seeing a gone-but-not-forgotten star speaking rehashed versions of popular old clichés against CGI backgrounds?

Well, the second Avatar movie took over $2 billion dollars. Perhaps no one will notice much when the revolution comes.

Nik Glover

Images/media, from top: Videodrome still; Industrial Light & Magic; Frankenstein still

Posted on 31/05/2023 by thedoublenegative