Artificial Imagination: Art that no human eye has seen before – about 'The Crow'

Visual Artist Glenn Marshall brings Deep Learning to the cinema: His AI short film 'The Crow' won prizes in Cannes and Linz. Thoughts on a new kind of art.

In Pocket speichern vorlesen Druckansicht 3 Kommentare lesen
The Crow: Short Film by Glenn Marshall, Cannes Short Film Award 2022, Prix Ars Electronica 2022

(Bild: Glenn Marshall)

Lesezeit: 11 Min.
Von
  • Silke Hahn

(Diesen Beitrag gibt es auch auf Deutsch.)

The short film 'The Crow', designed with AI technology, introduces a new category of computer-generated visual art and is already considered a pioneering work of a new genre for which a name is still missing. CyberArt? Artificial Imagination? At the Cannes Short Film Festival in August, the animated film won the Short Film Award (Best Short Short), and in Austria, it was awarded the Prix Ars Electronica 2022 (Honorary Mention) at the International Competition for CyberArts.

Conversation with Glenn Marshall

Glenn Marshall's professional career in computer animation spans more than 20 years, using experimental CGI, generative and AI technologies to pursue a philosophical vision of tomorrow's digital art. He was awarded the Arts Council of Northern Ireland (ACNI) Major Individual Award (2015), the highest recognition given to leading artists in the country. He is also a two-time winner of the Prix Ars Electronica, received the 2022 Cannes Short Film Prize for "The Crow". He has collaborated with Peter Gabriel and Tangerine Dream on music videos and concert visuals.

We spoke to the person behind the work, the Northern Irish visual artist and coder Glenn Marshall, and asked him about technology and the background to his work, which he has called 'Artificial Imagination' since his first Prix Ars Electronica in 2008 – a participant at the festival of the same name in Linz had given him the idea. At that time, Marshall was awarded for the official music video for Peter Gabriel's song 'The Nest That Sailed The Sky'.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmmung wird hier eine Vimeo-Video (Vimeo LLC) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (Vimeo LLC) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

The now award-winning film 'The Crow' is a barely three-minute animation in which a dancer transforms into a crow by the means of Artificial Intelligence. The result is 'a haunting and compelling piece that follows the crow through its brief dance in a landscape of post-apocalyptic barrenness, to its inevitable demise', as the film description puts it. What sounds dystopian is a sensual experience. Before the dancer began to transform into a crow through technical alienation, she was part of the short film 'Painted' by Duncan McDowall and Dorotea Saykaly. She dances to a piece by the composer Erik Satie (Gnossienne n° 3). Excerpts from it as well as scattered references to the 'making-of' are circulating on the net; the complete film is available on YouTube.

With a Pytii Colab notebook and code, a transformation of the film material took place, as much could already be gathered from the short notes. With the use of AI (text-to-video), the film with a flesh-and-blood dancer became a painting-like animation with a crow woman. But that's not all. At under three minutes, 'The Crow' not only makes a visual and technical statement, but comments on the human-technology relationship. The film becomes a kind of battle between the human and the AI, its director explained – with all the suggestive symbolism inherent in this charged confrontation.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmmung wird hier ein externes YouTube-Video (Google Ireland Limited) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (Google Ireland Limited) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

heise Developer: Glenn, what is your background – professional, scientific, artistic?

Glenn Marshall: It's a bit unconventional. As a kid, I got straight A's in art throughout primary school – I could paint and draw anything. But when I got to secondary school, it stifled me completely, and I failed art – the only thing I was good at and loved. But by then, I had a home computer. So I transferred all my creativity into a machine instead.

heise: What inspired you to create 'The Crow'?

Marshall: Seeing the original live-action 'Painted' short on YouTube – I knew my techniques would be perfect for it. The film became the basis for 'The Crow'.

heise: Oh, that sounds different from the earlier films where you mutate text into random images. Can you briefly explain your approach?

Marshall: The basic technique is feeding each video frame into an AI process that tries to change the image according to a CLIP-guided text-to-image prompt.

heise: So for editing you used CLIP, Contrastive Language-Image Pre-Training from OpenAI. What was your starting point?

Marshall: I had been heavily getting into the idea of AI-style transfer, using video footage as a source. So every day, I would search for something on YouTube – or stock video sites – and try to make an interesting video by abstracting it or transforming it into something else with my techniques. It was during this time that I discovered 'Painted' on YouTube.

heise: To transform 'Painted' into your film 'The Crow,' you still needed a text prompt. Can you share it with us?

Marshall: In this case, the prompt was 'A painting of a crow in a desolate landscape.' I had already gained experience and an idea of what prompts would work. The video then transformed itself. There was a magic present which wasn't in my other videos. I think this is because, in the underlying video, the dancer involved is already mimicking a crow's movement and wearing a crow-like shawl.

It's this that makes the film work so well – as the AI is trying to make every live-action frame look like a painting with a crow in it – so I'm meeting it halfway – and the film becomes kind of a battle between the human and the AI – with all the suggestive symbolism.

heise: How is this different from creating static images?

Marshall: You can generate a moving, animated scene. It's a bit different than with single images. You can describe the action, the camera movement, and more.

heise: Some people talk about you as a composer. Do you also compose music? Or does that refer to your film composition with AI craft and technology?

Marshall: I do compose music for some of my films – but I found the designation as a composer a bit strange. It doesn't make sense and is just confusing.

heise: How do you describe yourself? Composer, developer, machine learner... coder, or perhaps freelance artist? If there is already a name for what you do.

Marshall: We are all still trying to figure out what we are – but I'm now a full-time AI artist/filmmaker, I have funded projects to this effect for last year. Although I am the coder kind – we now have the 'casual' type in light of Midjourney / DALL·E, and so on. It's the coders and developer community that is always a step ahead of everyone else :)

heise: I see the wink... so is AI artist or AI filmmaker the best fit for you?

Marshall: The other option is Neural Art / Artist – but maybe pretentious.

heise: What drives you in what you do?

Marshall: There's something about immersing yourself in the latest technology (AI) and being part of the coding and development community that is creating art and animation that no human eye has seen before. That excites and inspires me – but you still need to craft something with these tools – it's not enough to spam upload every short quirky test you do to show the world how cool this tech is, which seems to be about 99 percent of what people do.

Marshall: I'm determined to lead the way in delivering that it's merely a tool to create a finished piece of art, like, 'The Crow'.

heise: You have been a pioneer and trailblazer in the visual arts for quite some time and have used innovative techniques over the last twenty years. Most recently, deep learning and text-to-image synthesis. Is there a list of your artworks to trace the development?

Marshall: My YouTube channel is a document of how AI has begun and progressed... starting with text-to-image synthesis processes back in January 2021 – until the present day. One of my first films, 'A Bleak Midwinter', is an early attempt at visualising poetry using AI. And then there's my latest effort – 'Everything in its Right Place' – a Radiohead video using the latest AI model Stable Diffusion. I think it's one of the best things I've ever made.

Marshall: You should brush up on Stable Diffusion – an open-source AI model released only a few days ago – that revolutionises the AI art scene.

heise: Thanks for the tip! I share your assessment. We had an announcement about the public release of Stable Diffusion and are staying on the ball.

Stable Diffusion and the Media Revolution
Stable Diffusion: launche of the public release in August 2022, an open-source text-to-image generator for everyone

The freely available text-to-image generator Stable Diffusion has been causing a stir since August 2022. However, AI-supported image generation does not stop at static images but touches on the production of moving images and films. New tools and techniques for filmmaking are emerging: text-to-video productions are likely to increase, and the team behind Stable Diffusion, among others, is working on further developing its model for such use, according to tweets. The former head of AI at Tesla, Andrej Karpathy, demonstrated a video created with Stable Diffusion, for which he provides the code on GitHub.

Speculation about the future of cinema in the age of AI-generated images goes hand in hand with this technical development. It is likely to change what we know and are used to in terms of visual art and creative artistic processes. One question will be who will create films in the future and how today's filmmakers will work. The first works are already tangible and show that, in addition to one's imagination, a large portion of know-how and programming skills are still necessary to make film art with AI technology at present: Art does not come at the push of a button but from talent. (sih)

Marshall: The speed and quality of the results are astounding. If I were to remake The Crow today, it would probably look twice as good.

heise: When exactly did you make 'The Crow'?

Marshall: In December 2021. Yes, that's how FAST the Development of AI is!

heise: Mind-blowing. How do you envision the future of filmmaking and cinema?

Marshall: You only have to look at 'The Mandalorian' – without giving away spoilers; there are some famous older characters from the original trilogy brought back to 'life' with deep fake tech. There was an actor used – but only as a reference for the AI to implant the entire face and voice of an AI-trained version of a famous character – I'd love to see another Beatles film, with all four of them convincingly brought back to life.

heise: I also think about the living people, directors and actors: what does the technical evolution mean for the film industry?

Marshall: We are potentially headed into a new hybrid film form, between live-action, animation, and deep fake. Many actors could end up being puppets. Directors can now envision their own concept art and matte paintings for their sets and special FX. That, to me, empowers the director with a more personal and direct vision. Potentially, you could generate an entire feature film with AI. There are basic examples of this in development, but they are very primitive and not usable in any professional or artistic way. But as with all things AI, it won't be long.

heise: Is there a best and a worst-case scenario? How can artists and developers prepare for the future?

Marshall: To be honest, I don't think or care about these questions about the future of AI. I'm just in the moment, creating and having fun. But when you look at the present moment, the future is always there.

heise: True. TS Eliot, The Eternal Present... What are you working on now?

Marshall: I'm planning a follow-up to 'The Crow', but I'll be shooting the live-action required myself – so that I have complete creative control over the whole piece and how the AI interprets it – so many possibilities…

The interview was conducted by Silke Hahn, editor at iX and heise Developer.

(sih)