An Interview With Richard Ramchurn
Richard Ramchurn is a British filmmaker who premiered his piece, “The Moment”, at the 2018 Sheffield Documentary Festival. Ramchurn’s film is unique, as it analyzes a viewer’s brain waves to guide the scenes and dictate which characters are displayed the most. In this interview, I dove into Rich’s motivations for the interactive film and what role he sees interactivity playing in the future of films.
Chase: What was the motivation for using brain waves to guide the direction of the film?
Rich: This is my second brain-controlled film. The inspiration for the project came in 2013. I found this research by a guy named Shinji Nishimoto who reconstructed a video from people’s brain data using an fMRI. The images in the video looked super dream-like and abstract, but they still had some of the forms of what the people were shown. [This video] totally fired up my imagination of the future possibilities of using brain data to create artistic experiences. We made a prototype to see if we could use a NeuroSky headset to cut and mix audio and video footage, and that seemed to work.
Chase: Can you talk a little more about your first film, and how it got you to where you are?
Rich: I used Kickstarter to make my first film, an autobiographical expressionistic, arthouse film, that put the viewer or the controller in the psychological position of the main character. [Using the viewer’s brain data, the film would send them] through moments of dream and reality. From that film we did some academic research and I started a PhD to further explore the topic. We got some further funding from the EPSRC, the Engineering Physical Science Research Center, at the University of Nottingham, to build a more traditional genre narrative film.
Chase: What was the motivation of using a sci-fi, Orwellian plot for this experience?
Rich: TekWar, sci-fi thriller, I think is fair to say. The reason I chose the specific genre film for this is because I really wanted to explore the different possibilities of narrative. I chose a narrative that people were familiar with, and would fit with the technology, in order to explore how we could go about constructing a narrative with this system. The problem we had with the last film was that people weren’t familiar with our film form, so it built an additional barrier to people understanding what the narrative-construct was.
Chase: How have you felt about the progress of the film so far?
Rich: The feedback’s been amazing so far at Doc/Fest. We’ve had full bookings all week, and the feedback’s been overwhelmingly positive. I didn’t think the narrative structure would work as well as it has, even though there are big sections of non-narrative, abstract imagery in it. There have been more cases than I thought of people actually getting what it’s trying to say and why specific pieces are there. I’m getting some really rich data for how to scale up the project in terms of being able to share it with people. [The key questions are] can they see the purpose and how can they further explore the different versions of the film?
Chase: Do you see any further iterations in the future with the project?
Rich:It seems like the narrative’s working so well that [we may explore different genres]. We could use it for documentaries. People have even said comedies. I think working with other directors is something we’re going to explore. I’m going to try to make other artistic collaborations with this set-up.
Chase: What role do you think interaction will play in the future of films, whether it’s brain waves or maybe other senses?
Rich: I think EEGs are pretty limited for this kind of film. Further brain computer interfaces are all being developed at the moment. People are pushing a lot of other methods of getting data from the brain, and sensors are becoming cheaper. I think with each technology shift, there’s going to be more opportunities for artists to tell stories, particularly with film, but for artists of other mediums as well. There is a move towards the neuroactive. I’ve seen Netflix, BBC, and Fox use CtrlMovie, which is an interactive cinema platform. I think there’s more subtlety with video games and what the difference between a video game and an interactive movie is. I think as technologies become more versatile and people are becoming more open to non-linear narratives and non-linear storytelling, there’s going to be more opportunities for artists to tell different takes of stories. Another thing is that stories were always adaptive before the times of radio and TV. [Before stories were recorded], stories were inherently interactive. You would change how you tell a story depending on who you were telling it to, whether it be a small group around a fire or large audience. As a storyteller, you would adapt that story depending on how the reaction from that audience would be. So in a way, [current interactive storytelling is] looking back to some of the original ways of telling stories.
Chase: What did you work in prior to analyzing brain waves to make films?
Rich: I studied illustration and animation around about 2000. I’ve done lots of different disciplines. My practices include painting, theater, theater design, short films, music venues, interactive development, and VJing so it’s been quite varied. And I think I bring that to the process. I’ve use all those tools to make [the film]. I think that the cross-discipline gives a lot of breadth to how I tell stories.