Quartet, Margie Medlin |
In this second article on Australian/UK-based artists in receipt of prestigious Sciart Awards from the Wellcome Trust in the UK (see Gina Czarnecki, RT 75, p33), RealTime talks to Margie Medlin about her Sciart project.
Medlin is completing the performance stage of research on the project throughout the UK winter with shows scheduled for February 2007 at the Great Hall of St Bartholomew’s Hospital in central London. In this ornate historical setting, Medlin and her team will place a dancer, a musician, a robot camera and two screens, one framing a virtual dancer, the other the point of view of the robot camera.
The idea of using a musician’s gestures to effect a virtual dancer is fascinating—it raises questions of a kinaesthetics of music—what movements produce which sounds which in turn produce new choreographies…? How does it work in Quartet?
A duet between the musician and virtual dancer [created by Holger Deuter] uses both gestural and audio data. Stevie [Wishart] plays the violin, works with her voice and controls 3 virtual instruments based on computational models of human hearing created by Todor Todoroff with the Physiological Lab at Cambridge. Stevie is wired with sensors so that any single action from her can do several things at once—creating sound from any of the 5 instruments, effecting the virtual dancer’s body parts, speed of movement...The virtual body control interface for the project was created by Nick Rothwell.
Is Stevie improvising this performance live?
We are going to have to set a lot of it although she is primarily an improviser and we began with this. We have tried out many of Stevie’s violin playing techniques, such as cross-bowing and plucking, to discover which are the most useful actions in terms of effecting the dancer’s movements and also which have the most intuitive relationships for Stevie as an instrumentalist. Our work together in July and August this year was the first chance we’ve had to connect the two systems as a creative investigation rather than systems testing. The idea is that we make presets or states of sensitivity in the virtual body that Stevie can improvise within. This is very refined work and needs a lot of concentrated effort from Stevie and the team to calibrate her instruments and the interface with the virtual dancer. And at the same time we are looking for visual keys for the audience to understand the connections playing out in real time.
So Stevie is really choreographing the virtual dancer...?
That’s it, or rather her sound and gesture. The other duet is between the live dancer and the motion-controlled robot camera that Gerald Thompson, Glenn Anderson and Scott Ebdon created for the project. The dancer [Carlee Mellow] is wearing three sensors, two 2D sensors, one on her shin and one on her thigh, and a 3D sensor on her chest. The data collected from her sensors becomes the information that runs a motion controlled robot camera.
The robot has three limbs and five motors each operating on an axis and then there’s a small surveillance camera mounted on the robot’s ‘forehead’ capturing its real time point of view. This will be projected during the performance. So the actions of the real dancer are being replicated by the robot and there will be a video projection of the footage shot by the robot camera. This moving image is a representation of the dancer’s movement—from her point of view.
And the data from the real dancer is also going to the virtual dancer, to her left leg and chest. The data from Stevie is going to the virtual dancer’s head, the right leg, the arms the hands and right fingers. Part of the reason we did that is because it’s incredibly difficult for Stevie to meaningfully control the virtual body, to get a sense of connection or flow through the body. In the first stages of Stevie’s connection to the virtual dancer we could only work with ‘close-ups’ on an arm or a leg. We needed to make connectivity through the whole of the virtual body and expand the connection between the dancer and the musician. This is a crucial point because the project is about the transfer of information, how it changes from medium to medium. So the musician and the dancer are controlling different parts of the virtual dancer’s body, but the live dancer’s stuff isn’t programmed at all—it’s responsive.
So the ‘real’ dancer is filling a gap in the information—and is the dancer responding to the musician?
Sometimes yes. There are a number of segments within the performance, for example there’s a duet between Stevie and the real dancer, a duet between Stevie and the virtual dancer, a duet between the real and virtual dancers, a trio between the robot, the virtual dancer and the real dancer etc, building to the quartet.
So the project is about the relationship between the virtual, real and mechanical, but also the observation of those relationships?
An observation and a highlighting of each of those relationships and what happens with the transfer of material, data, intention between them. But also the essence of where that information is coming from—for instance the real people, the musician and the dancer. I’m hoping that the nature of the source of the data will be highlighted as well—what you are drawn to, what’s alluring, what catches and holds attention, where you think the power lies between the virtual, the mechanical and the live elements.
Your interest in the dancer’s point-of-view was apparent in the film you did with Sandra Parker, In the Heart of the Eye. What is it about this particular line of research that keeps you coming back?
It’s about the choreography of cinematic space. There’s been a whole other stage in relation to this research in between, with my three screen work Miss World 2002 which has not been shown in Australia, and it started with Elasticity and Volume [installation, 1998]. When you look at the footage of the dancer’s POV alone, it doesn’t make any sense and can’t hold your interest, its confusing and irritating. As soon as you put the dancer with it, it makes complete sense and is really fulfilling. I’m interested in the relation between that out of control image and the sense of embodiment you get when you put that image in a relationship with the ‘source’ dancer. It’s about the poetics of looking, creating an imagination of looking.
Rebecca Hilton has been the choreographer involved in the first two stages of development. The other choreographers involved in the project are Lisa Nelson (USA), Lea Anderson (UK) and Russell Maliphant (UK). What will they each contribute?
Rebecca made some of the material that Carlee will develop for the performance—a set of alphabetic building blocks—but Lea, Russell and Lisa haven’t been involved at all yet. They’ll come to the rehearsal and each have 10 days. I have asked Russell to work with Carlee on the dancer’s solo and to work with Stevie on the gestures and sounds that she makes and how that impacts on the virtual dancer or vice versa. I’ll ask Lea to work with Carlee and the robot camera and I’ll ask Lisa to work with Stevie and Carlee on the improvisational relationship between the two ‘real’ elements.
They are all very different choreographers in terms of style. Will that complicate things?
I think it’s all very complicated. Having different artists exploring these complicated systems gives an expanded idea of what is possible. I am interested to see how people can use these systems. They will not be given any time to make technical developments but will be asked what they can get from the systems we have developed.
Quartet will be presented at The Great Hall, St Bartholomew's Hospital, London, February 14-18, 2007. Quartet has been funded by Sciart Production Awards 2005-6 in collaboration with the Physiological Laboratory at Cambridge University and the Arts Council of England. It is co-produced by the Performance and Digital Media Department at the Institute of Contemporary Arts, London, with support from the Australia Council for the Arts, Arts Victoria, and ZKM Center for Art and Technology, Germany. www.quartetproject.net
RealTime issue #76 Dec-Jan 2006 pg. 28
© Erin Brannigan; for permission to reproduce apply to [email protected]