‘Zeropointenergy (Ryan Jordan) Interview’

Ellie Mills, IP 1 Magazine, 2007

Ryan Jordan talks about his recently developed Movement and Gesture Interface, which controls music through his own physical movements.

What is your commercial name, and why?
It generally depends on what I’m playing or performing, but I mainly go under the name Zeropointenergy when I’m doing noise or gabber stuff. I found it out when I was searching the internet for brainwave frequencies and came across certain frequencies where apparently magic windows appear which can be sources for energy. I followed some links and came to Zero Point Energy which is something to do with physics and extracting energy out of a vacuum; just do a Google search and you’ll get a better picture.

What is it the MGI exactly?
MGI is just an abbreviation of Movement and Gesture Interface, and basically it’s a MIDI controller that maps bodily movements and gestures by attaching two different types of sensors to the body; one to the head and one to the fingertips. It can control anything that uses MIDI, so, for example, you could control parameters in Reason, Cubase and soft synth samplers, etc.

How did you come up with the idea?
I had a feeling, well more of a frustration really, with composing and performing with a computer. I haven’t been trained to play a traditional musical instrument and I like the idea of physically moving to generate sound because it’s a more direct way of connecting to the sound world because your whole body becomes involved. By using a computer you’re kind of limited to the keyboard and I wanted a greater physical freedom than that.

How does it work?
There is an accelerometer attached to the head which measures tilting on an x-y axis (up-down/left-right), and light sensors to the finger tips which measure the level of light in the immediate environment. These sensors then send the data to a computer chip (Basic Stamp) which has been programmed to send out MIDI data to a computer.

Who have you been influenced by, and how?
I think I’m influenced by pretty much anything that I come into contact with; it all affects me in some way. I have been influenced lately mainly by my surroundings and friends, reading Jung and alchemy stuff, and doing a performance with KK Null and Z’ev. I’m unsure in what ways I’ve been influenced in my work, it’s probably easier for other people to answer that question.

What sort of music do you generally use to perform with the MGI?
Currently it’s noise and experimental stuff. I’m using samples that I have generated myself, and trigger and manipulate them live with the MGI. I might expand from noise to do some kind of music but I’m not sure yet as I’m still developing the software and hardware.

Could you use any type of music with the MGI, for instance rock or jazz?
Yeah, you can use it for anything you want because it works by MIDI, but it should be up to the user for doing what they want. It’s not designed for any specific kind of musical style; it’s just an attempt at creating a freer way of performing when using a computer.

Would you class it as a new musical instrument?
Yes, and no. It’s not really a brand new thing, there have been many artists to use body mapping to control sounds. It’s definitely not a traditional musical instrument. It isn’t really an instrument on its own; it’s a controller because it still needs software. With the software, whether it be Reason, Max/MSP or whatever, then it could be termed as an instrument.

Would it ever be possible to form a band with a different aspect of the music being controlled by each different member?
Yes, it is very possible and it’s happening everywhere! A traditional rock band has each member controlling a different aspect of the music, so does an orchestra, a choir too. Just imagine a choir with accelerometers strapped to their heads all spinning around!

Could you ever integrate visuals to match the sound differentiations which you create, to work in unison with the music?
Yeah, I’m actually working on a Max patch at the moment that’s doing exactly that and it should be ready for my next performance, so I’ll be controlling sound and visuals.

Would you ever consider making a version of the MGI that integrated movements from the whole body?
Yes, definitely. That’s the plan for the next year. I’m intending to put bend sensors on my wrists and across my back, motion sensors on either side of a stage and some other sensors that I haven’t decided on yet. There are already artists who have used pretty much the whole body, from measuring eye movements, to pulse, brainwaves, and even blood flow.

Would you call what you create art or music, or a mix of them both, e.g. audio-visual art? Or, simply, performance?
I think it’s somewhere in between all of them. Its art-sound-interactive- performance.

What are your objectives for the future of the MGI?
Further development with more sensors! Also, the software must be developed more for greater control in performances. Oh, and more gigs! If anyone wants any more info, feel free to e-mail me at: zero_point@hotmail.co.uk.