Res, a matter.

Xth Sense > Event reports

“A high-tech music weekender, BEAM is a playground of homemade instruments and sonic robots, focused on electronic music being performed or created LIVE.”

Performing Music for Flesh II | Picture by Christos Michalakos

The Beam Festival was an inspiring event; packed with an incredible line-up including Stelarc, Atau Tanaka and Adam Parkinson, Alex Nowitz, Dj Sniff, Ryan Jordan and many others, it offered 3 days of absolute sonic pleasure. Me and colleague Christos Michalakos were awarded a bursary for “outstanding work”, respectively for Music for Flesh II and Frrriction, thus we had the pleasure to perform at the Artaud building during the evening concert sessions.
A complete report of the event is available at my personal blog.

STEIM Crackle Box

source: pcm.svg, available from http://commons.wikimedia.org/wiki/File:Pcm

The Linux Audio Conference (LAC2011) was held at NUI, National University of Ireland in Maynooth in the beginning of May.
My paper “Xth Sense: researching muscle sounds for an experimental paradigm of musical performance” and a related performance “Music for Flesh II” were both selected for the conference.
So far, the LAC has been one of the best conferences I took part in, both for the diverse and top-notch quality content offered, and for the easiness of social networking. A great place to be, indeed.

Thanks go to Victor Lazzarini, John Lato, Frank Neumann, Robin Gareus, Rory Walsh and all the NUI and LinuxAudio.org team. All presentations were streamed live and recorded. Definitely a great and valuable effort.
If you missed the event, make sure to check the program on-line, you will find an impressive source of ideas, knowledge and technical insights.
Here’s a complete video recording of my talk; an in-depth presentation of earlier methods and findings on the Xth Sense and muscle sounds.
The slides used in this presentation are also available for download at the LAC website.
All material is CC licensed, see the LAC website for detailed information.

Drawing from the stimulating conversations we all had in Maynooth I could extract some more ideas around muscle sounds. Above all some friends made me realized that I still didn’t analyzed in depth muscle sounds as for FFT and detailed frequency responses; although FFT has not proved effective, some statistical documentation about the sonic matter I’m working with could possibly disclose new insights.
Besides, a colleague working in the field of Biomedical Engineering reminded me that human body is composed of two different muscles, namely short and long muscle. Whereas long muscles are used for slow and light contractions, short muscles handle faster and stronger contractions. This observation might be useful while planning the location of four sensors on the performer’s body. I still couldn’t try this myself, but this is one of the next ideas I would like to actualize this year.

The Xth Sense workshop just ended. It was an exciting experience. We have been hosted by the almighty NK Berlin, an established independent venue for experimental music, DIY electronics and the likes.
During three days we have been building some Xth Sense biosensors from scratch, installing and trying out the Xth software and shortly training on biophysical gestural control of music. Definitely a lot to do for three days only!

Building the Xth Sense biosensor from scratch

Participants multitasking

DIYing biophysical sensors

It was the first time the Xth Sense was unveiled in such detail; on one hand, the sensor hardware proved to be fairly easy to build, also the participants who had not much prior experience with electronics and soldering successfully built their own sensors; on the other, I realized the software needs to be much more portable; we spent too much time dealing with the paths of the required libraries, and I eventually realized that we were struggling with a known bug!
Too bad. Lesson 1: check bug trackers regularly.

Ghost in the GUI

Patching the Xth Sense system

I was pretty satisfied of our workflow, although I realized three days are not enough to achieve the ideal format of such workshop. Eventually everyone built his own sensor and the Xth software was up and running on almost all the machines. The only real problem was to get the software working on two Mac G4.
It seemed that they couldn’t run the software because of too high CPU usage, thus, after some unlucky trials and on-the-run optimization of the software,  we had to work on some other laptops.
Lesson 2:  although sometimes is cool to be able to use a very old machine, make sure beforehand nobody is using one of them during your workshop.

Students playing with the biophysical sensors

More playing, get those sounds out!

It was extremely satisfying to see my students playing around with the system on the last day. It was good to confirm that the Xth Sense have an actual potential to extend the portability of biosensing technologies and gestural control to a wider audience. I wish we had more time for the hands-on training. Observing how each individual differently relate to sound had always fascinated me, and a course based on such bodily paradigm provides a transparent insight on the different expectations and skills of each one.
Lesson 3: DIY is good, but remember to leave enough time for the fun.

Want to thank the guys at NK, they have been extremely helpful and lovely companions. I’ll have to come back soon to Berlin for another workshop session.

I started a small tour with the Xth Sense system.
The first event I was invited to was the ImagineCreate Festival, in Derry, Northern Ireland. ImagineCreate is “a digital arts festival which brings together a world-class line-up of talents and minds from the worlds of art, design, new media and software development.”
I gave a talk about the Xth Sense and biophysical music within a presentation titled “Performance and Pure Data” in collaboration with Richard Graham.

Here’s a short video excerpt of the showcase captured by James Alliban.

It was quite good to take part in the event as I’ve received some very good feedback and suggestions. Among the other invited speakers there were (the above mentioned) Richard Graham, James Alliban, Alex Beim (Tangible Interaction), John Crooks, Gregory Taylor (Cycling ’74) and Brendan McCloskey. A full list of speakers is available on-line.

Many interesting issues were raised after my talk so I added some items to my todo list.
First, I plan to implement a tracking of muscle sounds contour. This way I could deploy the contour as dynamic curve for data mapping, obtaining different results than with a preset curve, such as a logarithmic or exponential one.
Secondly, I could compare the incoming data of each arm passing them to a discrete function; the resulting value would represent the ratio between the energy applied to the arms. It might also be interesting to track rest intervals over long period of time.

Rehearsing Music for Flesh II at Inspace, UK.

I just ended a new work residency at Inspace, home to a joint research partnership between the School of Informatics and New Media Scotland.
During the residency, me and Brendan F Doyle (Production Sound Engineering) worked on a participatory Musicircus which included a complex, reactive sonic system for 8 Xth Sense biosensors and 8 audio channels.
The piece was premiered two days ago on 15th March during a public event which gathered quite a good amount of people from different backgrounds, music, sound design, informatics, hackers and general public.

It was awesome to see how the audience positively reacted to this unconventional paradigm of musical performance, namely generating and controlling electroacoustic sounds and music by muscle’s contraction. After some initial – and expected – disorientation, everybody started realizing how to use the sensors and how to interact with the system; only then the actual Musicircus started.

The event was pretty successful, everything worked out fairly smoothly –  except the few things which always “have to” get broken just before the opening –  and we received several excited and stimulating feedback. Overall a great and satisfying experience.

During the same night, I also performed my new solo sonic piece Music for Flesh II, for augmented muscle sounds and Xth Sense technology. The acoustic capabilities of Inspace were challenging: glass walls all around, an empty floor which acts as a resonant shield, and interesting sonic reflections happening all around the space. I need to thank again my colleague Brendan for his great work with the sound engineering of the venue and his live mixing during my performance.

Picture courtesy by Chris Scott.

My paper “Xth Sense: researching muscle sounds for an experimental paradigm of musical performance” was accepted to the Pure Data mini conference, along with the related sound piece Music for Flesh II.
The Pd mini-con took place in Dublin, at the Trinity College. It was my first academic presentation of the Xth Sense outside of my own department; definitely an exciting chance to get new feedbacks and establish a new group of peers.
The experience was great, and the comments of audience and staff were positive. I also had some small technical problem during the performance… some good food for thought!

Performing Music for Flesh II | Pure Data mini conference | 2011

A full report of the event is available on my on-line magazine.

Before the holidays break I wanted to set a milestone of the inquiry, so I agreed together with my supervisor Martin Parker to host a short presentation reserved to the research staff of our departments (Sound Design and Digital Composition).

MMG biosensing device presentation setup | The University of Edinburgh | 2010

I’ve been setting up the Sound Lab at Alison House since the morning and everything worked fine.
Even though some of the researchers could not make it, I was happy to see Martin, Owen Green and Sean Williams.

MMG biosensing device presentation | The University of Edinburgh | 2010

Although the piece I performed represented more a first presentation than a proper concert for me, it has definitely caught the attention of the listeners and earned some good feedbacks; however what I was most expecting were constructive critics which could allow me to assume a different viewpoint on the present stage of the project.
In fact I did receive several advices which can be roughly summarized as follow:

  • harmony, improve the overall harmony of the piece
  • silence, including silence in a musical piece demonstrates braveness and coherence
  • sonic gesture, theatrical gesture is very important in a musical performance of this kind
  • add to the software envelope generators to enhance automation
  • try to use a MIDI device
  • study a better dynamics compression to be applied on the MMG source signal
  • delays and reverbs, if improperly used such effects might destroy the sonic space, instead of creating it
  • the distortion effects I first used can be ambiguous

I fully agreed with these critics and I actually realized I could have prepared the presentation much better. However it has been important to listen to my fellows’ feedbacks and that night I came back home and worked until late to improve the piece for the forthcoming concert.
Night time was not enough to work on all the issues raised after the presentation, but I was able to better experiment with silence and subtle processing effects, envelope generators (which eventually have not been used) and a MIDI controller. Results seemed very good.
The day after I worked a couple of hours more, then I went to Alison House to set up the equipment for the concert together with my fellows Matthew and Marcin who performed too on the same night. This time I prepared everything professionally, anxious as I was to present the reviewed piece.

Concert for Biosensing Techs | Edinburgh University | 2010

Concert for Biosensing Techs | Edinburgh University | 2010

The setup consisted of the MMG sensor prototype, a Focusrite Saffire Pro40, a Behringer BCF2000 MIDI controller, a DELL machine running a customized Ubuntu Lucid Lynx with Real Time Kernel, and the Pure Data-based software framework I’m developing.

Concert for Biosensing Techs | Edinburgh University | 2010

Audience feedbacks were very good, and I seemed to understand that what most appealed the listeners was an authentic, neat and natural responsiveness of the system along with a suggestive coupling of sound and gestures. Concerts were supposed to be recorded, but sadly they have not.
Although some harmony issues remained, I was also fairly satisfied of the performance outcome. During the winter break I plan to better implement the prototype, possibly making it portable and to refine the software, coding a better control data mapping and fixing the omnipresent bugs.