Shelly's sonification residency

Posted April 4, 2017 by Shelly Knotts

During my week at FoAM I worked on the project Molecular Soundscape which I am developing with chemists at Newcastle University. The aim of the project is to develop sound based work which can accompany visualisations of protein dynamics both to communicate the process of drug design to non-specialists and to aid scientists in analysing the effects of experimental drugs on specific proteins. The scientists I am working with have a long term goal to develop drugs to tackle Alzheimer's disease. The data they are working with relates to proteins present in the human brain affected by the disease.

Alongside working on my own project I also helped out fellow Human-in-Residence Jo Garrett, who was working on recording underwater soundscapes as part of FoAM’s long-term Sonic Kayaks project. I helped Jo with setting up and troubleshooting frequency response tests of pre-amps, soundcards and microphones, used in the technical setup of the sonic kayak project.

The Molecular Soundscapes project outputs include musical scores algorithmically generated from the data, and a sound installation which uses sonification - the translation of data into sound - to communicate the structure and movement of the protein. Prior to my week at FoAM I had developed the sound aspects of the installation and worked on multichannel spatialisation in the music studios at Culture Lab (Newcastle University’s Digital Cultures research lab). Main aims for the week were - with the advice of Dave and Amber - to resolve some of the conflicts between representing the science accurately and having an aesthetically pleasing sonification and to think about ways of introducing interactive elements to the installation.

Here’s what I got up to during the week:

Monday:
We discussed approaches to sonification and various past foam projects involving sonification. We talked over potential for adding interactive elements to my project, and the tensions between the artistic and scientific aims. Dave showed me his quipu sonification and then we talked about the difficulty of finding patterns in dense and complex datasets. Dave suggested adding a threshold to the flexibility parameter - meaning only residues over the threshold value would be sonified - to thin out the texture.

Tuesday:
I implemented the flexibility threshold and added a slider meaning I could change the threshold value. This was pretty effective, making the sound much less dense, and making it much easier to pick out patterns in the pitches that were left. Using the slider to vary the threshold also meant I could make a variation in density of the sonification and change the patterns being made by the pitches.

We also talked about some of the challenges of working on scientific projects and Amber and Jo found some papers for me which were useful in understanding some aspects of the science that I didn’t fully understand.

Wednesday:
Part of the sonification maps the distance of each residue from its starting position to pitch over time. When I played my sonification for the chemists in the studio at Culture Lab they said it was confusing that I had taken these as absolute values in the mapping - i.e. I always start the pitch change over time on a different pitch, so today I normalised the values to always start the pitch trajectory on the same pitch for each residue.

Thursday:
Another challenge of the project has been collaboratively developing marketing text for the events which are both scientifically accurate and understandable for non-specialists, as Dave and Amber have a lot of experience in this area I worked on some text with them to give an overview of the project.

FoAM's up-coming Viruscraft project also has a lot in common with Molecular Soundscapes in terms of the scientific concepts. We talked about Viruscraft and my project and looked at lots of different ways that scientists notate, codify and describe molecular biology. Amber made some drug-protein complexes out of plasticene to help clarify some issues.

Friday:
Final day at FoAM! I helped out with sending Jo and Aidan off in the kayak to record some deeper underwater habitats.

I also added some additional interactive elements to the installation, adding volume sliders for each layer of the sonification and a ‘time’ slider which lets you skip to different time frames in the data.


Finally I started working on sending messages from SuperCollider (which is generating the sound) to Chimera (which is a protein visualisation software), so I can highlight which part of the protein is being sonified at which time.

Related