Feedback Piano
Password to watch the above Feedback Piano video: RGibson96
https://player.vimeo.com/video/428082627?h=a20e7d970e&dnt=1&app_id=122963
In this project, I wanted to explore how I could use interactive music systems and feedback to encourage conscious musical decision making from audiences and interactors in an installation. This is in response to how installations that feature audience interaction typically use the audience members as controllers, catalysts or performers, but audiences rarely have the opportunity to contribute their own ideas within a sound installation.
At first, I was interested in creating a patch which allowed audiences to draw or type instructions in graphic score-like fashion which was then to be interpreted by a set of performers. However, I was concerned with how the audience were still separated from any sound making, and it was up to the performers to understand and interpret these ideas, which may not necessarily reflect the intentions of the audience as written in the score. Because of this, I decided to use the piano as a controller; a person can interact with it within a space and using Max, an interactive patching environment, the immediate consequences of their actions would be heard in the electronic part. I wanted to use a piano rather than motion tracking or other controllers because of how it is linear in pitch and does not necessarily require correct technique or intonation to trigger sound like it would on another instrument such as a violin or flute, since the keys act as triggers for the strings to be hit when pressed. However, we still have semantic knowledge of how the instrument functions due to its popularity particularly in Western classical and popular music unlike using a PlayStation controller for example, which although has more sound potential because it is a blank canvas for sound, it is not traditionally used as a tool for sound making. It is encouraged to explore and experiment with sounds possible by placing objects around the piano and on the top of it to help create new sounds e.g.: paper between the strings, house keys resting on the strings etc. This sound is then picked up by a microphone and inputted into Max via a DI box. To further create a physically unified world of sound between the electronics and acoustic sound, a transducer speaker is placed within the piano, acting as the speaker for the electronics as well as the acoustic sounds created by the instrument.
Musical decision making is encouraged through the immediate cause and effect reactions created in Max. The fiddle~ object is used to analyse the input frequencies and peakamp~ is used to analyse the amplitude. This information is then used to trigger events through different thresholds, such as loud and high sounds turning on a transposer and soft low sounds triggering a counter for the audio buffer to be switched on. To further enhance how the sound is used to trigger effects, positive and negative feedback systems are used to change parameters of the delay, transposition and the sample playback tools. This helps create a larger variety of resulting events rather than simply using the pitch and amplitude to change the parameters of effects in the patch.
In order to avoid constant triggering and overloading within the patch, a noise gate is added to the input sound, with an adjustable threshold depending on the environment the piano is in. A negative feedback filter and a limiter is added at the end before all the effects are sent to the output in order to stop sounds from being too loud or piercing. The buffer contains samples for playback when triggered. The samples include recordings of piano improvisations, which were then edited in the software Cecelia 5 in order to add more variety and interest to the sound world created, rather than relying strictly on the sounds created on the piano at one given time.