The first few seconds of this video are deliberately black screened. Do not adjust your television set.

Over the passed several years, miniturisation in electronics has resulted in the ability to move computing from the desktop or laptop computer and position processing power and sensors directly within interactive objects. This concept or phenomenon is sometimes called pervasive computing. It means that, as Human Computer Interaction (HCI) designers, we have the opportunity to rethink the way we interact with artefacts that rely on digital processing to do their job. We can break free from the overtly machine like nature of interfaces, and create something more instinctive and ‘natural’.
It’s my contention that digital musical instruments are no exception. In this, my master’s thesis (for which I proudly received a 12) I created a mash up of a digitally augmented snare drum. Aside from changing between presets, all interaction is based on accepted techniques for acoustic drums, and all the sound is processed, synthesised, amplified and transmitted from within the drum itself.
Evaluation showed, as with all musical instruments, personal tastes vary, but new performance capabilities were clearly identified without the need to learn new techniques or battle with parameter adjustments.

While there are a great variety of digital musical interfaces
available to the working musician, few o ffer the level of immediate,
nuanced and instinctive control that one finds in
an acoustic shaker.
bEADS is a prototype of a digital musical instrument
that utilises the gestural vocabulary associated with shaken
idiophones and expands on the techniques and sonic possibilities
associated with them.
By using a bespoke physically informed synthesis engine,
in conjunction with accelerometer and pressure sensor data,
an actuated handheld instrument has been built that allows
for quickly switching between widely diff ering percussive
sound textures. The prototype has been evaluated by three
experts with diff erent levels of involvement in professional
music making.

Toy or sculpture?

If you ever wanted a safe way to learn the art of snake charming — look no further!

This piece was designed to re-frame popular conceptions of technology. By creating a computer mediated artefact that makes people more aware of their immediate surroundings, and the people in it, we aim to make people think about interaction on many levels.


Noise Floor Presentation Video from NoiseFloor on Vimeo.

Guthman Competition

I’m proud to be appearing at the Guthman Competetion in Atlanta this year. From commercial products to imaginative stringed instruments to a contraption that allows a fetus to make music, this year’s competition pushes the boundaries of soundscapes and musical design.

I will be playing an augmented version of a medieval string instrument that I designed and built with Troels Hammer, Alex Baldwin and Edvinas Peciulis in 2015. We are amongst 24 semi finalists, and the competition is hot!

Pat Metheny is amongst the list of previous winners, and judges. So that gives you an indication of how much practice I need to put in! Unless Troels decides he wants to perform instead, in which case I will be standing by with spare tennis strings if the instrument breaks.

Tromba Moderna


The Tromba Moderna

A Digitally Augmented Medievil Instrument

This was a wonderful project that came out of a proposal from Musikmuseet in Frederiksberg, Denmark.

We addressed their desire for increased visitor interaction by building a replica of an instrument in their collection that the public could play. The instrument we chose was slightly taller than the average double bass, had only one string, and sounded like a trumpet: The Tromba Marina.

We made the instrument more playable by feeding sound from the bridge, electronically, back into the body. We also simulated the sound made by the instrument’s vibrating bridge using pure data.

Great fun.

Speech Feature Extraction

For this project I designed an audio based computer game. The player controls the environment with their voice. The Pure Data patch that I wrote is able to distinguish between various vocal sounds.

  • Open vowels
  • Hard constanants

These four sounds were used to drive an external physical model of a D.C. motor.

Saying ‘ssss’ or ‘zzzz’ is effectively putting your foot on the gas, say ‘ka’ or ‘pa’ changes up a gear. ‘Aaaah’ turns on reverb and a ping ping delay to simulate other traffic, and, for the icing on the cake, ‘sssss’ switches on a randomly sweeping notch filter, the center frequency of which jumps further, with increased time intervals, the louder the input voice. This has the effect of making the motor sound like a speeding boat. ‘Zzzzz’ switches off the notch filter, and engages a three tap dynamic chorus effect, again controlled by amplitude. This makes the motor sound like a plane.

Despite making people look quite silly, it proves to be quite an addictive waste of time.

Screen shot 2016-VocalExtractionScreenShot02-15 at 21.42.45

Despite some glitches in the behavioural routines, this is the interactive product I am most proud of

Having worked on this project as part of a small, but superb team, for a little over four years I feel deeply invested in it’s future.

Move your hips!

How movement affects audience perception

In this experiment I set out to determine if musicians can sell themselves more convincingly, just by moving a bit more. It turns out it’s true. it may be obvious on an intuitive level, but obtaining proof is another matter!


You can download the full report here: PeceptionCognitionPeterWilliamsMP

The Awesome ACPAD

ACPAD – This new interface for musical expression looks like a great deal of fun. In fact it is very similar to an idea I had recently had, and was considering mounting to my short scale bass guitar. I eventually decided to put that bass up for sale and watch an episode of Star Trek Voyager instead. This was no great loss to the music world. This device is clearly far more advanced than anything I was dreaming of. But, I wish they had gone in a fundamentally different direction.

There is a tendency, when music instruments and tools arise from technology, for the inventive mind to find ways to replace musicians, rather than empowering them. Here we see a guitarist performing the roll of guitarist, drummer, keyboard player and programmer. It’s very impressive, and will no doubt facilitate the production of fresh music. But would it not be more exciting if the interface was just controlling the sound of the guitar. Allowing varying amounts of delay, distortion, portamento and so on. Yes, these effects can be produced with the use of pedals, but I suspect with less freedom and control.

A great example would be Future man, seen here playing his crazy home made percussion instrument that he started work on in 1988. It’s not a world away from Acpad, but I feel there is a stark contrast in the philosophical approach. Roy Wooten, is a musician first, then an inventor, and he explains something of that process here.

For me, the most exciting thing about music is the interplay between musicians. Let’s not use technology to kill the band!

Older Posts »