GePS - Gesture-based Performance System


Software documentation - PureData

The gestures / movements are captured by the MPU6050 3-axis accelerometer and 3-axis gyroscope. The sensordata transmitted to the computer is interpreted and transformed to sound by software (performance patch) written in Puredata. Structure, functionality and programming guidlines for this performance patch is described in the following.



  • Raw data is read from the serial port the XBee dongle is connected to.

    (main > serial-status > pd serialport)

  • The sensor data is parsed from the data sent by the GePS unit, and split in 6 data streams, one for each axis.

    Data streams are written to tables for visualisation.

    (main > serial-status > pd parse)

  • data interpretation Sensor data is scaled and processed to gain specific information like derivatives, deltas which in turn are interpreted to get triggers and control signals for the DSP. This data is sent out to the instruments and the settings patcher.

    (main > serial-status > pd parse)

  • The performance can be structured by allowing to switch modes with gestures. This and initial settings are handled in settings.

    (main > controls > pd settings)

  • Instruments are built to react on the interpreted sensor data. It's up to you what you like to create / implement, it can range from simple sample playback to filtering, granulation, complex DSP like phase vocoders and physical modelling or external instruments. Audio outputs are sent to the mixer.

    Two instruments are included in the GePS PureData package:

    • GePS Transients
    • GePS Freezer
  • The mixer collects the audio-signals from all instruments. There are options to process the summed signal and to record the performance.

↑ to top

Global patcher (GePS-MAIN.pd)

Graphical overview / interfacte of the main performance controls and modules. Everything can be accessed from here by opening the subpatchers (right-click open) or via the buttones in the "CONTROLS" section.


This element displays the status of the serial port or connection. It has a control to open, close or reset the serial connection, and it displays the average latency of the sensor data (7.3ms is a common value).


  • DSP: Turn on / off the audio processing (Digital Signal Proccessing).
  • Perform: start / stop performing. A performance outline can be predefined in settings, it is initialized and started, and DSP is turned on as well.
  • Plot sensor data: Check if the sensor data is parsed correctly, visualized as graphs for each sensor / axis. Turn this off if you don't need it, it costs a lot of CPU cycles.
  • Show settings: View and edit global settings and the performance outline.
  • Mixer: open the audio mixer, only necessary if not defined in the performance outline in settings.


Instruments are the main building blocks to generate sound. The combination and control is handled by settings. Your performance can be extended by programming new instruments and combining them.

If you want to combine instruments and activate them by specific events, you need to implement a performance outline reacting to sensor data.

You can also turn them on/off manually, but keep in mind that the mixer is controlled by settings which could mean that the intruments outputs are muted.

Modules and Abstractions

Abstractions are files with code that is used in more than one place. Bear in mind, that modifying an abstraction can change or completely break other modules that depend on this abstraction. If you want to modify it, copy the contents to a subpatcher or to a new patcher!

Filename Usage
Instrument modules
geps.freezer.pd Instrument module
geps.transients.pd Instrument module
geps.mod-template.pd Instrument module template
DSP abstractions
geps.em-gyro.pd Emphasize by movement (gyro)
General abstractions
geps.abstraction-header Note to include into all abstractions. Used to scale sensor data

↑ to top

Sensor parsing / data interpretation

Every instrument needs its own sensor data curves. Therefore the sensor data is treated before it is sent to the instruments. In order to reduce calculation time, most of the treatment is done globally.

When using the gyroscope data, in some cases it is not important in which direction the hand moves (or on which axis). In this case values below 0 get multiplied by -1 (absolute value) or the whole signal is squared. Squaring the control signal makes the gestures more dynamic – values close to 1 are affected less than values close to 0. This also reduces the noise floor of the signal which is useful when the data gets interpreted directly as amplitude of an audio signal.

An adaptive low pass filter (which treats the data differently depending on whether the values increase or decrease) is used for temporal control of the sensor data.

Triggering sounds is mostly done with the gyroscope data, because it is directly linked to the hand movement. The constant values of the accelerometer are used for slower and finer changes in sound. They can also be used for detecting the current hand position which can be applied to switching between different modes of control or parts of a performance.

The sensor data is interpreted and turned into 5 different control signals. For the case that an instrument has more specific methods of interpretation, the scaled sensor data is sent out directly, too.

Sensor axis data

accel_x,y,z Accelerometer data in 3 axes. Temporal control and squaring is case specific.
gyro_x,y,z Gyroscope data in 3 axes. Temporal control and squaring is case specific.

Interpreted data

gyrodelta_x,y,z This is the derivation of the gyroscope data in 3 axes. It shows peaks at the beginning and end of gestures but not during the movement. The peaks coincide with the "visual peaks" of abrupt movements and are therefore useful to trigger sounds with a clearly defined attack.
gyro_all The absolute values of all gyroscope axes are added. This signal represents the amount of movement of the hand / arm as a whole and is not direction sensitive.
gyrodelta_all The absolute values of the derivation of all gyroscope axes are added. This signal shows peaks at the beginning and end of movements of the hand / arm as a whole and is not direction sensitive.
t.flick Transient detection based on gyrodelta_all.
slide Variable lowpass filter, can be used to remove or smoothen fast movement. y (n) = y (n-1) + ((x (n) - y (n-1))/slide) with parameters for increasing and decreasing input. This treatment is available as an abstraction (slide.cs) as it takes individual parameters.

↑ to top

Instrument Prototype

We prepared an instrument prototype (GePS_TEMPLATE) to get you started with implementing a new instrument. It is structured as follows:

  • Input to turn the instrument on/off, which includes starting / stopping any data interpretation and local DSP.
  • Example for receiving sensor data / trigger
  • Example for local data processing (slide)
  • Example for local DSP (sample playback and proccessing controlled by movement).
  • Output sent to mixer.

DSP modules

  • geps.sampleplayer
  • geps.em-gyro [slide up] [slide down]

↑ to top


During the development of the device the main focus was always on transmission speed. For a complete fusion of movement and sound the reaction of the sensor had to be as fast as possible. (Commercial products like the Wii-remote do not have that speed.)

The sensor data is sent byte by byte to the computer. One reading of all sensor values consists of 14 bytes: 6 axes with 2 bytes each (16bit) and 2 bytes as identifiers for the data parsing.

↑ to top


When starting a performance the configuration defined in Settings is sent out to the instruments and the mixer. You can implement a performance outline by triggering different instrument states by interpreting sensor data.


mixer.ch0 time 10, gain 0 Set the gain of channel 0 in the mixer to a gain value of 0 (close it) in 10 miliseconds.
mixer.summing type 1, factor 2l Amplify the sum by a factor of 21 and choose the summing type soft saturation of the mixer (0 for hard clipping, 1 for soft saturation).

↑ to top

Further reading