This paper presents and describes the features of a new
M4L module for Ableton Live which enables basic gesture tracking
and mapping. The module is based on the gesture-follower patch of
the IRCAM’s FTM library and it makes use of the native capabilities of the Live API. The mapping between the gestures and audible
output is designed within the module and it enables the simultaneous control of macro features of the Ableton interface such as clip
selection, volume and panning of the track/s.
History
Publication
AISB 2014 Convention: The Society for the Study of Artificial Intelligence and Simulator Behaviour;