posted on 2022-12-14, 15:49authored byIan R. O'Keeffe
This thesis presents an interactive system for investigating the emotive content of music. This system takes an empirical, computational approach that involves the subject directly in the process of creating emotional states through music. The system consists of a series of sets of low level musical transformations controlled by a user interface. These transformations include functions for the alteration of key, scale, timbre, attack, articulation, tempo, and rhythm, and are labeled in a general manner to facilitate their usage by non-musicians. Given an initial melody and a target emotion, the user then uses the functions available to transform the melody
until he or she is satisfied that the final result represents the emotion that was requested. The system is capable of tracking the sequence of processes used to arrive at the target emotion, and of using or storing these "macros" to allow for either the recreation of the emotive conversion, or the study of the captured data for the purpose of isolating general features pertaining to the requested emotion. This
system emerged from a commercial research project that investigated automated soundtrack composition driven by an inputted video stream. My area of research involved the creation of the composition engine. The project did not contain any element of emotional control, an area I regarded as essential in the shaping of the
musical experience, so it was my wish to incorporate such control that led me to enhance my existing system, expanding the scope and the flexibility of the musical manipulation functions available, and adding a suitable front end to allow user interaction.