AutomaTones – Anders Øland

- Hide menu



View full abstract

Source Code & More

Download the whole project
Source code repository:


Anders Høst Kjærgaard (GUI & Design), ahkj AT itu DOT dk

Anders Øland (Cellular Automaton), anoe AT itu DOT dk

Anders Bech Mellson (MIDI & Translators), anbh AT itu DOT dk


AutomaTones maps the output pattern of up to six simultaneous one- or two-dimensional cellular automata to musical notation (MIDI), and facilitates playback and user control through MIDI hardware and instruments.


Several applications in existence today experiment with the use of cellular automata in music creation or production, however to our knowledge most of these offer very little flexibility and yield somewhat uninteresting results. AutomaTones is our attempt on creating an application that invite for more creative use as well as more musically interesting results.

To achieve this we have focused, in our design, architecture and implementation, on flexibility and expandability. In the current version up to six automata may run simultaneously whileoutputting to individual MIDI channels. The states of the automata can be mapped to MIDI in different ways, and the receiving MIDI application can use the output in any possible way; e.g. pitch control, instrument control, effects control, sample playback etcetera. Furthermore the user can control and view the automata on a connected Novation Launchpad, and set configurations, add patterns and modify cells on the fly. The user can select different types of grid sizes, dimensions and rules.

We have implemented an extended version of John Conways famous Game of Life which works with non-binary cell states. Thus, yielding more options when mapping to MIDI and interacting with the Launchpad. One-dimensional elementary automata can be setup using any rule representable by a Wolfram code – even non-binary N-dimensional ones. The user can even make up his own rules by defining patterns of neighboring cells which control the new state for the subject cell.

The architecture and implementation allows for future expansion and development. Experimentation with an endless combination of rules, MIDI mappings, dimensions, state spaces etcetera is easily attainable in the current version (however not entirely via the GUI as it is). For instance each cell may use individual rules and state spaces, and the rules use delegates that could be multicast delegates. Tuning any of these variables may result in radically different musical output.

The application uses a Model-View-Controller architecture where model and view never communicate directly – they can only interact through a controller. MIDI input from hardware is handled by a Sense-Compute-Control pattern attached to the controller.

Future work

*** Implement learning algorithm, e.g. genetic algorithm, to explore the complexity needed in the CA rule set to yield more musical results.

—— Needs a feasibility function to measure the musicality of the output, e.g. a neural network trained with data from online MIDI library.

*** Port to iPhone / iPad with sync between multiple units.

*** Add triggers to specific cells, allowing e.g. oscillating patterns to output suitable hi-hat / drum rhythms.

*** User-defined MIDI translators

AutomaTones GUI mockup

Sound examples:

Early video examples:

Leave a Reply

You must be logged in to post a comment.