== Overview ==
The MAMA systems situates agents in a virtual space, where they play music, responding to the actions of the agents around them. Agents "listen" more to the actions of agents who are near to them - the space simulates the effects of physical acoustic placement. As a listener, the output of the agents nearer to the bottom/front of the space is louder than those at the back, and their left-right position controls their sounds position in the stereo field. A visual interface displays the positions of the agents, along with some indicators for their dynamics, how many notes they're playing and their position in the score.
The AgentBox was created to allow a human "Conductor" to manipulate the agents in a fast and intuitive manner. By creating a tangible interface, human participants can shape the music being played without any need for musical experience.
== Box Mechanics ==
The box is internally lit, and topped with translucent acrylic, with a webcam mounted inside. A set of specially designed counters are placed on the top, which are then tracked by an EyesWeb patch. Briefly, this patch:
* looks for white regions (the white circles on each counter)
* extracts a region from inside this (the coloured circle) and averages the colour
* sends the position and colour of each "blob" over the network using [http://www.cnmat.berkeley.edu/OpenSoundControl OSC] (Open Sound Control)
This information is received by a java program, which learns the colours of the counters being used; agents are identified by their colour, so the system gives visual indications of any counters whose colours are too similar. This program then passes on the colours to the main agent system, which creates an agent for each counter, with the appropriate colour. While the agents are running, the java helper injects position updates into the system, along with any high level gestures extracted.
== Performing In C ==
The first demo of the system is a piece called [http://en.wikipedia.org/wiki/In_C In C], by [http://en.wikipedia.org/wiki/Terry_Riley Terry Riley]. This piece is [http://www.otherminds.org/shtml/Scores.shtml scored] for "at least 30 people" as follows:
* everyone receives the same score, with 53 fragments of music on
* everyone starts playing the first fragment, until they decide to move on to the next
* there are a few rules: everyone should stay within a few fragments of everyone else; you may take breaks and offset your pattern as you see fit; everyone should reach unison once in the piece if possible
* when everyone reaches the final section, they play round it as people drop out one by one
The AgentBox acts as an interface to the agent system playing this, with a few modifications:
* the human conductor controls the dynamics of each agent, by moving them forwards/backwards in the space
* the conductor asks agents to take breaks, by moving them off the board
* the conductor can ask an agent to move on to the next section of the score by wiggling the counter
* the agents feel a "pull" to move through the score, based on the position of the agents around them (this is the lower of the two curved bars in the top diagram).
* if an agent has been on a break for a while and is then repositioned on the board, it will start playing the section which those around it are playing
This allows the human conductor to manipulate the timbre, dynamics and density of the resulting musical output. Agents can be brought forward to take solos; agents taking breaks can be brought in to play counterpoint to a soloing agent; the instrumentation can be stripped down to a single player, and then brought back up to full strength.
Many thanks to the Centre for Intelligent Systems and Applications at the University of Edinburgh for generously funding this work.