A visualiser for Sonic PI (resizable version)

I received a request to see if the visualiser I recently published could a) be used on a split screen with Sonic Pi, each taking up half the screen, and b) be used on a second monitor attached to the computer.I looked at the code, and produced this version which uses a resizable screen, which can also be dragged onto a a second screen monitor if attached to the main computer. The main change is in the initial screen setup. This is now all done in the setup function, not in the settings function which becomes redundant. Also all references to displayHeight and displayWidth are replaced by height and width respectively.

In testing the new version, I found that the star routine could causeit to crash. This was down to the way I was generating the parameters, and I have altered the star function calls so that the second radius parameter no longer uses a / when computing its value. The two calls are shown below:

First version:

star(random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, circles[i]*stardata[0], circles[i]/stardata[1], int(stardata[2]+random(stardata[3])));

New resizable version:

star(random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, circles[i]*stardata[0]*0.25, circles[i]*stardata[1]*0.25, int(stardata[2]+random(stardata[3])));

The new processing code is shown below:

//Visualiser for use with Sonic Pi 3 written by Robin Newman, September 2017
// based on an original sketch https://github.com/andrele/Starburst-Music-Viz
//THIS VERSION FOR RESIZEABLE SCREEN: CAN BE GRAGGED TO SECOND MONITOR
//Changes: changed to resizable screen, updated for Processing 3, added colour, added rectangle and star shapes
//added OSC input (from Sonic Pi) to alter parameters as it runs, removed slider inputs.
//input OSC:   /viz/float       updates STROKE_MAX, STROKE_MIN and audioThresh
//             /viz/pos         updates XY random offset values (can be zero)
//             /viz/col         updates RGB values
//             /viz/shapes      sets shapes to be used from S, E and R (or combinations thereof)
//             /viz/stardata    sets data for star shapes
//             /viz/rotval      turns rotation of shapes on/off
//             /viz/shift       turns XY shift across screen on/off
//             /viz/stop        initiates sending stop all signal back to Sonic Pi port 4557

import ddf.minim.analysis.FFT;
import ddf.minim.*;
import oscP5.*; //to support OSC server
import netP5.*;

Minim minim;
AudioInput input;
FFT fftLog;

int recvPort = 5000; //can change to whatever is convenient. Match with use_osc comand in Sonic Pi
OscP5 oscP5;
NetAddress myRemoteLocation; //used to send stop command b ack to Sonic PI

// Setup params
color bgColor = color(0, 0, 0);

// Modifiable parameters
float STROKE_MAX = 10;
float STROKE_MIN = 2;
float audioThresh = .9;
float[] circles = new float[29];
float DECAY_RATE = 2;
//variables for OSC input
float [] fvalues = new float[5]; //STROKE_MAX, STROKE_MIN,audioThresh values
int [] cols = new int[3]; //r,g,b colours
int [] positions = new int[2];// random offset scales for X,Y
int [] stardata = new int[4];// data for star shape, number of points, random variation
int shiftflag = 0; //flag to control xy drift across the screen set by OSC message
int two = 0; //variable to force concentric shapes when more than one is displayed
String shapes = "E"; //shapes to be displayed, including multiples from S,E,R
int rotval =0;
int xoffset = 0,yoffset = 0;
int xdirflag = 1,ydirflag = 1;

void setup() { 
 size(400, 400,P2D);
 frame.setResizable(true); 
 //noLoop();
  frameRate(60);
   myRemoteLocation = new NetAddress("127.0.0.1",4557); //address to send commands to Sonic Pi
  minim = new Minim(this);
  input = minim.getLineIn(Minim.MONO, 2048); //nb static field MONO referenced from class not instance hence Minim not minim

  fftLog = new FFT( input.bufferSize(), input.sampleRate()); //setup logarithmic fast fourier transform
  fftLog.logAverages( 22, 3); // see http://code.compartmental.net/minim/fft_method_logaverages.html

  noFill();
  ellipseMode(RADIUS); //first two coords centre,3&4 width/2 and height/2
  fvalues[0]=1.0;
  fvalues[1]=0.0;
  fvalues[2]=0.32;
  cols[0] = 255;
  cols[1]=0;
  cols[2]=150;
  positions[0] = 50;
  positions[1]=40;
  stardata[0]=2;
  stardata[1]=4;
  stardata[2]=3;
  stardata[3]=5;
  /* start oscP5, listening for incoming messages at recvPort */
  oscP5 = new OscP5(this, recvPort);
  background(0);
}

void draw() {
  background(0);
  pushMatrix();
  //calculate changing xy offsets: shiftflag set to 0 to siwtch this off
  xoffset += 10*xdirflag*shiftflag;
  yoffset += 10*ydirflag*shiftflag;
  if(shiftflag==0){xoffset=0;yoffset=0;} //reset offset values to zero if shifting is off
  //reverse directions of shifting when limits reached
  if (xoffset >width/3){xdirflag=-1;}
  if (xoffset < -width/3){xdirflag=1;} if (yoffset > height/3){ydirflag=-1;}
  if (yoffset < -height/3){ydirflag=1;}
  //transform to new shift settings
  translate(width/2+xoffset, height/2+yoffset); //half of screen width and height (ie centre) plus shift values

  //optional rotate set by OSC call
  rotate(float(rotval)*(2*PI)/360);
  
  //get limits for stroke values and audiThreshold from OSC data received
  STROKE_MIN=fvalues[0];
  STROKE_MAX=fvalues[1];
  audioThresh=fvalues[2]; 
  //println("fvalues: ",STROKE_MIN,STROKE_MAX,audioThresh); //for debugging

  // Push new audio samples to the FFT
  fftLog.forward(input.left);

  // Loop through frequencies and compute width for current shape stroke widths, and amplitude for size
  for (int i = 0; i < 29; i++) {

    // What is the average height in relation to the screen height?
    float amplitude = fftLog.getAvg(i);

    // If we hit a threshold, then set the "circle" radius to new value (originally circles, but applies to other shapes used)
    if (amplitude < audioThresh) { circles[i] = amplitude*(height/2); } else { // Otherwise, decay slowly circles[i] = max(0, min(height, circles[i]-DECAY_RATE)); } pushStyle(); // Set colour and opacity for this shape circle. (opacity depneds on amplitude) if (1>random(2)) {
      stroke(cols[0], cols[1], cols[2], amplitude*255);
    } else {
      stroke(cols[1], cols[2], cols[0], amplitude*255);
    }
    strokeWeight(map(amplitude, 0, 1, STROKE_MIN, STROKE_MAX)); //weight stroke according to amplitude value

    if (shapes.length()>1) { //if more than one shape being drawn, set two to 0 to draw them concentrically
      two = 0;
    } else {
      two = 1;
    }
    // draw current shapes
    if (shapes.contains("e")) {
      // Draw an ellipse for this frequency
      ellipse(random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, 1.4*circles[i], circles[i]);
    }
    if (shapes.contains("r")) {
      rectMode(RADIUS); 
      rect( random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, 1.4*circles[i], circles[i]);
    }
    if (shapes.contains("s")) {
      strokeWeight(3); //use fixed stroke weight when drawing stars
      //star data Xcentre,Ycentre,radius1,radius2,number of points
      star(random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, circles[i]*stardata[0]*0.25, circles[i]*stardata[1]*0.25, int(stardata[2]+random(stardata[3])));
    }
    popStyle();

    //System.out.println( i+" "+circles[i]); //for debugging
  } //end of for loop
  popMatrix();
}

void oscEvent(OscMessage msg) { //function to receive and parse OSC messages
  System.out.println("### got a message " + msg);
  System.out.println( msg);
  System.out.println( msg.typetag().length());

  if (msg.checkAddrPattern("/viz/float")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      fvalues[i] = msg.get(i).floatValue();
      System.out.print("float number " + i + ": " + msg.get(i).floatValue() + "\n");
    }
  }

  if (msg.checkAddrPattern("/viz/pos")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      positions[i] = msg.get(i).intValue();
      System.out.print("pos number " + i + ": " + msg.get(i).intValue() + "\n");
    }
  }

  if (msg.checkAddrPattern("/viz/col")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      cols[i] = msg.get(i).intValue();
      System.out.print("col number " + i + ": " + msg.get(i).intValue() + "\n");
    }
  }
  if (msg.checkAddrPattern("/viz/shapes")==true) {
    shapes=msg.get(0).stringValue();
    //for(int i =0; i<msg.typetag().length(); i++) {
    // shapes += msg.get(i).stringValue().toLowercase();      
    //}
    System.out.print("shapes code "+ shapes + "\n");
  }
  if (msg.checkAddrPattern("/viz/stardata")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      stardata[i] = msg.get(i).intValue();
      System.out.print("stardata number " + i + ": " + msg.get(i).intValue() + "\n");
    }
  }
  if (msg.checkAddrPattern("/viz/rotval")==true) {
    rotval =msg.get(0).intValue();
    System.out.print("rotval code "+ rotval + "\n");
  }
  if (msg.checkAddrPattern("/viz/shift")==true) {
    shiftflag =msg.get(0).intValue();
    System.out.print("shiftflag code "+ shiftflag + "\n");
  }
  if (msg.checkAddrPattern("/viz/stop")==true) {
    kill(); //stop Sonic Pi from running
  }
}

//function to draw a star (and polygons)
void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

void kill(){ //function to send stop message to Sonic Pi on local machine
  OscMessage myMessage = new OscMessage("/stop-all-jobs");
   myMessage.add("RBN_GUID"); //any value here. Need guid to make Sonic PI accept command
  oscP5.send(myMessage, myRemoteLocation); 
}

A modified Sonic Pi driver program is shown below.

#Program to drive Sonic Pi 3 visualiser written in "processing"
#by Robin Newman, September 2017
#see article at https://rbnrpi.wordpress.com
#This program for resizeable version of visualiser
#set up OSC address of processing sketch
use_osc '127.0.0.1',5000
#select shapes to show
osc "/viz/shapes","se"  #"s" "e" "r" Star,Ellipse, Rectangle or combination
sleep 0.1

live_loop :c do
  #choose starting colour for shapes
  osc "/viz/col",rrand_i(0,64),rrand_i(128,255),rrand_i(0,255)
  sleep 0.1
end

live_loop :f do
  #set Stroke max min widths and audioThreshold
  osc "/viz/float",([8.0,5.0,3.0].choose),[1.0,2.0].choose,(0.4+rand(0.3))
  sleep 2
end

#set range of random positional offset (can be 0,0)
#automatically disabled when showng more than one shape
osc "/viz/pos",10,0

#control "bouncing" shapes around the screen 1 for on 0 for off
osc "/viz/shift",0

live_loop :s do
  #setup star data inner/outer circle radius, number of points
  #and random variation of number of points
  osc "/viz/stardata",[1,2,3].choose,[1,2,4].choose,5,1
  sleep 2
end

rv=0 #variable for current rotation
live_loop :r do
  rv+=5*[1,1].choose # choose rotation increment
  rv=rv%360
  osc "/viz/rotval",rv #change rv to 0 to disable rotation
  sleep 0.1
end

#Now setup the sounds to play which will trigger the visualiser
use_bpm 60
set_volume! 5
use_random_seed 999

with_fx :level do |v|
  control v,amp: 0 #control the volume using fx :level
  sleep 0.1
  
  in_thread do #this loop does the volume control
    control v,amp: 1,amp_slide: 10 #fade in
    sleep 140
    control v,amp: 0,amp_slide: 10 #fade out
    sleep 10
    osc "/viz/stop" #send /viz/stop OSC message to sketch
    #sketch sends back a /stop_all_jobs command to port 4557
  end
  
  #  This drum loop is written by Eli see https://groups.google.com/forum/#!topic/sonic-pi/u71MnHnmkVY
  #  used with his permission. I liked it, and it has good percussive output
  #  to drive a visualiser
  live_loop :drums do
    this_sample = [:loop_compus, :loop_tabla, :loop_safari].ring
    start = [ 0.0 , 0.125 , 0.25 , 0.375 , 0.5 , 0.625 , 0.75 , 0.875 ].ring
    sample this_sample.look , beat_stretch: 4, start: start.look, rate: 0.5
    sleep 1
    tick
  end
end

If you are using the original version, I suggest you alter the line that calls the star drawing routine as for the resizable version, and use the new Sonic Pi driver program with it.

You can download the new versions here

The original article shows how to set the system up.

Advertisements

A visualiser for Sonic Pi 3

Every since Sonic PI had a transparency mode added (not available on the Raspberry Pi versions) I have been interested in adding a Visualiser graphics display as a backdrop. The built in Scope can give attractive displays, but I wanted something with a bit more colour, and which covered the whole screen. I did some experiments with the iTunes visualiser which added quite a nice backdrop, but only with a significant delay between the audio and the fluctuations of the display. The recent arrival of Sonic Pi 3 allowed for further possibilities, because it enabled the use of OSC messaging. I did a trawl of the internet and came across a promising looking processing sketch which produced a pattern which reacted to incoming audio. It was written several years ago and was only monochrome based. I played around with this, upgrading it to work on the latest version of Processing, and added some colour. I experimented with adding further  basic shapes to the display (it originally used an ellipse primitive, set up to give concentric circles, which could also be driven to produce a star burst effect). I added rectangles and star shapes and experimented with off-setting these as the program ran, and also with using more than one basic shape at the same time. I then added some code so that the sketch could receive incoming OSC messages sent from Sonic Pi 3, which could be used to control various parameters for the shapes, such as stroke width, colour, and offsets on the screen. I added further flexibility such as the ability to rotate the shapes, and to shift the whole display vertically and horizontally across the screen. The final setup works well with Sonic Pi 3, which controls the patterns both with the Audio signal it produces, and also with OSC messages which can be sent to the sketch display code. These can be timed appropriately with the musical content.

The code for the sketch is shown below

//Visualiser for use with Sonic Pi 3 written by Robin Newman, September 2017
// based on an original sketch https://github.com/andrele/Starburst-Music-Viz
//Changes: changed to full screen, updated for Processing 3, added colour, added rectangle and star shapes
//added OSC input (from Sonic Pi) to alter parameters as it runs, removed slider inputs.
//input OSC:   /viz/float       updates STROKE_MAX, STROKE_MIN and audioThresh
//             /viz/pos         updates XY random offset values (can be zero)
//             /viz/col         updates RGB values
//             /viz/shapes      sets shapes to be used from S, E and R (or combinations thereof)
//             /viz/stardata    sets data for star shapes
//             /viz/rotval      turns rotation of shapes on/off
//             /viz/shift       turns XY shift across screen on/off
//             /viz/stop        initiates sending stop all signal back to Sonic Pi port 4557

import ddf.minim.analysis.FFT;
import ddf.minim.*;
import oscP5.*; //to support OSC server
import netP5.*;

Minim minim;
AudioInput input;
FFT fftLog;

int recvPort = 5000; //can change to whatever is convenient. Match with use_osc comand in Sonic Pi
OscP5 oscP5;
NetAddress myRemoteLocation; //used to send stop command b ack to Sonic PI

// Setup params
color bgColor = color(0, 0, 0);

// Modifiable parameters
float STROKE_MAX = 10;
float STROKE_MIN = 2;
float audioThresh = .9;
float[] circles = new float[29];
float DECAY_RATE = 2;
//variables for OSC input
float [] fvalues = new float[5]; //STROKE_MAX, STROKE_MIN,audioThresh values
int [] cols = new int[3]; //r,g,b colours
int [] positions = new int[2];// random offset scales for X,Y
int [] stardata = new int[4];// data for star shape, number of points, random variation
int shiftflag = 0; //flag to control xy drift across the screen set by OSC message
int two = 0; //variable to force concentric shapes when more than one is displayed
String shapes = "E"; //shapes to be displayed, including multiples from S,E,R
int rotval =0;
int xoffset = 0,yoffset = 0;
int xdirflag = 1,ydirflag = 1;

void settings() {
  fullScreen(P3D);
}

void setup() {  
  frameRate(60);
   myRemoteLocation = new NetAddress("127.0.0.1",4557); //address to send commands to Sonic Pi
  minim = new Minim(this);
  input = minim.getLineIn(Minim.MONO, 2048); //nb static field MONO referenced from class not instance hence Minim not minim

  fftLog = new FFT( input.bufferSize(), input.sampleRate()); //setup logarithmic fast fourier transform
  fftLog.logAverages( 22, 3); // see http://code.compartmental.net/minim/fft_method_logaverages.html

  noFill();
  ellipseMode(RADIUS); //first two coords centre,3&4 width/2 and height/2
  fvalues[0]=1.0;
  fvalues[1]=0.0;
  fvalues[2]=0.32;
  cols[0] = 255;
  cols[1]=0;
  cols[2]=150;
  positions[0] = 50;
  positions[1]=40;
  stardata[0]=2;
  stardata[1]=4;
  stardata[2]=3;
  stardata[3]=5;
  /* start oscP5, listening for incoming messages at recvPort */
  oscP5 = new OscP5(this, recvPort);
  background(0);
}

void draw() {
  background(0);
  pushMatrix();
  //calculate changing xy offsets: shiftflag set to 0 to siwtch this off
  xoffset += 10*xdirflag*shiftflag;
  yoffset += 10*ydirflag*shiftflag;
  if(shiftflag==0){xoffset=0;yoffset=0;} //reset offset values to zero if shifting is off
  //reverse directions of shifting when limits reached
  if (xoffset >displayWidth/3){xdirflag=-1;}
  if (xoffset < -displayWidth/3){xdirflag=1;} if (yoffset > displayHeight/3){ydirflag=-1;}
  if (yoffset < -displayHeight/3){ydirflag=1;}
  //transform to new shift settings
  translate(displayWidth/2+xoffset, displayHeight/2+yoffset); //half of screen width and height (ie centre) plus shift values

  //optional rotate set by OSC call
  rotate(float(rotval)*(2*PI)/360);
  
  //get limits for stroke values and audiThreshold from OSC data received
  STROKE_MIN=fvalues[0];
  STROKE_MAX=fvalues[1];
  audioThresh=fvalues[2]; 
  //println("fvalues: ",STROKE_MIN,STROKE_MAX,audioThresh); //for debugging

  // Push new audio samples to the FFT
  fftLog.forward(input.left);

  // Loop through frequencies and compute width for current shape stroke widths, and amplitude for size
  for (int i = 0; i < 29; i++) {

    // What is the average height in relation to the screen height?
    float amplitude = fftLog.getAvg(i);

    // If we hit a threshold, then set the "circle" radius to new value (originally circles, but applies to other shapes used)
    if (amplitude < audioThresh) { circles[i] = amplitude*(displayHeight/2); } else { // Otherwise, decay slowly circles[i] = max(0, min(displayHeight, circles[i]-DECAY_RATE)); } pushStyle(); // Set colour and opacity for this shape circle. (opacity depneds on amplitude) if (1>random(2)) {
      stroke(cols[0], cols[1], cols[2], amplitude*255);
    } else {
      stroke(cols[1], cols[2], cols[0], amplitude*255);
    }
    strokeWeight(map(amplitude, 0, 1, STROKE_MIN, STROKE_MAX)); //weight stroke according to amplitude value

    if (shapes.length()>1) { //if more than one shape being drawn, set two to 0 to draw them concentrically
      two = 0;
    } else {
      two = 1;
    }
    // draw current shapes
    if (shapes.contains("e")) {
      // Draw an ellipse for this frequency
      ellipse(random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, 1.4*circles[i], circles[i]);
    }
    if (shapes.contains("r")) {
      rectMode(RADIUS); 
      rect( random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, 1.4*circles[i], circles[i]);
    }
    if (shapes.contains("s")) {
      strokeWeight(3); //use fixed stroke weight when drawing stars
      //star data Xcentre,Ycentre,radius1,radius2,number of points
      star(random(-1, 1)*amplitude*positions[0]*two, random(-1, 1)*amplitude*positions[1]*two, circles[i]*stardata[0], circles[i]/stardata[1], int(stardata[2]+random(stardata[3])));
    }
    popStyle();

    //System.out.println( i+" "+circles[i]); //for debugging
  } //end of for loop
  popMatrix();
}

void oscEvent(OscMessage msg) { //function to receive and parse OSC messages
  System.out.println("### got a message " + msg);
  System.out.println( msg);
  System.out.println( msg.typetag().length());

  if (msg.checkAddrPattern("/viz/float")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      fvalues[i] = msg.get(i).floatValue();
      System.out.print("float number " + i + ": " + msg.get(i).floatValue() + "\n");
    }
  }

  if (msg.checkAddrPattern("/viz/pos")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      positions[i] = msg.get(i).intValue();
      System.out.print("pos number " + i + ": " + msg.get(i).intValue() + "\n");
    }
  }

  if (msg.checkAddrPattern("/viz/col")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      cols[i] = msg.get(i).intValue();
      System.out.print("col number " + i + ": " + msg.get(i).intValue() + "\n");
    }
  }
  if (msg.checkAddrPattern("/viz/shapes")==true) {
    shapes=msg.get(0).stringValue();
    //for(int i =0; i<msg.typetag().length(); i++) {
    // shapes += msg.get(i).stringValue().toLowercase();      
    //}
    System.out.print("shapes code "+ shapes + "\n");
  }
  if (msg.checkAddrPattern("/viz/stardata")==true) {
    for (int i =0; i<msg.typetag().length(); i++) {
      stardata[i] = msg.get(i).intValue();
      System.out.print("stardata number " + i + ": " + msg.get(i).intValue() + "\n");
    }
  }
  if (msg.checkAddrPattern("/viz/rotval")==true) {
    rotval =msg.get(0).intValue();
    System.out.print("rotval code "+ rotval + "\n");
  }
  if (msg.checkAddrPattern("/viz/shift")==true) {
    shiftflag =msg.get(0).intValue();
    System.out.print("shiftflag code "+ shiftflag + "\n");
  }
  if (msg.checkAddrPattern("/viz/stop")==true) {
    kill(); //stop Sonic Pi from running
  }
}

//function to draw a star (and polygons)
void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

void kill(){ //function to send stop message to Sonic Pi on local machine
  OscMessage myMessage = new OscMessage("/stop-all-jobs");
   myMessage.add("RBN_GUID"); //any value here. Need guid to make Sonic PI accept command
  oscP5.send(myMessage, myRemoteLocation); 
}

The basis of a visualiser depends on doing a fast fourier transform analysis of the incoming audio, and calculating amplitudes related to the different frequency components of the audio, which is continuously monitored input buffer by input buffer. I don’t intend togo anywhere near the complex mathematics involved, but there are a lot of useful articles at different levels which you can read on the subject. I quite liked this one on the Fourier Transform. https://betterexplained.com/articles/an-interactive-guide-to-the-fourier-transform/. Also it is useful to look at the documentation of the minim library analysis section http://code.compartmental.net/minim/index_analysis.html if you want more detailed information on the calls employed.
You may find it easier to look at the original sketch from which I started https://github.com/andrele/Starburst-Music-Viz before taking on board the additions I have added to make the sketch more flexible and wide ranging.
I have added various transformations. Starting at the beginning of the main draw() function there are x and y offsets which increase each time the loop iterates, until a maximum offset is reached, when they reverse in direction. This causes the shapes to move smoothly left and right and up and down the screen. a shiftflag eanbles this to be siwtched on and off by one of the control OSC messages.
There follows an optional rotate command which can rotate the axis of the shapes being drawn, again controlled by an incoming OSC message.
Next values for setting the limits on the stroke size being used to render the shapes are read from data received by an OSC message, together with an audioThreshold setting.
A buffer from the input audio is now processed, and amplitude values for different frequency ranges are stored in an array of “circle” settings. NB the name circles is used for the variable as this was the only shape used in the original.Perhaps it might be better names shapes() now as there are three different classes of shapes used. A new value for the “radius” is stored if it is less than the audioThreshold setting, otherwise, a “decayed” value of the currently stored value from the previous iteration is stored. (suitable minimum values are set).
rgb colour values are set using values received from an OSC message, and these are then swapped at random, before setting the stroke colour attributes  to be used.
Next a flag two is set according to whether  one or more than one shapes have been selected. In the latter case the shapes are forced to be drawn concentrically, by nullifying the offset values by setting two = 0. The selected shapes are then drawn, before the loop starts a further iteration.

The oscEvent(OscMessage msg) function is triggered when an OSC message is received. It contains sections to parse the various OSC messages which can be sent from Sonic PI to the processing sketch. The parameters for each command are used to update various lists containing information used by the draw function.eg cols[ ] holds rgb values, startdata[ ] holds the parameters for the star shapes, fvalues[ ] holds the floating point values for the STROKE_MAX, STROKE_MIN  and audioThresh settings. These and other similar settings are updated when the relevant OSC messages are received, so Sonic PI can control the operation of the sketch as it runs.
The star functions draws the star shape. It is lifted straight from the star example here https://processing.org/examples/star.html.
The final function kill( ) is used to send a /stop-all-jobs OSC message back to Sonic PI to stop all programs running on that machine. It can be triggered by a /viz/stop OSC message being sent from Sonic PI to the Sketch.

As far as Sonic Pi is concerned, it produces two sorts of input. First the audio that it produces is fed back to the sketch which looks at the default audio input. On my Mac I used the program Loopback ( https://rogueamoeba.com/loopback/ ) to perform this. This is a paid for program, but you can use it for free for 10 minutes or so at a time.It is based on the previous free SoundFlower utility, but this has not been fully updated for recent MacOS and you may find it difficult to get it to work instead. The setup I used is shown below:Note that Sonic Pi is added as a Source and that its output is monitored through the speakers, so that the audio is fed both there and to the processing sketch. This loopbackdevice will appear in the list of devices in the MIDI audio setup program, and should be selected as the default input device,

Secondly, Sonic PI is used to send OSC messages to the processing sketch to control its various parameters. this can be done by sending “one off” messages or by putting the message sender into a live loop, sending messages at regular intervals. An example is shown below, where the OSC messages are combined with the program producing the music, but they can  be run as a separate program in a different buffer, which is an advantage if you are visualising a linear piece and do not want to restart it every time you press run to update the OSC messages sent.

#Program to drive Sonic Pi 3 visualiser written in "processing"
#by Robin Newman, September 2017
#see article at https://rbnrpi.wordpress.com
#set up OSC address of processing sketch
use_osc '127.0.0.1',5000
#select shapes to show
osc "/viz/shapes","e"  #"s" "e" "r" Star,Ellipse, Rectangle or combination
sleep 0.1

live_loop :c do
  #choose starting colour for shapes
  osc "/viz/col",rrand_i(0,64),rrand_i(128,255),rrand_i(0,255)
  sleep 0.1
end

live_loop :f do
  #set Stroke max min widths and audioThreshold
  osc "/viz/float",([8.0,5.0,3.0].choose),[1.0,2.0].choose,(0.4+rand(0.3))
  sleep 2
end

#set range of random positional offset (can be 0,0)
#automatically disabled when showng more than one shape
osc "/viz/pos",20,0

#control "bouncing" shapes around the screen 1 for on 0 for off
osc "/viz/shift",0

live_loop :s do
  #setup star data inner/outer circle radius, number of points
  #and random variation of number of points
  osc "/viz/stardata",[1,2].choose,[1,2,3].choose,5,2
  sleep 4
end

rv=0 #variable for current rotation
live_loop :r do
  rv+=5*[-8,1].choose # choose rotation increment
  rv=rv%360
  osc "/viz/rotval",rv #change rv to 0 to disable rotation
  sleep 0.1
end

#Now setup the sounds to play which will trigger the visualiser
use_bpm 60
set_volume! 5
use_random_seed 999

with_fx :level do |v|
  control v,amp: 0 #control the volume using fx :level
  sleep 0.1
  
  in_thread do #this loop does the volume control
    control v,amp: 1,amp_slide: 10 #fade in
    sleep 140
    control v,amp: 0,amp_slide: 10 #fade out
    sleep 10
    osc "/viz/stop" #send /viz/stop OSC message to sketch
    #sketch sends back a /stop_all_jobs command to port 4557
  end
  
  #This drum loop is written by Eli see https://groups.google.com/forum/#!topic/sonic-pi/u71MnHnmkVY
  #used with his permission. I liked it, and it has good percussive output
  #to drive a visualiser
  live_loop :drums do
    this_sample = [:loop_compus, :loop_tabla, :loop_safari].ring
    start = [ 0.0 , 0.125 , 0.25 , 0.375 , 0.5 , 0.625 , 0.75 , 0.875 ].ring
    sample this_sample.look , beat_stretch: 4, start: start.look, rate: 0.5
    sleep 1
    tick
  end
end

As this program runs, you can alter the parameters eg shape(s) chosen, shift parameter etc. and rerun. Effectively live coding the visualiser. The program uses a great percussion loop written by Eli, which he said I might use. https://groups.google.com/forum/#!topic/sonic-pi/u71MnHnmkVY

There is a video of the program in operation on youtube here

and you can download the code for the sketch and Sonic Pi programs here

You can download and install Processing from https://processing.org/download/

You will also have to install the libraries Minim and oscP5 from the Sketch=>Import Library… menu.
To use, set up the loopback audio and select it as the default input device. Load and play the sketch in processing, then run the Sonic Pi program on the same computer. You can adjust the transparency of the Sonic Pi screen on the Preferences Visuals tab to make it semi transparent, or minimise Sonic Pi once it is running, if you are not going to do live coding of the control OSC messages.

8 note polyphonic gated synth for Sonic Pi 3

A problem with playing midi input on Sonic Pi, is that it is difficult to use the incoming note duration directly. This is because a midi signal specifies the start of a note, but doesn’t specify its duration, instead sending another signal when the note finishes. Conversely Sonic Pi programs its internal synths to play a given pitch for a set duration, which is specified when the note starts playing. Thus, you have to compromise and set a convenient duration time for a note when it starts, which is hopefully not far off the correct duration for the note. You can mask this to a certain extent by using :release to specify the time, so that the note dies away as it is played, and also adding reverb can also help to mask the note durations. There are two alternatives. one is to process the midi input in advance, and use an algorithm to convert to to Sonic Pi notation. Such an algorithm was produced byHiroshi TACHIBANA and I wrote an article about its use  here  and have used it extensively to convert midi files for Sonic Pi use. However, this is no good for live performances, and so I have recently worked on a second method to solve the problem. That is to use a gated synth. In this case, it works by switching on a note as the midi note_on signal is received, and turns it off again as the corresponding note_off signal is received, so the note can be any duration, as specified by the length of the incoming midi note.

I had recently experimented with a “long note” synth, where I started a note of long duration (say 1000 seconds) and then switched it on and off by altering the volume of the note. You could also alter the pitch of the note and other characteristics such as the cutoff value as the note played. It looked promising so I developed this into a polyphonic system with up to 8 parallel notes playing and attempting to switch them on and off by gating the volume setting. In the event, it became rather complicated, and the processing power needed caused timing errors and program drop outs, so eventually I abandoned this in favour of a similar although simpler system. In the new system, instead of switching the note’s volume on and off, I started a new note at the beginning of each midi note_on event, and then maintained that note until the corresponding note_off event. I decided on having up to 8 possible notes playing at the same time, and devised a rather crude scheduling system which allocated a free “note generator” to each note as it was demanded, and then kept track of that note until it was finished, when the “note-generator” was again marked as available for re-use. I just had to keep track of which note was which, so that I knew which  one to release when a note_off event was received. (In fact the note_off events where just note_on events with the velocity (or volume) parameter set to zero. The timing of the loops concerned was fairly critical, and I made a lot of use of the set and get commands available in Sonic Pi 3 for recording events and values in the timing log from which they could be retrieved. The system is running close to the limit, and has several timing error warnings, but it does seem robust to keep going regardless.However one consequence is that at present it is not viable to run on a Raspberry Pi 3, although I have tested it satisfactorily on a Mac PowerBook, and on Ubuntu 17.04

The program is split into two parts, because it is too long for a single buffer. However the first part, which merely sets up state information, only needs to be run once, and then remains active until Sonic Pi is shut down. Otherwise the program is fairly easy to follow, the heart consisting of 8 identical synth engines, together with a midi input live_loop which contains the logic to allocate the synths to the received notes.
To keep the program short enough to fit the buffers, I am afraid that I have not added my usual copious comments to it.

You can download the program files here together with an xml file to generate the touch OSC file. IT would be fairly easy to substitute another means of supplying the external input (choice of synth etc) if required.

There is also a video of the project in action here

Sonic Pi 3.0 arrives. Get going with its MIDI and OSC commands.

It is six months since the last release of Sonic Pi, and you may be forgiven in wondering what has been going on. The answer is that over this Sam Aaron has been working extremely hard to bring midi connectivity to Sonic Pi, as well as enabling the program to receive audio input from external sources. This may sound fairly modest and not a big deal, but it has involved a huge amount of work to ensure that midi input and output are fully synced with Sonic Pi’s extremely accurate timing system. During the same period Sam’s funding to develop Sonic Pi has reduced and comes to an end at Christmas, making it difficult for him to spend the time needed to develop the application further.
The initial release of version 3.0 is for the Mac, and set up details in this article are specific to that version.
The image below shows the new Pro Icon Set (optional) and the new IO tab in the Preference Panel.The new version, was developed as version 2.12.0 then 2.12.0-midi-alpha1 to 8 but because of the major changes involved justified a name bump to version 3.0, named “IO” (input Output). Sonic Pi was developed as a platform where children could be introduced to coding via the medium of music. It became a hit in many schools, but led to many requests to enable it to accept external input or to be able to give output to drive external synths or music modules. The established mechanism over the years (since the 70s) has been to use midi protocols to do this, although of its own it can be a bit limiting and more modern music communication methods also use OSC short for Open Sound Control. In fact Sonic Pi has used always OSC internally to allow the various parts: the GUI (Graphics User Interface) Ruby Server and scsynth (the Super Collider synth module) to communicate. Sam collaborated with Joe Armstrong to use the language Erlang, which is designed for situations where concurrency and the passing of large numbers of messages are concerned, and so it was ideal to handle the scheduling of midi information into Sonic Pi. This was helped by the development of two small applications by Luis Lloret to convert midi messages to osc messages (as they came into Sonic Pi, and osc message to midi information as it left sonic pi for the outside world.
This image of the info screen lists some of new new facilities in Sonic Pi 3.0

Lets look first at midi. Of course for this to work, you need to have either something which can produce midi signals which can be sent to Sonic Pi, or something which can receive and react to them. This will of course cost money if you use hardware devices, but you can try out midi on a Mac, (and on other platforms with appropriate software), by using suitable software programs loaded on to your Mac. One free program that I have used is MuseScore2 which you can download here Alternatively you may have GarageBand on your Mac. This can be used both to receive midi from Sonic Pi to be played using software instruments within MuseScore, or to play a midi file which can be received by Sonic Pi and played using synths within Sonic Pi. Another software synth which can be used is Helm which is donate-ware, i.e. you contribute what you think it is worth. here  Finally you can download a free virtual keyboard called vmpk from here It is a bit old and slightly flaky on Mac OS Sierra but it does work, although it can take a little playing around with to get everything right. The key is the midi connections entry on the Edit Menu. You can configure it to act as a player using the built in fluid synth software synthesiser, or to act as a simple keyboard. Picture below shows it set up as a player. Disable Midi In and change Midi Out to CoreMidi  SonicPi Connect to use as a keyboard input.

All the above software is effectively free (If you are miserly and don’t want to donate to helm). However on the Mac there are two pieces of software which I would recommend considering to buy. The first is an excellent little midi player called  MidiPlayer X  which is available on the App Store for a very modest £1.99 so it doesn’t break the bank. You can drag and drop midi files on to it and either play them in the built in sound module, or you can select any available midi device from a drop down menu. You can loop the file or build a playlist and also alter the tempo and selectively mute channels. I have used it a lot with the development version of Sonic Pi over the last few months. The second item, TouchOSC lets you interact with Sonic Pi from an iPhone or iPad, and there is also an Android version, although I have not tried that. I have built several quite substantial projects using this, and utilising not just the midi connectivity of Sonic Pi 3.0 but also its OSC external messaging facilities. These have included a substantial midi player controller project, that lets you choose synths, envelopes, volumes, pan settings etc of Sonic Pi utilising it as a 16 track midi player, with built in support for a full GM midi drumkit as well. Another simpler project creates a virtual midi keyboard, (editor build screen shown below) with selection of sustain times and choice of synth for Sonic Pi. A third project modified the Hexome Project published in the MagPi Issue 58 to work with Sonic Pi 3.0  This software is a little more expensive at £4.99 but still pretty cheap, although you do need a suitable device from which to run it. Now that Sonic Pi 3.0 is released I hope to publish the software for these, but you can already see development videos of them on my youtube channel here
In all cases, one further vital link is required. Midi signals are passed between devices. Initially these were physical boxes like keyboards, and synthesiser modules and they were connected together by physical 5 pin din leads. When computers came on the scene, external devices started to use usb leads as a means of connection, and if you plug such a device like my small M_Audio Oxygen8 keyboard into the Mac it is recognised as a midi input device. However programs like Sonic Pi have no physical presence, and so the Mac lets you create virtual midi devices which let programs like Sonic Pi or Garage Band talk to Sonic Pi via midi.
The key program to set this up on the mac is called Audio Midi Setup and it can be found inside the Utilities folder in your main Applications folder. Start up Audio Midi Setup and  select Show Midi Studio from the View Menu. (if it says Hide Midi Studio the window should already be visible). (Double click the image below to expand it). Find the icon entitled IAC Driver and double click to open it. If you see a flag saying More information click on it too. (Double click the image below to expand it). Now find the entry IAC Bus 1 and click and rename it to something more memorable. I called mine SonicPi. I also renamed the Port as Connect Finally click the Device is online flag, and the Apply Button.  (See the image below).

You can of course use any names you like, or leave it as the default setting, IAC_Driver with the default port name, IAC Bus 1  Note: Sonic Pi converts all Midi Dev names to lowercase, and replaces any spaces with _ characters, to make life a little less confusing.

If you launch Sonic Pi 3.0 and select IO the new tab on the preferences window, you should see sonicpi_connect listed under the MIDI inputs and MIDI outputs sections. Now launch a suitable program to receive midi from Sonic Pi. I tried GarageBand and MuseScore2. For GarageBand, open the application and choose a new empty project. Select Software Instrument to insert one instrument, set by default to Classic Electric Piano. In Sonic Pi type

midi 72

Run the program, and all being well you should here a piano note played in GarageBand. Sam has achieved his aim of producing something that an 8 year old can do!
You can change the instrument sound in Garage Band. Use the library on the left. here a marimba is chosen.
To do the same thing in MuseScore2, launch that app. (nb an updated version was released very recently) Close the Score Centre popup window. Select the preferences on the MuseScore Menu,  select theNoteInput tab and make sure enable midi input is ticked.Then select the I/O tab and select the SonicPi connect midi input as shown. Note the message about restarting MuseScore. Click OK. Restart MuseScore, again closing the Score Centre popup window and then you can run the midi command from Sonic Pi again. All being well you should here a piano note being played. If not, check that the midi din-plug icon at the top MuseScore window is highlighted (i.e. active) and try again. You can also change the sound on MuseScore from the Mixer Window on the View Menu (NB NOT the Instruments entry on the Edit Menu. That is for something else. The picture below shows a Glockenspiel sound being selected for the Piano output.You can even have MuseScore and GarageBand being played at the same time by Sonic Pi of both are set up together as described!

You can then try a slightly more sophisticated program to play notes chosen at random from a scale.

live_loop :midi_out do
  n=scale(:c4,:major).choose
  midi n,sustain: 0.2
  sleep 0.2
end

Try experimenting by altering changing the sleep and sustain times. You could also choose a different scale, or maybe transpose by using midi n+3 instead of midi n
You can add further control by using the option vel_f: this is followed by a number in the range 0->1 which specifies the velocity with which a standard midi keyboard note is pressed i.e. the volume. Try changing the line to

midi n,sustain: 0.2,vel_f: 0.3

The instrument you hear is entirely controlled by the receiving program. In GarageBand you can choose a different instrument from those available on the library. e.g. a bright punchy synth, or in MuseScore2 open the mixer on the View Menu and choose say a Clavinet from the dropdown list as illustrated above,

So much for midi output. What about input? To handle this Sonic Pi utilises its cue and sync system. The cues are provided by incoming midi events, such as when a note is received from a connected midi device. The midi “cues” can be generated by various actions. When a note turns on and when it turns off again. Also midi control signals can cause events. These might be used to change a synth for example. Another type of event is generated by a midi pitch-bend wheel. All of these and more can be catered for in Sonic Pi. (They can also be sent out from Sonic Pi as well as the simple midi command used to play notes, which in fact automatically uses both note_on and note_off events). The code below will receive midi note_on and note_off events (In fact note_off events can also be interpreted as note_on events with zero volume).

live_loop :midi_input do
  use_real_time #gives fast response by overriding the sched ahead time
  use_synth :tri
  #wait for a note_on event from midi source sonicpi_connect
  b = sync "/midi/sonicpi_connect/*/*/note_on"
  #b is a list with two entries.
  #The note value in b[0] and the velocity value  in b[1] 
  puts b
  #b[1] has range 0-127. Convert to float
  #then scale it to range 0-1 by dividing by 127
  play b[0],release: 0.2,amp: b[1].to_f/127 #play the note 
end

#you can use two variables say b,c to get the information from the sync
#b,c = sync "/midi/sonicpi_connect/*/*/note_on"
#if you prefer to do so, Amend the program appropriately

To use this we can produce a midi input from the virtual keyboard. However first, we can check it out even more quickly, by combining it with our midi send program recently discussed. That gives us this total program.

live_loop :midi_out do
  n=scale(:c4,:major).choose
  v=0.7
  midi n,sustain: 0.2,vel_f: v,port: "sonicpi_connect"
  sleep 0.2
end

live_loop :midi_input do
  use_real_time #gives fast response by overriding the sched ahead time
  use_synth :tri
  #wait for a note_on event from midi source sonicpi_connect
  b = sync "/midi/sonicpi_connect/*/*/note_on"
  #b is a list with two entries.
  #The note value in b[0] and the velocity value  in b[1] 
  puts b
  #b[1] has range 0-127. Convert to float
  #then scale it to range 0-1 by dividing by 127
  play b[0],release: 0.2,amp: b[1].to_f/127 #play the note 
end

I have slightly altered the first part of the program to include a velocity setting, and I’ve also explicitly named the midi port to be used, rather than all of the available ones. Try altering the v setting say to 0.3 and press run again. Or perhaps put n+12 in the midi send line in the first loop and go up an octave. Note if you still have GarageBand and or MuseScore running then they will play along too! You can mute them using the mute icon beside the instrument name in GB and by clicking the midi din icon in MuseScore to toggle off midi input.

To try the keyboard first comment out the first live loop so it doesn’t send any midi. Theb launch vmpk. Go to the Edit Menu and select Midi Connections. Make sure that midi input is not ticked. Select CoreMID for the Midi OUT Driver and choose SonicPi Connect for the Output Midi Connection, (as shown in the screen shots above) Because the keyboard program generates further midi connections, you need to update Sonic Pi. In the Sonic Pi Preferences I/O tab click the Reset MIDI button. This takes a few seconds, but eventually you will see the connection lists updated: there will be some vmkb entries, although we are not using them here. Now run the Sonic Pi program (with the first live_loop commented out) and play the virtual keyboard. You should hear the notes you click with the mouse. Also in the vmpk preferences under the vmpk menu you can enable your mac (typing) keyboard to activate note input. With this selected, you can type the appropriate keys and play Sonic Pi, when the vmkb program is selected. Note that when you quit the vmkb program, it will remove its virtual midi devices, and you will have to reset the Sonic Pi midi setup to keep it going using the Reset MIDI button.

Not only does Sonic Pi 3.0 add midi in and out, it also enables you to add audio input directly into Sonic Pi, where you can modify it by applying  fx like :reverb, or :flanger. To do this you WILL need some additional hardware to feed audio in. I use a Steinberg UR22 MkII audio/midi interface, which gives me two audio in/out channels as well as allowing a hardware midi in/out connection which can be connected to an external midi device such as a keyboard or music module. I have an old Korg X5DR which works with this. However, you CAN try it out using input from the built in microphone on a Mac. Best if you have a set of headphones to listen so that you don’t get feedback! To set things up you use our old friend Audio MIDI Setup again. This time you want to look at the Audio Devices window on the View Menu. Make sure that Built in Microphone is used as the default Input device. (That WILL be the case if you don’t have any additional audio devices connected to your computer). Note if you change the audio selected devices, then Sonic Pi will NOT realise any changes have been made until you restart it. Unlike the midi changes, there is not a reset button to accommodate this, as it happens less frequently! Restart Sonic Pi if you have changed the settings. Now select an empty Sonic Pi 3.0 buffer and type in:

live_audio :mic,amp: 5

Put headphones on, connected to your Mac, and press run. You should be able to hear yourself, and will also see a trace on the Scope if you turn that on in Sonic Pi. You may have to get quite close to the mike to get sufficient input. If it is very quiet, try changing the amp: 5 to amp: 10. Now amend the program as shown below, and rerun.

with_fx :compressor do
 with_fx :reverb,room: 1 do
 live_audio :mic
 end
end
sleep 10
live_audio :mic,:stop #turns off the live_audio feed
#program still running here. Press stop to finish

Here we put in some reverb and also take out the amplification of the live_audio input, but instead put the whole section inside an fx :compressor to boost the overall output. You can already see from this the potential of live_audio if you add the hardware to access other external audio sources. If it is too loud, you can add :amp 0.5 in the compressor line. giving with_fx :compressor, amp: 0.5 do At the end I show a program which sends midi out to an external synth, which sends audio back via live audio to Sonic Pi 3.0 Also incorporated is a synchronised rhythm drum track generated by Sonic Pi. The volume fades up and down, controlled by an at statement also utilising the new .scale method applicable to rings, which scales the values within a ring.

`The final major addition to Sonic Pi 3.0 is the ability to record a buffer. This means that it is possible to record some live_audio input and store it and then reuse it as part of a running program for example to produce a loop. I have already published a video example of this where I record 4 separate buffers with different versions of Frere Jaques, and then use Sonic Pi 3.0 to play them as a round, and then manipulate them, playing the round faster and faster.

Here we’ll try something a bit simpler. Amend the program above by adding commands to record up to 16 beats worth of sound.

with_fx :record,buffer: buffer[:micbuffer,16] do
  with_fx :compressor do
    with_fx :reverb,room: 1 do
      live_audio :mic
    end
   end
end
at 16 do #stop the live audio feed when finished recording
  live_audio :mic,:stop
end
#press stop to finish the program

The extra wrapper around the original program uses the new with_fx :record effect. This uses as a destination the buffer described at the end of the first line. The buffer is named :micbuffer, and the 32 specifies its duration in beats. by default it will be 8 beats long unless specified otherwise. Run the program and record a session of speech. At the normal bpm setting of 60 this can be up to 16 seconds long. If bpm is set to 120 it will only be 16 seconds long. When you have completed the recording, press stop then comment out this section and add at the bottom:

sample buffer[:micbuffer,16]

Run the program again and you should hear your speech back again.
You can have fun with it now. What about Mickey Mouse? Try:_

sample buffer[:micbuffer,16],rpitch: 12

This puts it up an octave and plays twice as fast. One word of warning. Do not change the bpm when you have a recorded buffer or it will reconfigure it for the new bpm value, (as it will have a different duration) and destroy its contents in the process. I had to take account of this when I used the record technique to produce a round version of Freer Jacques using 4 recorded buffers. here

The final new introduction I want to mention is the easy availability of OSC messaging. Sonic Pi 2.11 had the ability to receive OSC based cues, although it wasn’t documented, and the syntax was different. In Sonic Pi 3.0 it is a fully functional feature, as well as the ability to send OSC messages, crucially not only to other programs running on your computer, but also to external computers and programs via network connections. There is the facility to enable or disable this feature on the new IO prefs tab, as it can potentially be a security risk. The facility is particularly useful for interacting with programs such as TouchOSC, which, as the name implies is designed to use bi-directional OSC messaging, although it can also send midi messages as well. To use OSC you need a program that can send and a program that can receive them. There are OSC monitors that can be built or used, but they can require a bit of setting up, so to keep things simple we will use Sonic Pi to send OSC messages to itself.

In a new window type

use_osc "localhost",4559
osc "/hello/play",:c4,1,0.5,:tri

b = sync "/osc/hello/play"
puts b
use_synth b[3]
play b[0],amp: b[1],sustain: b[2],release: 0
#if you prefer it, use four separate variables to get the data

 

when you run this, you should hear the note :c4 played with a :tri synth, with a duration of 0.5 seconds and an amplitude of 1. These values are passed as the data associated with the address /hello/play Sonic Pi prepends the /osc to distinguish the source, and the sync command is triggered by the arrival of the osc message. if you look at the output of the puts b line you will see that the data is passed in a list which is assigned to b (the variable we specified) and then various bits can be accessed using b[0], b[1] etc… Here is the version using separate variables to extract the received data:

use_osc "localhost",4559
osc "/hello/play",:c4,1,0.5,:tri

n,a,d,s = sync "/osc/hello/play"
use_synth s
play n,amp: a,sustain: d,release: 0

Now I amplify this to play my old favourite Frere Jaques entirely with OSC messages.

#Frere Jqeus played on Sonic Pi 3.0 entirely using OSC messages
use_osc "localhost",4559
t=180
set :tempo,t #use set to store values that will be passed to live_loops
use_bpm t

p=0.2;m=0.5;f=1 # volume settings
#store data using set function so that it can be retrieved in live_loops
set :notes,(ring :c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4,:e4,:f4,:g4,:e4,:f4,:g4,:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4,:c4,:g3,:c4,:c4,:g3,:c4)
set :durations,(ring 1,1,1,1,1,1,1,1 ,1,1,2,1,1,2, 0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1, 1,1,2,1,1,2)
set :vols,(ring p,p,m,p,p,p,m,p,p,m,f,p,m,f,m,m,m,m,f,m,m,m,m,m,f,m,f,f,f,f,f,f)
set :synths,(ring :tri,:saw,:fm,:tb303)

live_loop :playosc do # this loop plays the received osc data
  use_real_time
  n,d,v,s,tempo= sync "/osc/hello/play" #retrieve data from OSC message
  
  use_bpm tempo
  use_synth s
  play n,amp: v,sustain: d*0.9,release: d*0.1
end

live_loop :sendosc do
  #retrieve data from main program using get functions
  s=get(:synths).tick
  notes=get(:notes)
  durations=get(:durations)
  vols=get(:vols)
  tempo=get(:tempo)
  use_bpm tempo #set local tempo for this loop
  notes.zip(durations,vols).each do |n,d,v|
    osc "/hello/play",n,d,v,s,tempo #send OSC message with note data
    sleep d
  end
end

The tune, note durations and volumes for each note are held in three rings. The data is sent to the live_loop that will send it using OSC messages using another new feature in Sonic Pi 3. The set and get functions. Previously I, for one, have just declared variables in the main program, and used them inside live loops. Whilst this will work most of the time, it is bad practice, as you might get two live_loops trying to alter variables at the same time and causing confusion.  Using the set and get functions keeps things in order. See section 10.1 of the Sonic Pi built in tutorial for more detail on this. Somewhat oddly the two live loops to send and receive the data via OSC messages are presented in what may seem the wrong order, but this will make sense a bit later on. The Second live loop :sendosc first chooses a different synth to be used on each iteration using a tick to sequence through a list of synths (which you can alter if you like) . Then I use one of my favourite constructs in Ruby which enables you to iterate through two or more (in this case three) lists which are zipped together. The way it works is that on the first iteration n,d and v will hold the first values in the three rings :c4, 1 and 0.2 On the next iteration they hold the second set of values and so on. These are then combined in an OSC message. This has two parts. First an address, which can be anything you like, but each section must be preceded by a / Here we have /hello/play but we could equally have /hi/dothis as long as we look for the right address when we try and receive it. This is followed by a list of data, which can be numbers, strings or symbols. In this case we have four items which are sent. The destination is specified in the separate use_osc command, here giving use_osc “localhost”,4559 This specifies that the local machine (ie the one we are using) will receive the message on port 4559. (You will see this port specified in the new I/O prefs. Sonic Pi is set up to monitor this port. We sleep for the duration of the note and then the next OSC message is sent.

Turning to the receiving live_loop :playosc this waits for sync events to occur with the format “/osc/hello/play” Remember that Sonic Pi prepends the /osc to signify where the sync event has originated from. as described in the first OSC program the data is extracted to the variables n,d,v,s and tempo and the various parts are then used to specify the synth, bpm note amp: and duration settings to use. In this case an envelope is used with separate sustain and release times. We don’t need to specify the time between notes as this is taken care of by the sync, which depends on the time between the OSC messages set by the sending loop. The received tempo is used to set the local bpm, so the timing of the note durations are interpreted correctly.  Hopefully when you run this program it will play Frere Jaques for you continuously, cycling through the synths specified.

Now for more of a good thing!. Add a third loop with a delay of 8 beats to the end of the program

live_loop :sendosc2,delay: 8 do
  #retrieve data from main program using get functions
  s=get(:synths).tick
  notes=get(:notes)
  durations=get(:durations)
  vols=get(:vols)
  tempo=get(:tempo)
  use_bpm tempo #set local tempo for this loop
  notes.zip(durations,vols).each do |n,d,v|
    osc "/hello/play",n,d,v,s,tempo #send OSC message with note data
    sleep d
  end
end

This is identical to the other send program, except in name and in the delay: 8 in the first line which means that it starts 8 beats after the other live_loops. If you now play the program again, both sendosc loops will broadcast the tune, but the second one delayed by the time for the first two lines of Frere Jaques to play. Because they both send to the same OSC address their information  streams will both be picked up and played by the :playosc loop.

Finally, if you are lucky enough to have access to a second machine with Sonic Pi 3 on it, you can copy the live_loop :playosc code to the second machine, and adjust the main program to send OSC messages to it and it will join in, fully synchronised. I’ve just had two Macs and a Raspberry Pi running my initial build of Sonic Pi 3 all playing synchronised together by OSC messages. Sounds Great. The change you have to make to the main program, is to the osc message lines. An example is shown below, for a machine on ip address 192.168.1.128:

osc "/hello/play",n,d,v,s,tempo #sends to local machine
osc_send "192.168.1.128",4559,"/hello/play",n,d,v,s,tempo #sends to remote machine

You can add a third machine by adding a third live_loop :oscsend3,delay: 16 but otherwise the same as the other two send loops. This could use the appropriate address for that machine. If you want you can use one send loop and put two or more appropriate or osc_send commands one after the other, so that they play the same notes together.
Here is a link to tweet video showing an early (naughty) version of the program in action, which didn’t use set and get to transfer the note information!

Finally I promised earlier to include a program incorporating midi, live audio and a locally generated rhythm track all nicely synchronised together.

#Sonic Pi 3.0 Example showing midi out, live_audio in
#synchronised drum track and use of at to control volumes
#written by Robin Newman, July 2017
use_debug false
use_osc_logging false
use_midi_logging false
use_bpm 100
#st up rhythm tracks and volumes 0->9
set :bass_rhythm,ring(9, 0, 9, 0,  0, 0, 0, 0,  9, 0, 0, 3,  0, 0, 0, 0)
set :snare_rhythm,ring(0, 0, 0, 0,  9, 0, 0, 2,  0, 1, 0, 0,  9, 0, 0, 1)
set :hat_rhythm,ring(5, 0, 5, 0,  5, 0, 5, 0,  5, 0, 5, 0,  5, 0, 5, 0)

with_fx :level do |v|
  control v,amp: 0 #start at 0 volume
  sleep 0.05 #allow amp value to settle without clicks
  at [1,26],[1,0] do |n|
    control v,amp: n,amp_slide: 25 #fade in and out over 25 beats each
  end
  
  
  live_loop :drums do
    sample :drum_bass_hard, amp: 0.1*get(:bass_rhythm).tick
    sample :drum_snare_hard, amp: 0.1*get(:snare_rhythm).look
    sample :drum_cymbal_closed,amp: 0.1*get(:hat_rhythm).look
    sleep 0.2
    stop if look==249
  end
  
  #audio input section
  
  with_fx :compressor, pre_amp: 3,amp: 4 do
    #audio from helm synth fed back using loopback utility
    live_audio :helm_synth,stereo: true #audio from CM bells selected on helm synth
  end
  
end #fx_level

at 30 do  #stop audio input at the end
  live_audio :korg,:stop
end

#send out midi note to play (sent to helm synth CM bells)
live_loop :midi_out,  sync: :drums do
  tick
  n=scale(:c4,:minor_pentatonic).choose
  vel=0.7
  midi n,sustain: 0.1,vel_f: vel,port: "sonicpi_connect",channel: 1
  sleep 0.2
  stop if look==249
end

This program requires an external synth (I used the helm software synth and fed the midi to it via the sonicpi_connect interface we set up earlier. I used the utility LoopBack (freee to try out) to set up an audio input connection, and set this as the default audio input in Audio MIDI setup so that Sonic Pi selected this (rather than the built-in microphone that we used before) when it started. Remember you need to restart Sonic Pi if you want to alter either where its sound output is fed OR where it can receive audio input from. It always uses the system selected audio devices active when it starts up.
The program has a live loop each playing percussion samples. These are controlled by rings containing the volume setting for each iteration of the loops which are synced together. There is a live_loop which sends midi notes out to the helm synth, which is also synced to the percussion loops. The audio output of the Helm is fed back to Sonic Pi as live_audio via the LoopBack interface. Also of interest is the use of the at command which fades in the percussion and live_audio volume by controlling a with_fx :level wrapper placed around the :drums live_loop and the live_audio inputs. When the program is run, The total sound output builds from zero to a maximum, and it is then faded out again, with  the loops being stopped at an appropriate point by counting the elapsed ticks in each case.
You will have to adjust the settings in the program to suit any external midi/audio interface that you may have. Even if you can’t run it, some of the techniques employed may be useful to you.

The FrereJaques program and the Helm with percussion program are also on my gist site  here and here and you can hear their output on sound cloud here and here

 

Well this has been quite a quick gallop through what Sonic Pi 3.0 has to offer. I hope you find some of the examples useful in getting you going. I think that the program adds fantastic new opportunities to Sonic Pi, and Sam is to be congratulated on the amazing job he has done. I know that it has involved a huge effort, many long nights and frustrations, and that few appreciate just what has been involved. If you like Sonic Pi and use it, and particularly if you want to see it developed further consider supporting Sam via the patreon site on https://patreon.com/samaaron If enough people sponsor a modest monthly amount then the funds can be gained to enable him to devote the time to this mammoth job that it needs. As a supporter you will also gain access to interim development releases of Sonic Pi along the way.

I have quite a number of published resources for Sonic Pi produced over the last few years. This blog is one. Also available are sound file on soundcloud  here here and here
Many of these have associated software on my gist site I also publish videos on youtube

Now that Sonic PI 3.0 is released, I have quite a bit of resource material which makes use of it, so watch this space for further articles on its great capabilities

twitter address @rbnman

High time for a post! What’s happening with Sonic Pi and how you can help

I feel somewhat guilty that I have not added anything new to this blog for some time. This is not because I have not been busy using Sonic Pi: far from it: but rather that the things I have been involved in have not lent themselves readily to any articles. Part of this is becuase I always work with the cutting edge version of Sonic Pi, and most people do not have access to it. However, one bit of good news on this front is that if you use a Mac, then you CAN easily obtain a development copy of Sonic Pi 2.12.0-alpha-midi by supporting Sam Aaron, the creator of Sonic Pi on the patreon site. Sam needs financial support to coninue developing Sonic Pi, so if you like it then go to https://www.patreon.com/samaaron and give your support. As a reward you will have access to downloading a recent version of 2.12.0-alpha-midi. This will let you try out using midi in and out and OSC out messages from Sonic Pi.

I build the latest version on my Mac, and have been using midi with Sonic Pi since Christmas, and in my latest work have produced a program to play midi files and or keyboard input from Sonic Pi, with control of the Synths used and some of gtheir parameters using a TouchOSC front end running on my iPad. This gives a new slant to using Sonic Pi, not so much for live coding but rather for live playing, with real time alteration of the synths used and what they sound like as a piece plays. I also use a little midi player called Midi Player X available on the Mac for £1.99 from the App Store, which is ideal for use with Sonic Pi.

I have developed two version of the program, the second one being more versaitle, and produced videos of each program in action to give an idea of what it can do. It does require a reasonable understanding of how things work to set it up and use it, becuse you have to set up and confiugre midi ports to enable connections between the various parts,  and I have not at present published the program for this reason, and also becuase this is an alpha version of Sonic Pi, and some of the commands may be altered (some already ahve been!) before it reaches beta and then release stages. However I may in afuture post give some useful ideas to get you going on using Midi with Sonic Pi. I have spent many hours playing with it, studying the code, and working out hwo the commands work. There is increasingly socumentation, but it is not all present at this moment in time.

So why not view the videos below, then resolve to support the further development of Sonic Pi AND get a copy of the development version to try out on a Mac.

Here is a picture of the latest TouchOSC screen, and there are links to the two videos below.

Video for version 1

Video for version 2

Patreon LInk for Sam

Sonic Pi remote gui to control play/stop and the starting position within a piece

Following on from my previous post, and some “playing” with the midi facilities in the latest Sonic Pi 2.12.0-midi-alpha1 I developed a processing script to control the transport mechanism in Logic Pro which I was using to play the Sonic Pi produced midi. I have now further developed this, and applied its use to the existing release version Sonic Pi 2.11.1 so that it can provide a remote control functionality. (You can also use it on the version 2.11 with a slight modification to two or three lines. This is the current release version on the Raspberry Pi). With the addition of a few lines to any existing Sonic Pi music program file it can provide play and stop functions, which can be used repeatedly without touching Sonic Pi . Effectively the stop function stops the program from running and then re-runs it so that it awaits a subsequent “play” command from the remote control. If the music is a linear piece of say 200 bars, then with some more pervasive changes, the music code can be written in such a way that it is possible to start at the beginning of any specified bar in the music, and this bar selection can also be done from the remote control.

In this article, I am going to show two separate pieces which utilise the remote control. The first is a rendition of a four part round of Frére Jaques with a twist! At the end of each repeated line the tempo increases until the fourth part has finished playing, then the round plays again, this time starting at the new fast tempo, and decreasing at the end of each repeated line until the fourth entry finishes at the original slow tempo. The piece is 28 bars in length, and you can start playing at the beginning of any designated bar, controlled by the remote interface. To see how this is developed, first here is the code for a play-through of Frére Jaques for a single part, first speeding up, then slowing down again

#FrereJaques-1part.rb
a1=[ ]
b1=[ ]
a1[0]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
a1[1]=[:e4,:f4,:g4,:e4,:f4,:g4]
a1[2]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
a1[3]=[:c4,:g3,:c4,:c4,:g3,:c4]
a1[4]=[:r]*8
a1[5]=[:r]*8
a1[6]=[:r]*8+a1[0]
a1[7]=a1[1]
a1[8]=a1[2]
a1[9]=a1[3]
a1[10]=a1[4]
a1[11]=a1[5]
a1[12]=[:r]*8
b1[0]=[1,1,1,1,1,1,1,1]
b1[1]=[1,1,2,1,1,2]
b1[2]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
b1[3]=[1,1,2,1,1,2]
b1[4]=[1]*8
b1[5]=[1]*8
b1[6]=[1]*8+b1[0]
b1[7]=b1[1]
b1[8]=b1[2]
b1[9]=b1[3]
b1[10]=b1[4]
b1[11]=b1[5]
b1[12]=[1]*8

c1=[100,120,140,160,180,200,220,200,180,160,140,120,100]
use_synth :beep
in_thread do
 for i in 0..a1.length-1
 use_bpm c1[i]
 for j in 0..a1[i].length-1
 play a1[i][j],sustain: b1[i][j]*0.9,release: b1[i][j]*0.1
 sleep b1[i][j]
 end
 end
end

In order to accommodate the tempo changes, the notes are held in two bar sections, inside an array a1[ ]. Thus the first two bars of the tune :c4, :d4, :e4, :c4, :c4, :d4, :e4, :c4 are held in the first entry of this array, a1[0]. The corresponding note lengths, are held in the first entry of array b1[ ] in b1[0] consisting of 8 equal notes of duration 1 (a crotchet). These two bars will be played at the first temp in the list
c1=[100,120,140,160,180,200,220,200,180,160,140,120,100]
namely 100 bpm. At the end of the data section, a synth is chosen (:beep) and the notes are played in a thread, as we will want all four parts to play together later on. Two”for” loops are used to play the two bar sections one after another. The outer loop selects the tempo in bpm for each two bar section, and the inner loop uses a play command to play the note with its accompanying duration, with sustain, release and pan parameters included. Each two lines the tempo is bumped up by 20. When the tune finishes playing (a1[3] and corresponding b1[3]) then three further two bar sections follow, each playing rests as the remaining three parts finish playing. In fact  the last of these three additional sections is 4 bars in length (a3[6] and b3[6]). The additional 2 bars consists of the first line of the tune (a1[0] and b1[0] which is played again, this time at the tempo of 220, and you will see that subsequent two bar sections consist of the remainder of the tune being played again, but this time the value of the tempo is reduced by 20 for each section, until the final section is played at the original tempo of 100. As before three “rest” sections are included, so allowing the other three parts to finish off.

If we now add in the other three parts, you will see that they are very similar to the first part. The only difference is that the tune is shifted for each part so that it actually starts play 2 bars later than for the previous part. So part2 which is held in the arrays a2[ ] and b2[ ] has a rest section for its initial 2 bars before the tune starts playing this time in a2[1] compared to a1[0] for the first part. Similarly the actual tune for part3 start playing in a3[2][and for the 4th part in a4[3]. The complete tune playing code is show below for all 4 parts. Each part has a different synth and pan setting to make them stand out.

#4 Frere Jaques round played twice, speeds up then slows down
p1=-1;p2=-0.33;p3=0.33;p4=1

a1=[]
b1=[]
a1[0]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
a1[1]=[:e4,:f4,:g4,:e4,:f4,:g4]
a1[2]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
a1[3]=[:c4,:g3,:c4,:c4,:g3,:c4]
a1[4]=[:r]*8
a1[5]=[:r]*8
a1[6]=[:r]*8+a1[0]
a1[7]=a1[1]
a1[8]=a1[2]
a1[9]=a1[3]
a1[10]=a1[4]
a1[11]=a1[5]
a1[12]=[:r]*8
b1[0]=[1,1,1,1,1,1,1,1]
b1[1]=[1,1,2,1,1,2]
b1[2]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
b1[3]=[1,1,2,1,1,2]
b1[4]=[1]*8
b1[5]=[1]*8
b1[6]=[1]*8+b1[0]
b1[7]=b1[1]
b1[8]=b1[2]
b1[9]=b1[3]
b1[10]=b1[4]
b1[11]=b1[5]
b1[12]=[1]*8

c1=[100,120,140,160,180,200,220,200,180,160,140,120,100]
use_synth :beep
in_thread do
 for i in 0..a1.length-1
 use_bpm c1[i]
 for j in 0..a1[i].length-1
 play a1[i][j],sustain: b1[i][j]*0.9,release: b1[i][j]*0.1,pan: p1
 sleep b1[i][j]
 end
 end
end

a2=[]
b2=[]
a2[0]=[:r]*8
a2[1]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
a2[2]=[:e4,:f4,:g4,:e4,:f4,:g4]
a2[3]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
a2[4]=[:c4,:g3,:c4,:c4,:g3,:c4]
a2[5]=[:r]*8
a2[6]=[:r]*8+a2[0]
a2[7]=a2[1]
a2[8]=a2[2]
a2[9]=a2[3]
a2[10]=a2[4]
a2[11]=a2[5]
a2[12]=[:r]*8
b2[0]=[1]*8
b2[1]=[1,1,1,1,1,1,1,1]
b2[2]=[1,1,2,1,1,2]
b2[3]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
b2[4]=[1,1,2,1,1,2]
b2[5]=[1]*8
b2[6]=[1]*8+b2[0]
b2[7]=b2[1]
b2[8]=b2[2]
b2[9]=b2[3]
b2[10]=b2[4]
b2[11]=b2[5]
b2[12]=[1]*8

c2=[100,120,140,160,180,200,220,200,180,160,140,120,100]
use_synth :blade
in_thread do
 for i in 0..a2.length-1
 use_bpm c2[i]
 for j in 0..a2[i].length-1
 play a2[i][j],sustain: b2[i][j]*0.9,release: b2[i][j]*0.1,pan: p2
 sleep b2[i][j]
 end
 end
end

a3=[]
b3=[]
a3[0]=[:r]*8
a3[1]=[:r]*8
a3[2]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
a3[3]=[:e4,:f4,:g4,:e4,:f4,:g4]
a3[4]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
a3[5]=[:c4,:g3,:c4,:c4,:g3,:c4]
a3[6]=[:r]*8+a3[0]
a3[7]=a3[1]
a3[8]=a3[2]
a3[9]=a3[3]
a3[10]=a3[4]
a3[11]=a3[5]
a3[12]=[:r]*8
b3[0]=[1]*8
b3[1]=[1]*8
b3[2]=[1,1,1,1,1,1,1,1]
b3[3]=[1,1,2,1,1,2]
b3[4]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
b3[5]=[1,1,2,1,1,2]
b3[6]=[1]*8+b3[0]
b3[7]=b3[1]
b3[8]=b3[2]
b3[9]=b3[3]
b3[10]=b3[4]
b3[11]=b3[5]
b3[12]=[1]*8

c3=[100,120,140,160,180,200,220,200,180,160,140,120,100]
use_synth :tri
in_thread do
 for i in 0..a3.length-1
 use_bpm c3[i]
 for j in 0..a3[i].length-1
 play a3[i][j],sustain: b3[i][j]*0.9,release: b3[i][j]*0.1,pan: p3
 sleep b3[i][j]
 end
 end
end

a4=[]
b4=[]
a4[0]=[:r]*8
a4[1]=[:r]*8
a4[2]=[:r]*8
a4[3]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
a4[4]=[:e4,:f4,:g4,:e4,:f4,:g4]
a4[5]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
a4[6]=[:c4,:g3,:c4,:c4,:g3,:c4]+a4[0]
a4[7]=a4[1]
a4[8]=a4[2]
a4[9]=a4[3]
a4[10]=a4[4]
a4[11]=a4[5]
a4[12]=[:c4,:g3,:c4,:c4,:g3,:c4]
b4[0]=[1]*8
b4[1]=[1]*8
b4[2]=[1]*8
b4[3]=[1,1,1,1,1,1,1,1]
b4[4]=[1,1,2,1,1,2]
b4[5]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
b4[6]=[1,1,2,1,1,2]+b4[0]
b4[7]=b4[1]
b4[8]=b4[2]
b4[9]=b4[3]
b4[10]=b4[4]
b4[11]=b4[5]
b4[12]=[1,1,2,1,1,2]

c4=[100,120,140,160,180,200,220,200,180,160,140,120,100]
use_synth :saw
in_thread do
 for i in 0..a4.length-1
 use_bpm c4[i]
 for j in 0..a4[i].length-1
 play a4[i][j],sustain: b4[i][j]*0.9,release: b4[i][j]*0.1,pan: p4
 sleep b4[i][j]
 end
 end
end

So now that we have the 4 part tune set up, how can we control it, so that we can play and stop at will, and also specify at which bar of the 28 available we start playing? In order to do this we need to supply externally two pieces of information. First a play/stop code (which is set to 1 to play and -1 to stop) and secondly a bar start code bs (which is set to the bar number from 1…28). In fact we can set bs greater than 28, and the program will determine that that number is too big and clamp it to a maximum of 28 (or whatever the relevant value is for the piece we are playing). So we have two problems to look at. First, generating the play/stop code and the bs code and sending them to Sonic Pi from our remote GUI, and secondly receiving and decoding them in the Sonic Pi piece we are playing, and acting upon them.

As a break from Sonic Pi, we will look first at the remote control GUI. Last summer, I was introduced to processing by Hiroshi Tachiban, who had written a script using this application to convert midi files to Sonic Pi code, via an intermediate MusicXML format. I have used and developed this script, and in the process have begun to appreciate the power of Processing. In the conversion script its graphical properties are not utilised, but looking at the examples on the processing site (processing.org) and prompted by others who were using it to generate OSC commands (which are utilised by Sonic Pi to communicate between its various parts, GUI, Server, Scsynth). I decided to try and use it for this purpose. It also has the advantage of being easily installable on Macs, Windows PC, linux and Raspberry Pi, so it can be utilised on all the Sonic Pi platforms. Basically I deiced to use a very small GUI screen which had 5 small clickable rectangles on it. Two of these were used to select Play or Stop, and the remaining three let you increase the current (displayed) bar start, or decrease it, or reset it to bar 1 (the start of the piece). The current bar start was held internally in the GUI code, and transmitted to Sonic Pi whenever the Play or Stop rectangle was clicked. This enabled subsequent Play commands to start at the current bar start setting repeatedly, without having to set it each time. Processing refers to the graphics window it sets up using an xy coordinate system where the origin x=0,y=0 is top left, and x increasing moves right across the screen, and y increasing moves down the screen. I set a small screen size of 100 x 120 pixels, as I didn’t want the GUI to use much screen real estate. Processing has built in routines to determine the mouse position mouseX and mouseY. It can also determine when the mouse button is pressed, and also when the mouse is moved. The program for the processing window is called a sketch. Those of you who have used the Arduino interface on a Mac or Windows will be at home with the programming interface as the Arduino interface is written using processing. Both use JAVA. Unfortunately the code would not render properly in WordPress, so you can see it by using this paste-link instead

To try out the program, you first need to download the processing code from processing.org There are versions available for Mac Windows PC linux and Raspberry Pi (use the linux ARM version). Unzip and run the application. Create a new sketch called StartBarSelector and paste in the code. You will also need to load in the oscP5 library from the Sketch–>Import Library— link on the Edit menu. I suggest you run the app initially from the processing ide. Later when you are sure it is working OK you can create a standalone app using Export Application… from the File Menu.

If you view the code you will see that the first entry is import oscP5.*;
This utilises an external library oscP5 which you must add as detailed above. The script creates a small (100x120pixel) window which contains 5 small rectangles. These are set up as clickable zones, and trigger various operations when they are clicked. The three rectangles in a vertical line, control the value of a variable bs. This is initially set to 1, but can have its value increased by clicking the top rectangle marked bs+. The bottom rectangle marked bs- decreases its value, provided that it is greater than 1, and the central rectangle resets its value to 1. When one of these rectangles is clicked it turns green and remains so until the mouse button is released. The rectangle on the right is filled in red and has the label play. When it is clicked it sends the value 1 to Sonic Pi, together with the current value of the bs (bar start) variable. When it is clicked it will also highlight the left hand rectangle in red and reveal the caption stop, whilst removing its own caption and recolouring the play rectangle to the background colour. When the blue left hand rectangle (∫) is clicked it sends the value -1 to Sonic Pi, together with the current value of the bs variable.
Much of the code deals with the generation of the rectangles and the captions, and also the detection of the mouse pointer position when the mouse is clicked.  The initial function setup creates the window and the rectangles and sets their initial states and colours. It also sets up and OSC udp socket which is used to communicate with Sonic Pi which is assumed to be running on the local machine, although the controller will work with a remote Sonic Pi, provided that it is run with the appropriate address inserted for SonicPi in the program. The display is set to refresh 60 times a second.

The second function sendOscData composes the OSC message to be sent to Sonic Pi. This consists of the “address” /transport followed by two integer arguments tr, which is either 1 for play or -1 for stop, and bs which is an integer specifying the start bar to use.

The third function draw saves the mouse position given by mouseX and mouseY to two variables mx and my and then proceeds through a series of tests which detect whether the mouse was clicked inside one of the five rectangles, and if so, sets the values of the tr and bs variables as appropriate. In the case of the three “bs” rectangles it also shades the rectangle green, by repainting it in that colour, and in the case of the stop and play rectangles it clears the current rectangle colour and caption and enables the colour fill and caption for the other one, again repainting the rectangles and pasting a background filled area to hide the captions and then rewrite the appropriate one. These two sections of code also send OSC messages with the current data to Sonic Pi.

After these sections of code which test where the mouse has been clicked, 6 lines of code paste over the old displayed bs value and then repaint the current value. When the mouse button is released, the bottom section of the draw loop activates when the mouse button is released, and removes the green fill from all of the “bs” rectangles. A flag variable also makes sure that only one OSC message is sent when a click in the play or stop rectangles is detected, even though the mouse button may remain depressed for several passes of the draw loop. Finally a small delay is added, which is set so that the bs value will increase fairly rapidly as the bs+ button is clicked and the mouse held down, whilst allowing a sufficient delay so that it can be quickly clicked to give a single increment in the value.

Now we turn our attention back to Sonic Pi. In order to detect the OSC messages being sent from the BarStartSelector GUI, I have used an undocumented feature of Sonic Pi, which is that it can respond to cues received in the form of OSC messages which are sent to port 4559. There is a slight difference in that response between SP version 2.11 (the current release version on the Raspberry Pi) and SP version 2.11.1 (the current release version on the Mac and Windows PC). However it only requires a minor change to a few lines in the program. Here Sam Aaaron the Lead Developer of Sonic Pi would want me to point out that this is an experimental feature and that it may well change in form, or even disappear in future versions, although currently it is still the same in the latest development version 2.12.0-midi-alpha, and of course is fixed in SP 2.11 and 2.11.1

A simple program which can be used to test the operation of the external GUI is shown below. It also illustrates the small code difference required between SP version 2.11 and 2.11.1 in decoding the received messages.

#test communication with StartBarSelector processing GUI
define :ver do
 return version.to_s.split('.')
end

loop do
 v= version
 tr=0
 until tr==1
 s=sync '/transport'
 if version="v2.11.1" or ver[2].to_i > 11
 tr=s[0]
 bs=s[1]
 else
 tr=s[:args][0]
 bs=s[:args][1]
 end
 
 puts "play/stop tr variable is "+tr.to_s
 puts "bs bar start variable is "+bs.to_s
 end
 
 until tr==-1
 s=sync '/transport'
 if version="v2.11.1" or ver[2].to_i > 11
 tr=s[0]
 bs=s[1]
 else
 tr=s[:args][0]
 bs=s[:args][1]
 end
 
 puts "play/stop tr variable is "+tr.to_s
 puts "bs bar start variable is "+bs.to_s
 end
end

The operative line is s=sync ‘/transport’
This acts like a standard sync command in Sonic Pi, only this time it wait until an OSC message with the address ‘/transport’ is received on port 4559. Depending on the SP version the arguments added to this OSC message are extracted and displayed by puts statements. You can see how tr is set to 1 when play is clicked and -1 when stop is pressed and in each case the bs value is also sent.

So all that remains is to add similar code to the program we wish to control and then to figure out ways of getting the program to stop and to play again when a stop and play command are received in succession. The other problem to solve is how to set the program to start at a specified bar rather than at the beginning. The stop/play problem is solved by means of an external ruby helper program. This utilises the sonic-pi-cli gem which enables you to send commands to sonic pi from a command line. First you have to install it. On a Mac this should be done using the system ruby installation and for the root user, using the command
sudo /usr/bin/gem install sonic-pi-cli
This should install /usr/local/bin/sonic_pi which should automatically be placed in your PATH so that it can be accessed with sonic_pi. If you are using a Raspberry Pi then you can use the procedure below:
Start a terminal window and type the following:

sudo mkdir /var/lib/gems
sudo chown pi /var/lib/gems
sudo chown pi /usr/local/bin
gem install sonic-pi-cli

When the install has completed, reset the ownership of /usr/local/bin

sudo chown root /usr/local/bin

It is a good idea to test it at this stage. Get any piece you like running in Sonic Pi, then from a command line type sonic_pi stop and the sonic pi cli should stop the program from running. Also if you type sonic_pi by itself you should get some information back from the program.
On my system, Sonic Pi programs to play are stored in ~/Documents/SPfromXML and the program we are going to control is called FrereJaquesControlled-RF.rb with a path of
~/Documents/SPfromXML/FrereJaquesControlled-RF.rb
I put the -RF on the end of the file name to remind me that the file is too long to run in a Sonic Pi buffer, and requires to be run from Sonic Pi using the command
run_file “~/Documents/SPfromXML/FrereJaquesControlled-RF.rb”
Similarly the helper program which I use is stored in the same folder and is called FrereJaquesControlled-RFauto.rb It has the contents below

#!/usr/bin/ruby
`/usr/local/bin/sonic_pi stop`
`/usr/local/bin/sonic_pi "run_file '~/Documents/SPfromXML/FrereJaquesControlled-RF.rb'"`

This sends two commands to Sonic Pi via the sonic_pi gem. The first causes Sonic Pi to stop all running programs. The second issues a run_file command which restarts the FrereJaques program again from the beginning.
Now we are in a position to build up the FrereJaquesControlled-RF program. We start with the four part round introduced at the beginning of this post. We have seen how we can wait for and detect a sync OSC message which can send us a command to start the program, but also the bar at which to start. So the next problem is how can we set up the program to react to this?
What we need to do is to determine which of the 13 sections a1[0] to a1[12] we need to select, and whether we are starting at the beginning of that selected section or in the middle of it. In order to help us we add some functions at the beginning of the program. The complete FrereJaquesControlled-RF.rb program is listed below

#FrereJaquesControlled.rb
restart="~/Documents/SPfromXML/FrereJaquesControlled-RFAuto.rb"
use_debug false #turn off log_synths
use_arg_checks false #turn off log_cues
bs=1 #starting bar number: give it an initial value here
bpba=[4]*13 #setup up list of section beats per bar

#puts bpba
st=[] #holds info for start section and remaining bars to process: set global here
#part pan positions
p1=-1;p2=-0.33;p3=0.33;p4=2

############### define functions used in the script
define :numbeats do |durations| #return number of crotchet beats in a note durations list
 l=0.0
 durations.each do |d|
 l+=d
 end
 return l
end

#find starting section, and number of bars in that section to be processed
#to determine the starting note index
define :startDetails do |bn,bNumberSecStart,durations|
 startSecIndex=0
 remainingBars=bn
 #iterate until remaning bn is within the section
 while bn>bNumberSecStart[startSecIndex]
 remainingBars=bn-bNumberSecStart[startSecIndex]
 startSecIndex+=1
 end
 #return the section to start playing and number of bars to determine starting note index
 return startSecIndex-1,remainingBars
end

define :getmatchd do |bn,bpb,durations| #works out the note index for a given bar number
 matchbeat=(bn-1)*bpb #target number of beats to find
 l=0.0;x=0
 until l>=matchbeat || (l-matchbeat).abs < 0.0625 #0.0625 is smallest quantisation to consider
 l+=durations[x]
 x+=1
 end
 return [x ,l-matchbeat] #return the matched beat note index, plus sleep for tied note (if any)
 #nb if the bar start coincides with a tied note, then the part will start with the next
 #note and a sleep command will be issued for the remaining duration of the tied note
end

define :ver do
 return version.to_s.split('.')
end
##########################
#wait for an OSC cue to be received from the Processing GUI sketch
#This sends two parameters: First controls Play (1) Stop (-1)
#second gives requested bar start number
tr=0
until tr==1 #wait for PLAY cue from processing GUI (first parameter will be set to 1)
 s=sync '/transport'
 if version="v2.11.1" or ver[2].to_i > 11
 tr=s[0]
 bs=s[1]
 else
 tr=s[:args][0] #tr governs play/stop value is 1 for play -1 for stop
 bs=s[:args][1] #bs is start bar,the second parameter received
 end
end

puts "BS selected is "+bs.to_s
##########################
#start polling for an OSC cue to stop playing from the Processing GUI sketch
#this runs continuously in a thread
in_thread do #this thread polls for an OSC cue to stop the program
 tr=0
 until tr==-1 #the first parameter will be set to -1 for a STOP signal
 s=sync '/transport'
 if version="v2.11.1" or ver[2].to_i > 11
 tr=s[0]
 bs=s[1]
 else
 tr=s[:args][0] #tr governs play/stop value is 1 for play -1 for stop
 bs=s[:args][1] #bs is start bar,the second parameter received
 end
 end
 #stop command detected
 puts"stopping"
 puts "running sonic pi cli script to restart"
 
 system(restart+" &") #run the auto script to stop and rerun the code
end
##########################
with_fx :reverb, room: 0.8 do
 with_fx :level,amp: 0.7 do
 #part 1 data
 a1=[]
 b1=[]
 a1[0]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
 a1[1]=[:e4,:f4,:g4,:e4,:f4,:g4]
 a1[2]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
 a1[3]=[:c4,:g3,:c4,:c4,:g3,:c4]
 a1[4]=[:r]*8
 a1[5]=[:r]*8
 a1[6]=[:r]*8+a1[0]
 a1[7]=a1[1]
 a1[8]=a1[2]
 a1[9]=a1[3]
 a1[10]=a1[4]
 a1[11]=a1[5]
 a1[12]=[:r]*8
 b1[0]=[1,1,1,1,1,1,1,1]
 b1[1]=[1,1,2,1,1,2]
 b1[2]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
 b1[3]=[1,1,2,1,1,2]
 b1[4]=[1]*8
 b1[5]=[1]*8
 b1[6]=[1]*8+b1[0]
 b1[7]=b1[1]
 b1[8]=b1[2]
 b1[9]=b1[3]
 b1[10]=b1[4]
 b1[11]=b1[5]
 b1[12]=[1]*8
 c1=[100,120,140,160,180,200,220,200,180,160,140,120,100]
 ###################### calculate starting data

 #calc bar offset for start of each tempo change. Held in bNumberSecStart list
 bNumberSecStart=[]
 bNumberSecStart[0]=0
 bNumber=0
 b1.length.times do |z|
 bNumber+= numbeats(b1[z])/bpba[z]
 bNumberSecStart[z+1]=bNumber
 end
 #puts bNumberSecStart #for debugging
 #calc number of bars inthe piece
 bmax=bNumberSecStart[b1.length]
 puts "Total number of bars="+bmax.to_s
 #adjust requested bar start number if too large
 if bs>bmax
 bs=bmax
 puts "Start bar exceeds piece length: changed to :"+bs.to_s
 end
 #calculate info for starting sector containing bar start requested,
 #and number of remaining bars to process to get starting index
 st=startDetails(bs,bNumberSecStart,b1)
 startSec=st[0]
 remainingBars=st[1]
 puts "Start Section="+st[0].to_s
 puts "Remaining Bars to find starting index="+st[1].to_s
 puts

 ################### now ready to process an play each part in turn (played together in threads)
 #each part is processed in exactly the same way

 #calc starting index and any sleep for tied notes for part 1
 sv1=getmatchd(remainingBars,bpba[startSec],b1[startSec])
 
 puts "1: "+sv1.to_s #print start index and sleep time
 use_synth :beep
 in_thread do
 for i in startSec..a1.length-1
 use_bpm c1[i]
 sleep sv1[1] #sleep for tied note (>0 if tied)
 for j in sv1[0]..a1[i].length-1
 play a1[i][j],sustain: b1[i][j]*0.9,release: b1[i][j]*0.1,pan: p1
 sleep b1[i][j]
 end
 sv1=[0,0] #reset so subsequent iterations of j loop in full and no tied sleep
 end
 end


 a2=[]
 b2=[]
 a2[0]=[:r]*8
 a2[1]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
 a2[2]=[:e4,:f4,:g4,:e4,:f4,:g4]
 a2[3]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
 a2[4]=[:c4,:g3,:c4,:c4,:g3,:c4]
 a2[5]=[:r]*8
 a2[6]=[:r]*8+a2[0]
 a2[7]=a2[1]
 a2[8]=a2[2]
 a2[9]=a2[3]
 a2[10]=a2[4]
 a2[11]=a2[5]
 a2[12]=[:r]*8
 b2[0]=[1]*8
 b2[1]=[1,1,1,1,1,1,1,1]
 b2[2]=[1,1,2,1,1,2]
 b2[3]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
 b2[4]=[1,1,2,1,1,2]
 b2[5]=[1]*8
 b2[6]=[1]*8+b2[0]
 b2[7]=b2[1]
 b2[8]=b2[2]
 b2[9]=b2[3]
 b2[10]=b2[4]
 b2[11]=b2[5]
 b2[12]=[1]*8 
 c2=[100,120,140,160,180,200,220,200,180,160,140,120,100]
 #calc starting index and any sleep for tied notes for part 2
 sv2=getmatchd(remainingBars,bpba[startSec],b2[startSec])

 puts "2: "+sv2.to_s #print start index and sleep time
 use_synth :blade
 in_thread do
 for i in startSec..a2.length-1
 use_bpm c2[i]
 sleep sv2[1] #sleep for tied note (>0 if tied)
 for j in sv2[0]..a2[i].length-1
 play a2[i][j],sustain: b2[i][j]*0.9,release: b2[i][j]*0.1,pan: p2
 sleep b2[i][j]
 end
 sv2=[0,0] #reset so subsequent iterations of j loop in full and no tied sleep
 end
 end

 a3=[]
 b3=[]
 a3[0]=[:r]*8
 a3[1]=[:r]*8
 a3[2]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
 a3[3]=[:e4,:f4,:g4,:e4,:f4,:g4]
 a3[4]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
 a3[5]=[:c4,:g3,:c4,:c4,:g3,:c4]
 a3[6]=[:r]*8+a3[0]
 a3[7]=a3[1]
 a3[8]=a3[2]
 a3[9]=a3[3]
 a3[10]=a3[4]
 a3[11]=a3[5]
 a3[12]=[:r]*8
 b3[0]=[1]*8
 b3[1]=[1]*8
 b3[2]=[1,1,1,1,1,1,1,1]
 b3[3]=[1,1,2,1,1,2]
 b3[4]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
 b3[5]=[1,1,2,1,1,2]
 b3[6]=[1]*8+b3[0]
 b3[7]=b3[1]
 b3[8]=b3[2]
 b3[9]=b3[3]
 b3[10]=b3[4]
 b3[11]=b3[5]
 b3[12]=[1]*8
 c3=[100,120,140,160,180,200,220,200,180,160,140,120,100]
 #calc starting index and any sleep for tied notes for part 3
 sv3=getmatchd(remainingBars,bpba[startSec],b3[startSec])

 puts "3: "+sv3.to_s #print start index and sleep time

 use_synth :tri
 in_thread do
 for i in startSec..a3.length-1
 use_bpm c3[i]
 sleep sv3[1] #sleep for tied note (>0 if tied)
 for j in sv3[0]..a3[i].length-1
 play a3[i][j],sustain: b3[i][j]*0.9,release: b3[i][j]*0.1,pan: p3
 sleep b3[i][j]
 end
 sv3=[0,0] #reset so subsequent iterations of j loop in full and no tied sleep
 end
 end

 a4=[]
 b4=[]
 a4[0]=[:r]*8
 a4[1]=[:r]*8
 a4[2]=[:r]*8
 a4[3]=[:c4,:d4,:e4,:c4,:c4,:d4,:e4,:c4]
 a4[4]=[:e4,:f4,:g4,:e4,:f4,:g4]
 a4[5]=[:g4,:a4,:g4,:f4,:e4,:c4,:g4,:a4,:g4,:f4,:e4,:c4]
 a4[6]=[:c4,:g3,:c4,:c4,:g3,:c4]+[:r]*6 #Tied note added here: to show how its dealt with start at bars 14 then 15
 a4[7]=a4[1]
 a4[8]=a4[2]
 a4[9]=a4[3]
 a4[10]=a4[4]
 a4[11]=a4[5]
 a4[12]=[:c4,:g3,:c4,:c4,:g3,:c4]
 b4[0]=[1]*8
 b4[1]=[1]*8
 b4[2]=[1]*8
 b4[3]=[1,1,1,1,1,1,1,1]
 b4[4]=[1,1,2,1,1,2]
 b4[5]=[0.5,0.5,0.5,0.5,1,1,0.5,0.5,0.5,0.5,1,1]
 b4[6]=[1,1,2,1,1,4]+[1]*6 #Tied note added here: to show how its dealt with start at bars 14 then 15
 b4[7]=b4[1]
 b4[8]=b4[2]
 b4[9]=b4[3]
 b4[10]=b4[4]
 b4[11]=b4[5]
 b4[12]=[1,1,2,1,1,2]
 c4=[100,120,140,160,180,200,220,200,180,160,140,120,100]
 #calc starting index and any sleep for tied notes for part 4
 sv4=getmatchd(remainingBars,bpba[startSec],b4[startSec])

 puts "4: "+sv4.to_s #print start index and sleep time

 use_synth :saw
 in_thread do
 for i in startSec..a4.length-1
 use_bpm c4[i]
 sleep sv4[1] #sleep for tied note (>0 if tied)
 for j in sv4[0]..a4[i].length-1
 play a4[i][j],sustain: b4[i][j]*0.9,release: b4[i][j]*0.1,pan: p4
 sleep b4[i][j]
 end
 sv4=[0,0] #reset so subsequent iterations of j loop in full and no tied sleep
 end
 end

 end #level
end #fx

At the beginning of the program restart is set as a variable holding the command to run the FrereJaquesControlled-RFauto program referred to above. The next two lines turn off some of the output in the log to make it eaieer to see some of the printed statements the program produces. bpba is an array or list which holds the number of beats per bar for each of the 13 sections in the piece. In this example the time signature 4/4 or 4 crotchets per bar is constant throughout, but the code will handle pieces where the time signatures changes between sections, as is the case in the second example detailed later on. In this case each of the 13 entries is set to 4.
st[ ] is an array which will hold details about the starting  section and the bars to process to determine the starting note. p1 to p4 are the pan positions for each of the four parts.

The first additional procedure is numbeats. This calculates the number of crotchet beats in a given section. Thus puts numbeats(b1[0]) would print 8 in the log window, as there are 8 crotchets in the first section, but puts numbeats(b1[6]) would give 16 as this middle section is twice as long as the others.

The second added procedure startDetails determines which section we need to start playing from, and how many bars remain to be processed from the start of this section in order to determine the index or position of the first note to be played. Although it is listed at this point, to keep the procedure definitions together, it requires some data which can only be ascertained after the note and duration data of the first part have been defined. If you look just below that point in the program listing you will see some code which sets up the list bNumberSecStart  This calculates and holds the starting bar number of each section, but counting from 0 rather than 1. Basically it calculates the number of beats in each section and divides it by the number of beats per bar for that section which is held in the bpba array referred to above. Also in this section of the program the maximum number of bars in the program bmax is calculated, and it is used to limit the requested bar bs if this is too high. Returning to the  startDetails function,  this is fed with three parameters. The start bar requested bs, the array list of starting bar number for each section bNumberSecStart and the array b1 which holds the lists of durations for the 13 sections b1[0] to b1[12]
It calculates which section to start from by subtracting the number of bars from each section in turn from the value of bs, until the remaining number is less than the number in the current section. The starting section is then one less than this number, and the remaining bars are those from which to work out what the starting note index for the bar request is.

This task is now passed to the third additional procedure, getmatchd. This will be passed three parameters. bn is the number of bars remaining to be processed, after removing those which have already taken place in the previous sections, the beats per bar bpb for the current section, and the list of durations for that section (durations) eg b1[6] if we are going to start within that section. matchbeat is set to the target beat to find within the section. The -1 adjusts for the fact that the start bar counts from 1 whereas the calculations we have done essentially on elapsed bars starts counting from 0. We now set up two variables l and x. x indexes the position starting from 0 as we move though the section in a loop while l holds a running total of the duration within the loop. We continue going round the loop until the increasing value of l matches the target beat count in matchd within an error less than the smallest note duration we will use, OR until l just exceeds this target. This latter case will occur if there is a tied note over the target beat. For example if we have two bars each with four crotchets and the fourth crotchet is tied to the fifth one “over the bar line” which is the target start bar so that we sound a minim note, then the match will actually be after the fifth crotchet a the start of the sixth crotchet. In this case, we will start that part playing at the sixth crotchet, but we will insert a rest so that it is delayed and will start playing in synchronism with the other 3 parts. So the procedure getmatchd will return two pieces of information. First the starting note index to be played in the section, and secondly the rest value (if any) required, which will allow for a tied note match.

The final small procedure uses some ruby code to split the returned version number of the Sonic Pi being utilised and enable us to determine whether it is version 2.11 or 2.11.1 and so adjust the OSC sync code as shown previously. In the first part, we wait for an OSC sync to be received containing the play command code tr=1. Once this has been received we proceed to the next part which is now placed in a thread, so that it can continue to operate as the rest of the program proceed on its way. This thread waits for another OSC message to be received, this time with tr set to -1 the stop code. When this is received it uses a system command to call the FrereJaquesControlled-RFauto.rb program using the command line variable restart setup at the start of the program. As discussed, this will cause the program to stop and then rerun, awaiting another play command from the remote GUI.

The remainder of the program is largely the same as the 4 part round we discussed towards the beginning of this article. However there are one or two changes. First the whole program is wrapped in two fx calls. The first with_fx :reverb adds some reverb to the round as it is played and the second with_fx :level sets an overall :amp value of 0.7 for the volume. Below the data for the first part is the extra code to work out bNumberSectStart and bmax as discussed above. The startDetails procedure is called to work out the starting section and the number of bars to be processed for this part. In fact the same data can be used for all four parts in this example so it doesn’t have to be calculated more than once. The results are stored in the list st[ ] with startSec being set to st[0] and remainingBars to st[1] From there on each of the four parts is processed in exactly the same way.

First we use the getmachd procedure to find the starting note index and the sleep value (if any) to compensate for a tied note. This piece wouldn’t normally have any tied notes, but to illustrate what happens I have introduced one at the end of the first tune played in part 4. You will see this in the comments alongside the lines for a4[6] and b4[6]. You can compare them with the corresponding lines in the second listing in the article. We will discuss this further when the program is played. The call to the getmatchd procedure is sv1=getmatchd(remainingBars,bpba[startSec],b1[startSec]), the two values this returns are stored in sv1[0] the starting note index and sv1[1] the sleep value (if any)

Now all we have to do to adjust the starting note is to alter the starting point of the two loops i and j which control the playing of the notes. Instead of for i in 0..a1.length-1 we now have for i in startSec..a1.length-1 and instead of for j in 0..a1[i].length-1 we now have for j in sv1[0]..a1[i].length-1 Also just before the j loop we insert sleep sv1[1] If there is a tied note involved, this sleep value will be greater than 0 and will allow for the fact that this loop is starting a bit later than the other parts. Finally we reset the values in sv1 to 0 after the j loop has finished so that for subsequent sections all of the loop contents are used with j starting from 0 and with no sleep value before the section start. If you look at the code you will see that all the three remaining parts are processed in exactly the same way, each with their own sv values sv2, sv3 and sv4 calculated and applied.

So after that mammoth discussion we can try out the code. Becuase the program is so long it will not run directly in a Sonic Pi buffer. Instead, in an empty buffer you need to type

run_file “~/Documents/SPfromXML/FrereJaquesControlled-RF.rb”

obviously alter the path if you have the file stored in another folder. Make sure that the FrereJaquesControlled-RFauto.rb file is in place as well and that the restart variable is pointing to its correct location, and also that the auto file has the correct location of the FrereJaquesControlled-RF.rb file in it. It is easy to get one of these wrong, so check them carefully. Now start the StartBarSelector GUI and then run the Sonic Pi program. It wont do a lot as it is waiting for a play command from the GUI. Click on the red play rectangle and all being well it should start playing. Click on the red stop rectangle and it should stop AND relaunch the Sonic Pi program. All being well you can click on play again to restart it. If that doesn’t happen look at the debugging section later on. You should be able to play and stop at will. The buttons are highlighted to show which should be pressed next. You should also be able to alter the displayed start bar at any time. using the three rectangles provided. The next time you press play, the displayed value will be implemented. To see how the tied bar works, start from bar 14. You should here the tied note held over and playing at the same time as part 1 restarts for the second time. Now change and start from bar 15. You will here part 1 starting, but the second half of the tied note in part 4 is replaced with a sleep command and part four starts from next note (which is a rest) so you will here nothing from part 4 until it comes in (at the correct time) for the second time through. you can have a look at the screen output where you will see

“BS selected is 15”
“Total number of bars=28.0”
“Start Section=6”
“Remaining Bars to find starting index=3.0”

“1: [8, 0.0]”
“2: [8, 0.0]”
“3: [8, 0.0]”
“4: [6, 2.0]”

You will see that part 4 starts section 6 with an index of 6 with a sleep value of 2.0 corresponding to the overrun of the tied note, whereas the other three parts have 0.0 sleep value. Part 4 starts playing with index 6 which is the seventh entry in section 6, on the third beat of the third bar, whereas the other parts which all have  8 crotchet rests at the start of section 6 all start on the 9th entry (index 8) at the start of the third bar in that section. The sleep 2 will delay part 4 so that it starts the third beat in synchronism with the other three parts when they get there.

a4[6]=[:c4,:g3,:c4,:c4,:g3,:c4]+[:r]*6  #Tied note added here: to show how its dealt with start at bars 14 then 1
b4[6]=[1,1,2,1,1,4]+[1]*6   #Tied note added here: to show how its dealt with start at bars 14 then 15z

For comparison part 1 without the tied note is shown below

a1[6]=[:r]*8+a1[0]
b1[6]=[1]*8+b1[0]

Debugging
It can be quite tricky to sort things out if the system doesn’t work. From my experience the usual culprit is an incorrect path or filename in the places where these are included in the scripts. Check very carefully the saved names of the two programs FrereJaquesControlled-RF.rb and FrereJaquesControlled-RFauto.rb and the places where these are referenced in the main program and in the auto program.
You can check the behaviour of the auto program by running it from the  command line with
/usr/bin/ruby ~/Documents/SPfromXML/FrereJaquesControlled-RFauto.rb
You can check the StartBarSelector GUI with the test program detailed in the article. You can also run the GUI from the processing ide, and uncomment the debugging print statements in the program which will give you some output in the terminal window beneath the program.

You can down load all the programs in the project including the source file for the GUI (in text format) which you can paste into a new blank sketch window, and then save it as StartBarSelector. The link is here

Finally, there is also a second example playing a slightly longer piece by Monteverdi (Beatus Vir) which can also be controlled by the GUI. It incorporates a time signature change at bar 62 which is also handled by the software. Try starting two bars earlier to hear the time signature change take place from 4/4 to 6/4 time.
The files for this second example BeatusVirControlled-RF.rb and BEatusVirControlled-RFauto.rb are included with the link above.

I have recorded a series of 6 videos which you can watch which go through the operation of these  programs. You can access them here.

Using Processing to control Sonic Pi (updated)

UPDATE ADDED FOR SONIC PI 3.0 AND LATER

line in the Sonic Pi program

nv=sync "/notesend"

is altered to

nv=sync "/osc/notesend"

Having been inspired by the superb Christmas Card from MeHackit (do look at it and download and play with the code) I resolved to take another look at Processing and how it can be used to control Sonic Pi. Previously I had only used it to run a conversion script to convert MusicXML files to Sonic Pi format, but seeing this Christmas Card showed that it is capable of far more. As a newbie to using the program in earnest I looked at some of the numerous examples at https://processing.org/examples/ and saw how easy it was to get information on mouse coordinates. I choose the example constrain which has a filled ellipse follow the mouse coordinates, but bounded by an enclosing box and decided to modifiy this to send coordinates to Sonic Pi using the sync command osc features added in Sonic PI 2.11. With reference to the MeHackit code, it was easy to add OSC commands to send the mouse x and xy coordinates to sonic pi, where they could be received and scaled to control the note pitch and cutoff values for notes played with the tb303 synth. This is just an example. You could control any parameters you wish, or add detection for mouse down as well in a more complex example. The processing script used is:

import oscP5.*; //libraries required
import netP5.*;

OscP5 oscP5;
NetAddress sonicPi;

float mx;
float my;
float easing = 1; //change to 1 to get immediate following
int radius = 24;
int edge = 100;
int inner = edge + radius;

void setup() {
size(640, 360);
noStroke();
ellipseMode(RADIUS);
rectMode(CORNERS);
oscP5 = new OscP5(this, 8000);
sonicPi = new NetAddress("127.0.0.1",4559);

}
void sendOscNote(float mx,float my) {
OscMessage toSend = new OscMessage("/notesend");
toSend.add(mx); //add mx and my values as floating numbers
toSend.add(my);
oscP5.send(toSend, sonicPi);
println(toSend);
}
void draw() {
background(51);

if (abs(mouseX - mx) > 0.1) {
mx = mx + (mouseX - mx) * easing;
}
if (abs(mouseY - my) > 0.1) {
my = my + (mouseY- my) * easing;
}

mx = constrain(mx, inner, width - inner);
my = constrain(my, inner, height - inner);
fill(76);
rect(edge, edge, width-edge, height-edge);
fill(255);
ellipse(mx, my, radius, radius);
sendOscNote(mx,my); //send the mx and my values to SP
}

To use it, install processing 3 from https://processing.org/ and paste the script into the sketch window which opens when you run it. Save the sketcch with a suitable name and location. YOu need to add the oscP5 library from the sketch=>import libarary…=>add library menu selection. YOu can then run the sketch and the ellispe should follow the mouse around inside its rectangle.

On the Sonic Pi side paste in the code below and run it.

use_synth :tb303
live_loop :os do
  nv=sync "/osc/notesend" #for Sonic PI 3 and later
  #puts nv #uncomment and comment next line to see OSC input
  #scale the mx and my values in nv[0] and nv[1] appropriately
  #raw mx 124(left)-516(right) and raw my 124 (top)-236(bottom
  puts (40+(nv[0]-124)/392*60).to_i.to_s+" "+(190-nv[1]/2).to_s
  play (40+(nv[0]-124)/392*60).to_i,cutoff: (190-nv[1]/2),sustain: 0.04,release: 0.01
  sleep 0.05
end

Run the SP program which will wait for input from teh Procdessing script. Run the Processing script and move the mouse around to alter the position of the filled ellipse (circle). Moveing it left-right will alter the pitch of the note from 40 to 100. Moving it up and down will alter the cutoff from 72 (bottom) to 128(top). Note it is possible to slice the note by moving the mouse bottom right where the pitch 100 is significantly above the cutoff value 72 so you hear nothing.

I hope that this simple example will inspre both you and I to explore further the use of Processing with Sonic Pi.

Here is a link to a video of the files in action