Converting MusicXML or midi files to work with Sonic Pi

Recently I came across a Japanese user Hiroshi TACHIBANA who has produced a script running under the Processing app which is able to convert MusicXML files into a format that Sonic Pi can play. I have spent some time using and amending this script, and find it very useful in transcribing music to run on Sonic Pi, that I might not otherwise have done, because of the time taken to type in all the notes and their durations.

I have written two articles about this. In the first one I use the sample Japanese piece that Hiroshi was using the script to convert, and in the second I introduce the amendments that I have made to the script, and walk through the conversion of a simple piece to illustrate its features. I also give a download link to several pieces that I have converted, both the midi, MuseScore and Sonic Pi versions, so that you can try them out yourself

The first article is here
The second article is here

Sonic-Pi controlled conversation between two McRoboFaces

IMG_4843

Following on from my previous project with a single McRoboFace, 4Tronix have kindly supplied me with a second face to enable me to develop the idea to control two McRoboFaces with Sonic Pi. I have amended the previous project to feed the outputs of the left and right audio channels to two separate adc inputs on the piconzero board, and daisy chained the two McFaces (you merely connect the Dout pin of the first to the Din pin of the second) and then address the leds on the second McRoboFace with an offset of 17. I have developed routines in the python driver program to control each face separately. Each mouth can be set to a fixed position: closed, open, smile or sad, or can be fed from the audio input via the adc, so that it is triggered to open when the signal exceeds a preset threshold.

In order to provide greater control, and to synchronise it to the audio feed from Sonic Pi, I have added Ruby routines to the Sonic Pi program which can send text strings to the python program via a text file. These strings can set the mouth state for each face, and also alter the colours of the leds. because there is only a common brightness setting for both faces (using pwm) If only one face is receiving audio I use that output to control the brightness of both faces. If both faces are set to receive audio then I set the brightness at a fixed value.

The conversation is entirely controlled from Sonic Pi. It plays the audio for each face via a series of pre-recorded samples, and plays each face with a separate audio channel by setting the pan: value either to -1 or to 1. Before each sample is played, control signals are sent via the text file to set up the required state for each face. At the end of the presentation both faces receive audio input together as they “sing” along to a round of Frere Jaques. Finally a control signal is sent to reduce the brightness to zero, effectively switching off all the leds.

Writing and reading the data via a text file is perhaps not the most elegant way to do things, but does seem to work OK. I used a technique I developed previously when reading in large numbers of sample files to “hold up” the Sonic Pi program utilising a cue and sync while the writing completes. Otherwise you can run into “too far behind errors”. On the reception side, at the start of the main program loop the python program polls for the existence of the text file, and if it finds one, reads the data, then deletes the file. It then alters parameters according to the received data. It took quite a lot of experimentation to get the timings and consistent operation of the two programs correct, but having done so, the final system is quite stable. I boost the audio levels to amp: 4 in Sonic Pi, which gives a good signal for the adc inputs to latch on to.

Setup is fairly straight forward. The calibrate button used in the single face project us utilised again, and sets separate offsets for each channel, and the code used to modulate the mouths is very similar to that used in the previous project. Once set, the Sonic Pi program can be run several times, leaving the python program running continuously..

I have enjoyed this project, which had brought together Sonic Pi, Ruby and Python in an interesting way, not to mention recording and processing the samples with Audacity., and I hope you enjoy the video of the final system. I hope it may be possible to write up teh system more fully in the future, but it will be quite a big job to do so.

You can see the video here

 

Sonic Pi driven Sound Bar Graph built on RasPiO ProHat

SPBarGraphMainLayout

Ive recently added an article describing how to use the RasPiO ProHat to build a 4 led bar graph which can be driven by Sonic Pi. It also uses components from RasPiO’s Experimenter’s Kit. The article contains full constructional details with links to the python program, and videos showing the construction and use of the project.

Read the article here

McRoboFace project to singalong with Sonic Pi

IMG_4773

Recently I was sent a pre-production version of 4Tronix McRoboFace, which is a small face whose features light up with 17 Neopixel leds. It is ideally suited as a companion to their Picon Zero controller board, which already has software in its library to accommodate it. I experimented with the item, and the result was a project to produce a talking/singing face, which could respond to an audio input either fed from a microphone, or internally from a raspberry pi running Sonic Pi.

An article here gives full constructional details and links to the software used.

4Tronix hope to launch a kickstarter project for the McRoboFace soon. If you support it, this will give you a nice project with which to use it.

video link to the project in action, explaining itself!

PS3 controlled Edukit robot

ps3robot

Edukit 3 robot kit, revamped with 4Tronix picon zero board and python program to control it with a ps3 wireless controller. The pizero has the dongle for the ps3 controller in its single usb port, and is arranged to boot automatically into the python program driving the robot. A red led lights when the system is ready for action. The left hand joystick is sensed and used to control the motor speed and direction. A button on the controller is sensed to initiate shutdown when finished, so the whole process is automatic and doesn’t require any keyboard, mouse or screen. software is availabe for download at:
https://gist.github.com/rbnpi/ee3b60f200a4ef9b927d2faa0241f7b0

video of the robot in operation is here