Preparing a new project

Preparing a new project

I had some minor success this evening preparing for a new project to build a MIDI controller utilizing raveloxmidi.

I wanted to use a Raspberry Pi Zero W as the main CPU and I set up a breadboarded version of how it’s going to work along with some Python code to poll the hardware and integrate with raveloxmidi.

It actually works but the latency is too great so I’m going to switch the Pi Zero out with a Pi 2 or 3 if I can find one in my pile.

I’m pretty excited because I’ve gone from concept to working prototype in a couple of hours and I already know what the final product is going to look like.

raveloxmidi v0.5.0

This release is a big one for me in that it provides ALSA support. This opens up more possibilities for interfacing MIDI devices on the Raspberry Pi with music-making software like Logic on a remote machines.

I’ve never programmed anything for ALSA before and the rawmidi interface made it easy to use in that I can read a chunk of data as a binary blob and then process it. This made it possible to integrate into the existing data-reading code without too many changes.

It all works !

After giving up for a couple of days because it wasn’t doing what I wanted, I took everything apart and rewired it. This time, it functioned as I wanted it to!

There are more things that I need to do:

  1. Get another MCP3008 ADC so that it can be used to clear an input once it’s been read.
  2. Wire up more than 1 drum input using more LM324 op-amps.
  3. Rewrite the python code in C so that latency can be as low as it can be. I may attach the Raspberry Pi to a network hub/switch so that WiFi isn’t part of the equation.

Information burst

More reading, more changes of mind.

As my requirements for this MIDI interface are limited only to sending notes, I came to the realisation that I don’t need to handle journals inbound to me and the only journal I need to create outbound is a Chapter N for NOTE ON/OFF events. After playing with reference implementation and copying most of the sample code from the RFC, I’ve taken it all out again and I’m starting from scratch. The RFC code isn’t the best example of a Chapter N journal so I need to go over the details again and translate the concept. I’m sure it’s relatively simple, I just need to wrap my head around it (I’m a bear of little brain).

I have 2 binaries right now. The first is a simple Avahi registration app that publishes host and port 5004 and the name of the service (_apple-midi._udp). The second listens on the ports for events. The listener app is correctly unpacking the Apple MIDI events. I did clean up some segfaults when the OS X MIDI app ends the session ( by sending a BY event ). I’m debating whether I want to handle just 1 session or multiple.

On the hardware side, having got a little bored of starting at C code, I spent some money at Radio Shack and other places to get capacitors, resistors, diodes and also some 3.5mm audio inputs that are breadboard combatible.

I did some reading up on how to handle the drum pad inputs. As the teardown showed, it’s a piezo trigger and, according to the internet, that requires some special handling.

The following pages are useful information: and

They show the circuits needed to handle a piezo input.

Here’s my problem right now. I built the circuit using the info from Peter Vieth’s page and, if I connect up my multimeter, I can see there is a change in voltage when I hit a pad but it’s very small. Some of the many questions I have is whether I’m supposed to be seeing such a small voltage change ( which I saw when I first tested the pad a couple of months ago ). In some cases, with light hits, I don’t see a change at all.

I connected the drum pads back up to the controller they came with and they play fine. Light hits are registering so I know I haven’t broken anything ( hard to do seeing as it’s a piezo trigger ). The controller uses a 9v DC input and I’ve hooked my circuit to a 9v battery but see little change.

The other question I have is whether I’m actually using my multimeter correctly or if I should even be seeing anything register with a multimeter. The info on the Leucos site shows some good data using an oscilloscope but I don’t want to get into that at my stage of life ( $$$ ).


I guess more reading is in order. I have some work colleagues that are part of Noisebridge ( and they have sessions on a Monday  which helps members learn about circuits’n’stuff so I may check them out.

Midi over a network

Midi over a network is actually defined in RFCs. See for more information.

However, those definitions are only for transferring the MIDI data over RTP once the connection has been made. The next trick is to work out how to initiate the connection.

This is where wireshark comes in useful. I have AC-7 on my iPad which is a MIDI-based DAW interface so I’ve been able to capture the traffic between the iPad and my Mac. The fields used in the initial protocol have been identified are in the standard wireshark release. See for those fields.

There are a couple of steps involved here. The first part is to determine where to connect. Zeroconf comes into play here by searching for addresses offering _apple-midi._udp. Once the address has been identified, the connection can be made.

The second part is initiating the session.

I’ve had this brilliant idea…

So, it’s like this you see….

I decided today that I needed a project to consolidate my time and give me a goal to achieve instead of aimlessly wandering around Azeroth beating people up.

I want a Raspberry Pi ( but I want to have a reason to get one and not leave it in a box somewhere doing nothing.

Today’s brilliant idea is to put together a midi interface for my ION Sound Sessions drum kit that’s gathering dust in the garage.