Thursday 2 June 2011

Network MIDI on iOS - Part 3

In this part I will discuss how incoming MIDI is received into the application and subsequently processed. The source code for this project is available for download in Part 1 of this article.

The MIDI controller class (see MIDIController.h) provides a formal protocol, MIDIReceivedDelegate, and a corresponding delegate property. This defines the following methods:

- (void) midiControllerUpdated:(Byte)controller onChannel:(Byte)channel toValue:(Byte)value;
- (void) midiNoteOnOff:(Byte)note onChannel:(Byte)channel withVelocity:(Byte)velocity on:(BOOL)on;

These methods encapsulate the complexity of receiving MIDI data - but how is this accomplished behind the scenes?

When we created the MIDI client in part 2 of this article, we also created a MIDI input port. This was passed a pointer to a callback function, MIDIInputReadProc. This callback function is called by the operating system when MIDI data is received at the port. As a real time callback function which may be re-entrant when the system is under load, certain restrictions should be adhered to. In particular, this function should avoid:
  • Allocating and deallocating memory
  • Acquiring locks
  • Performing lengthy operations
The function walks the list of MIDI packets received as follows:

For each MIDI packet received:
  • The packet's length and data are written into a structured circular buffer (see MIDIPacketBuffer.h)
  • A Mach semaphore is incremented to signal the rest of the application that another packet is available for processing.
Note that the structured buffer in this example disregards the timestamp of the MIDI packet. This will be covered in a later example.

Another thread (see midiInputThreadProc) waits on this semaphore. If the semaphore is signalled, the length of the next MIDI packet is retrieved from the circular buffer, and then that amount of data is copied into a regular buffer.

This data is then parsed from this buffer. The following MIDI commands are identified in this example:

  • Control changes
  • Note on/off
Other commands are discarded. When a recognised command is received, Grand Central Dispatch is used to enqueue a block that will invoke the controller's delegate asynchronously on the main queue. The delegate will then respond to the invocation and perform any necessary UI updates etc.

By these means, the application decouples the processing of MIDI data from its receipt, and conforms with the requirements placed on it by the CoreMIDI port model.

15 comments:

  1. Hello!
    I´ve been studying your code for handling incoming MIDI messages, and I am trying to implement this into my own project. In my application I need to handle incoming SysEx messages that will be up to 72 bytes in length. How would you handle these SysEx data? I guess it is best to process them outside of the midiInputThreadProc, just like you do with other commands through Grand Central Dispatch. But how to copy the bytes of the message?

    Thanks in advance!

    ReplyDelete
  2. Hi, if you need to handle short sysex messages then first you need to define constants for the start and end control bytes e.g. in MIDIMacros.h:

    #define MIDI_SYSEX 0xF0
    #define MIDI_SYSEX_END 0xF7

    Then add something like this to the MIDIControllerDelegate:

    @optional

    - (void)midiReceivedSysEx:(NSData *)sysExData;

    Then in the controller's midiInputThreadProc add something like (apologies for any loop index errors etc., but you get the idea):

    case MIDI_SYSEX:
    {
    // Somewhere else in the program ideally...
    const int MyManufacturerID = 0x69;
    //
    Byte manufacturerID = data[packetContentsIndex + 1];
    if (manufacturerID == MyManufacturerID) {
    // TODO: Optionally read a model number etc. if your spec supports it
    packetContentsIndex = packetContentsIndex + 1;
    NSMutableData *sysexData = [NSMutableData dataWithCapacity:length - 2];
    [sysexData setLength:length - 2];
    Byte *receivedData = (Byte *)[sysexData mutableBytes];
    while (packetContentsIndex < length && data[packetContentsIndex] != MIDI_SYSEX_END) {
    *receivedData = data[packetContentsIndex];
    ++packetContentsIndex;
    ++receivedData;
    }
    // TODO: The last byte of the received data may contain a checksum, so you can process that here
    // and truncate the data by one byte if necessary
    if ([delegate respondsToSelector:@selector(midiReceivedSysEx:)])
    dispatch_async(dispatch_get_main_queue(), ^(void) {
    [delegate midiReceivedSysEx:sysexData];
    });
    }
    }
    break;

    You should then be able to process the sysex asynchronously in the Objective-C part of your program by implementing the corresponding method on the delegate. I'll update the example when I get some time (and include an example of responding to sysex data requests as well), but hopefully this should get you going for now.

    ReplyDelete
    Replies
    1. Great tutorial!
      Have you managed to update the post for SysEx anywhere yet?

      Delete
  3. Thanks a lot for your answer! I got this working in the way you suggested.

    However, I am struggling with getting my app to synchronize playback through MIDI clock. I use your code to receive MIDI data and to process them in midiInputThreadProc. When my app receives MIDI clock messages from another app acting as master clock, playback in my app is rather unstable, with beats often occuring slightly delayed. I am not performing any smoothing on the incoming clock stream, but the timstamps of the clock messages received indicate a rather stable stream.

    I suspect that the handling of incoming MIDI clock is not done efficiently enough in my app. Do you think that this way of processing incoming MIDI data in its own thread, combined with a circular buffer, semaphore signalling and dispatch_asynch is a sufficiently fast method of handling real-time data? I also wonder if the priority of the midiInputThreadProc should be adjusted?

    Currently, I am not using delegation to message midi processing methods, I just call a method in the same class directly. Don´t know if this has any implication on the timing...

    I really hope you can share some thoughts on this topic.
    Thank you!

    ReplyDelete
  4. Hi, if you're looking to generate MIDI or audio output in real time as a response to the input then it's the dispatch_async to the main queue that's the problem, that's only necessary if you are trying to update the UI anyway. It depends what type of app you are trying to build, i.e. is it a soft synth or a phrase sequencer (or both).

    Either way, the key thing is when you receive the MIDI input packets you should then update the internal state of your app directly (for instance using the functions in OSAtomic.h). You could either do this directly in MIDIInputReadProc (messy!) or add another similar thread, semaphore and circular buffer for processing this information. The overhead in signalling another semaphore and copying the packets to another buffer is minimal, and I expect this would give a cleaner implementation as then you are separating UI updates and internal state updates. You could then bump up the priority of this thread, but it's probably not necessary. I would stick to C in this part and avoid locks, blocking operations etc.

    The resulting output should then be generated either:

    . in the high priority callback you use to populate the buffers for Core Audio, for a soft synth.
    . in a separate real-time thread, for a MIDI phrase sequencer, arpeggiator etc.

    The first case is pretty straightforward as Core Audio will manage the threading for you. In the latter case, you need to use the real time thread capabilities in the mach APIs to get really accurate timing (it might be worth checking what Apple's current policy on iOS app store apps using this is if that is your target platform). I'm just about to upload an updated version of my arpeggiator program for OS X which may be worth a look if you're doing something like that. The shared state is held as an array of ints indicating which MIDI notes are on or off. The old version used various kernel structures that AFAIK weren't available on iOS (and which didn't achieve much apart from obfuscating the code) but the updated version should be pretty portable. I'll update it so it syncs to external clock at some point. Let me know how you get on!

    ReplyDelete
  5. Hi, if you're looking to generate MIDI or audio output in real time as a response to the input then it's the dispatch_async to the main queue that's the problem, that's only necessary if you are trying to update the UI anyway. It depends what type of app you are trying to build, i.e. is it a soft synth or a phrase sequencer (or both).

    Either way, the key thing is when you receive the MIDI input packets you should then update the internal state of your app directly (for instance using the functions in OSAtomic.h). You could either do this directly in MIDIInputReadProc (messy!) or add another similar thread, semaphore and circular buffer for processing this information. The overhead in signalling another semaphore and copying the packets to another buffer is minimal, and I expect this would give a cleaner implementation as then you are separating UI updates and internal state updates. You could then bump up the priority of this thread, but it's probably not necessary. I would stick to C in this part and avoid locks, blocking operations etc.

    The resulting output should then be generated either:

    . in the high priority callback you use to populate the buffers for Core Audio, for a soft synth.
    . in a separate real-time thread, for a MIDI phrase sequencer, arpeggiator etc.

    The first case is pretty straightforward as Core Audio will manage the threading for you. In the latter case, you need to use the real time thread capabilities in the mach APIs to get really accurate timing (it might be worth checking what Apple's current policy on iOS app store apps using this is if that is your target platform). I'm just about to upload an updated version of my arpeggiator program for OS X which may be worth a look if you're doing something like that. The shared state is held as an array of ints indicating which MIDI notes are on or off. The old version used various kernel structures that AFAIK weren't available on iOS (and which didn't achieve much apart from obfuscating the code) but the updated version should be pretty portable. I'll update it so it syncs to external clock at some point. Let me know how you get on!

    ReplyDelete
  6. See http://www.mediafire.com/?gflqasz8x75pk2q for the updated arpeggiator code, I haven't added it to the main blog timeline yet.

    ReplyDelete
  7. Thanks for your thorough answer! And thanks for sharing your code! I have little experience when it comes to handling multiple threads and having these access shared variables, but your code seems like a neat way to solve such problems. I will definitely try using the atomic operations in OSAtomic.h to solve some errors in my code!

    My app is running a simple sequencer (divided into 16 steps per bar), triggering samples and phrases, and it should act either as a master clock device or a slave device syncing to incoming clock. Your implementation of the RealTimeThread looks like a smart way to have precise timing without using an audio callback, but as I tried to port this code to iOS, it seems like the call thread_policy_set() is not available. Michael Tyson of ATastyPixel has done some experiments with precise timing in iOS using mach_wait_until, the results show that the timing is occasionally rather inaccurate.
    http://atastypixel.com/blog/experiments-with-precise-timing-in-ios/
    I guess this calls for some other solution when it comes to implementing your example on iOS, unfortunately...

    ReplyDelete
  8. Oh, that's unfortunate, those functions certainly used to be available in iOS 4 onwards but it looks like they've commented them out in now, so they are kernel only in iOS 6. Of course, you could cheat and create an audio unit (e.g. remote IO) callback that generates silence and use that to invoke your sequencer code, as the thread the callback runs on will have realtime priority.

    ReplyDelete
  9. HI Simon,

    Any chance you know how to make this MIDI program work with Pro Tools? Everything works with Logic just fine, but I was hoping to make a cross-platform app.

    ReplyDelete
  10. Hi Andrew, Does Pro Tools not use the standard Audio/MIDI Setup application to manage its connections? If so, it should work the same as Logic. I must admit I haven't used it in some years, I mainly use Ableton Live these days. Is there some specific problem you're seeing?

    ReplyDelete
  11. Thanks for the quick reply!

    It seems Pro Tools only supports approved AVID devices. So never mind.

    However, I am building my app using Air for iOS. I have Bonjour capabilities, and can publish a service on the network. Is this publication of the proper service (I'm guessing it's something like _Apple-midi._udp) the only necessary step to be seen by the Network MIDI application, then send and receive MIDI messages from iOS and OSX? Or, are the NSNetworkService and CoreMIDI framework the only way to set up the proper MIDI network connection between my app and my laptop?

    ReplyDelete
    Replies
    1. Basically, I am wondering if I need to re-write my app using x-Code and objective C, or if I can make my Flash version work.

      Delete
  12. Hi Andrew, I think unless there's already an ActionScript wrapper for the CoreMIDI libraries it would be pretty difficult to implement this as a pure ActionScript solution as you would need to handle the rtpMIDI protocol communication. I think the easiest way would be to create an iOS static library in C/Objective-C that wrapped up the necessary service interfaces and the sending and receiving of MIDI messages via CoreMIDI and then use that as a native extension from your existing Air UI as described here:

    http://www.adobe.com/devnet/air/articles/building-ane-ios-android-pt3.html

    If you take that approach you won't have to throw away the work you've already done.

    ReplyDelete
  13. Fantastic!

    Thank you very much.

    ReplyDelete