MIDI

MIDI Route Settings

General Settings

  • Channel: MIDI channel.
  • Type of messages:  There are few to choose from. Some will make appear an extra parameter 'Filter'. 
  • Filter: ON/OFF -> enable Select 
  • Select: value of a specific controller, or the velocity of a specific note. 


Rescale

Theses parameters are intended to map the raw MIDI values (0-127 for 7bit, and 0-16383 for 14bit) to a more useful range in Blender. If you set on the Blender line "Low" to -5 and "High" to 200, the 7 bit controller (0-127) will be remapped so that 0 equals -5 and 127 equals 200. If it is a 14bit controller (0-16383), it's of course 16383 that will equals 200 (but with better granularity). This feature has now 4 modes and works either when receiving and/or when sending. 

  • "Direct" disables the rescaling and you get in Blender the exact MIDI values unaffected (faster). 
  • "Auto" is the simplest mode, in just means that the add-on automatically sets the proper MIDI range (0-127 or 0-16383) depending of the type of messages.
  • “Cut” ignores values outside a MIDI range (Low/High), first example of use to come in mind is to filter out the 0 velocity of the “false” note-off, in the case we want to focus on non null velocity (Low = 1, High =127 -> outside this range, no pass).
  • "Wrap" is very similar to "Cut" except it will not reject messages outside the range but constraint them. With MIDI Low = 0 and High = 1, all the values outside this very limited range will be changed either to 0 or 1. Since midi message cannot be negative, the "Low" value will never act as a limitation. Consequently all positive values will change to 1. Then this “1” can be scaled further to match an angle with the second row of "Low/High" (Blender world).

Enveloppe Settings

  • The group parameter allows to gather F-curves in a group (mandatory for envelope "post-processing").
  • Envelope settings are shown only if you enable them in the MIDI Config panel. The intended use is to mimic instruments like piano.
  • Attack is actually a pre-attack, ocuring before a note is played for instance, like when you hit a piano key. Both Attack and Release are expressed in milliseconds.  

Multi routing

This mode is mostly pertinent for notes velocity and poly aftertouch: 

  1. Once, enabled, you can set the number of routes with “Instances”.
  2. The main principle is based on internally replacing the keyword “VAR” by the route number, and incrementing the "Select" parameter as well.
  3. The keyword VAR can be placed either in the name of the elements you target (objects, armatures, shape keys) or in the data-path of the property, allowing different kind of uses.
  4. Offset is a way to start the numbering of the VAR variable by a number different than 0 (default).
  5. There is a little trick while using VAR related to how blender deal with item’s name. VAR can be used in a composed string name (ex: “Key_VAR”) or straight alone without quotes (it will then be interpreted as an index in an array of items, if there is one).

And to explain a little bit more the point n°5:

  1. In the example "objects_multi’, VAR is used alone in the name, and therefore is implicitly used as the index of all the objects present in the scene. That’s a little bit new and funny. But it works because Blender can refer internally to all the objects thru “bpy.data.objects[array_index]”
  2. In my keyboard example, VAR is used to complete a string (“Key_”), therefore it’s not an index, and some explicitly corresponding named objects have to exist in the scene. See the outliner.

Here are 2 examples of the multi routes use:

First example, is a mix of various objects scaled on Z with one route.
http://www.jpfep.net/static/objects_multi.blend

The second is my own piano example from AddMIDI, but revisited. This file serves as well as an example in the Midifile Conversion section below. 

 

MIDI Configuration

Devices

As stated in the main page, the MIDI engine can now have a default "system wide" configuration, active as soon as you start Blender and open a project. This is handy when your MIDI setup is stable, and make more immediate, for instance, the use of a MIDI faders box with some system routes to act on the Blender GUI. 

Note: if you edit the system configuration from the project panel, you have to explicitly click "Save preferences" as theses properties belong to the Add-on preferences. They are just exposed here for convenience.  

It's still possible however to have a project based configuration like shown in the example below, if you prefer. 


Debug: Will display in console or in the text editor what happens (incoming/sending events). Can be slower due to extra computation, however with MIDI the rate of data is rather low and it shouldn't be a problem.

Synchronization

Currently synchronization needs to be reworked and is mostly experimental. But let's recall a few things:

  • Midi Clock: Despite its name, this protocol doesn't only send midi ticks, it sends 3 messages (START/CONTINUE/STOP)
  • SPP: Song Pointer Position, this protocol send positional information and is actually responsible for remotely move where the play head should be 

You need Midi Clock and SPP at the same time, even though they can be selected individually, for a complete remote playing feature. 

Note 1: Received MIDI ticks are not used to adjust frame rate, but might be implemented soon. It's however usable on short period if you manage to keep the FPS steady in Blender.

Note 2: Sending ticks is somehow jerky and has not been tested much 

Note 3: The Play and Pause button are two ugly workarounds to use sync out correctly as I cannot detect the user interaction (when you use the space bar to start playing for instance). So use theses instead if you need sync out. Still trying to find a proper solution....

Convert Note off

The MIDI standard has 2 ways to terminate a note: either by using a note-off event or with a note-on message with a null velocity. This later option is most of the time used by cheap keyboards. Real note-off events have the luxury to join a velocity value to reflect the way the key is released. The Add-on can exploit both situations but sometimes you might find simpler to ignore that extra velocity information. This option converts your note-off event into a note-on event with a null velocity

Tempo parameter is used when Midi Clock and SPP are sent from Blender. This value will reflect tempo information contained in a MidiFile as well

MIDIFile options

Note: Currently you have to place the midifile in the blender project folder and fill the name field manually.

A new option to avoid "extrapolation", at the right of the midifile name will insert some supplementary initialization keyframes each time a note is played for the first time. If a note occurs at the very beginning of the Midifile, a null keyframe will be inserted one frame before in the blender animation system, if not it will be at the first starting frame.   

Realtime playing and rendering 

The Add-on doesn't force you to convert the midifile into F-curves. Depending of your choices, midi data contained by the midifile will update in realtime (either while playing or while rendering) the Blender properties. That way you remove a lot of tedious work to post edit the animation data. This is most welcome for notes.

However if you are after realistic representations of instruments like a piano for instance, you might want to see subtle movement of keys like in the real life. For that you might prefer work with F-curves and the new envelope feature. 

Note: currently this realtime contribution work best for rendering (see workflow considerations below). Contribute while playing works in fact "correctly" but it can you play you some tricks by injecting events in the Blender scene, while you are still working on it. BTW, Blender has a little bug, when you start playing an animation, the first 2 frames are not reported by their built-in python "handler". So always use the frame offset feature to reintroduce a few frame as a security. If not, the first MIDI events won't be injected in your routes.

Conversion

First and foremost, check this blendfile example to see how midifile conversion works actually with a piece of Eric Satie. You will see the piano keys moving while listening to the correspond mp3 in the VSE. A provided text lists the steps, it's very simple to get the result. 

To explain a little bit more, conversion is always made according to your routes settings (set on "RECEIVE" and with REC enabled). So you have to select which track(s) and event(s) are converted, otherwise nothing happens. 

There are now 2 modes of conversion which concerns mostly note events:

  1. The default mode acts very "robotically", a MIDI event -> a key frame insertion. In this simple scenario you might want to enable the option to prevent "extrapolation" (unless you intend to turn your F-curves into NLA clip which has some option for that too). You might want to read as well the chapter below "About Recording" because the Blender animation system is different from a MIDI sequencer.  After conversion, be sure to set the interpolation mode to CONSTANT in the graph editor for the F-curves representing note events.
  2. The new method is to use the envelope feature.  

Note : In all cases tempo values will be evaluated along the conversion allowing that way fine tempo variations that some musicians appreciate.

Note2: The envelope feature is still in its infancy and expect some specific conditions to work correctly (such as just after a midi conversion, without manual editing of the F-curves). The group you choose should be void of any previous keyframes, if not the add-on will try to process theses and might fail. You should be fine most of the time but someone had once a python error (division by zero) which was in fact related to some orphaned data. Apparently a Blender Action object was still thereas a fantom and the add-on was in turn able to access the keyframes of that object because they were in the same group than another active route. Things went fine back as normal after cleaning orphaned data listed in the outliner.  

More about MIDI and recording

Many people will want to "record" MIDI events with the Blender animation system. There are however a few fundamental differences that are not so easy to grasp.

Here are some facts about the blender animation framework:

  1. it’s “monophonic”
  2. it allows different interpolation modes (a Midifile is in CONSTANT)  
  3. each channel copes with uni-dimensional data (not like notes gathering key number and velocity in one event)
  4. a single key sets a value for all frames (even before the key in the timeline) and forces "extrapolation" (unless F-curves turned into NLA clip).

Consequently, it is not suited fundamentally to record polyphonic tracks, and even a monophonic melody would have to be spread onto 2 channels (for note values and velocities). MIDI controllers on the other hand are compatible with the blender system (except poly AT).

It can be done however, either in realtime or by converting a midifile, the advantage is that once MIDI events converted, the result is more predictable and further processing can be done on F-curves.  

The second important task is to set the interpolation mode to CONSTANT for every notes converted. If not you will see some very weird and slow playing on the screen, or you might want to use the new envelope feature that avoid this problem. 

If you realize a realtime "recording" in Blender of MIDI events sent by a running digital audio workstation, you can manually apply the envelope on note events after the take.