MIDI, OSC and More

MIDI, OSC and More is now replaced by AddRoutes (see this page)






Introduction

MIDI, OSC and More is a new Add-on for Blender 2.8 gathering AddMIDI and AddOSC in the same package. As a novelty an application for smartphones and tablets is currently being developed to ease the remote control of Blender.

The Add-on is now located in the "N panel", at the right, and has its own tab.

License and Status

MIDI,OSC and More (M.O.M for the friends) inherits from the Blender license which is the GNU GPL license (like all Blender Add-ons). 

The mobile application "Blemote" being not an Add-on has no obligation to be released under the GPL license, and currently I'm not sure what will be its license. 

M.O.M is still considered as a work in progress, some functionalities are reliable but some others far from being beta quality, and therefore future versions might be incompatible with the Blender projects you are working on with the current release. 

Download

By downloading this software you understand that M.O.M is still being under development and that some problems might occur during its use.

For the last release (Windows, OSX, and Linux), here is the LINK.     

For installation, like any Add-ons, don't unzip and do "Install From file", then look in the category "System"  to enable it.

Funding and voting for the next features

Development and maintenance of this Add-on has been a lot of work. If you can, please support financially this project.

Even though the Add-on can already be useful, it needs some polishing and new handy features and improvements can be wished. Please share your thoughts. 

General Principle

A route is the basic element to connect an external source to a blender property. Currently routes are either MIDI or OSC, and can optionally have some extra parameters for Blemote.

Theses common settings are for a blender property:

  • Data-block: In the Blender philosophy each object belongs to a category (mesh objects, curves, materials, textures). When you edit manually a route, the first thing is to use the drop-down menu to choose the right category. In the example below the user is using "Key" (which is for shapekeys) :  
  • Item Name: Once the right ID-block is used, you have to select one of its members. Pick one in the drop-down list, or use the search feature by typing here a few characters.
  • Datapath: This represents the path toward the final property you want to control. Please see the paragraph "Things to know" below about the particular cases you might encounter, as Blender sometimes uses "bridges" that are not always supported by the Add-on (specially for recording).
  • Deg: This checkbox appears when the property is an angle, and will convert for you a degree value into radians (Blender internal preference thru the python API).   
  • Actualization : this new menu allows to use a new expression feature, basically a python eval() function where 2 reserved keyword are available: IN and PROP. IN is the MIDI value, PROP is the current blender property value. Note that IN reflects the result converted in radians when you deal with angles and you use the feature "Deg". Blender expects the expression to return a result in radians as well. The default mode is "replace" which just set the Blender property with the incoming value. Note you can use Numpy with the "np" prefix like that: np.log(IN)

Easy adding

Because manually editing each route datapath can be tedious, there is a new entry in the contextual menu to automatically do that. When the mouse cursor is over a targeted property in its related panel, invoke the menu with the right button and choose "Create realtime route". A new route will then appear in the list with the proper parameters. 

Routing parameters

You need then to choose if it will be a MIDI or an OSC route. Then you have to set the related additional parameters and decide if your new route is for sending or receiving, or both at the same time ! 

Note: If a route is "non functional" and you try to set it for real use, its fields will appear in red, until you fix the issue. 

MIDI Settings:

  • Channel: MIDI channel.
  • Type of messages:  There are few to choose from. Some will display an extra parameter 'Filter'. 
  • Filter: With this parameter you can pick for instance a specific cc7 controller, or the velocity of a specific note. 
  • Rescale: Theses parameters are intended to map the raw MIDI values (0-127 for 7bit, and 0-16383 for 14bit) to a more useful range in Blender. If you set on the Blender line "Low" to -5 and "High" to 200, the 7 bit controller (0-127) will be remapped so that 0 equals -5 and 127 equals 200. If it is a 14bit controller (0-16383), it's of course 16383 that will equals 200 (but with better granularity). This feature has now 4 modes and works either when receiving and/or when sending. 
  • "Direct" disables the rescaling and you get in Blender the exact MIDI values unaffected (faster). 
  • "Auto" is the simplest mode, in just means that the add-on automatically sets the proper MIDI range (0-127 or 0-16383) depending of the type of messages.
  • “Cut” ignores values outside a MIDI range (Low/High), first example of use to come in mind is to filter out the 0 velocity of the “false” note-off, in the case we want to focus on non null velocity (Low = 1, High =127 -> outside this range, no pass).
  • "Wrap" is very similar to "Cut" except it will not reject messages outside the range but constraint them. With MIDI Low = 0 and High = 1, all the values outside this very limited range will be changed either to 0 or 1. Since midi message cannot be negative, the "Low" value will never act as a limitation. Consequently all positive values will change to 1. Then this “1” can be scaled further to match an angle with the second row of "Low/High" (Blender world).
  • If you click "Rec" you get 6 settings for keyframe insertions (check Blender documentation).
  • The group parameter allows to gather F-curves in a group (mandatory for envelope "post-processing").
  • Envelope settings are shown only if you enable them in the MIDI Config panel. The intended use is to mimic instruments like piano.
  • Attack is actually a pre-attack, ocuring before a note is played for instance, like when you hit a piano key. Both Attack and Release are expressed in milliseconds.  
  • Multi routing: this mode is mostly pertinent for notes velocity and poly aftertouch.

More info on this last option:

  1. Once, enabled, you can set the number of routes with “Instances”.
  2. The main principle is based on internally replacing the keyword “VAR” by the route number, and incrementing the "Select" parameter as well.
  3. The keyword VAR can be placed either in the name of the elements you target (objects, armatures, shape keys) or in the data-path of the property, allowing different kind of uses.
  4. Offset is a way to start the numbering of the VAR variable by a number different than 0 (default).
  5. There is a little trick while using VAR related to how blender deal with item’s name. VAR can be used in a composed string name (ex: “Key_VAR”) or straight alone without quotes (it will then be interpreted as an index in an array of items, if there is one).

And to explain a little bit more the point n°5:

  1. In the example "objects_multi’, VAR is used alone in the name, and therefore is implicitly used as the index of all the objects present in the scene. That’s a little bit new and funny. But it works because Blender can refer internally to all the objects thru “bpy.data.objects[array_index]”
  2. In my keyboard example, VAR is used to complete a string (“Key_”), therefore it’s not an index, and some explicitly corresponding named objects have to exist in the scene. See the outliner.

Here are 2 examples of the multi routes use:

First example, is a mix of various objects scaled on Z with one route.
http://www.jpfep.net/static/objects_multi.blend

The second is my own piano example from AddMIDI, but revisited. This file serves as well as an example in the Midifile Conversion section below. 

 

MIDI Configuration

Devices

Important Note: The Play and Pause button are two ugly workarounds to use sync out correctly as I cannot detect the user interaction (when you use the space bar to start playing for instance). So use theses instead if you need sync out. Still trying to find a proper solution....

Synchronization

Currently synchronization needs to be reworked and is mostly experimental. But let's recall a few things:

  • Midi Clock: Despite its name, this protocol doesn't only send midi ticks, it sends 3 messages (START/CONTINUE/STOP)
  • SPP: Song Pointer Position, this protocol send positional information and is actually responsible for remotely move where the play head should be 

You need Midi Clock and SPP at the same time, even though they can be selected individually, for a complete remote playing feature. 

Note 1: Received MIDI ticks are not used to adjust frame rate, but might be implemented soon. It's however usable on short period if you manage to keep the FPS steady in Blender.

Note 2: Sending ticks is somehow jerky and has not been tested much 

Convert Note off

The MIDI standard has 2 ways to terminate a note: either by using a note-off event or with a note-on message with a null velocity. This later option is most of the time used by cheap keyboards. Real note-off events have the luxury to join a velocity value to reflect the way the key is released. The Add-on can exploit both situations but sometimes you might find simpler to ignore that extra velocity information. This option converts your note-off event into a note-on event with a null velocity

Tempo parameter is used when Midi Clock and SPP are sent from Blender. This value will reflect tempo information contained in a MidiFile as well

MIDIFile options

Note: Currently you have to place the midifile in the blender project folder and fill the name field manually.

A new option to avoid "extrapolation", at the right of the midifile name will insert some supplementary initialization keyframes each time a note is played for the first time. If a note occurs at the very beginning of the Midifile, a null keyframe will be inserted one frame before in the blender animation system, if not it will be at the first starting frame.   

Realtime playing and rendering 

The Add-on doesn't force you to convert the midifile into F-curves. Depending of your choices, midi data contained by the midifile will update in realtime (either while playing or while rendering) the Blender properties. That way you remove a lot of tedious work to post edit the animation data. This is most welcome for notes.

However if you are after realistic representations of instruments like a piano for instance, you might want to see subtle movement of keys like in the real life. For that you might prefer work with F-curves and the new envelope feature. 

Note: currently this realtime contribution work best for rendering (see workflow considerations below). Contribute while playing works in fact "correctly" but it can you play you some tricks by injecting events in the Blender scene, while you are still working on it. BTW, Blender has a little bug, when you start playing an animation, the first 2 frames are not reported by their built-in python "handler". So always use the frame offset feature to reintroduce a few frame as a security. If not, the first MIDI events won't be injected in your routes.

Conversion

First and foremost, check this blendfile example to see how midifile conversion works actually with a piece of Eric Satie. You will see the piano keys moving while listening to the correspond mp3 in the VSE. A provided text lists the steps, it's very simple to get the result. 

To explain a little bit more, conversion is always made according to your routes settings (set on "RECEIVE" and with REC enabled). So you have to select which track(s) and event(s) are converted, otherwise nothing happens. 

There are now 2 modes of conversion which concerns mostly note events:

  1. The default mode acts very "robotically", a MIDI event -> a key frame insertion. In this simple scenario you might want to enable the option to prevent "extrapolation" (unless you intend to turn your F-curves into NLA clip which has some option for that too). You might want to read as well the chapter below "About Recording" because the Blender animation system is different from a MIDI sequencer.  After conversion, be sure to set the interpolation mode to CONSTANT in the graph editor for the F-curves representing note events.
  2. The new method is to use the envelope feature.  

Note : In all cases tempo values will be evaluated along the conversion allowing that way fine tempo variations that some musicians appreciate.

Note2: The envelope feature is still in its infancy and expect some specific conditions to work correctly (such as just after a midi conversion, without manual editing of the F-curves). The group you choose should be void of any previous keyframes, if not the add-on will try to process theses and might fail. You should be fine most of the time but someone had once a python error (division by zero) which was in fact related to some orphaned data. Apparently a Blender Action object was still there as a fantom and the add-on was in turn able to access the keyframes of that object because they were in the same group than another active route. Things went fine back as normal after cleaning orphaned data listed in the outliner.  

OSC Configuration

"0.0.0.0" allows to listen to all the network devices of your computer, you might want to restrict that later but it's handy for tests. The connections are updated as soon as you change a parameter if enabled.

The input connection can sometimes appears in red, this reveals a problem with another software blocking the port. The server will try to reconnect automatically and revert to the normal as soon the problem is solved on the other side.

The 2 switches allow to disable temporary sending or receiving. 

The "Debug" option will copy in the console a report as soon something is received by the add-on. For that you need of course to start Blender from a shell window.  

OSC Route Settings

Location being a Vector, you can set an index value to get only X,Y or Z. But since OSC offers more possibilities than the MIDI protocol, it's now possible to send and receive the whole array of values (for vectors or colors).

When you set a route to receive a full array, you have to make sure that the number of values sent matches what Blender expects (and it's required now to use the "n" parameter accordingly). For colors, for instance, it can be either 3 or 4 values depending if there is an extra Alpha channel added to the R,G,B triplet. 

"From" and "n", allow to extract in a list of messages a subset by specifying the rank of the first item and the number of items.  "0" is the item just after the OSC address (like /blender) and generally doesn't need to change for simple message. "n" however has to reflect the size of the targeted Blender array.  

Pick button: This allows to automatically fill the Address based on the last received OSC message.

Multi routing: Please refer to the MIDI multi route settings as they are mostly the same. In the context of OSC, the purpose is to address messages like the ones presented by Face Cap. Notably the 52 Blendshape parameters (Shape Key values) formated like this:  

/W + 1 Int + 1 Float (blendshape index, value)

When enabling the multi route for OSC, the add-on will search for an index at position "0" of the list and affect it to "VAR", consequently you have to set the "From" parameter to "1" to pass the index and focus on the payload just after. With the Face Cap Blendshape parameters "n" will stay on "1" since there is just a single float value to pick.

Note: OSC multi routes can now send. The add-on will insert the index automatically for you.  

Note2: the index in multi routes should counts from 0 (for the first item). 

Note3: if you use the new actualization mode "expression", be careful that IN may represent an array of values, and that your expression should output in this case an array of the same size. Fortunately Numpy, a scientific module for Python make easy to manipulate arrays (or even simple value most of the time) and is exposed thru the "np" prefix.    

Tools

This new panel will probably grow in the future to offer a few facilities. You can now:

  • Create a new category with the "+" icon
  • Remove a category with the "x" icon (the "contained" routes are not deleted but re-affected to 'Default')
  • Rename a category (select it first with the drop down menu)
  • Copy a category and its routes to a different scene (idem) 

Routes sorting if set to "Category" will display only the routes belonging to it.

The options in the "Extra route parameters" section allow to show/hide some recent new parameters, in order to help readability of the routes settings and keep the presentation simple. The route category is hidden by default. 

    

Blemote

Blemote is a kivy application made in Python, for Android smartphones and tablets, to control remotly the properties set in the routes of the Add-on. See this page for more information. 

About Recording

Please be cautious with the "rec" button. It relies on the "is_playing" flag that Blender turns "on" and "off" when you play the current animation OR when you move the playhead manually. 

You might be better off to disable "rec" on a route just after the recording to avoid unwanted keyframe insertions while you move the playhead inadvertently.

You don't need to enable the auto-keying button in the Timeline, it's a different feature.  

About MIDI and recording:

Many people will want to "record" MIDI events with the Blender animation system. There are however a few fundamental differences that are not so easy to grasp.

Here are some facts about the blender animation framework:

  1. it’s “monophonic”
  2. it allows different interpolation modes (a Midifile is in CONSTANT)  
  3. each channel copes with uni-dimensional data (not like notes gathering key number and velocity in one event)
  4. a single key sets a value for all frames (even before the key in the timeline) and forces "extrapolation" (unless F-curves turned into NLA clip).

Consequently, it is not suited fundamentally to record polyphonic tracks, and even a monophonic melody would have to be spread onto 2 channels (for note values and velocities). MIDI controllers on the other hand are compatible with the blender system (except poly AT).

It can be done however, either in realtime or by converting a midifile, the advantage is that once MIDI events converted, the result is more predictable and further processing can be done on F-curves.  

The second important task is to set the interpolation mode to CONSTANT for every notes converted. If not you will see some very weird and slow playing on the screen, or you might want to use the new envelope feature that avoid this problem. 

If you realize a realtime "recording" in Blender of MIDI events sent by a running digital audio workstation, you can manually apply the envelope on note events after the take.  

Workflow

It's a very personal subject, but if I could share a few hints, I would advise to work as much as possible in realtime without bothering with midifile conversion. The "revolutionary" aspect of this add-on is, for the first time in history, to be able to work both on music and 3D visuals at the same time, under some very good conditions. EEVEE is a blessing and a strong opportunity for a new kind of creators who would want to push the art of videoclips way beyond its current state. Computers are fast, storage and RAM are cheap, it's time to invest in some new hardware and be the first to seize this opportunity. 

These new conditions will still imply nonetheless to have a reflection about means in art. You might still have to prevent yourself from using too heavy objects or scenes to keep the frame rate steady. Art is about emotion not technics, and time is the matter.  

Then you might want to produce a first final video, things might get less clear at this point, as several strategies might lead to the final product. 

To get a very predictable result, and tight sync'ing, exporting to a midifile (or to a textfile in the case of OSC, later) is probably the best option. For MIDI data, converting notes to keyframes is probably avoidable. A limit could be you need more than 16 tracks. I would then advise to convert only monophonic channels or to break the project into several midifiles. 

On this subject, it might be a good idea, specially with a project that lasts more than a simple video clip to break each part into several midifiles and may be into several D.A.W. files as well. Blender would render each sequence as frames that in turn you would gather into a final video project (either using the Blender VSE or another application) 

Things to know

  • Don't use "location.x" or "location[0]" when you set a data-path manually but "location", then set the index that will appear. Same for any property being an array (like colors). 
  • Frame dropping as sync method might produce incoherent results while making a midifile contribute during playing
  • Using a animation step > 1 is not supported for midifile contributing, but shouldn't be a pbm for converting. 
  • MIDI CC's 98 and 100 are reserved for RPN/NRPN and cannot be used currently (might change later) as simple 7bit continuous controllers. 
  • Angles are expressed in radian internally thru the Python API by Blender (and therefore the add-on complies), and not in degree like in the interface. 360° = 2*pi. The Add-on however offers an option for conversion (done in python).
  • Using drivers is fast (no python calculation) and offers nice added features. Instead of targeting directly an object property, you can create an empty object and use its axis location X, Y, Z as the support for the information you want to inject in Blender. Then, with a driver you can control even more the information and shape it with curve to control the object you really want to animate.
  • You can enable both "Tooltips" and "Python Tooltips" in Preferences/Interface/Display to have information about the path of a property when the mouse hover on it. This helps a lot to solve some problems while setting routes manually.
  • Blender sometimes uses "bridges" internally to describe the path of a property (called as nested properties in the documentation). For clarity, you might want to select the proper category "Key" (for shapekeys) instead of using "Object" then have a bridge (like "data") in the path. For instance, these 2 properties are the same: bpy.data.objects['head'].data.shape_keys.key_blocks['eyeBlink_L'].value  bpy.data.shape_keys['Key'].key_blocks['eyeBlink_L'].value
  • Another oddity is that historically the Node System was only for material shaders then it was generalized to be used for many usages. However, Blender keeps a special treatment for material shaders, when you enable the nodal system, you won't find your material available in the Nodes category. Therefore the add-on has special code to handle this frequent exception.      

Credits

This Add-on relies on a few libraries: