MIDI, OSC and More is a new Add-on for Blender 2.8 gathering AddMIDI and AddOSC in the same package. As a novelty an application for smartphones and tablets is currently being developed to ease the remote control of Blender.
The Add-on is now located in the "N panel", at the right, and has its own tab.
MIDI,OSC and More (M.O.M for the friends) inherits from the Blender license which is the GNU GPL license (like all Blender Add-ons).
The mobile application "Blemote" being not an Add-on has no obligation to be released under the GPL license, and currently I'm not sure what will be its license.
M.O.M is still considered as a work in progress, some functionalities are reliable but some others far from being beta quality, and therefore future versions might be incompatible with the Blender projects you are working on with the current release.
By downloading this software you understand that M.O.M is still being under development and that some problems might occur during its use.
For the last release (Windows, OSX, and Linux), here is the LINK.
For installation, like any Add-ons, don't unzip and do "Install From file", then look in the category "System" to enable it.
Development and maintenance of this Add-on has been a lot of work. If you can, please support financially this project.
Even though the Add-on can already be useful, it needs some polishing and new handy features and improvements can be wished. Please share your thoughts.
A route is the basic element to connect an external source to a blender property. Currently routes are either MIDI or OSC, and can optionally have some extra parameters for Blemote.
Theses common settings are for a blender property:
Actualization : this new menu allows to use a new expression feature, basically a python eval() function where 2 reserved keyword are available: IN and PROP. IN is the MIDI value, PROP is the current blender property value. Note that IN reflects the result converted in radians when you deal with angles and you use the feature "Deg". Blender expects the expression to return a result in radians as well. The default mode is "replace" which just set the Blender property with the incoming value. Note you can use Numpy with the "np" prefix like that: np.log(IN)
Because manually editing each route datapath can be tedious, there is a new entry in the contextual menu to automatically do that. When the mouse cursor is over a targeted property in its related panel, invoke the menu with the right button and choose "Create realtime route". A new route will then appear in the list with the proper parameters.
You need then to choose if it will be a MIDI or an OSC route. Then you have to set the related additional parameters and decide if your new route is for sending or receiving, or both at the same time !
Note: If a route is "non functional" and you try to set it for real use, its fields will appear in red, until you fix the issue.
More info on this last option:
And to explain a little bit more the point n°5:
Here are 2 examples of the multi routes use:
First example, is a mix of various objects scaled on Z with one route.
http://www.jpfep.net/static/objects_multi.blend
The second is my own piano example from AddMIDI, but revisited. This file serves as well as an example in the Midifile Conversion section below.
Important Note: The Play and Pause button are two ugly workarounds to use sync out correctly as I cannot detect the user interaction (when you use the space bar to start playing for instance). So use theses instead if you need sync out. Still trying to find a proper solution....
Currently synchronization needs to be reworked and is mostly experimental. But let's recall a few things:
You need Midi Clock and SPP at the same time, even though they can be selected individually, for a complete remote playing feature.
Note 1: Received MIDI ticks are not used to adjust frame rate, but might be implemented soon. It's however usable on short period if you manage to keep the FPS steady in Blender.
Note 2: Sending ticks is somehow jerky and has not been tested much
The MIDI standard has 2 ways to terminate a note: either by using a note-off event or with a note-on message with a null velocity. This later option is most of the time used by cheap keyboards. Real note-off events have the luxury to join a velocity value to reflect the way the key is released. The Add-on can exploit both situations but sometimes you might find simpler to ignore that extra velocity information. This option converts your note-off event into a note-on event with a null velocity
Tempo parameter is used when Midi Clock and SPP are sent from Blender. This value will reflect tempo information contained in a MidiFile as well
Note: Currently you have to place the midifile in the blender project folder and fill the name field manually.
A new option to avoid "extrapolation", at the right of the midifile name will insert some supplementary initialization keyframes each time a note is played for the first time. If a note occurs at the very beginning of the Midifile, a null keyframe will be inserted one frame before in the blender animation system, if not it will be at the first starting frame.
The Add-on doesn't force you to convert the midifile into F-curves. Depending of your choices, midi data contained by the midifile will update in realtime (either while playing or while rendering) the Blender properties. That way you remove a lot of tedious work to post edit the animation data. This is most welcome for notes.
However if you are after realistic representations of instruments like a piano for instance, you might want to see subtle movement of keys like in the real life. For that you might prefer work with F-curves and the new envelope feature.
Note: currently this realtime contribution work best for rendering (see workflow considerations below). Contribute while playing works in fact "correctly" but it can you play you some tricks by injecting events in the Blender scene, while you are still working on it. BTW, Blender has a little bug, when you start playing an animation, the first 2 frames are not reported by their built-in python "handler". So always use the frame offset feature to reintroduce a few frame as a security. If not, the first MIDI events won't be injected in your routes.
First and foremost, check this blendfile example to see how midifile conversion works actually with a piece of Eric Satie. You will see the piano keys moving while listening to the correspond mp3 in the VSE. A provided text lists the steps, it's very simple to get the result.
To explain a little bit more, conversion is always made according to your routes settings (set on "RECEIVE" and with REC enabled). So you have to select which track(s) and event(s) are converted, otherwise nothing happens.
There are now 2 modes of conversion which concerns mostly note events:
Note : In all cases tempo values will be evaluated along the conversion allowing that way fine tempo variations that some musicians appreciate.
Note2: The envelope feature is still in its infancy and expect some specific conditions to work correctly (such as just after a midi conversion, without manual editing of the F-curves). The group you choose should be void of any previous keyframes, if not the add-on will try to process theses and might fail. You should be fine most of the time but someone had once a python error (division by zero) which was in fact related to some orphaned data. Apparently a Blender Action object was still there as a fantom and the add-on was in turn able to access the keyframes of that object because they were in the same group than another active route. Things went fine back as normal after cleaning orphaned data listed in the outliner.
"0.0.0.0" allows to listen to all the network devices of your computer, you might want to restrict that later but it's handy for tests. The connections are updated as soon as you change a parameter if enabled.
The input connection can sometimes appears in red, this reveals a problem with another software blocking the port. The server will try to reconnect automatically and revert to the normal as soon the problem is solved on the other side.
The 2 switches allow to disable temporary sending or receiving.
The "Debug" option will copy in the console a report as soon something is received by the add-on. For that you need of course to start Blender from a shell window.
Location being a Vector, you can set an index value to get only X,Y or Z. But since OSC offers more possibilities than the MIDI protocol, it's now possible to send and receive the whole array of values (for vectors or colors).
When you set a route to receive a full array, you have to make sure that the number of values sent matches what Blender expects (and it's required now to use the "n" parameter accordingly). For colors, for instance, it can be either 3 or 4 values depending if there is an extra Alpha channel added to the R,G,B triplet.
"From" and "n", allow to extract in a list of messages a subset by specifying the rank of the first item and the number of items. "0" is the item just after the OSC address (like /blender) and generally doesn't need to change for simple message. "n" however has to reflect the size of the targeted Blender array.
Pick button: This allows to automatically fill the Address based on the last received OSC message.
Multi routing: Please refer to the MIDI multi route settings as they are mostly the same. In the context of OSC, the purpose is to address messages like the ones presented by Face Cap. Notably the 52 Blendshape parameters (Shape Key values) formated like this:
/W + 1 Int + 1 Float (blendshape index, value)
When enabling the multi route for OSC, the add-on will search for an index at position "0" of the list and affect it to "VAR", consequently you have to set the "From" parameter to "1" to pass the index and focus on the payload just after. With the Face Cap Blendshape parameters "n" will stay on "1" since there is just a single float value to pick.
Note: OSC multi routes can now send. The add-on will insert the index automatically for you.
Note2: the index in multi routes should counts from 0 (for the first item).
Note3: if you use the new actualization mode "expression", be careful that IN may represent an array of values, and that your expression should output in this case an array of the same size. Fortunately Numpy, a scientific module for Python make easy to manipulate arrays (or even simple value most of the time) and is exposed thru the "np" prefix.
This new panel will probably grow in the future to offer a few facilities. You can now:
Routes sorting if set to "Category" will display only the routes belonging to it.
The options in the "Extra route parameters" section allow to show/hide some recent new parameters, in order to help readability of the routes settings and keep the presentation simple. The route category is hidden by default.
Blemote is a kivy application made in Python, for Android smartphones and tablets, to control remotly the properties set in the routes of the Add-on. See this page for more information.
Please be cautious with the "rec" button. It relies on the "is_playing" flag that Blender turns "on" and "off" when you play the current animation OR when you move the playhead manually.
You might be better off to disable "rec" on a route just after the recording to avoid unwanted keyframe insertions while you move the playhead inadvertently.
You don't need to enable the auto-keying button in the Timeline, it's a different feature.
About MIDI and recording:
Many people will want to "record" MIDI events with the Blender animation system. There are however a few fundamental differences that are not so easy to grasp.
Here are some facts about the blender animation framework:
Consequently, it is not suited fundamentally to record polyphonic tracks, and even a monophonic melody would have to be spread onto 2 channels (for note values and velocities). MIDI controllers on the other hand are compatible with the blender system (except poly AT).
It can be done however, either in realtime or by converting a midifile, the advantage is that once MIDI events converted, the result is more predictable and further processing can be done on F-curves.
The second important task is to set the interpolation mode to CONSTANT for every notes converted. If not you will see some very weird and slow playing on the screen, or you might want to use the new envelope feature that avoid this problem.
If you realize a realtime "recording" in Blender of MIDI events sent by a running digital audio workstation, you can manually apply the envelope on note events after the take.
It's a very personal subject, but if I could share a few hints, I would advise to work as much as possible in realtime without bothering with midifile conversion. The "revolutionary" aspect of this add-on is, for the first time in history, to be able to work both on music and 3D visuals at the same time, under some very good conditions. EEVEE is a blessing and a strong opportunity for a new kind of creators who would want to push the art of videoclips way beyond its current state. Computers are fast, storage and RAM are cheap, it's time to invest in some new hardware and be the first to seize this opportunity.
These new conditions will still imply nonetheless to have a reflection about means in art. You might still have to prevent yourself from using too heavy objects or scenes to keep the frame rate steady. Art is about emotion not technics, and time is the matter.
Then you might want to produce a first final video, things might get less clear at this point, as several strategies might lead to the final product.
To get a very predictable result, and tight sync'ing, exporting to a midifile (or to a textfile in the case of OSC, later) is probably the best option. For MIDI data, converting notes to keyframes is probably avoidable. A limit could be you need more than 16 tracks. I would then advise to convert only monophonic channels or to break the project into several midifiles.
On this subject, it might be a good idea, specially with a project that lasts more than a simple video clip to break each part into several midifiles and may be into several D.A.W. files as well. Blender would render each sequence as frames that in turn you would gather into a final video project (either using the Blender VSE or another application)
This Add-on relies on a few libraries: