3D Organ
This is a continuation of my 3D Piano project, so if you haven’t seen that, I’d recommend taking a look at it first!
Once the visualization project came together for the piano, it was pretty clear to me that I wanted to expand it to work with the organ. I figured that, at a high level, most of the work had already been done and could be adapted to work with my organ MIDI console as well.
While I created the Piano models myself in Blender, I was fortunate to have a nice starting point for this project. The company from which I bought the MIDI hardware for my console has actually modeled many of their components, and has made those models available in the 3D Warehouse available in SketchUp. I placed these in a scene, positioned / adjusted them as necessary, and added my own structural pieces such as the console table, key cheeks, and music stand. A fair amount of tweaking to the initial components was necessary, in order to get the exported file structure into a hierarchy that made it easy to work with in Unity (as well as, randomly, one pedal note that had been modeled backward)! The current state of the model closely resembles my actual practice console.
As this project is split into two separate applications, I’ll highlight some of the changes for each.
MIDI Processor
Since this console also sends MIDI signals, the data being dealt with here is quite similar to the data for the piano visualization. However, the MIDI channel property of each Note-On/-Off event is now important, because the pedal board and each manual (keyboard) send MIDI events on different channels. Additionally, there is a new type of OSC message to send, coming from the “swell shoes” — the expression pedals used for controlling dynamics. On my MIDI console, these are basically just foot-controlled potentiometers, and they send control messages just like those sent by the knobs on my MPK Mini.
Visualizer
The visualizer application is where the bulk of the changes took place.
For the 3D Piano project, most of the events were sent to a class called PianoController, which kept track of the various visualization parameters and triggered the visual changes necessary when notes were pressed and released. For this project, I refactored this logic into two layers: an InstrumentController class and a KeyboardController class. The functionality of the previous PianoController class largely fits into the new KeyboardController class (in a subclass now named PianoKeyboardController), and I added a PianoInstrumentController class, mainly just as a wrapper.
The benefits of an InstrumentController are more clear in the case of the OrganInstrumentController subclass — when receiving a Note-On/-Off event, the OrganInstrumentController forwards the event to the proper “keyboard” (in this case, one of the three manuals or the pedalboard) based on the MIDI channel of the signal. The movement of the swell shoes (expression pedals) is also controlled at the Instrument level, since they are not necessarily associated directly with any particular keyboard.
KeyboardController subclasses (ManualController and PedalboardController) handle the differing placements of the visual effects on the various types of keys (the particles emitted from the natural pedal notes spawn from a different area of the pedals than the particles emitted from the natural manual notes). Each KeyboardController subclass also defines the proper indexing offset for translating MIDI messages to specific keys (the MIDI note number associated with the lowest key varies between an 88-key piano, a 61-key organ manual, and a 32-key pedalboard). The separation of concerns between the Instrument and the Keyboards also allows me to make the entire organ exhibit the same visual effects (particle color, key emission color, etc.) or to have separate effect settings for each keyboard.