I’m really curious about how Hapax might work as a more dynamic melodic quantizer. I’ll send this as a feature request, but before doing so, I want to see if anyone else has similar ideas.
Right now, we can set a global scale for a project via the pScale setting, but my projects usually contain key changes and/or modulations, so the idea of a single project-level key+color+scale doesn’t really add any value for me.
I’ve spent a lot of time working on this exact kind of thing in my modular, where I have an O_c operating as a quad turing machine going into a Instruo QPQ that is quantizing the O_c’s CV based on MIDI melodies/chords played live or sequenced, with the quantized outputs going into a second O_c, operating as a quad-S&H, the triggers of which are being driven by a Gatestorm.
What I would like to see in Hapax is some kind of real-time analysis of one or more track’s MIDI note data, which would result in the pScale (or a similar setting) being set on-the-fly. This could be as as simple as setting the pScale to be just whatever notes are currently being playing on a given track at any given time, allowing the Scaler effect on any track to quantize that track’s MIDI note data to just those notes, or could be as complex as inferring the scale and key from the notes played to generate missing notes as well.
I’ve actually been playing with some ideas for how I can intelligently infer a scale from a given set of inputs, for example, this rudimentary scale finder can do something along these lines. The next step would be to set up weights for each possible output scale and then build web MIDI API support into a working prototype, so I can try it out.
It would be amazing if Hapax could do something like the above!