Home Artists Posts Import Register

Content

I am tearing myself away from the code to write this post, just because it has been way too long since my last one. Usually I write a post when I hit some kind of coding or design milestone but all January I have been working away on the same thing and it's not quite done yet, and when it's done, nothing will outwardly change because it's a big internal refactoring which needs to be done for the instrument system.

I will go into detail on what exactly I am doing at the moment but first of all: I am alive and well. I am still working on Blockhead every day. I am still happy with the progress I am making even if from the outside it seems like things are moving a bit slowly.

I didn't run out of money yet. I am still living off Patreon and occasional contracting work. I wrote a cry-post on Twitter a while ago about possibly open-sourcing the project with the idea that doing so it might generate more attention for the project and therefore more Patreon subscribers. Shortly after that I was contacted by a nice man with a very impressive background in the audio industry who has expressed an interest in helping me out, possibly by helping me start an actual company and get more developers on board. This is all still completely up in the air and nothing is certain yet. But there is a possibility of Blockhead becoming an actual product in the future rather than whatever it is now.

I changed my mind a lot about how the instrument system is going to work and so a lot of the stuff I said in the last devlog is now wrong. A summary of my current plans would be something like: The instrument system in Blockhead is going to work similarly to the way it works in other DAWs. With the key points being -

  • There will be a new MIDI block representing note on/off events, which at a basic level will work a lot like the blocks in a typical piano roll interface, with the left-edge of the block representing the note-on event and the right-edge representing the note-off.
  • A MIDI block will be set up to point at a MIDI instrument. A MIDI instrument will be some floating device that you can add to the project and just sits there waiting for events, similar to how normal DAWs work.
  • A MIDI instrument is either a third-party plugin (VST/CLAP), or it will point to a user-defined instrument built up using the instrument editor.
  • The user-defined instruments will basically be a bunch of blocks mapped to key-zones and velocity levels (plus a bunch of bells and whistles).
  • The manipulator system will be extended to support generating MIDI-based modulation events which can be routed to MIDI parameter targets. Currently manipulators can only target parameters exposed by Blockhead's native plugin format (Blink). If I do everything correctly then we should be able to use manipulators to control arbitrary MIDI parameters with a similar level of expressiveness. This brings me to what I'm working on now...

The parameter and manipulator systems are not currently set up in a way that allows me to easily introduce MIDI manipulators into the mix, so I've been refactoring everything to do with how parameters and manipulators are represented internally. I thought this would be a small refactor but it turned out to be very large and has taken up the entirety of January so far.

Up until now the only kind of parameters are those exposed by Blink plugins. Blink plugins and their parameters work very differently from typical audio plugin formats so I can't do anything generic here. The manipulator system currently expects everything to be a Blink parameter and generates data in a format that is ready to be sent on the Blink plugins.

This is not good enough if Manipulators should also work with MIDI parameters and generate MIDI data, so I've been refactoring everything so that Manipulators are sort of "target-agnostic". Everything to do with Blink is now decoupled from the main manipulation system. Blockhead now knows how to generate specialized manipulators which operate on the various kinds of Blink parameter (envelopes, chords, options and sliders). Once this is all sorted out, adding MIDI manipulators should be much less painful than trying to tack it on to the old system.

MIDI modulation is represented as a stream of MIDI events, which is completely different from how Blink parameters work. I am not sure what the correct terminology is here but Blink parameters sort of work "offline" in that all the parameter data for the entire duration of the block is always available. For example when the playback hits the start of a Blink plugin block, the plugin already knows about all the envelope points to the right of the playback position, because that's just data that it has free access to. This way of getting envelope data to the plugins is also important for the sampler plugins to be able to generate sample waveforms very quickly.

MIDI is different because it's meant to be this generic way to send data between hardware devices or whatever, so the receiving device doesn't know what an envelope is going to do in the future. It just has to wait for the MIDI events and then respond to them as soon as it gets them. These two worlds are very different so this is why I've tried to establish this strong separation between MIDI and Blink in the codebase.

Interestingly, I do want to eventually add the ability to be able to "play" Blink plugins in realtime, like MIDI instruments. This is not currently possible - a Blink block is just this static thing on the timeline. You can jiggle a Blink parameter around while it's playing back and it won't click or glitch, but this is because of this thing that the engine does, where every time you adjust a parameter, it automatically crossfades out the old version of the block while crossfading in the new version, to cover up the clicks.

When it comes to playing back Blink plugins live while modulating parameters and stuff I don't think this kind of state-crossfading strategy is very good (sonically or scalability-wise). I'd prefer to be able to send Blink plugins MIDI-style event data and have the plugins themselves respond in an appropriate way. I still need to keep the current "static" parameter data (again it's crucial to the way that the sampler plugins generate waveforms, among other things.)

So I'm planning to extend the Blink API to add an extra "realtime parameter" layer to allow plugins to receive event-style data on top of what they are already doing. To borrow some shader terminology from the world of computer graphics, the non-realtime parameter data could be thought of as "uniform" while the realtime event data would be "varying".

Anyway sorry I don't have anything new and cool to show yet. Progress is happening every day. I am fairly well and sleeping okay. Once I get through this refactor and get the parameter/manipulation system working again, I will likely release another build. I think the latest build is a little buggy so I may decide to put things on hold and work exclusively on bugs and stability for a few weeks. I don't like to do this very often because often I am fixing bugs in code that is going to be changed anyway but it is also nice to have a good build just for playing around with.

Hope you are all well

Comments

No comments found for this post.