The coding aspect is novel I'll admit, and something an audience may find interesting, but I've yet to hear any examples of live coded music (or even coded music) that I'd actually want to listen to. They almost always take the form of some bog-standard house music or techno, which I don't find that enjoyable.
Additionally, the technique is fun for demonstrating how sound synthesis works (like in the OP article), but anything more complex or nuanced is never explored or attempted. Sequencing a nuanced instrumental part (or multiple) requires a lot of moment-to-moment detail, dynamics, and variation. Something that is tedious to sequence and simply doesn't play to this formats' strengths.
So again, I want to integrate this skill into my music production tool set, but aside from the novelty of coding live, it doesn't appear well-suited to making interesting music in real time. And for offline sequencing there are better, more sophisticated tools, like DAWs or trackers.
Consider this: there are teenagers today, out there somewhere, learning to code music. Remember when synthesisers were young and cool and there was an explosion of different engines and implementations?
This is happening for the kids, again.
Try to use this new technology to replicate the modern, and then the old sound, and then discover new sounds. Like we synth nerds have been doing for decades.
Aside from the novelty factor (due to very different UI/UX) and the idea that you can use generative code to make music (which became an even more interesting factor in the age of LLMs), I agree.
And even the generative code part I mentioned is a novelty factor as well, and isn't really practical for someone who actually makes music as their end-goal (and not someone who is just experimenting around with tech or how far one can get with music-as-code UIUX).
I think this format of composition is going to encourage a highly repetitive structure to your music. Good programming languages constrain and prevent the construction of bad programs. Applying that to music is effectively going to give you quantization of every dimension of composition.
I'm sure its possible to break out of that but you are fighting an uphill battle.
There's a learning curve for sure, but it's not too bad once you learn the basics of how audio and MIDI are handled + general JUCE application structure.
Two tips:
Don't bother with the Projucer, use the CMAke example to get going. Especially if you don't use XCode or Visual Studio.
If your on a Mac, you might need to self-sign the VST. I don't remember the exact process, but it's something I had to do once I got an M4 Mac.
If you see it as yet another instrument you have to master, then you can go pretty far. I'm finding myself exploring rhythms and sounds in ways I could never do in a DAW so fast, but at the same time I do find limiting a lot of factors, especially sequencing.
So far I haven't gotten beyond a good sounding loop, hence the name "loopmaster", and maybe that's the limit, which is why I made a 2 deck "dual" mode in the editor, so that it can be played as a DJ set where you don't really need that much progression.
That said, it's quite fun to play with it and experiment with sounds, and whenever you make something you enjoy, you can export a certain length and use it as a track in your mix.
My goal is certainly to be able to create full length tracks with nuances and variations as you say, just not entirely sure how to integrate this into the format right now.
Feedback[0] is appreciated!
[1] https://news.ycombinator.com/item?id=46052478 [2] Nice example: https://m.youtube.com/watch?v=GWXCCBsOMSg
I must say the narrated trance piece by switch angel blew me socks right off, to me feels like this should be a genre in itself.
The tools/frameworks have become more plentiful, approachable, and mature over the past 10-15 years, to the point where you can just go to strudel.cc and start coding music right from your browser.
I'll shamelessly plug my weirdo version in a Forth variant, also a house loop running in the browser: https://audiomasher.org/patch/WRZXQH
Well, maybe it's closer to trance than house. It's also considerably more esoteric and less commented! Win-win?
Also, there is an AI DJ mode[0] where you set the mood/genre and the AI generates and plays music for you infinitely.
I don't imagine making a full song out of this, but it would be a great instrument to have.
I'll put 50$ down right now.
[0]: https://loopmaster.xyz/loop/75a00008-2788-44a5-8f82-ae854e87...
The janky way to do this would be to run it locally, and setup a watch job to reload the audio file into a vst plugin every time the file changes.
The license at: https://github.com/juce-framework/JUCE/blob/master/LICENSE.m...
indicates you can just license any module under agpl and avoid the JUCE 8 license (which to be fair, I'm not bothering to read)
And sure you can license under APGL. It should be obvious that's undesirable.
I'm not going to test it, but couldn't you just load a json file with all params.
Various instructions, etc.
I can't believe it's not code!
Not like a fringe unknown one, but one with over 20 years of history and now-owned by Beatport.