It's cool in some ways but when I spend hours tracking down a bug instead of playing music it's a bit of a trap as well.
What inspired me? Well, I used to own a DX7 when they came out and every now and again miss it, and I (naively) thought it would be a fairly simple task to program as it was designed in the late 1970s and obviously computers have a heap more processing power than they did and I'd seen some people had written a C version and the source code was available. So I just thought I'd port their code and integrate it with the Raspberry Pi MIDI and audio support. What I didn't realise was that the DX7 used logic hardware to do most of the sound generation - so even with optimised code the 1GHz Raspberry Pi is stretched to do the calculations in time at 48KHz. Also the open source code I got had serious errors in it. It was also hard to follow - everything in one function (slight exaggeration :-)) with obscure variable names and the like. So I ended up redesigning and starting from scratch.
Quite interesting to try out new ideas though. The voice assignment is interesting because Yamaha gave the first note priority - so if you are using 16 voices any extra keys you play are ignored. Roland do it the other way where they drop the oldest note to allow the new note to sound. I've always thought the Roland way was better but coded both and the Yamaha method sounds best because you tend to latch on to the first note and when it is terminated abruptly you notice it far more than the notes that never played.
I should finish it. I took it as far as I could without a DX7 to compare against. Then I got a Volca FM and hooked them both up to my oscilloscope to compare. I've fixed the envelopes so that they match but that has thrown the sine wave modulation out so I need to check varying degrees of modulation against the waveform produced. One day when I have some free time ...
I wrote a MIDI sequencer program in the 80s which I guess was similar to a DAW but without any audio processing. The operating system was single user so I used two separate hardware interrupts to run the background processes. One from a timer which simply updated the current song position pointer. The other emptied the MIDI input queue time-stamped each event and fed notes into the output queue when it was time to play them. The rest of the program updated the screen and tried to move the cursor and scroll the screen to display what was being played in time. If it couldn't keep up it would get jittery but the music playing wouldn't get interrupted.
So anyway - long winded way of saying that I think you're right and the UI part of the DAW will definitely be sending messages or somehow communicating with the code that handles the levels - perhaps it is simply that it never deletes the message once it has acted on it or multiple messages get sent?
Long reply ... sorry!
- This reply was modified 5 years ago by Jonathan Marshall.