How did they do it?
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
honey the codewitch wrote:
However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas.
Why? Why not have a 'pre-stream' of events that are the absolute times of next 'event' of each track? Then you just keep polling the track with the next playable event until its 'time' falls behind the next track with an upcoming playable event. Poll that one until ...
Our Forgotten Astronomy | Object Oriented Programming with C++
-
honey the codewitch wrote:
However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas.
Why? Why not have a 'pre-stream' of events that are the absolute times of next 'event' of each track? Then you just keep polling the track with the next playable event until its 'time' falls behind the next track with an upcoming playable event. Poll that one until ...
Our Forgotten Astronomy | Object Oriented Programming with C++
That's what I do essentially as far as the deltas. My pre stream is n contexts where n is the number of tracks. I use those to pull events out in the right order
To err is human. Fortune favors the monsters.
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
Even older than MIDI, mergesort on multi mag tapes. The issue sounds awful similar.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
-
Even older than MIDI, mergesort on multi mag tapes. The issue sounds awful similar.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
It does. It really does, assuming those mag tapes are interleaved like "raided" kind.
To err is human. Fortune favors the monsters.
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
From what I recall, having done some MIDI stuff in the days before time, on a 1 MHz 8 bit CPU with 4k RAM, is that the data stream was at 31.25 kbaud. [insert match equation here] Which was one tick every 10 milliseconds; all 16 channels combined. So all I had to do was was to preprocess everything in less than 10 ms.
Nothing succeeds like a budgie without teeth. To err is human, to arr is pirate.
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
I think you're overcomplicating things. You could use a simple class for each track. Initialise it with the track length, byte offset/state, and a getNextByte(offset/state) callback function. Implement getNextEventTime(), and getNextEvent(). Now you can get the minimum getNextEventTime() of all tracks, and then getNextEvent() for any track that matched that minimum time. Calling getNextEvent() will update that track's next event time. Rinse & repeat. Rinsing is optional.
-
I think you're overcomplicating things. You could use a simple class for each track. Initialise it with the track length, byte offset/state, and a getNextByte(offset/state) callback function. Implement getNextEventTime(), and getNextEvent(). Now you can get the minimum getNextEventTime() of all tracks, and then getNextEvent() for any track that matched that minimum time. Calling getNextEvent() will update that track's next event time. Rinse & repeat. Rinsing is optional.
How do I know which track to pull an event from next? That's where it gets weird.
To err is human. Fortune favors the monsters.
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
-
How do I know which track to pull an event from next? That's where it gets weird.
To err is human. Fortune favors the monsters.
-
How do I know which track to pull an event from next? That's where it gets weird.
To err is human. Fortune favors the monsters.
From whichever tracks have the next time equal to the minimum-next-time of all tracks:
void playNextNotes()
{
int nextTime = MAX_INTVAL;
for (auto &track : tracks)
{
if (track.getNextTimestamp() < nextTime)
nextTime = track.getNextTimestamp();
}waitUntil(nextTime); for (auto &track : tracks) { if (track.getNextTimestamp() == nextTime) { auto event = track.getNextEvent(); midi.playEvent(event); } }
}
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
I know this feel. The problem is always present in your mind like an earworm. It invades your dreams and you can't escape it even though that is what it will take to gain a fresh perspective on it. Have you tried any of the meditation apps? Maybe you could attend a yoga class? Sometimes writing down or talking out everything you know about the issue can help offload some of the brain activity. There is a solution. Try to be settled by that.
-
From whichever tracks have the next time equal to the minimum-next-time of all tracks:
void playNextNotes()
{
int nextTime = MAX_INTVAL;
for (auto &track : tracks)
{
if (track.getNextTimestamp() < nextTime)
nextTime = track.getNextTimestamp();
}waitUntil(nextTime); for (auto &track : tracks) { if (track.getNextTimestamp() == nextTime) { auto event = track.getNextEvent(); midi.playEvent(event); } }
}
I got it all working last night. The trick was in implementing your "getNextEvent()" method correctly (I don't call mine that, but same-o same-o)
To err is human. Fortune favors the monsters.
-
How do I know which track to pull an event from next? That's where it gets weird.
To err is human. Fortune favors the monsters.
-
From what I recall, having done some MIDI stuff in the days before time, on a 1 MHz 8 bit CPU with 4k RAM, is that the data stream was at 31.25 kbaud. [insert match equation here] Which was one tick every 10 milliseconds; all 16 channels combined. So all I had to do was was to preprocess everything in less than 10 ms.
Nothing succeeds like a budgie without teeth. To err is human, to arr is pirate.
Yeah, it wasn't really the speed that was my problem. It was the difficulty of streaming midi file tracks while merging them without loading more than I absolutely had to in RAM at once. I got it working. It only keeps N messages in memory at a time, where N is the number of tracks. That's about as good as it gets I think.
To err is human. Fortune favors the monsters.
-
I got it all working last night. The trick was in implementing your "getNextEvent()" method correctly (I don't call mine that, but same-o same-o)
To err is human. Fortune favors the monsters.
-
I know this feel. The problem is always present in your mind like an earworm. It invades your dreams and you can't escape it even though that is what it will take to gain a fresh perspective on it. Have you tried any of the meditation apps? Maybe you could attend a yoga class? Sometimes writing down or talking out everything you know about the issue can help offload some of the brain activity. There is a solution. Try to be settled by that.
I found the solution. It couldn't hide from me forever. :)
To err is human. Fortune favors the monsters.
-
I'm working on complicated MIDI wizardry for IoT gadgets, so I can make MIDI "smart" pedals and controllers and such. Playing a multitrack MIDI file without reading it into memory is a bear. I have just not been able to get this code right. The trouble is each midi event has a delta attached to it that is the offset in "MIDI ticks"* from the previous event. *A midi tick is a fixed time duration based on the tempo and timebase. With a multitrack midi file, each track has its own sequence of events and the deltas are all relative to that track. However, in order to play them, you must merge all the tracks into one event stream, adjusting the deltas. The actual adjusting of the deltas isn't so bad, but the logic to figure out when to pull from what track - I'm not even sure I have it right yet, because my code has other issues. My point in all this, is MIDI is early 1980s protocol and multitrack midi isn't exactly brand spanking new. Sequencers with scant amounts of RAM were doing this. I feel like in so many ways MIDI was designed to make it possible to do things on little devices without much RAM. But this particular operation - in C# I just merged the tracks in memory before I played them. I can't afford the RAM or the CPU to do that here. I have to stream everything. And I've convinced myself I'm overcomplicating things. I hate when I do that - it means I have tunnel vision, and/or am missing something big and important. I don't like knowing that I don't know something I need to know, you know? It bugs me, like a song that's stuck in my head I can't remember the entire hook to.
To err is human. Fortune favors the monsters.
For a moment forget about your environment and try to think like the original developers. You've got very limited RAM, and slightly-less limited code space. This means your code has to be clever. It also implies you don't necessarily have to handle the arbitrary, general case where all possibilities are handled regardless of what seems to be allowed by the parameters. As an example, suppose you have a signed 16-bit value to handle. Does the usage really need to allow for negative values? What about zero (0)? What's the actual, practical range for the value? Figuring out the actual, implicit (and undocumented) constraints can help figure out a practical algorithm.
Software Zen:
delete this;
-
For a moment forget about your environment and try to think like the original developers. You've got very limited RAM, and slightly-less limited code space. This means your code has to be clever. It also implies you don't necessarily have to handle the arbitrary, general case where all possibilities are handled regardless of what seems to be allowed by the parameters. As an example, suppose you have a signed 16-bit value to handle. Does the usage really need to allow for negative values? What about zero (0)? What's the actual, practical range for the value? Figuring out the actual, implicit (and undocumented) constraints can help figure out a practical algorithm.
Software Zen:
delete this;
I totally agree with this, and when I originally planned this I was writing clever code. :) I use signed values for most things because midi is largely a 7-bit protocol. that way if I accidentally set the sign bit the number jumps out at me as negative. In the end I solved it. It took me moving my code to a real PC so I could fire up a debugger. It was just too complicated to work through it without one.
To err is human. Fortune favors the monsters.