I figured I'd extract this question from my XG/GS insert post to make it more visible.
How does one go about calculating millisecond time from delta time or vice versa?
I know there are two bytes in a MIDI header that (somehow) specify ticks/division, though I am not 100% clear on how it works. Currently I read them and throw them away, but with every new thing I add to my program, I find I have to be more respectful to some of these formerly throw-away bytes
The first bit: "0" determines "normal (ticks per quarter-note)" vs "1" "SMPTE (ticks per frame)" (I probably am going to ignore SMPTE, as it has been determined this is HIGHLY uncommon in MIDI files). How do the rest of the bits go together? Is it just a straight run of the last 15 bits?
I know there are two META events FF 51 (tempo) and/or FF 58 (time signature), and tempo seems most relevant.
However, this comment seems to indicate that time signature is also important?
BPM = (60/(500,000e-6))*b/4, with b the lower numeral of the time signature
My goal is to figure out how much delta time I need to insert per MIDI file after I add GM/GM2 System On/Reset (50ms), XG System Reset (100 ms), and GS System Reset (50 ms) SYSEX events to files.